Sample records for information records methodological

  1. Assessment of Registration Information on Methodological Design of Acupuncture RCTs: A Review of 453 Registration Records Retrieved from WHO International Clinical Trials Registry Platform

    PubMed Central

    Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan

    2014-01-01

    Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration. PMID:24688591

  2. Assessment of Registration Information on Methodological Design of Acupuncture RCTs: A Review of 453 Registration Records Retrieved from WHO International Clinical Trials Registry Platform.

    PubMed

    Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan

    2014-01-01

    Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration.

  3. 40 CFR 98.287 - Records that must be retained.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Methodology in § 98.37 and the information listed in this paragraph (a): (1) Records of all petroleum coke... conducted for reported data listed in § 98.286(b). (2) Records of all petroleum coke purchases. (3) Annual...

  4. 40 CFR 98.287 - Records that must be retained.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Methodology in § 98.37 and the information listed in this paragraph (a): (1) Records of all petroleum coke... conducted for reported data listed in § 98.286(b). (2) Records of all petroleum coke purchases. (3) Annual...

  5. 40 CFR 98.287 - Records that must be retained.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Methodology in § 98.37 and the information listed in this paragraph (a): (1) Records of all petroleum coke... conducted for reported data listed in § 98.286(b). (2) Records of all petroleum coke purchases. (3) Annual...

  6. 40 CFR 98.287 - Records that must be retained.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Methodology in § 98.37 and the information listed in this paragraph (a): (1) Records of all petroleum coke... conducted for reported data listed in § 98.286(b). (2) Records of all petroleum coke purchases. (3) Annual...

  7. 40 CFR 98.287 - Records that must be retained.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Methodology in § 98.37 and the information listed in this paragraph (a): (1) Records of all petroleum coke... conducted for reported data listed in § 98.286(b). (2) Records of all petroleum coke purchases. (3) Annual...

  8. A Community Health Record: Improving Health Through Multisector Collaboration, Information Sharing, and Technology.

    PubMed

    King, Raymond J; Garrett, Nedra; Kriseman, Jeffrey; Crum, Melvin; Rafalski, Edward M; Sweat, David; Frazier, Renee; Schearer, Sue; Cutts, Teresa

    2016-09-08

    We present a framework for developing a community health record to bring stakeholders, information, and technology together to collectively improve the health of a community. It is both social and technical in nature and presents an iterative and participatory process for achieving multisector collaboration and information sharing. It proposes a methodology and infrastructure for bringing multisector stakeholders and their information together to inform, target, monitor, and evaluate community health initiatives. The community health record is defined as both the proposed framework and a tool or system for integrating and transforming multisector data into actionable information. It is informed by the electronic health record, personal health record, and County Health Ranking systems but differs in its social complexity, communal ownership, and provision of information to multisector partners at scales ranging from address to zip code.

  9. A Community Health Record: Improving Health Through Multisector Collaboration, Information Sharing, and Technology

    PubMed Central

    Garrett, Nedra; Kriseman, Jeffrey; Crum, Melvin; Rafalski, Edward M.; Sweat, David; Frazier, Renee; Schearer, Sue; Cutts, Teresa

    2016-01-01

    We present a framework for developing a community health record to bring stakeholders, information, and technology together to collectively improve the health of a community. It is both social and technical in nature and presents an iterative and participatory process for achieving multisector collaboration and information sharing. It proposes a methodology and infrastructure for bringing multisector stakeholders and their information together to inform, target, monitor, and evaluate community health initiatives. The community health record is defined as both the proposed framework and a tool or system for integrating and transforming multisector data into actionable information. It is informed by the electronic health record, personal health record, and County Health Ranking systems but differs in its social complexity, communal ownership, and provision of information to multisector partners at scales ranging from address to zip code. PMID:27609300

  10. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    PubMed

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  11. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    PubMed

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. A Program Evaluation Using Client Records and Census Data.

    ERIC Educational Resources Information Center

    Bachrach, Kenneth M.; Zautra, Alex

    Use of client records and census data as a research methodology can provide mental health planners with information on community needs as well as the adequacy of existing programs. Three ways of analyzing client records in conjunction with census data are: (1) tract by tract, comparing client geographic distribution with census characteristics;…

  13. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network

    PubMed Central

    Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing

    2015-01-01

    This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information. PMID:25938760

  14. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.

    PubMed

    Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing

    2015-01-01

    This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.

  15. Social Neuroscience and Hyperscanning Techniques: Past, Present and Future

    PubMed Central

    Babiloni, Fabio; Astolfi, Laura

    2012-01-01

    This paper reviews the published literature on the hyperscanning methodologies using hemodynamic or neuro-electric modalities. In particular, we describe how different brain recording devices have been employed in different experimental paradigms to gain information about the subtle nature of human interactions. This review also included papers based on single-subject recordings in which a correlation was found between the activities of different (non-simultaneously recorded) participants in the experiment. The descriptions begin with the methodological issues related to the simultaneous measurements and the descriptions of the results generated by such approaches will follow. Finally, a discussion of the possible future uses of such new approaches to explore human social interactions will be presented. PMID:22917915

  16. An Information Transmission Measure for the Analysis of Effective Connectivity among Cortical Neurons

    PubMed Central

    Law, Andrew J.; Sharma, Gaurav; Schieber, Marc H.

    2014-01-01

    We present a methodology for detecting effective connections between simultaneously recorded neurons using an information transmission measure to identify the presence and direction of information flow from one neuron to another. Using simulated and experimentally-measured data, we evaluate the performance of our proposed method and compare it to the traditional transfer entropy approach. In simulations, our measure of information transmission outperforms transfer entropy in identifying the effective connectivity structure of a neuron ensemble. For experimentally recorded data, where ground truth is unavailable, the proposed method also yields a more plausible connectivity structure than transfer entropy. PMID:21096617

  17. Evaluation Methodologies for Information Management Systems; Building Digital Tobacco Industry Document Libraries at the University of California, San Francisco Library/Center for Knowledge Management; Experiments with the IFLA Functional Requirements for Bibliographic Records (FRBR); Coming to Term: Designing the Texas Email Repository Model.

    ERIC Educational Resources Information Center

    Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia

    2002-01-01

    Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…

  18. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Verification of nonlinear dynamic structural test results by combined image processing and acoustic analysis

    NASA Astrophysics Data System (ADS)

    Tene, Yair; Tene, Noam; Tene, G.

    1993-08-01

    An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.

  20. 36 CFR 1256.28 - Does NARA make any exceptions for access to records containing privacy-restricted information?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... research to qualified persons doing biomedical or social science research under the conditions outlined in... who wish to have access to records restricted by § 1256.56 to conduct biomedical or social science... even for biomedical or social science research; (ii) The methodology proposed by the requester will...

  1. Challenges and methodology for indexing the computerized patient record.

    PubMed

    Ehrler, Frédéric; Ruch, Patrick; Geissbuhler, Antoine; Lovis, Christian

    2007-01-01

    Patient records contain most crucial documents for managing the treatments and healthcare of patients in the hospital. Retrieving information from these records in an easy, quick and safe way helps care providers to save time and find important facts about their patient's health. This paper presents the scalability issues induced by the indexing and the retrieval of the information contained in the patient records. For this study, EasyIR, an information retrieval tool performing full text queries and retrieving the related documents has been used. An evaluation of the performance reveals that the indexing process suffers from overhead consequence of the particular structure of the patient records. Most IR tools are designed to manage very large numbers of documents in a single index whereas in our hypothesis, one index per record, which usually implies few documents, has been imposed. As the number of modifications and creations of patient records are significant in a day, using a specialized and efficient indexation tool is required.

  2. Research methodology and applied statistics. Part 2: the literature search.

    PubMed

    Prince, B; Makrides, L; Richman, J

    1980-01-01

    This paper presents a basic methodology for an effective and efficient retrieval and recording of written materials in a subject area. The purpose of the literature review is examined and the criteria for selection of materials for inclusion are outlined. The methodology then describes the role of the librarian, various types of information resources, how to choose appropriate indexing and abstracting services, and a simple efficient method of recording the items found. The importance and use of Medical Subject Headings for research in physiotherapy is emphasized. A survey of types of book materials and how to locate them is followed by a detailed description of the most useful indexing and abstracting services available, in particular, the publications of the National Library of Medicine, notably Index Medicus, as well as Excerpta Medica and the Science Citation Index. A discussion of on-line search services, their coverage and availability in Canada, concludes the review of information sources. Finally, guidelines for selecting and summarizing the materials located and comments on the literary style for a review are supplied.

  3. 75 FR 13076 - Privacy Act of 1974; Altered System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-18

    ....C., Sections 141 and 193 and the U.S. Census Bureau; and to undertake methodological evaluations and enhancements leading to improved data collection and quality control studies. Also, information collected by...

  4. Los Alamos National Laboratory: A guide to records series supporting epidemiologic studies conducted for the Department of Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-01-01

    The purpose of this guide is to describe each series of records that pertains to the epidemiologic studies conducted by the Epidemiology Section of the Occupational Medicine Group (ESH-2) at the Department of Energy`s (DOE) Los Alamos National Laboratory (LANL) in Los Alamos, New Mexico. The records described in this guide relate to occupational studies performed by the Epidemiology Section, including those pertaining to workers at LANL, Mound Plant, Oak Ridge Reservation, Pantex Plant, Rocky Flats Plant, and Savannah River Site. Also included are descriptions of other health-related records generated or collected by the Epidemiology Section and a small setmore » of records collected by the Industrial Hygiene and Safety Group. This guide is not designed to describe the universe of records generated by LANL which may be used for epidemiologic studies of the LANL work force. History Associates Incorporated (HAI) prepared this guide as part of its work as the support services contractor for DOE`s Epidemiologic Records Inventory Project. This introduction briefly describes the Epidemiologic Records Inventory Project, HAI`s role in the project, the history of LANL the history and functions of LANL`s Health Division and Epidemiology Section, and the various epidemiologic studies performed by the Epidemiology Section. It provides information on the methodology that HAI used to inventory and describe records housed in the offices of the LANL Epidemiology Section in Technical Area 59 and at the LANL Records Center. Other topics include the methodology used to produce the guide, the arrangement of the detailed record series descriptions, and information concerning access to records repositories.« less

  5. A data-driven feature extraction framework for predicting the severity of condition of congestive heart failure patients.

    PubMed

    Sideris, Costas; Alshurafa, Nabil; Pourhomayoun, Mohammad; Shahmohammadi, Farhad; Samy, Lauren; Sarrafzadeh, Majid

    2015-01-01

    In this paper, we propose a novel methodology for utilizing disease diagnostic information to predict severity of condition for Congestive Heart Failure (CHF) patients. Our methodology relies on a novel, clustering-based, feature extraction framework using disease diagnostic information. To reduce the dimensionality we identify disease clusters using cooccurence frequencies. We then utilize these clusters as features to predict patient severity of condition. We build our clustering and feature extraction algorithm using the 2012 National Inpatient Sample (NIS), Healthcare Cost and Utilization Project (HCUP) which contains 7 million discharge records and ICD-9-CM codes. The proposed framework is tested on Ronald Reagan UCLA Medical Center Electronic Health Records (EHR) from 3041 patients. We compare our cluster-based feature set with another that incorporates the Charlson comorbidity score as a feature and demonstrate an accuracy improvement of up to 14% in the predictability of the severity of condition.

  6. Digitally enabled patients, professionals and providers: making the case for an electronic health record in mental health services.

    PubMed

    Richardson, Jonathan; McDonald, Joe

    2016-10-01

    The move to a digital health service may improve some components of health systems: information, communication and documentation of care. This article gives a brief definition and history of what is meant by an electronic health record (EHR). There is some evidence of benefits in a number of areas, including legibility, accuracy and the secondary use of information, but there is a need for further research, which may need to use different methodologies to analyse the impact an EHR has on patients, professionals and providers.

  7. Program: A Record of the First 40 Years of Electronic Library and Information Systems

    ERIC Educational Resources Information Center

    Tedd, Lucy A.

    2006-01-01

    Purpose: To provide a broad overview of the history of the journal Program: electronic library and information systems and its contents over its first 40 years. Design/methodology/approach: Analysis of content from the original published material, as well as from abstracting and indexing publications and from minutes of Editorial Board meetings.…

  8. Studies and analyses of the space shuttle main engine: High-pressure oxidizer turbopump failure information propagation model

    NASA Technical Reports Server (NTRS)

    Glover, R. C.; Rudy, S. W.; Tischer, A. E.

    1987-01-01

    The high-pressure oxidizer turbopump (HPOTP) failure information propagation model (FIPM) is presented. The text includes a brief discussion of the FIPM methodology and the various elements which comprise a model. Specific details of the HPOTP FIPM are described. Listings of all the HPOTP data records are included as appendices.

  9. Finding clusters of similar events within clinical incident reports: a novel methodology combining case based reasoning and information retrieval

    PubMed Central

    Tsatsoulis, C; Amthauer, H

    2003-01-01

    A novel methodological approach for identifying clusters of similar medical incidents by analyzing large databases of incident reports is described. The discovery of similar events allows the identification of patterns and trends, and makes possible the prediction of future events and the establishment of barriers and best practices. Two techniques from the fields of information science and artificial intelligence have been integrated—namely, case based reasoning and information retrieval—and very good clustering accuracies have been achieved on a test data set of incident reports from transfusion medicine. This work suggests that clustering should integrate the features of an incident captured in traditional form based records together with the detailed information found in the narrative included in event reports. PMID:14645892

  10. Disruptive technologies for Massachusetts Bay Transportation Authority business strategy exploration.

    DOT National Transportation Integrated Search

    2013-04-01

    There are three tasks for this research : 1. Methodology to extract Road Usage Patterns from Phone Data: We combined the : most complete record of daily mobility, based on large-scale mobile phone data, with : detailed Geographic Information System (...

  11. Regional-scale analysis of extreme precipitation from short and fragmented records

    NASA Astrophysics Data System (ADS)

    Libertino, Andrea; Allamano, Paola; Laio, Francesco; Claps, Pierluigi

    2018-02-01

    Rain gauge is the oldest and most accurate instrument for rainfall measurement, able to provide long series of reliable data. However, rain gauge records are often plagued by gaps, spatio-temporal discontinuities and inhomogeneities that could affect their suitability for a statistical assessment of the characteristics of extreme rainfall. Furthermore, the need to discard the shorter series for obtaining robust estimates leads to ignore a significant amount of information which can be essential, especially when large return periods estimates are sought. This work describes a robust statistical framework for dealing with uneven and fragmented rainfall records on a regional spatial domain. The proposed technique, named "patched kriging" allows one to exploit all the information available from the recorded series, independently of their length, to provide extreme rainfall estimates in ungauged areas. The methodology involves the sequential application of the ordinary kriging equations, producing a homogeneous dataset of synthetic series with uniform lengths. In this way, the errors inherent to any regional statistical estimation can be easily represented in the spatial domain and, possibly, corrected. Furthermore, the homogeneity of the obtained series, provides robustness toward local artefacts during the parameter-estimation phase. The application to a case study in the north-western Italy demonstrates the potential of the methodology and provides a significant base for discussing its advantages over previous techniques.

  12. Usability testing in medical informatics: cognitive approaches to evaluation of information systems and user interfaces.

    PubMed Central

    Kushniruk, A. W.; Patel, V. L.; Cimino, J. J.

    1997-01-01

    This paper describes an approach to the evaluation of health care information technologies based on usability engineering and a methodological framework from the study of medical cognition. The approach involves collection of a rich set of data including video recording of health care workers as they interact with systems, such as computerized patient records and decision support tools. The methodology can be applied in the laboratory setting, typically involving subjects "thinking aloud" as they interact with a system. A similar approach to data collection and analysis can also be extended to study of computer systems in the "live" environment of hospital clinics. Our approach is also influenced from work in the area of cognitive task analysis, which aims to characterize the decision making and reasoning of subjects of varied levels of expertise as they interact with information technology in carrying out representative tasks. The stages involved in conducting cognitively-based usability analyses are detailed and the application of such analysis in the iterative process of system and interface development is discussed. PMID:9357620

  13. Remembering History: The Work of the Information Services Sub-Committee of the Joint Information Systems Committee in the UK

    ERIC Educational Resources Information Center

    Law, Derek

    2006-01-01

    Purpose: The paper seeks to record the work of the committee and its interaction with the much better known Electronic Libraries (eLib) Programme. It also examines the principles that underlay the development of content acquisition and supporting infrastructure in UK university libraries in the 1990s. Design/methodology/approach: A historical…

  14. [Definition of hospital discharge, serious injury and death from traffic injuries].

    PubMed

    Pérez, Katherine; Seguí-Gómez, María; Arrufat, Vita; Barberia, Eneko; Cabeza, Elena; Cirera, Eva; Gil, Mercedes; Martín, Carlos; Novoa, Ana M; Olabarría, Marta; Lardelli, Pablo; Suelves, Josep Maria; Santamariña-Rubio, Elena

    2014-01-01

    Road traffic injury surveillance involves methodological difficulties due, among other reasons, to the lack of consensus criteria for case definition. Police records have usually been the main source of information for monitoring traffic injuries, while health system data has hardly been used. Police records usually include comprehensive information on the characteristics of the crash, but often underreport injury cases and do not collect reliable information on the severity of injuries. However, statistics on severe traffic injuries have been based almost exclusively on police data. The aim of this paper is to propose criteria based on medical records to define: a) "Hospital discharge for traffic injuries", b) "Person with severe traffic injury", and c) "Death from traffic injuries" in order to homogenize the use of these sources. Copyright © 2014. Published by Elsevier Espana.

  15. Using scenarios and personas to enhance the effectiveness of heuristic usability evaluations for older adults and their care team.

    PubMed

    Kneale, Laura; Mikles, Sean; Choi, Yong K; Thompson, Hilaire; Demiris, George

    2017-09-01

    Using heuristics to evaluate user experience is a common methodology for human-computer interaction studies. One challenge of this method is the inability to tailor results towards specific end-user needs. This manuscript reports on a method that uses validated scenarios and personas of older adults and care team members to enhance heuristics evaluations of the usability of commercially available personal health records for homebound older adults. Our work extends the Chisnell and Redish heuristic evaluation methodology by using a protocol that relies on multiple expert reviews of each system. It further standardizes the heuristic evaluation process through the incorporation of task-based scenarios. We were able to use the modified version of the Chisnell and Redish heuristic evaluation methodology to identify potential usability challenges of two commercially available personal health record systems. This allowed us to: (1) identify potential usability challenges for specific types of users, (2) describe improvements that would be valuable to all end-users of the system, and (3) better understand how the interactions of different users may vary within a single personal health record. The methodology described in this paper may help designers of consumer health information technology tools, such as personal health records, understand the needs of diverse end-user populations. Such methods may be particularly helpful when designing systems for populations that are difficult to recruit for end-user evaluations through traditional methods. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Dynamic Creation of Social Networks for Syndromic Surveillance Using Information Fusion

    NASA Astrophysics Data System (ADS)

    Holsopple, Jared; Yang, Shanchieh; Sudit, Moises; Stotz, Adam

    To enhance the effectiveness of health care, many medical institutions have started transitioning to electronic health and medical records and sharing these records between institutions. The large amount of complex and diverse data makes it difficult to identify and track relationships and trends, such as disease outbreaks, from the data points. INFERD: Information Fusion Engine for Real-Time Decision-Making is an information fusion tool that dynamically correlates and tracks event progressions. This paper presents a methodology that utilizes the efficient and flexible structure of INFERD to create social networks representing progressions of disease outbreaks. Individual symptoms are treated as features allowing multiple hypothesis being tracked and analyzed for effective and comprehensive syndromic surveillance.

  17. How to convince your manager to invest in an HIS preimplementation methodology for appraisal of material, process and human costs and benefits.

    PubMed Central

    Bossard, B.; Renard, J. M.; Capelle, P.; Paradis, P.; Beuscart, M. C.

    2000-01-01

    Investing in information technology has become a crucial process in hospital management today. Medical and administrative managers are faced with difficulties in measuring medical information technology costs and benefits due to the complexity of the domain. This paper proposes a preimplementation methodology for evaluating and appraising material, process and human costs and benefits. Based on the users needs and organizational process analysis, the methodology provides an evaluative set of financial and non financial indicators which can be integrated in a decision making and investment evaluation process. We describe the first results obtained after a few months of operation for the Computer-Based Patient Record (CPR) project. Its full acceptance, in spite of some difficulties, encourages us to diffuse the method for the entire project. PMID:11079851

  18. Methodologies, Models and Algorithms for Patients Rehabilitation.

    PubMed

    Fardoun, H M; Mashat, A S

    2016-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The objective of this focus theme is to present current solutions by means of technologies and human factors related to the use of Information and Communication Technologies (ICT) for improving patient rehabilitation. The focus theme examines distinctive measurements of strengthening methodologies, models and algorithms for disabled people in terms of rehabilitation and health care, and to explore the extent to which ICT is a useful tool in this process. The focus theme records a set of solutions for ICT systems developed to improve the rehabilitation process of disabled people and to help them in carrying out their daily life. The development and subsequent setting up of computers for the patients' rehabilitation process is of continuous interest and growth.

  19. Protecting Unesco World Heritage PROPERTIES'S Integrity: the Role of Recording and Documentation in Risk Management for PETRA

    NASA Astrophysics Data System (ADS)

    Santana Quintero, M.; Cesaro, G.; Ishakat, F.; Vandesande, A.; Vileikis, O.; Vadafari, A.; Paolini, A.; Van Balen, K.; Fakhoury, L.

    2012-07-01

    Risk management - as it has been defined - involves the decision-making process following a risk assessment (Ball, Watt, 2003). It is the process that involves managing to minimize losses and impacts on the significant of historic structures and to reach the balance between gaining and losing opportunities. This contribution explains the "heritage information" platform developed using low-cost recording, documentation and information management tools to serve as container for assessments resulting from the application of a risk methodology at a pilot area of the Petra Archaeological Park, in particular those that permit digitally and cost effective to prepare an adequate baseline record to identify disturbances and threats. Furthermore, this paper will reflect on the issue of mapping the World Heritage property's boundaries by illustrating a methodology developed during the project and further research to overcome the lack of boundaries and buffer zone for the protection of the Petra World Heritage site, as identified in this project. This paper is based on on-going field project from a multidisciplinary team of experts from the Raymond Lemaire International Centre for Conservation (University of Leuven), UNESCO Amman, Petra Development Tourism and Region Authority (PDTRA), and Jordan's Department of Antiquities (DoA), as well as, experts from Jordan. The recording and documentation approach included in this contribution is part of an on-going effort to develop a methodology for mitigating (active and preventive) risks on the Petra Archaeological Park (Jordan). The risk assessment has been performed using non-intrusive techniques, which involve simple global navigation satellite system (GNSS), photography, and structured visual inspection, as well as, a heritage information framework based on Geographic Information Systems. The approach takes into consideration the comparison of vulnerability to sites with the value assessment to prioritize monuments at risk based on their importance of significance and magnitude of risk, in order for the authorities to plan more in-depth assessment for those highly significant monuments or areas at risk. A decision tool is envisaged as outcome of this project.

  20. Toward on-chip, in-cell recordings from cultured cardiomyocytes by arrays of gold mushroom-shaped microelectrodes

    PubMed Central

    Fendyur, Anna; Spira, Micha E.

    2012-01-01

    Cardiological research greatly rely on the use of cultured primary cardiomyocytes (CMs). The prime methodology to assess CM network electrophysiology is based on the use of extracellular recordings by substrate-integrated planar Micro-Electrode Arrays (MEAs). Whereas this methodology permits simultaneous, long-term monitoring of the CM electrical activity, it limits the information to extracellular field potentials (FPs). The alternative method of intracellular action potentials (APs) recordings by sharp- or patch-microelectrodes is limited to a single cell at a time. Here, we began to merge the advantages of planar MEA and intracellular microelectrodes. To that end we cultured rat CM on micrometer size protruding gold mushroom-shaped microelectrode (gMμEs) arrays. Cultured CMs engulf the gMμE permitting FPs recordings from individual cells. Local electroporation of a CM converts the extracellular recording configuration to attenuated intracellular APs with shape and duration similar to those recorded intracellularly. The procedure enables to simultaneously record APs from an unlimited number of CMs. The electroporated membrane spontaneously recovers. This allows for repeated recordings from the same CM a number of times (>8) for over 10 days. The further development of CM-gMμE configuration opens up new venues for basic and applied biomedical research. PMID:22936913

  1. Eliciting Spontaneous Speech in Bilingual Students: Methods & Techniques.

    ERIC Educational Resources Information Center

    Cornejo, Ricardo J.; And Others

    Intended to provide practical information pertaining to methods and techniques for speech elicitation and production, the monograph offers specific methods and techniques to elicit spontaneous speech in bilingual students. Chapter 1, "Traditional Methodologies for Language Production and Recording," presents an overview of studies using…

  2. 76 FR 62632 - NARA Records Reproduction Fees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-11

    ... methodology for creating and changing records reproduction fees, to remove records reproduction fees found in... add the methodology for creating and changing records reproduction fees, to remove records...

  3. New Advanced Technologies to Provide Decentralised and Secure Access to Medical Records: Case Studies in Oncology

    PubMed Central

    Quantin, Catherine; Coatrieux, Gouenou; Allaert, François André; Fassa, Maniane; Bourquard, Karima; Boire, Jean-Yves; de Vlieger, Paul; Maigne, Lydia; Breton, Vincent

    2009-01-01

    The main problem for health professionals and patients in accessing information is that this information is very often distributed over many medical records and locations. This problem is particularly acute in cancerology because patients may be treated for many years and undergo a variety of examinations. Recent advances in technology make it feasible to gain access to medical records anywhere and anytime, allowing the physician or the patient to gather information from an “ephemeral electronic patient record”. However, this easy access to data is accompanied by the requirement for improved security (confidentiality, traceability, integrity, ...) and this issue needs to be addressed. In this paper we propose and discuss a decentralised approach based on recent advances in information sharing and protection: Grid technologies and watermarking methodologies. The potential impact of these technologies for oncology is illustrated by the examples of two experimental cases: a cancer surveillance network and a radiotherapy treatment plan. It is expected that the proposed approach will constitute the basis of a future secure “google-like” access to medical records. PMID:19718446

  4. Use of the My Health Record by people with communication disability in Australia: A review to inform the design and direction of future research.

    PubMed

    Hemsley, Bronwyn; Georgiou, Andrew; Carter, Rob; Hill, Sophie; Higgins, Isabel; van Vliet, Paulette; Balandin, Susan

    2016-12-01

    People with communication disability often struggle to convey their health information to multiple service providers and are at increased risk of adverse health outcomes related to the poor exchange of health information. The purpose of this article was to (a) review the literature informing future research on the Australian personally controlled electronic health record, 'My Health Record' (MyHR), specifically to include people with communication disability and their family members or service providers, and (b) to propose a range of suitable methodologies that might be applied in research to inform training, policy and practice in relation to supporting people with communication disability and their representatives to engage in using MyHR. The authors reviewed the literature and, with a cross-disciplinary perspective, considered ways to apply sociotechnical, health informatics, and inclusive methodologies to research on MyHR use by adults with communication disability. This article outlines a range of research methods suitable for investigating the use of MyHR by people who have communication disability associated with a range of acquired or lifelong health conditions, and their family members, and direct support workers. In planning the allocation of funds towards the health and well-being of adults with disabilities, both disability and health service providers must consider the supports needed for people with communication disability to use MyHR. There is an urgent need to focus research efforts on MyHR in populations with communication disability, who struggle to communicate their health information across multiple health and disability service providers. The design of studies and priorities for future research should be set in consultation with people with communication disability and their representatives. © The Author(s) 2016.

  5. Museum of Comparative Zoology Library--The Agassiz Library: Harvard University.

    ERIC Educational Resources Information Center

    Jonas, Eva S.; Regen, Shari S.

    1986-01-01

    Argues that the Museum of Comparative Zoology Library reflects the union between the nineteenth century natural history values of Louis Agassiz and the twentieth century library and information science methodology. Special collections, records, cataloging and classification, serials and their classification, policies, services, and procedures are…

  6. Cluster randomized trials utilizing primary care electronic health records: methodological issues in design, conduct, and analysis (eCRT Study).

    PubMed

    Gulliford, Martin C; van Staa, Tjeerd P; McDermott, Lisa; McCann, Gerard; Charlton, Judith; Dregan, Alex

    2014-06-11

    There is growing interest in conducting clinical and cluster randomized trials through electronic health records. This paper reports on the methodological issues identified during the implementation of two cluster randomized trials using the electronic health records of the Clinical Practice Research Datalink (CPRD). Two trials were completed in primary care: one aimed to reduce inappropriate antibiotic prescribing for acute respiratory infection; the other aimed to increase physician adherence with secondary prevention interventions after first stroke. The paper draws on documentary records and trial datasets to report on the methodological experience with respect to research ethics and research governance approval, general practice recruitment and allocation, sample size calculation and power, intervention implementation, and trial analysis. We obtained research governance approvals from more than 150 primary care organizations in England, Wales, and Scotland. There were 104 CPRD general practices recruited to the antibiotic trial and 106 to the stroke trial, with the target number of practices being recruited within six months. Interventions were installed into practice information systems remotely over the internet. The mean number of participants per practice was 5,588 in the antibiotic trial and 110 in the stroke trial, with the coefficient of variation of practice sizes being 0.53 and 0.56 respectively. Outcome measures showed substantial correlations between the 12 months before, and after intervention, with coefficients ranging from 0.42 for diastolic blood pressure to 0.91 for proportion of consultations with antibiotics prescribed, defining practice and participant eligibility for analysis requires careful consideration. Cluster randomized trials may be performed efficiently in large samples from UK general practices using the electronic health records of a primary care database. The geographical dispersal of trial sites presents a difficulty for research governance approval and intervention implementation. Pretrial data analyses should inform trial design and analysis plans. Current Controlled Trials ISRCTN 47558792 and ISRCTN 35701810 (both registered on 17 March 2010).

  7. Cluster randomized trials utilizing primary care electronic health records: methodological issues in design, conduct, and analysis (eCRT Study)

    PubMed Central

    2014-01-01

    Background There is growing interest in conducting clinical and cluster randomized trials through electronic health records. This paper reports on the methodological issues identified during the implementation of two cluster randomized trials using the electronic health records of the Clinical Practice Research Datalink (CPRD). Methods Two trials were completed in primary care: one aimed to reduce inappropriate antibiotic prescribing for acute respiratory infection; the other aimed to increase physician adherence with secondary prevention interventions after first stroke. The paper draws on documentary records and trial datasets to report on the methodological experience with respect to research ethics and research governance approval, general practice recruitment and allocation, sample size calculation and power, intervention implementation, and trial analysis. Results We obtained research governance approvals from more than 150 primary care organizations in England, Wales, and Scotland. There were 104 CPRD general practices recruited to the antibiotic trial and 106 to the stroke trial, with the target number of practices being recruited within six months. Interventions were installed into practice information systems remotely over the internet. The mean number of participants per practice was 5,588 in the antibiotic trial and 110 in the stroke trial, with the coefficient of variation of practice sizes being 0.53 and 0.56 respectively. Outcome measures showed substantial correlations between the 12 months before, and after intervention, with coefficients ranging from 0.42 for diastolic blood pressure to 0.91 for proportion of consultations with antibiotics prescribed, defining practice and participant eligibility for analysis requires careful consideration. Conclusions Cluster randomized trials may be performed efficiently in large samples from UK general practices using the electronic health records of a primary care database. The geographical dispersal of trial sites presents a difficulty for research governance approval and intervention implementation. Pretrial data analyses should inform trial design and analysis plans. Trial registration Current Controlled Trials ISRCTN 47558792 and ISRCTN 35701810 (both registered on 17 March 2010). PMID:24919485

  8. Methodological update in Medicina Intensiva.

    PubMed

    García Garmendia, J L

    2018-04-01

    Research in the critically ill is complex by the heterogeneity of patients, the difficulties to achieve representative sample sizes and the number of variables simultaneously involved. However, the quantity and quality of records is high as well as the relevance of the variables used, such as survival. The methodological tools have evolved to offering new perspectives and analysis models that allow extracting relevant information from the data that accompanies the critically ill patient. The need for training in methodology and interpretation of results is an important challenge for the intensivists who wish to be updated on the research developments and clinical advances in Intensive Medicine. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  9. The Philip Morris Information Network: A Library Database on an In-House Timesharing System.

    ERIC Educational Resources Information Center

    DeBardeleben, Marian Z.; And Others

    1983-01-01

    Outlines a database constructed at Philip Morris Research Center Library which encompasses holdings and circulation and acquisitions records for all items in the library. Host computer (DECSYSTEM-2060), software (BASIC), database design, search methodology, cataloging, and accessibility are noted; sample search, circ-in profile, end user profiles,…

  10. A study on design and development of enterprise-wide concepts for clinical documentation templates.

    PubMed

    Zhou, Li; Gurjar, Rupali; Regier, Rachel; Morgan, Stephen; Meyer, Theresa; Aroy, Teal; Goldman, Debora Scavone; Hongsermeier, Tonya; Middleton, Blackford

    2008-11-06

    Structured clinical documents are associated with many potential benefits. Underlying terminologies and structure of information are keys to their successful implementation and use. This paper presents a methodology for design and development of enterprise-wide concepts for clinical documentation templates for an ambulatory Electronic Medical Record (EMR) system.

  11. A cross-sectional survey of 5-year-old children with non-syndromic unilateral cleft lip and palate: the Cleft Care UK study. Part 1: background and methodology.

    PubMed

    Persson, M; Sandy, J R; Waylen, A; Wills, A K; Al-Ghatam, R; Ireland, A J; Hall, A J; Hollingworth, W; Jones, T; Peters, T J; Preston, R; Sell, D; Smallridge, J; Worthington, H; Ness, A R

    2015-11-01

    We describe the methodology for a major study investigating the impact of reconfigured cleft care in the United Kingdom (UK) 15 years after an initial survey, detailed in the Clinical Standards Advisory Group (CSAG) report in 1998, had informed government recommendations on centralization. This is a UK multicentre cross-sectional study of 5-year-olds born with non-syndromic unilateral cleft lip and palate. Children born between 1 April 2005 and 31 March 2007 were seen in cleft centre audit clinics. Consent was obtained for the collection of routine clinical measures (speech recordings, hearing, photographs, models, oral health, psychosocial factors) and anthropometric measures (height, weight, head circumference). The methodology for each clinical measure followed those of the earlier survey as closely as possible. We identified 359 eligible children and recruited 268 (74.7%) to the study. Eleven separate records for each child were collected at the audit clinics. In total, 2666 (90.4%) were collected from a potential 2948 records. The response rates for the self-reported questionnaires, completed at home, were 52.6% for the Health and Lifestyle Questionnaire and 52.2% for the Satisfaction with Service Questionnaire. Response rates and measures were similar to those achieved in the previous survey. There are practical, administrative and methodological challenges in repeating cross-sectional surveys 15 years apart and producing comparable data. © 2015 The Authors. Orthodontics & Craniofacial Research Published by John Wiley & Sons Ltd.

  12. Transmission line relay mis-operation detection based on time-synchronized field data

    DOE PAGES

    Esmaeilian, Ahad; Popovic, Tomo; Kezunovic, Mladen

    2015-05-04

    In this paper, a real-time tool to detect transmission line relay mis-operation is implemented. The tool uses time-synchronized measurements obtained from both ends of the line during disturbances. The proposed fault analysis tool comes into the picture only after the protective device has operated and tripped the line. The proposed methodology is able not only to detect, classify, and locate transmission line faults, but also to accurately confirm whether the line was tripped due to a mis-operation of protective relays. The analysis report includes either detailed description of the fault type and location or detection of relay mis-operation. As such,more » it can be a source of very useful information to support the system restoration. The focus of the paper is on the implementation requirements that allow practical application of the methodology, which is illustrated using the field data obtained the real power system. Testing and validation is done using the field data recorded by digital fault recorders and protective relays. The test data included several hundreds of event records corresponding to both relay mis-operations and actual faults. The discussion of results addresses various challenges encountered during the implementation and validation of the presented methodology.« less

  13. [Public health education integrated in hospital. An internship proposal, "Medical information and pharmacology"].

    PubMed

    Boulay, F; Chevallier, T; Staccini, P; Chichmanian, R M

    1997-06-01

    According to a recent circular reforming french medical studies, we propose a teaching of medical information and pharmacology in situ within hospital instructions. Students could acquire an investigation methodology on the medicine economy. It will cover in four sessions the succeeding stages of medical information processing and be subject to an assessment: case studies and appreciation on student's, instruction record. By combining public health teaching with clinical practice, our project promotes its development in contact with other learnings and activities such as clinical research.

  14. Piloting a Deceased Subject Integrated Data Repository and Protecting Privacy of Relatives

    PubMed Central

    Huser, Vojtech; Kayaalp, Mehmet; Dodd, Zeyno A.; Cimino, James J.

    2014-01-01

    Use of deceased subject Electronic Health Records can be an important piloting platform for informatics or biomedical research. Existing legal framework allows such research under less strict de-identification criteria; however, privacy of non-decedent must be protected. We report on creation of the decease subject Integrated Data Repository (dsIDR) at National Institutes of Health, Clinical Center and a pilot methodology to remove secondary protected health information or identifiable information (secondary PxI; information about persons other than the primary patient). We characterize available structured coded data in dsIDR and report the estimated frequencies of secondary PxI, ranging from 12.9% (sensitive token presence) to 1.1% (using stricter criteria). Federating decedent EHR data from multiple institutions can address sample size limitations and our pilot study provides lessons learned and methodology that can be adopted by other institutions. PMID:25954378

  15. Piloting a deceased subject integrated data repository and protecting privacy of relatives.

    PubMed

    Huser, Vojtech; Kayaalp, Mehmet; Dodd, Zeyno A; Cimino, James J

    2014-01-01

    Use of deceased subject Electronic Health Records can be an important piloting platform for informatics or biomedical research. Existing legal framework allows such research under less strict de-identification criteria; however, privacy of non-decedent must be protected. We report on creation of the decease subject Integrated Data Repository (dsIDR) at National Institutes of Health, Clinical Center and a pilot methodology to remove secondary protected health information or identifiable information (secondary PxI; information about persons other than the primary patient). We characterize available structured coded data in dsIDR and report the estimated frequencies of secondary PxI, ranging from 12.9% (sensitive token presence) to 1.1% (using stricter criteria). Federating decedent EHR data from multiple institutions can address sample size limitations and our pilot study provides lessons learned and methodology that can be adopted by other institutions.

  16. Recording, Publishing, and Reconstructing Wooden Shipwrecks

    NASA Astrophysics Data System (ADS)

    Castro, F.; Bendig, C.; Bérubé, M.; Borrero, R.; Budsberg, N.; Dostal, C.; Monteiro, A.; Smith, C.; Torres, R.; Yamafune, K.

    2018-04-01

    Almost three decades ago J. Richard Steffy (in: Tzalas (ed) Tropis II, proceedings of the 2nd international symposium on ship construction in antiquity. Athens, pp 315-320, 1990, in: Tzalas (ed) Tropis III, proceedings of the 3rd international symposium on ship construction in antiquity. Athens, pp 417-428, 1995) voiced the need to standardize the recording and publication of shipwrecks. Cluster analysis of construction features is difficult if archaeologists record different and non-overlapping features. This paper discusses the necessity to standardize the recording and publishing of a set of consistent and compatible basic construction features when archaeologists assess, survey, or excavate wooden shipwrecks and proposes a methodology for the recording of wooden hulls. It also emphasizes the urgency of a wide and complete sharing of archaeological information in maritime archaeology.

  17. Call recognition and individual identification of fish vocalizations based on automatic speech recognition: An example with the Lusitanian toadfish.

    PubMed

    Vieira, Manuel; Fonseca, Paulo J; Amorim, M Clara P; Teixeira, Carlos J C

    2015-12-01

    The study of acoustic communication in animals often requires not only the recognition of species specific acoustic signals but also the identification of individual subjects, all in a complex acoustic background. Moreover, when very long recordings are to be analyzed, automatic recognition and identification processes are invaluable tools to extract the relevant biological information. A pattern recognition methodology based on hidden Markov models is presented inspired by successful results obtained in the most widely known and complex acoustical communication signal: human speech. This methodology was applied here for the first time to the detection and recognition of fish acoustic signals, specifically in a stream of round-the-clock recordings of Lusitanian toadfish (Halobatrachus didactylus) in their natural estuarine habitat. The results show that this methodology is able not only to detect the mating sounds (boatwhistles) but also to identify individual male toadfish, reaching an identification rate of ca. 95%. Moreover this method also proved to be a powerful tool to assess signal durations in large data sets. However, the system failed in recognizing other sound types.

  18. On Modeling Research Work for Describing and Filtering Scientific Information

    NASA Astrophysics Data System (ADS)

    Sicilia, Miguel-Ángel

    Existing models for Research Information Systems (RIS) properly address the description of people and organizations, projects, facilities and their outcomes, e.g. papers, reports or patents. While this is adequate for the recording and accountability of research investments, helping researchers in finding relevant people, organizations or results requires considering both the content of research work and also its context. The content is not only related to the domain area, but it requires modeling methodological issues as variables, instruments or scientific methods that can then be used as search criteria. The context of research work is determined by the ongoing projects or scientific interests of an individual or a group, and can be expressed using the same methodological concepts. However, modeling methodological issues is notably complex and dependent on the scientific discipline and research area. This paper sketches the main requirements for those models, providing some motivating examples that could serve as a point of departure for future attempts in developing an upper ontology for research methods and tools.

  19. The Emerging Fourth Tier in K-12 Education Finance in British Columbia, Canada: Increasing Privatisation and Implications for Social Justice

    ERIC Educational Resources Information Center

    Poole, Wendy; Fallon, Gerald

    2015-01-01

    This paper examines increasing privatisation of education in the province of British Columbia, Canada. Conceptually, the paper is informed by theories of privatisation and social justice; and methodologically, it uses policy analysis to examine documents and financial records obtained from government departments. The paper critically analyses…

  20. Integrating Technologies, Methodologies, and Databases into a Comprehensive Terminology Management Environment to Support Interoperability among Clinical Information Systems

    ERIC Educational Resources Information Center

    Shakib, Shaun Cameron

    2013-01-01

    Controlled clinical terminologies are essential to realizing the benefits of electronic health record systems. However, implementing consistent and sustainable use of terminology has proven to be both intellectually and practically challenging. First, this project derives a conceptual understanding of the scope and intricacies of the challenge by…

  1. Exploring Modes of Communication among Pupils in Brazil: Gender Issues in Academic Performance

    ERIC Educational Resources Information Center

    Teixeira, Adla B. M.; Villani, Carlos E.; do Nascimento, Silvania S.

    2008-01-01

    The objective of this study was to identify gender issues in the academic performance of boys and girls during physics classes in a laboratory. The methodology adopted was the observation and interactions of pupils during eight classroom events. The interactions were recorded and events were informally discussed with the teacher. The school…

  2. New-Model Scholarship: How Will It Survive? Optimizing Collections and Services for Scholarly Use.

    ERIC Educational Resources Information Center

    Smith, Abby

    This report explores the following types of emerging scholarship: (1) experimental--designed to develop and model a methodology for generating recorded information about a historical event or an academic discipline that might otherwise go undocumented; (2) open-ended--generates digital objects that are intended to be added to over time; (3)…

  3. Discovering Decision Knowledge from Web Log Portfolio for Managing Classroom Processes by Applying Decision Tree and Data Cube Technology.

    ERIC Educational Resources Information Center

    Chen, Gwo-Dong; Liu, Chen-Chung; Ou, Kuo-Liang; Liu, Baw-Jhiune

    2000-01-01

    Discusses the use of Web logs to record student behavior that can assist teachers in assessing performance and making curriculum decisions for distance learning students who are using Web-based learning systems. Adopts decision tree and data cube information processing methodologies for developing more effective pedagogical strategies. (LRW)

  4. Exploring the Micro-Social Geography of Children's Interactions in Preschool: A Long-Term Observational Study and Analysis Using Geographic Information Technologies

    ERIC Educational Resources Information Center

    Torrens, Paul M.; Griffin, William A.

    2013-01-01

    The authors describe an observational and analytic methodology for recording and interpreting dynamic microprocesses that occur during social interaction, making use of space--time data collection techniques, spatial-statistical analysis, and visualization. The scheme has three investigative foci: Structure, Activity Composition, and Clustering.…

  5. Chief Information Officer's Role in Adopting an Interoperable Electronic Health Record System for Medical Data Exchange

    ERIC Educational Resources Information Center

    Akpabio, Akpabio Enebong Ema

    2013-01-01

    Despite huge growth in hospital technology systems, there remains a dearth of literature examining health care administrator's perceptions of the efficacy of interoperable EHR systems. A qualitative research methodology was used in this multiple-case study to investigate the application of diffusion of innovations theory and the technology…

  6. Data Analysis Methods for Library Marketing

    NASA Astrophysics Data System (ADS)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  7. A multidimensional approach to case mix for home health services

    PubMed Central

    Manton, Kenneth G.; Hausner, Tony

    1987-01-01

    Developing a case-mix methodology for home health services is more difficult than developing one for hospitalization and acute health services, because the determinants of need for home health care are more complex and because of the difficulty in defining episodes of care. To evaluate home health service case mix, a multivariate grouping methodology was applied to records from the 1982 National Long-Term Care Survey linked to Medicare records on home health reimbursements. Using this method, six distinct health and functional status dimensions were identified. These dimensions, combined with factors describing informal care resources and local market conditions, were used to explain significant proportions of the variance (r2 = .45) of individual differences in Medicare home health reimbursements and numbers of visits. Though the data were not collected for that purpose, the high level of prediction strongly suggests the feasibility of developing case-mix strategies for home health services. PMID:10312187

  8. A multidimensional approach to case mix for home health services.

    PubMed

    Manton, K G; Hausner, T

    1987-01-01

    Developing a case-mix methodology for home health services is more difficult than developing one for hospitalization and acute health services, because the determinants of need for home health care are more complex and because of the difficulty in defining episodes of care. To evaluate home health service case mix, a multivariate grouping methodology was applied to records from the 1982 National Long-Term Care Survey linked to Medicare records on home health reimbursements. Using this method, six distinct health and functional status dimensions were identified. These dimensions, combined with factors describing informal care resources and local market conditions, were used to explain significant proportions of the variance (r2 = .45) of individual differences in Medicare home health reimbursements and numbers of visits. Though the data were not collected for that purpose, the high level of prediction strongly suggests the feasibility of developing case-mix strategies for home health services.

  9. Hidden Markov model analysis of force/torque information in telemanipulation

    NASA Technical Reports Server (NTRS)

    Hannaford, Blake; Lee, Paul

    1991-01-01

    A model for the prediction and analysis of sensor information recorded during robotic performance of telemanipulation tasks is presented. The model uses the hidden Markov model to describe the task structure, the operator's or intelligent controller's goal structure, and the sensor signals. A methodology for constructing the model parameters based on engineering knowledge of the task is described. It is concluded that the model and its optimal state estimation algorithm, the Viterbi algorithm, are very succesful at the task of segmenting the data record into phases corresponding to subgoals of the task. The model provides a rich modeling structure within a statistical framework, which enables it to represent complex systems and be robust to real-world sensory signals.

  10. Chromosome aberrations in workers occupationally exposed to tritium.

    PubMed

    Tawn, E Janet; Curwen, Gillian B; Riddell, Anthony E

    2018-06-01

    This paper reports the findings of an historical chromosome analysis for unstable aberrations, undertaken on 34 nuclear workers with monitored exposure to tritium. The mean recorded β-particle dose from tritium was 9.33 mGy (range 0.25-79.71 mGy) and the mean occupational dose from external, mainly γ-ray, irradiation was 1.94 mGy (range 0.00-7.71 mGy). The dicentric frequency of 1.91 ± 0.53 × 10 -3 per cell was significantly raised, in comparison with that of 0.61 ± 0.30 × 10 -3 per cell for a group of 66 comparable worker controls unexposed to occupational radiation. The frequency of total aberrations was also significantly higher in the tritium workers. Comparisons with in vitro studies indicate that at these dose levels an increase in aberration frequency is not expected. However, the available historical tritium dose records were produced for the purposes of radiological protection and based on a methodology that has since been updated, so tritium doses are subject to considerable uncertainty. It is therefore recommended that, if possible, tritium doses are reassessed using information on historical recording practices in combination with current dosimetry methodology, and that further chromosome studies are undertaken using modern FISH techniques to establish stable aberration frequencies, as these will provide information on a cumulative biological effect.

  11. Compliance With Electronic Medical Records Privacy Policy: An Empirical Investigation of Hospital Information Technology Staff

    PubMed Central

    Sher, Ming-Ling; Talley, Paul C.; Yang, Ching-Wen; Kuo, Kuang-Ming

    2017-01-01

    The employment of Electronic Medical Records is expected to better enhance health care quality and to relieve increased financial pressure. Electronic Medical Records are, however, potentially vulnerable to security breaches that may result in a rise of patients’ privacy concerns. The purpose of our study was to explore the factors that motivate hospital information technology staff’s compliance with Electronic Medical Records privacy policy from the theoretical lenses of protection motivation theory and the theory of reasoned action. The study collected data using survey methodology. A total of 310 responses from information technology staff of 7 medical centers in Taiwan was analyzed using the Structural Equation Modeling technique. The results revealed that perceived vulnerability and perceived severity of threats from Electronic Medical Records breaches may be used to predict the information technology staff’s fear arousal level. And factors including fear arousal, response efficacy, self-efficacy, and subjective norm, in their turn, significantly predicted IT staff’s behavioral intention to comply with privacy policy. Response cost was not found to have any relationship with behavioral intention. Based on the findings, we suggest that hospitals could plan and design effective strategies such as initiating privacy-protection awareness and skills training programs to improve information technology staff member’s adherence to privacy policy. Furthermore, enhancing the privacy-protection climate in hospitals is also a viable means to the end. Further practical and research implications are also discussed.

  12. Knowledge Representation and Communication: Imparting Current State Information Flow to CPR Stakeholders

    PubMed Central

    de la Cruz, Norberto B.; Spiece, Leslie J.

    2000-01-01

    Understanding and communicating the who, what, where, when, why, and how of the clinics and services for which the computerized patient record (CPR) will be built is an integral part of the implementation process. Formal methodologies have been developed to diagram information flow -- flow charts, state-transition diagram (STDs), data flow diagrams (DFDs). For documentation of the processes at our ambulatory CPR pilot site, flowcharting was selected as the preferred method based upon its versatility and understandability.

  13. A cross-sectional survey of 5-year-old children with non-syndromic unilateral cleft lip and palate: the Cleft Care UK study. Part 1: background and methodology

    PubMed Central

    Persson, M; Sandy, J R; Waylen, A; Wills, A K; Al-Ghatam, R; Ireland, A J; Hall, A J; Hollingworth, W; Jones, T; Peters, T J; Preston, R; Sell, D; Smallridge, J; Worthington, H; Ness, A R

    2015-01-01

    Structured Abstract Objectives We describe the methodology for a major study investigating the impact of reconfigured cleft care in the United Kingdom (UK) 15 years after an initial survey, detailed in the Clinical Standards Advisory Group (CSAG) report in 1998, had informed government recommendations on centralization. Setting and Sample Population This is a UK multicentre cross-sectional study of 5-year-olds born with non-syndromic unilateral cleft lip and palate. Children born between 1 April 2005 and 31 March 2007 were seen in cleft centre audit clinics. Materials and Methods Consent was obtained for the collection of routine clinical measures (speech recordings, hearing, photographs, models, oral health, psychosocial factors) and anthropometric measures (height, weight, head circumference). The methodology for each clinical measure followed those of the earlier survey as closely as possible. Results We identified 359 eligible children and recruited 268 (74.7%) to the study. Eleven separate records for each child were collected at the audit clinics. In total, 2666 (90.4%) were collected from a potential 2948 records. The response rates for the self-reported questionnaires, completed at home, were 52.6% for the Health and Lifestyle Questionnaire and 52.2% for the Satisfaction with Service Questionnaire. Conclusions Response rates and measures were similar to those achieved in the previous survey. There are practical, administrative and methodological challenges in repeating cross-sectional surveys 15 years apart and producing comparable data. PMID:26567851

  14. Data collection and information presentation for optimal decision making by clinical managers--the Autocontrol Project.

    PubMed Central

    Grant, A. M.; Richard, Y.; Deland, E.; Després, N.; de Lorenzi, F.; Dagenais, A.; Buteau, M.

    1997-01-01

    The Autocontrol methodology has been developed in order to support the optimisation of decision-making and the use of resources in the context of a clinical unit. The theoretical basis relates to quality assurance and information systems and is influenced by management and cognitive research in the health domain. The methodology uses population rather than individual decision making and because of its dynamic feedback design promises to have rapid and profound effect on practice. Most importantly the health care professional is the principle user of the Autocontrol system. In this methodology we distinguish three types of evidence necessary for practice change: practice based or internal evidence, best evidence derived from the literature or external evidence concerning the practice in question, and process based evidence on how to optimise the process of practice change. The software used by the system is of the executive decision support type which facilitates interrogation of large databases. The Autocontrol system is designed to interrogate the data of the patient medical record however the latter often lacks data on concomitant resource use and this must be supplemented. This paper reviews the Autocontrol methodology and gives examples from current studies. PMID:9357733

  15. Data collection and information presentation for optimal decision making by clinical managers--the Autocontrol Project.

    PubMed

    Grant, A M; Richard, Y; Deland, E; Després, N; de Lorenzi, F; Dagenais, A; Buteau, M

    1997-01-01

    The Autocontrol methodology has been developed in order to support the optimisation of decision-making and the use of resources in the context of a clinical unit. The theoretical basis relates to quality assurance and information systems and is influenced by management and cognitive research in the health domain. The methodology uses population rather than individual decision making and because of its dynamic feedback design promises to have rapid and profound effect on practice. Most importantly the health care professional is the principle user of the Autocontrol system. In this methodology we distinguish three types of evidence necessary for practice change: practice based or internal evidence, best evidence derived from the literature or external evidence concerning the practice in question, and process based evidence on how to optimise the process of practice change. The software used by the system is of the executive decision support type which facilitates interrogation of large databases. The Autocontrol system is designed to interrogate the data of the patient medical record however the latter often lacks data on concomitant resource use and this must be supplemented. This paper reviews the Autocontrol methodology and gives examples from current studies.

  16. An Innovative Methodology for Capturing Young Children's Curiosity, Imagination and Voices Using a Free App: Our Story

    ERIC Educational Resources Information Center

    Canning, Natalie; Payler, Jane; Horsley, Karen; Gomez, Chris

    2017-01-01

    This study explores children's narratives of their curiosity and imagination through innovative use of an information technology app--Our Story. Novel use of the app allowed children to express and record their opinions they considered significant to them. The research captured children's approaches to everyday situations through their play.…

  17. Healthcare Information Systems for the epidemiologic surveillance within the community.

    PubMed

    Diomidous, Marianna; Pistolis, John; Mechili, Aggelos; Kolokathi, Aikaterini; Zimeras, Stelios

    2013-01-01

    Public health and health care are important issues for developing countries and access to health care is a significant factor that contributes to a healthy population. In response to these issues, the World Health Organization (WHO) has been working on the development of methods and models for measuring physical accessibility to health care using several layers of information integrated in a GIS. This paper describes the methodological approach for the development of a real time electronic health record, based on the statistical and geographic information for the identification of various diseases and accidents that can happen in a specific place.

  18. 78 FR 79418 - Agency Information Collection Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-30

    ...The Department of Energy (DOE), pursuant to the Paperwork Reduction Act of 1995), intends to extend for three years, an information collection request (OMB Control Number 1910-1700) with the Office of Management and Budget (OMB). The proposed voluntary collection will request that an individual or an authorized designee provide pertinent information for easy record retrieval allowing for increased efficiencies and quicker processing. Pertinent information includes the requester's name, shipping address, phone number, email address, previous work location, the action requested and any identifying data that will help locate the records (e.g., maiden name, occupational license number, time and place of employment). Comments are invited on: (a) whether the extended collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and, (d) ways to minimize the burden of the collection of information on respondents, including through the use of automated collection techniques or other forms of information technology.

  19. Establishment of Requirements and Methodology for the Development and Implementation of GreyMatters, a Memory Clinic Information System.

    PubMed

    Tapuria, Archana; Evans, Matt; Curcin, Vasa; Austin, Tony; Lea, Nathan; Kalra, Dipak

    2017-01-01

    The aim of the paper is to establish the requirements and methodology for the development process of GreyMatters, a memory clinic system, outlining the conceptual, practical, technical and ethical challenges, and the experiences of capturing clinical and research oriented data along with the implementation of the system. The methodology for development of the information system involved phases of requirements gathering, modeling and prototype creation, and 'bench testing' the prototype with experts. The standard Institute of Electrical and Electronics Engineers (IEEE) recommended approach for the specifications of software requirements was adopted. An electronic health record (EHR) standard, EN13606 was used, and clinical modelling was done through archetypes and the project complied with data protection and privacy legislation. The requirements for GreyMatters were established. Though the initial development was complex, the requirements, methodology and standards adopted made the construction, deployment, adoption and population of a memory clinic and research database feasible. The electronic patient data including the assessment scales provides a rich source of objective data for audits and research and to establish study feasibility and identify potential participants for the clinical trials. The establishment of requirements and methodology, addressing issues of data security and confidentiality, future data compatibility and interoperability and medico-legal aspects such as access controls and audit trails, led to a robust and useful system. The evaluation supports that the system is an acceptable tool for clinical, administrative, and research use and forms a useful part of the wider information architecture.

  20. A multichannel model-based methodology for extubation readiness decision of patients on weaning trials.

    PubMed

    Casaseca-de-la-Higuera, Pablo; Simmross-Wattenberg, Federico; Martín-Fernández, Marcos; Alberola-López, Carlos

    2009-07-01

    Discontinuation of mechanical ventilation is a challenging task that involves a number of subtle clinical issues. The gradual removal of the respiratory support (referred to as weaning) should be performed as soon as autonomous respiration can be sustained. However, the prediction rate of successful extubation is still below 25% based on previous studies. Construction of an automatic system that provides information on extubation readiness is thus desirable. Recent works have demonstrated that the breathing pattern variability is a useful extubation readiness indicator, with improving performance when multiple respiratory signals are jointly processed. However, the existing methods for predictor extraction present several drawbacks when length-limited time series are to be processed in heterogeneous groups of patients. In this paper, we propose a model-based methodology for automatic readiness prediction. It is intended to deal with multichannel, nonstationary, short records of the breathing pattern. Results on experimental data yield an 87.27% of successful readiness prediction, which is in line with the best figures reported in the literature. A comparative analysis shows that our methodology overcomes the shortcomings of so far proposed methods when applied to length-limited records on heterogeneous groups of patients.

  1. Electronic palliative care coordination systems: Devising and testing a methodology for evaluating documentation

    PubMed Central

    Allsop, Matthew J; Kite, Suzanne; McDermott, Sarah; Penn, Naomi; Millares-Martin, Pablo; Bennett, Michael I

    2016-01-01

    Background: The need to improve coordination of care at end of life has driven electronic palliative care coordination systems implementation across the United Kingdom and internationally. No approaches for evaluating electronic palliative care coordination systems use in practice have been developed. Aim: This study outlines and applies an evaluation framework for examining how and when electronic documentation of advance care planning is occurring in end of life care services. Design: A pragmatic, formative process evaluation approach was adopted. The evaluation drew on the Project Review and Objective Evaluation methodology to guide the evaluation framework design, focusing on clinical processes. Setting/participants: Data were extracted from electronic palliative care coordination systems for 82 of 108 general practices across a large UK city. All deaths (n = 1229) recorded on electronic palliative care coordination systems between April 2014 and March 2015 were included to determine the proportion of all deaths recorded, median number of days prior to death that key information was recorded and observations about routine data use. Results: The evaluation identified 26.8% of all deaths recorded on electronic palliative care coordination systems. The median number of days to death was calculated for initiation of an electronic palliative care coordination systems record (31 days), recording a patient’s preferred place of death (8 days) and entry of Do Not Attempt Cardiopulmonary Resuscitation decisions (34 days). Where preferred and actual place of death was documented, these were matching for 75% of patients. Anomalies were identified in coding used during data entry on electronic palliative care coordination systems. Conclusion: This study reports the first methodology for evaluating how and when electronic palliative care coordination systems documentation is occurring. It raises questions about what can be drawn from routine data collected through electronic palliative care coordination systems and outlines considerations for future evaluation. Future evaluations should consider work processes of health professionals using electronic palliative care coordination systems. PMID:27507636

  2. Electronic palliative care coordination systems: Devising and testing a methodology for evaluating documentation.

    PubMed

    Allsop, Matthew J; Kite, Suzanne; McDermott, Sarah; Penn, Naomi; Millares-Martin, Pablo; Bennett, Michael I

    2017-05-01

    The need to improve coordination of care at end of life has driven electronic palliative care coordination systems implementation across the United Kingdom and internationally. No approaches for evaluating electronic palliative care coordination systems use in practice have been developed. This study outlines and applies an evaluation framework for examining how and when electronic documentation of advance care planning is occurring in end of life care services. A pragmatic, formative process evaluation approach was adopted. The evaluation drew on the Project Review and Objective Evaluation methodology to guide the evaluation framework design, focusing on clinical processes. Data were extracted from electronic palliative care coordination systems for 82 of 108 general practices across a large UK city. All deaths ( n = 1229) recorded on electronic palliative care coordination systems between April 2014 and March 2015 were included to determine the proportion of all deaths recorded, median number of days prior to death that key information was recorded and observations about routine data use. The evaluation identified 26.8% of all deaths recorded on electronic palliative care coordination systems. The median number of days to death was calculated for initiation of an electronic palliative care coordination systems record (31 days), recording a patient's preferred place of death (8 days) and entry of Do Not Attempt Cardiopulmonary Resuscitation decisions (34 days). Where preferred and actual place of death was documented, these were matching for 75% of patients. Anomalies were identified in coding used during data entry on electronic palliative care coordination systems. This study reports the first methodology for evaluating how and when electronic palliative care coordination systems documentation is occurring. It raises questions about what can be drawn from routine data collected through electronic palliative care coordination systems and outlines considerations for future evaluation. Future evaluations should consider work processes of health professionals using electronic palliative care coordination systems.

  3. Central and Divided Visual Field Presentation of Emotional Images to Measure Hemispheric Differences in Motivated Attention.

    PubMed

    O'Hare, Aminda J; Atchley, Ruth Ann; Young, Keith M

    2017-11-16

    Two dominant theories on lateralized processing of emotional information exist in the literature. One theory posits that unpleasant emotions are processed by right frontal regions, while pleasant emotions are processed by left frontal regions. The other theory posits that the right hemisphere is more specialized for the processing of emotional information overall, particularly in posterior regions. Assessing the different roles of the cerebral hemispheres in processing emotional information can be difficult without the use of neuroimaging methodologies, which are not accessible or affordable to all scientists. Divided visual field presentation of stimuli can allow for the investigation of lateralized processing of information without the use of neuroimaging technology. This study compared central versus divided visual field presentations of emotional images to assess differences in motivated attention between the two hemispheres. The late positive potential (LPP) was recorded using electroencephalography (EEG) and event-related potentials (ERPs) methodologies to assess motivated attention. Future work will pair this paradigm with a more active behavioral task to explore the behavioral impacts on the attentional differences found.

  4. Linked Records of Children with Traumatic Brain Injury. Probabilistic Linkage without Use of Protected Health Information.

    PubMed

    Bennett, T D; Dean, J M; Keenan, H T; McGlincy, M H; Thomas, A M; Cook, L J

    2015-01-01

    Record linkage may create powerful datasets with which investigators can conduct comparative effectiveness studies evaluating the impact of tests or interventions on health. All linkages of health care data files to date have used protected health information (PHI) in their linkage variables. A technique to link datasets without using PHI would be advantageous both to preserve privacy and to increase the number of potential linkages. We applied probabilistic linkage to records of injured children in the National Trauma Data Bank (NTDB, N = 156,357) and the Pediatric Health Information Systems (PHIS, N = 104,049) databases from 2007 to 2010. 49 match variables without PHI were used, many of them administrative variables and indicators for procedures recorded as International Classification of Diseases, 9th revision, Clinical Modification codes. We validated the accuracy of the linkage using identified data from a single center that submits to both databases. We accurately linked the PHIS and NTDB records for 69% of children with any injury, and 88% of those with severe traumatic brain injury eligible for a study of intervention effectiveness (positive predictive value of 98%, specificity of 99.99%). Accurate linkage was associated with longer lengths of stay, more severe injuries, and multiple injuries. In populations with substantial illness or injury severity, accurate record linkage may be possible in the absence of PHI. This methodology may enable linkages and, in turn, comparative effectiveness studies that would be unlikely or impossible otherwise.

  5. Comparing clinical automated, medical record, and hybrid data sources for diabetes quality measures.

    PubMed

    Kerr, Eve A; Smith, Dylan M; Hogan, Mary M; Krein, Sarah L; Pogach, Leonard; Hofer, Timothy P; Hayward, Rodney A

    2002-10-01

    Little is known about the relative reliability of medical record and clinical automated data, sources commonly used to assess diabetes quality of care. The agreement between diabetes quality measures constructed from clinical automated versus medical record data sources was compared, and the performance of hybrid measures derived from a combination of the two data sources was examined. Medical records were abstracted for 1,032 patients with diabetes who received care from 21 facilities in 4 Veterans Integrated Service Networks. Automated data were obtained from a central Veterans Health Administration diabetes registry containing information on laboratory tests and medication use. Success rates were higher for process measures derived from medical record data than from automated data, but no substantial differences among data sources were found for the intermediate outcome measures. Agreement for measures derived from the medical record compared with automated data was moderate for process measures but high for intermediate outcome measures. Hybrid measures yielded success rates similar to those of medical record-based measures but would have required about 50% fewer chart reviews. Agreement between medical record and automated data was generally high. Yet even in an integrated health care system with sophisticated information technology, automated data tended to underestimate the success rate in technical process measures for diabetes care and yielded different quartile performance rankings for facilities. Applying hybrid methodology yielded results consistent with the medical record but required less data to come from medical record reviews.

  6. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling

    PubMed Central

    Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078

  7. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.

    PubMed

    Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.

  8. Methodological framework for heart rate variability analysis during exercise: application to running and cycling stress testing.

    PubMed

    Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel

    2018-05-01

    Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.

  9. Technical Assistance for the Conservation of Built Heritage at Bagan, Myanmar

    NASA Astrophysics Data System (ADS)

    Mezzino, D.; Santana Quintero, M.; Ma Pwint, P.; Tin Htut Latt, W.; Rellensmann, C.

    2016-06-01

    Presenting the outcomes of a capacity building activity, this contribution illustrates a replicable recording methodology to obtain timely, relevant and accurate information about conditions, materials and transformations of heritage structures. The purpose of the presented training activity consisted in developing local capabilities for the documentation of the built heritage at Bagan, Myanmar, employing different IT-supported techniques. Under the Director of UNESCO, the direct supervision of the chief of the culture unit, and in close consultation and cooperation with the Association of Myanmar Architects, the Department of Archaeology National Museum and Library (DoA) a documentation strategy has been developed in order to set up a recording methodology for the over three thousand Bagan monuments. The site, located in central Myanmar, in South East Asia, was developed between the IX and the XIII century as capital of the Myanmar kingdom. In the last years, this outstanding site has been exposed to an increasing number of natural hazards including earthquakes and flooding that strongly affected its built structures. Therefore, a documentation strategy to quickly capture shape, color, geometry and conditions of the monuments, in order to develop proper conservation projects, was needed. The scope of the training activity consisted in setting up a recording strategy updating the existing Bagan inventory, using three Buddhist temples as pilot cases study. The three documented temples were different in size, construction period, conditions and shape. The documentation included several IT-supported techniques including: Electronic Distance Measurements (EDM), SFM Photogrammetry, Laser Scanning, Record Photography as well as hand measurement and field notes. The monuments' surveying has been developed in accordance with the guidelines and standards established by the ICOMOS International Committee for Documentation of Cultural Heritage (CIPA). Recommendations on how to extend the adopted methodology to the other Bagan monuments have been also elaborated.

  10. Nursing record systems: effects on nursing practice and health care outcomes.

    PubMed

    Currell, R; Urquhart, C

    2003-01-01

    A nursing record system is the record of care planned and/or given to individual patients/clients by qualified nurses or other caregivers under the direction of a qualified nurse. Nursing record systems may be an effective way of influencing nurse practice. To assess the effects of nursing record systems on nursing practice and patient outcomes. We searched The Cochrane Library, the EPOC trial register (October 2002), MEDLINE, Cinahl, Sigle, and databases of the Royal College of Nursing, King's Fund, the NHS Centre for Reviews and Dissemination, and the Institute of Electrical Engineers up to August 1999; and OCLC First Search, Department of Health database, NHS Register of Computer Applications and the Health Visitors' Association database up to the end of 1995. We hand searched the Journal of Nursing Administration (1971-1999), Computers in Nursing (1984-1999), Information Technology in Nursing (1989-1999) and reference lists of articles. We also hand searched the major health informatics conference proceedings. We contacted experts in the field of nursing informatics, suppliers of nursing computer systems, and relevant Internet groups. To update the review the Medline, Cinahl, British Nursing Index, Aslib Index to Theses databases were all searched from 1998 to 2002. The Journal of Nursing Administration, Computers in Nursing, Information Technology in Nursing were all hand searched up to 2002. The searches of the other databases and grey literature included in the original review, were not updated (except for Health Care Computing Conference and Med Info) as the original searches produced little relevant material. Randomised trials, controlled before and after studies and interrupted time series comparing one kind of nursing record system with another, in hospital, community or primary care settings. The participants were qualified nurses, students or health care assistants working under the direction of a qualified nurse and patients receiving care recorded and/or planned using nursing record systems. Two reviewers independently assessed trial quality and extracted data. Eight trials involving 1497 people were included. In three studies of client held records, there were no overall positive or negative effects, although some administrative benefits through fewer missing notes were suggested. A paediatric pain management sheet study showed a positive effect on the children's pain intensity. A computerised nursing care planning study showed a negative effect on documented nursing care planning, although two other computerised nursing information studies showed an increase in recording but no change in patient outcomes. Care planning took longer with these computerised systems, but the numbers of patients and nurses included in these studies was small. A controlled before-and-after study of two paper nursing record systems showed improvement in meeting documentation standards. No evidence was found of effects on practice attributable to changes in record systems. Although there is a paucity of studies of sufficient methodological rigour to yield reliable results in this area, it is clear from the literature that it is possible to set up randomised trials or other quasi-experimental designs needed to produce evidence for practice. The research undertaken so far may have suffered both from methodological problems and faulty hypotheses. Qualitative nursing research to explore the relationship between practice and information use, could be used as a precursor to the design and testing of nursing information systems.

  11. A flexible data-driven comorbidity feature extraction framework.

    PubMed

    Sideris, Costas; Pourhomayoun, Mohammad; Kalantarian, Haik; Sarrafzadeh, Majid

    2016-06-01

    Disease and symptom diagnostic codes are a valuable resource for classifying and predicting patient outcomes. In this paper, we propose a novel methodology for utilizing disease diagnostic information in a predictive machine learning framework. Our methodology relies on a novel, clustering-based feature extraction framework using disease diagnostic information. To reduce the data dimensionality, we identify disease clusters using co-occurrence statistics. We optimize the number of generated clusters in the training set and then utilize these clusters as features to predict patient severity of condition and patient readmission risk. We build our clustering and feature extraction algorithm using the 2012 National Inpatient Sample (NIS), Healthcare Cost and Utilization Project (HCUP) which contains 7 million hospital discharge records and ICD-9-CM codes. The proposed framework is tested on Ronald Reagan UCLA Medical Center Electronic Health Records (EHR) from 3041 Congestive Heart Failure (CHF) patients and the UCI 130-US diabetes dataset that includes admissions from 69,980 diabetic patients. We compare our cluster-based feature set with the commonly used comorbidity frameworks including Charlson's index, Elixhauser's comorbidities and their variations. The proposed approach was shown to have significant gains between 10.7-22.1% in predictive accuracy for CHF severity of condition prediction and 4.65-5.75% in diabetes readmission prediction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Sharply rising prevalence of HIV infection in Bali: a critical assessment of the surveillance data.

    PubMed

    Januraga, P P; Wulandari, L P L; Muliawan, P; Sawitri, S; Causer, L; Wirawan, D N; Kaldor, J M

    2013-08-01

    This study critically examines serological survey data for HIV infection in selected populations in Bali, Indonesia. Sero-survey data reported by the Bali Health Office between 2000 and 2010 were collated, and provincial health staff were interviewed to gain a detailed understanding of survey methods. Analysis of time series restricted to districts that have used the same sampling methods and sites each year indicates that there has been a steady decline in HIV prevalence among prisoners, from 18.7% in 2000 to 4.3% in 2010. In contrast, HIV prevalence among women engaged in sex work increased sharply: from 0.62% in 2000 to 20.2% in 2010 (brothel based), and from 0% in 2000 to 7.2% in 2010 (non-brothel based). The highest prevalence was recorded among people who injected drugs. Recent surveys of gay men and transvestites also found high prevalences, at 18.7% and 40.9%, respectively. Review of the methodology used in the surveys identified inconsistencies in the sampling technique, sample numbers and sites over time, and incomplete recording of individual information about survey participants. Attention to methodological aspects and incorporation of additional information on behavioural factors will ensure that the surveillance system is in the best position to support prevention activities.

  13. Damage Assessment and Monitoring of Cultural Heritage Places in a Disaster and Post-Disaster Event - a Case Study of Syria

    NASA Astrophysics Data System (ADS)

    Vafadari, A.; Philip, G.; Jennings, R.

    2017-08-01

    In recent decades, and in response to an increased focus on disastrous events ranging from armed conflict to natural events that impact cultural heritage, there is a need for methodologies and approaches to better manage the effects of disaster on cultural heritage. This paper presents the approaches used in the development of a Historic Environment Record (HER) for Syria. It describes the requirements and methodologies used for systematic emergency recording and assessment of cultural heritage. It also presents the type of information needed to record in the aftermath of disaster to assess the scale of damage and destruction. Started as a project at Durham University, the database is now being developed as part of the EAMENA (Endangered Archaeology in the Middle East and North Africa) project. The core dataset incorporates information and data from archaeological surveys undertaken in Syria by research projects in recent decades and began life as a development of the Shirīn initiative1. The focus of this project is to provide a tool not only for the recording and inventory of sites and monuments, but also to record damage and threats, their causes, and assess their magnitude. It will also record and measure the significance in order to be able to prioritize emergency and preservation responses. The database aims to set procedures for carrying out systematic rapid condition assessment (to record damage) and risk assessment (to record threat and level of risk) of heritage places, on the basis of both on the ground and remote assessment. Given the large number of heritage properties damaged by conflict, the implementation of rapid assessment methods to quickly identify and record level of damage and condition is essential, as it will provide the evidence to support effective prioritization of efforts and resources, and decisions on the appropriate levels of intervention and methods of treatment. The predefined data entry categories, use of a data standard, and systematic methods of assessment will ensure that different users choose from the same prefixed data entry and measurement inputs in order to allow for consistent and comparable assessments across different sites and regions. Given the general lack of appropriate emergency response and assessment databases, this system could also be applied in other locations facing similar threats and damage from conflict or natural disasters.

  14. Analytical optimal pulse shapes obtained with the aid of genetic algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guerrero, Rubén D., E-mail: rdguerrerom@unal.edu.co; Arango, Carlos A.; Reyes, Andrés

    2015-09-28

    We propose a methodology to design optimal pulses for achieving quantum optimal control on molecular systems. Our approach constrains pulse shapes to linear combinations of a fixed number of experimentally relevant pulse functions. Quantum optimal control is obtained by maximizing a multi-target fitness function using genetic algorithms. As a first application of the methodology, we generated an optimal pulse that successfully maximized the yield on a selected dissociation channel of a diatomic molecule. Our pulse is obtained as a linear combination of linearly chirped pulse functions. Data recorded along the evolution of the genetic algorithm contained important information regarding themore » interplay between radiative and diabatic processes. We performed a principal component analysis on these data to retrieve the most relevant processes along the optimal path. Our proposed methodology could be useful for performing quantum optimal control on more complex systems by employing a wider variety of pulse shape functions.« less

  15. Follow-up methods for retrospective cohort studies in New Zealand.

    PubMed

    Fawcett, Jackie; Garrett, Nick; Bates, Michael N

    2002-01-01

    To define a general methodology for maximising the success of follow-up processes for retrospective cohort studies in New Zealand, and to illustrate an approach to developing country-specific follow-up methodologies. We recently conducted a cohort study of mortality and cancer incidence in New Zealand professional fire fighters. A number of methods were used to trace vital status, including matching with records of the New Zealand Health Information Service (NZHIS), pension records of Work and Income New Zealand (WINZ), and electronic electoral rolls. Non-electronic methods included use of paper electoral rolls and the records of the Registrar of Births Deaths and Marriages. 95% of the theoretical person-years of follow-up of the cohort were traced using these methods. In terms of numbers of cohort members traced to end of follow-up, the most useful tracing methods were fire fighter employment records, the NZHIS, WINZ, and the electronic electoral rolls. The follow-up process used for the cohort study was highly successful. On the basis of this experience, we propose a generic, but flexible, model for follow-up of retrospective cohort studies in New Zealand. Similar models could be constructed for other countries. Successful follow-up of cohort studies is possible in New Zealand using established methods. This should encourage the use of cohort studies for the investigation of epidemiological issues. Similar models for follow-up processes could be constructed for other countries.

  16. Analysis of dynamic brain oscillations: methodological advances.

    PubMed

    Le Van Quyen, Michel; Bragin, Anatol

    2007-07-01

    In recent years, new recording technologies have advanced such that, at high temporal and spatial resolutions, oscillations of neuronal networks can be identified from simultaneous, multisite recordings. However, because of the deluge of multichannel data generated by these experiments, achieving the full potential of parallel neuronal recordings also depends on the development of new mathematical methods that can extract meaningful information relating to time, frequency and space. Here, we aim to bridge this gap by focusing on up-to-date recording techniques for measurement of network oscillations and new analysis tools for their quantitative assessment. In particular, we emphasize how these methods can be applied, what property might be inferred from neuronal signals and potentially productive future directions. This review is part of the INMED and TINS special issue, Physiogenic and pathogenic oscillations: the beauty and the beast, derived from presentations at the annual INMED and TINS symposium (http://inmednet.com).

  17. Using fuzzy data mining to diagnose patients' degrees of melancholia

    NASA Astrophysics Data System (ADS)

    Huang, Yo-Ping; Kuo, Wen-Lin

    2011-06-01

    The common treatments of melancholia are psychotherapy and taking medicines. The psychotherapy treatment which this study focuses on is limited by time and location. It is easier for psychiatrists to grasp information from clinical manifestation but it is difficult for psychiatrists to collect information from patients' daily conversations or emotion. To design a system which psychiatrists enable to capture patients' daily symptoms will show great help in the treatment. This study proposes to use fuzzy data mining algorithm to find association rules among keywords segmented from patients' daily voice/text messages to assist psychiatrists extract useful information before outpatient service. Patients of melancholia can use devices such as mobile phones or computers to record their own emotion anytime and anywhere and then uploading the recorded files to the back-end server for further analysis. The analytical results can be used for psychiatrists to diagnose patients' degrees of melancholia. Experimental results will be given to verify the effectiveness of the proposed methodology.

  18. Fieldwork Methodology in South American Maritime Archaeology: A Critical Review

    NASA Astrophysics Data System (ADS)

    Argüeso, Amaru; Ciarlo, Nicolás C.

    2017-12-01

    In archaeology, data obtained from the analysis of material evidence (i.e., the archaeological record) from extensive excavations have been a significant means for the ultimate development of interpretations about human life in the past. Therefore, the methodological procedures and tools employed during fieldwork are of crucial importance due to their effect on the information likely to be recovered. In the case of maritime archaeology, the development of rigorous methods and techniques allowed for reaching outcomes as solid as those from the work performed on land. These improvements constituted one of the principal supports—if not, the most important pillar—for its acceptance as a scientific field of study. Over time, the growing diversity of sites under study (e.g., shipwrecks, ports, dockyards, and prehistoric settlements) and the underwater environments encountered made it clear that there was a need for the application of specific methodological criteria, in accordance with the particularities of the sites and of each study (e.g., the research aims and the available resources). This article presents some ideas concerning the methodologies used in South American investigations that have exhibited a strong emphasis on the analysis of historical shipwrecks (the sixteenth to twentieth centuries). Based on a state-of-the-knowledge review of these research projects, in particular where excavations were conducted, the article focuses on the details of the main strategies adopted and results achieved. The ideas proposed in this article can be useful as a starting point for future activities of surveying, recording, and excavating shipwrecks.

  19. Information Management for a Large Multidisciplinary Project

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Randall, Donald P.; Cronin, Catherine K.

    1992-01-01

    In 1989, NASA's Langley Research Center (LaRC) initiated the High-Speed Airframe Integration Research (HiSAIR) Program to develop and demonstrate an integrated environment for high-speed aircraft design using advanced multidisciplinary analysis and optimization procedures. The major goals of this program were to evolve the interactions among disciplines and promote sharing of information, to provide a timely exchange of information among aeronautical disciplines, and to increase the awareness of the effects each discipline has upon other disciplines. LaRC historically has emphasized the advancement of analysis techniques. HiSAIR was founded to synthesize these advanced methods into a multidisciplinary design process emphasizing information feedback among disciplines and optimization. Crucial to the development of such an environment are the definition of the required data exchanges and the methodology for both recording the information and providing the exchanges in a timely manner. These requirements demand extensive use of data management techniques, graphic visualization, and interactive computing. HiSAIR represents the first attempt at LaRC to promote interdisciplinary information exchange on a large scale using advanced data management methodologies combined with state-of-the-art, scientific visualization techniques on graphics workstations in a distributed computing environment. The subject of this paper is the development of the data management system for HiSAIR.

  20. A novel metadata management model to capture consent for record linkage in longitudinal research studies.

    PubMed

    McMahon, Christiana; Denaxas, Spiros

    2017-11-06

    Informed consent is an important feature of longitudinal research studies as it enables the linking of the baseline participant information with administrative data. The lack of standardized models to capture consent elements can lead to substantial challenges. A structured approach to capturing consent-related metadata can address these. a) Explore the state-of-the-art for recording consent; b) Identify key elements of consent required for record linkage; and c) Create and evaluate a novel metadata management model to capture consent-related metadata. The main methodological components of our work were: a) a systematic literature review and qualitative analysis of consent forms; b) the development and evaluation of a novel metadata model. We qualitatively analyzed 61 manuscripts and 30 consent forms. We extracted data elements related to obtaining consent for linkage. We created a novel metadata management model for consent and evaluated it by comparison with the existing standards and by iteratively applying it to case studies. The developed model can facilitate the standardized recording of consent for linkage in longitudinal research studies and enable the linkage of external participant data. Furthermore, it can provide a structured way of recording consent-related metadata and facilitate the harmonization and streamlining of processes.

  1. Validation of the AVM Blast Computational Modeling and Simulation Tool Set

    DTIC Science & Technology

    2015-08-04

    by-construction" methodology is powerful and would not be possible without high -level design languages to support validation and verification. [1,4...to enable the making of informed design decisions.  Enable rapid exploration of the design trade-space for high -fidelity requirements tradeoffs...live-fire tests, the jump height of the target structure is recorded by using either high speed cameras or a string pot. A simple projectile motion

  2. A spatio-temporal landslide inventory for the NW of Spain: BAPA database

    NASA Astrophysics Data System (ADS)

    Valenzuela, Pablo; Domínguez-Cuesta, María José; Mora García, Manuel Antonio; Jiménez-Sánchez, Montserrat

    2017-09-01

    A landslide database has been created for the Principality of Asturias, NW Spain: the BAPA (Base de datos de Argayos del Principado de Asturias - Principality of Asturias Landslide Database). Data collection is mainly performed through searching local newspaper archives. Moreover, a BAPA App and a BAPA website (http://geol.uniovi.es/BAPA) have been developed to obtain additional information from citizens and institutions. Presently, the dataset covers the period 1980-2015, recording 2063 individual landslides. The use of free cartographic servers, such as Google Maps, Google Street View and Iberpix (Government of Spain), combined with the spatial descriptions and pictures contained in the press news, makes it possible to assess different levels of spatial accuracy. In the database, 59% of the records show an exact spatial location, and 51% of the records provided accurate dates, showing the usefulness of press archives as temporal records. Thus, 32% of the landslides show the highest spatial and temporal accuracy levels. The database also gathers information about the type and characteristics of the landslides, the triggering factors and the damage and costs caused. Field work was conducted to validate the methodology used in assessing the spatial location, temporal occurrence and characteristics of the landslides.

  3. Supersampling and Network Reconstruction of Urban Mobility.

    PubMed

    Sagarra, Oleguer; Szell, Michael; Santi, Paolo; Díaz-Guilera, Albert; Ratti, Carlo

    2015-01-01

    Understanding human mobility is of vital importance for urban planning, epidemiology, and many other fields that draw policies from the activities of humans in space. Despite the recent availability of large-scale data sets of GPS traces or mobile phone records capturing human mobility, typically only a subsample of the population of interest is represented, giving a possibly incomplete picture of the entire system under study. Methods to reliably extract mobility information from such reduced data and to assess their sampling biases are lacking. To that end, we analyzed a data set of millions of taxi movements in New York City. We first show that, once they are appropriately transformed, mobility patterns are highly stable over long time scales. Based on this observation, we develop a supersampling methodology to reliably extrapolate mobility records from a reduced sample based on an entropy maximization procedure, and we propose a number of network-based metrics to assess the accuracy of the predicted vehicle flows. Our approach provides a well founded way to exploit temporal patterns to save effort in recording mobility data, and opens the possibility to scale up data from limited records when information on the full system is required.

  4. Search strategies to identify information on adverse effects: a systematic review

    PubMed Central

    Golder, Su; Loke, Yoon

    2009-01-01

    Objectives: The review evaluated studies of electronic database search strategies designed to retrieve adverse effects data for systematic reviews. Methods: Studies of adverse effects were located in ten databases as well as by checking references, hand-searching, searching citations, and contacting experts. Two reviewers screened the retrieved records for potentially relevant papers. Results: Five thousand three hundred thirteen citations were retrieved, yielding 19 studies designed to develop or evaluate adverse effect filters, of which 3 met the inclusion criteria. All 3 studies identified highly sensitive search strategies capable of retrieving over 95% of relevant records. However, 1 study did not evaluate precision, while the level of precision in the other 2 studies ranged from 0.8% to 2.8%. Methodological issues in these papers included the relatively small number of records, absence of a validation set of records for testing, and limited evaluation of precision. Conclusions: The results indicate the difficulty of achieving highly sensitive searches for information on adverse effects with a reasonable level of precision. Researchers who intend to locate studies on adverse effects should allow for the amount of resources and time required to conduct a highly sensitive search. PMID:19404498

  5. A new field-laboratory methodology for assessing human response to noise

    NASA Technical Reports Server (NTRS)

    Borsky, P. N.

    1973-01-01

    Gross measures of community annoyance with intrusive noises have been made in a number of real environment surveys which indicate that aircraft noise may have to be reduced 30-40 EPNdb before it will generally be considered acceptable. Interview studies, however, cannot provide the precise information which is needed by noise abatement engineers of the variable human response to different types and degrees of noise exposure. A new methodological field-survey approach has been developed to provide such information. The integrated attitudes and experiences of a random sample of subjects in the real environment are obtained by a prior field survey. Then these subjects record their more precise responses to controlled noise exposures in a new realistic laboratory. The laboratory is a sound chamber furnished as a typical living room (18 ft x 14 ft) and subjects watch a color TV program while they judge simulated aircraft flyovers that occur at controlled levels and intervals. Methodological experiments indicate that subjects in the laboratory have the sensation that the airplanes are actually moving overhead across the ceiling of the chamber. It was also determined that annoyance judgments in the laboratory stabilize after three flyovers are heard prior to a judgment of annoyance.

  6. Barriers, Facilitators, and Solutions to Optimal Patient Portal and Personal Health Record Use: A Systematic Review of the Literature

    PubMed Central

    Zhao, Jane Y.; Song, Buer; Anand, Edwin; Schwartz, Diane; Panesar, Mandip; Jackson, Gretchen P.; Elkin, Peter L.

    2017-01-01

    Patient portal and personal health record adoption and usage rates have been suboptimal. A systematic review of the literature was performed to capture all published studies that specifically addressed barriers, facilitators, and solutions to optimal patient portal and personal health record enrollment and use. Consistent themes emerged from the review. Patient attitudes were critical as either barrier or facilitator. Institutional buy-in, information technology support, and aggressive tailored marketing were important facilitators. Interface redesign was a popular solution. Quantitative studies identified many barriers to optimal patient portal and personal health record enrollment and use, and qualitative and mixed methods research revealed thoughtful explanations for why they existed. Our study demonstrated the value of qualitative and mixed research methodologies in understanding the adoption of consumer health technologies. Results from the systematic review should be used to guide the design and implementation of future patient portals and personal health records, and ultimately, close the digital divide. PMID:29854263

  7. Connecticut Highlands Technical Report - Documentation of the Regional Rainfall-Runoff Model

    USGS Publications Warehouse

    Ahearn, Elizabeth A.; Bjerklie, David M.

    2010-01-01

    This report provides the supporting data and describes the data sources, methodologies, and assumptions used in the assessment of existing and potential water resources of the Highlands of Connecticut and Pennsylvania (referred to herein as the “Highlands”). Included in this report are Highlands groundwater and surface-water use data and the methods of data compilation. Annual mean streamflow and annual mean base-flow estimates from selected U.S. Geological Survey (USGS) gaging stations were computed using data for the period of record through water year 2005. The methods of watershed modeling are discussed and regional and sub-regional water budgets are provided. Information on Highlands surface-water-quality trends is presented. USGS web sites are provided as sources for additional information on groundwater levels, streamflow records, and ground- and surface-water-quality data. Interpretation of these data and the findings are summarized in the Highlands study report.

  8. Risk assessment of integrated electronic health records.

    PubMed

    Bjornsson, Bjarni Thor; Sigurdardottir, Gudlaug; Stefansson, Stefan Orri

    2010-01-01

    The paper describes the security concerns related to Electronic Health Records (EHR) both in registration of data and integration of systems. A description of the current state of EHR systems in Iceland is provided, along with the Ministry of Health's future vision and plans. New legislation provides the opportunity for increased integration of EHRs and further collaboration between institutions. Integration of systems, along with greater availability and access to EHR data, requires increased security awareness since additional risks are introduced. The paper describes the core principles of information security as it applies to EHR systems and data. The concepts of confidentiality, integrity, availability, accountability and traceability are introduced and described. The paper discusses the legal requirements and importance of performing risk assessment for EHR data. Risk assessment methodology according to the ISO/IEC 27001 information security standard is described with examples on how it is applied to EHR systems.

  9. Record Linkage Techniques - 1985. Proceedings of the Workshop on Exact Matching Methodologies

    DOT National Transportation Integrated Search

    1985-12-01

    The Workshop on Exact Matching Methodologies was held on May 9-10, 1985, at the Rosslyn Westpark Hotel in Arlington, Virginia. The conference grew out of the efforts of the Matching Group, Administrative Records Subcommittee, of the Federal Committee...

  10. Comprehensible knowledge model creation for cancer treatment decision making.

    PubMed

    Afzal, Muhammad; Hussain, Maqbool; Ali Khan, Wajahat; Ali, Taqdir; Lee, Sungyoung; Huh, Eui-Nam; Farooq Ahmad, Hafiz; Jamshed, Arif; Iqbal, Hassan; Irfan, Muhammad; Abbas Hydari, Manzar

    2017-03-01

    A wealth of clinical data exists in clinical documents in the form of electronic health records (EHRs). This data can be used for developing knowledge-based recommendation systems that can assist clinicians in clinical decision making and education. One of the big hurdles in developing such systems is the lack of automated mechanisms for knowledge acquisition to enable and educate clinicians in informed decision making. An automated knowledge acquisition methodology with a comprehensible knowledge model for cancer treatment (CKM-CT) is proposed. With the CKM-CT, clinical data are acquired automatically from documents. Quality of data is ensured by correcting errors and transforming various formats into a standard data format. Data preprocessing involves dimensionality reduction and missing value imputation. Predictive algorithm selection is performed on the basis of the ranking score of the weighted sum model. The knowledge builder prepares knowledge for knowledge-based services: clinical decisions and education support. Data is acquired from 13,788 head and neck cancer (HNC) documents for 3447 patients, including 1526 patients of the oral cavity site. In the data quality task, 160 staging values are corrected. In the preprocessing task, 20 attributes and 106 records are eliminated from the dataset. The Classification and Regression Trees (CRT) algorithm is selected and provides 69.0% classification accuracy in predicting HNC treatment plans, consisting of 11 decision paths that yield 11 decision rules. Our proposed methodology, CKM-CT, is helpful to find hidden knowledge in clinical documents. In CKM-CT, the prediction models are developed to assist and educate clinicians for informed decision making. The proposed methodology is generalizable to apply to data of other domains such as breast cancer with a similar objective to assist clinicians in decision making and education. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. The Mother-Infant Study Cohort (MISC): Methodology, challenges, and baseline characteristics

    PubMed Central

    Hashim, Mona; Shaker Obaid, Reyad; Hasan, Hayder; Naja, Farah; Al Ghazal, Hessa; Jan Jan Mohamed, Hamid; Al Hilali, Marwa; Rayess, Rana; Izzaldin, Ghamra

    2018-01-01

    Background The United Arab Emirates (UAE) exhibits alarming high prevalence of Non-Communicable Diseases (NCDs) and their risk factors. Emerging evidence highlighted the role of maternal and early child nutrition in preventing later-onset NCDs. The objectives of this article are to describe the design and methodology of the first Mother and Infant Study Cohort (MISC) in UAE; present the baseline demographic characteristics of the study participants; and discuss the challenges of the cohort and their respective responding strategies. Methods The MISC is an ongoing two-year prospective cohort study which recruited Arab pregnant women in their third trimester from prenatal clinics in Dubai, Sharjah and Ajman. Participants will be interviewed six times (once during pregnancy, at delivery, and at 2, 6, 12 and 24months postpartum). Perinatal information is obtained from hospital records. Collected data include socio-demographic characteristics, lifestyle, dietary intake and anthropometry; infant feeding practices, cognitive development; along with maternal and infant blood profile and breast milk profile. Results The preliminary results reported that 256 completed baseline assessment (mean age: 30.5±6.0 years; 76.6% multiparous; about 60% were either overweight or obese before pregnancy). The prevalence of gestational diabetes was 19.2%. Upon delivery, 208 women-infant pairs were retained (mean gestational age: 38.5±1.5 weeks; 33.3% caesarean section delivery; 5.3% low birthweight; 5.7% macrosomic deliveries). Besides participant retention, the main encountered challenges pertained to cultural complexity, underestimation the necessary start-up time, staff, and costs, and biochemical data collection. Conclusions Despite numerous methodological, logistical and sociocultural challenges, satisfactory follow-up rates are recorded. Strategies addressing challenges are documented, providing information for planning and implementing future birth cohort studies locally and internationally. PMID:29851999

  12. Measuring the effect of improvement in methodological techniques on data collection in the Gharbiah population-based cancer registry in Egypt: Implications for other Low- and Middle-Income Countries.

    PubMed

    Smith, Brittney L; Ramadan, Mohamed; Corley, Brittany; Hablas, Ahmed; Seifeldein, Ibrahim A; Soliman, Amr S

    2015-12-01

    The purpose of this study was to describe and quantify procedures and methods that maximized the efficiency of the Gharbiah Cancer Registry (GPCR), the only population-based cancer registry in Egypt. The procedures and measures included a locally-developed software program to translate names from Arabic to English, a new national ID number for demographic and occupational information, and linkage of cancer cases to new electronic mortality records of the Ministry of Health. Data was compiled from the 34,058 cases from the registry for the years 1999-2007. Cases and registry variables about demographic and clinical information were reviewed by year to assess trends associated with each new method or procedure during the study period. The introduction of the name translation software in conjunction with other demographic variables increased the identification of detected duplicates from 23.4% to 78.1%. Use of the national ID increased the proportion of cases with occupation information from 27% to 89%. Records with complete mortality information increased from 18% to 43%. Proportion of cases that came from death certificate only, decreased from 9.8% to 4.7%. Overall, the study revealed that introducing and utilizing local and culture-specific methodological changes, software, and electronic non-cancer databases had a significant impact on data quality and completeness. This study may have translational implications for improving the quality of cancer registries in LMICs considering the emerging advances in electronic databases and utilization of health software and computerization of data. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. A problem-oriented approach to journal selection for hospital libraries.

    PubMed Central

    Delman, B S

    1982-01-01

    This paper describes a problem-oriented approach to journal selection (PAJS), including general methodology, theoretical terms, and a brief description of results when the system was applied in three different hospitals. The PAJS system relates the objective information which the MEDLARS data base offers about the universe of biomedical literature to objective, problem-oriented information supplied by the hospital's medical records. The results were manipulated quantitatively to determine (1) the relevance of various journals to each of the hospital's defined significant information problems and (2) the overall utility of each journal to the institution as a whole. The utility information was plotted on a graph to identify the collection of journal titles which would be most useful to the given hospital. Attempts made to verify certain aspects of the whole process are also described. The results suggest that the methodology is generally able to provide an effective library response. The system optimizes resources vis-a-vis information and can be used for both budget allocation and justification. It offers an algorithm to which operations researchers can apply any one of a variety of mathematical programming methods. Although originally intended for librarians in the community hospital environment, the PAJS system is generalizable and has application potential in a variety of special library settings. PMID:6758893

  14. A Relevancy Algorithm for Curating Earth Science Data Around Phenomenon

    NASA Technical Reports Server (NTRS)

    Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.

    2017-01-01

    Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earthscience metadata records. Second, the methodology has been implemented as a standalone web service that is utilized to augment search and usability of data in a variety of tools.

  15. A relevancy algorithm for curating earth science data around phenomenon

    NASA Astrophysics Data System (ADS)

    Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.

    2017-09-01

    Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earth science metadata records. Second, the methodology has been implemented as a stand-alone web service that is utilized to augment search and usability of data in a variety of tools.

  16. The New Method of Tsunami Source Reconstruction With r-Solution Inversion Method

    NASA Astrophysics Data System (ADS)

    Voronina, T. A.; Romanenko, A. A.

    2016-12-01

    Application of the r-solution method to reconstructing the initial tsunami waveform is discussed. This methodology is based on the inversion of remote measurements of water-level data. The wave propagation is considered within the scope of a linear shallow-water theory. The ill-posed inverse problem in question is regularized by means of a least square inversion using the truncated Singular Value Decomposition method. As a result of the numerical process, an r-solution is obtained. The method proposed allows one to control the instability of a numerical solution and to obtain an acceptable result in spite of ill posedness of the problem. Implementation of this methodology to reconstructing of the initial waveform to 2013 Solomon Islands tsunami validates the theoretical conclusion for synthetic data and a model tsunami source: the inversion result strongly depends on data noisiness, the azimuthal and temporal coverage of recording stations with respect to the source area. Furthermore, it is possible to make a preliminary selection of the most informative set of the available recording stations used in the inversion process.

  17. Optical characterization of agricultural pest insects: a methodological study in the spectral and time domains

    NASA Astrophysics Data System (ADS)

    Li, Y. Y.; Zhang, H.; Duan, Z.; Lian, M.; Zhao, G. Y.; Sun, X. H.; Hu, J. D.; Gao, L. N.; Feng, H. Q.; Svanberg, S.

    2016-08-01

    Identification of agricultural pest insects is an important aspect in insect research and agricultural monitoring. We have performed a methodological study of how spectroscopic techniques and wing-beat frequency analysis might provide relevant information. An optical system based on the combination of close-range remote sensing and reflectance spectroscopy was developed to study the optical characteristics of different flying insects, collected in Southern China. The results demonstrate that the combination of wing-beat frequency assessment and reflectance spectral analysis has the potential to successfully differentiate between insect species. Further, studies of spectroscopic characteristics of fixed specimen of insects, also from Central China, showed the possibility of refined agricultural pest identification. Here, in addition to reflectance recordings also laser-induced fluorescence spectra were investigated for all the species of insects under study and found to provide complementary information to optically distinguish insects. In order to prove the practicality of the techniques explored, clearly fieldwork aiming at elucidating the variability of parameters, even within species, must be performed.

  18. Beverage and water intake of healthy adults in some European countries.

    PubMed

    Nissensohn, Mariela; Castro-Quezada, Itandehui; Serra-Majem, Lluis

    2013-11-01

    Nutritional surveys frequently collect some data of consumption of beverages; however, information from different sources and different methodologies raises issues of comparability. The main objective of this review was to examine the available techniques used for assessing beverage intake in European epidemiological studies and to describe the most frequent method applied to assess it. Information of beverage intake available from European surveys and nutritional epidemiological investigations was obtained from gray literature. Twelve articles were included and relevant data were extracted. The studies were carried out on healthy adults by different types of assessments. The most frequent tool used was a 7-d dietary record. Only Germany used a specific beverage assessment tool (Beverage Dietary History). From the limited data available and the diversity of the methodology used, the results show that consumption of beverages is different between countries. Current epidemiological studies in Europe focusing on beverage intake are scarce. Further research is needed to clarify the amount of beverage intake in European population.

  19. Data-driven approach for creating synthetic electronic medical records.

    PubMed

    Buczak, Anna L; Babin, Steven; Moniz, Linda

    2010-10-14

    New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs) that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia) and for background records. The method developed has three major steps: 1) synthetic patient identity and basic information generation; 2) identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3) adaptation of these care patterns to the synthetic patient population. We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders). The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious diseases. The pilot synthetic background records were in the 4-11 year old age group. The adaptations that must be made to the algorithms to produce synthetic background EMRs for other age groups are indicated.

  20. 77 FR 4002 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-26

    ... the methodological research previously included in the original System of Record Notice (SORN). This... methodological research on improving various aspects of surveys authorized by Title 13, U.S.C. 8(b), 182, and 196, such as: survey sampling frame design; sample selection algorithms; questionnaire development, design...

  1. Does the patient-held record improve continuity and related outcomes in cancer care: a systematic review.

    PubMed

    Gysels, Marjolein; Richardson, Alison; Higginson, Irene J

    2007-03-01

    To assess the effectiveness of the patient-held record (PHR) in cancer care. Patients with cancer may receive care from different services resulting in gaps. A PHR could provide continuity and patient involvement in care. Relevant literature was identified through five electronic databases (Medline, Embase, Cinahl, CCTR and CDSR) and hand searches. Patient-held records in cancer care with the purpose of improving communication and information exchange between and within different levels of care and to promote continuity of care and patients' involvement in their own care. Data extraction recorded characteristics of intervention, type of study and factors that contributed to methodological quality of individual studies. Data were then contrasted by setting, objectives, population, study design, outcome measures and changes in outcome, including knowledge, satisfaction, anxiety and depression. Methodological quality of randomized control trials and non-experimental studies were assessed with separate standard grading scales. Seven randomized control trials and six non-experimental studies were identified. Evaluations of the PHR have reached equivocal findings. Randomized trials found an absence of effect, non-experimental evaluations shed light on the conditions for its successful use. Most patients welcomed introduction of a PHR. Main problems related to its suitability for different patient groups and the lack of agreement between patients and health professionals regarding its function. Further research is required to determine the conditions under which the PHR can realize its potential as a tool to promote continuity of care and patient participation.

  2. Keys to a successful project: Associated data and planning: Data standards. Chapter 5 in Measuring and monitoring biological diversity: Standard methods for amphibians

    USGS Publications Warehouse

    McDiarmid, Roy W.; Heyer, W. Ronald; Donnelly, Maureen A.; McDiarmid, Roy W.; Hayek, Lee-Ann C.; Foster, Mercedes S.

    1994-01-01

    The many individual salamanders, frogs, caecilians, and their larvae encountered during the course of an inventory or monitoring project will have to be identified to species. Depending on the goals and sampling method(s) used, some individuals will be identified from a distance by their calls, others will be handled. At the same time, some will be marked for recapture, and others will be sampled as vouchers. For each, certain minimum data should be recorded. In this section, data pertaining to locality and sampling methodology are considered, information on microhabitats and specimen vouchers is covered in sections that follow. I feel strongly that the data outlined here should be the minimum for any project. Investigators with specific goals may require additional types of data as well.Standardized, printed sheets containing the required data categories provide a convenient, inexpensive, and effective way to ensure that all the desired information is recorded in a consistent format, Data sheets should be well organized, printed on good-quality paper (75%-100% cotton content) and include extra space (e.g., other side of sheet) for notes that do not fit preestablished categoriesData should be recorded in the field with permanent (waterproof) ink as simply and directly as possible. I strongly recommend against the use of data codes in the field; it is too easy to forget codes or to enter the wrong code. Original data sheets can be photocopied for security, but they should not be copied by hand. If data are to be coded for computer analysis, the original or photocopied sheets should be used for data entry to minimize transcription errors. Some workers prefer recording information on small tape recorders; this also works well if a list of the standard data categories is checked during taping to ensure that all required information is recorded. Information recorded on tapes should be transcribed to data sheets or into a computer within 24 hours of the sample.

  3. International health IT benchmarking: learning from cross-country comparisons.

    PubMed

    Zelmer, Jennifer; Ronchi, Elettra; Hyppönen, Hannele; Lupiáñez-Villanueva, Francisco; Codagnone, Cristiano; Nøhr, Christian; Huebner, Ursula; Fazzalari, Anne; Adler-Milstein, Julia

    2017-03-01

    To pilot benchmark measures of health information and communication technology (ICT) availability and use to facilitate cross-country learning. A prior Organization for Economic Cooperation and Development-led effort involving 30 countries selected and defined functionality-based measures for availability and use of electronic health records, health information exchange, personal health records, and telehealth. In this pilot, an Organization for Economic Cooperation and Development Working Group compiled results for 38 countries for a subset of measures with broad coverage using new and/or adapted country-specific or multinational surveys and other sources from 2012 to 2015. We also synthesized country learnings to inform future benchmarking. While electronic records are widely used to store and manage patient information at the point of care-all but 2 pilot countries reported use by at least half of primary care physicians; many had rates above 75%-patient information exchange across organizations/settings is less common. Large variations in the availability and use of telehealth and personal health records also exist. Pilot participation demonstrated interest in cross-national benchmarking. Using the most comparable measures available to date, it showed substantial diversity in health ICT availability and use in all domains. The project also identified methodological considerations (e.g., structural and health systems issues that can affect measurement) important for future comparisons. While health policies and priorities differ, many nations aim to increase access, quality, and/or efficiency of care through effective ICT use. By identifying variations and describing key contextual factors, benchmarking offers the potential to facilitate cross-national learning and accelerate the progress of individual countries. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  4. Electronic health record interoperability as realized in the Turkish health information system.

    PubMed

    Dogac, A; Yuksel, M; Avci, A; Ceyhan, B; Hülür, U; Eryilmaz, Z; Mollahaliloglu, S; Atbakan, E; Akdag, R

    2011-01-01

    The objective of this paper is to describe the techniques used in developing the National Health Information System of Turkey (NHIS-T), a nation-wide infrastructure for sharing electronic health records (EHRs). The UN/CEFACT Core Components Technical Specification (CCTS) methodology was applied to design the logical EHR structure and to increase the reuse of common information blocks in EHRs. The NHIS-T became operational on January 15, 2009. By June 2010, 99% of the public hospitals and 71% of the private and university hospitals were connected to NHIS-T with daily feeds of their patients' EHRs. Out of the 72 million citizens of Turkey, electronic healthcare records of 43 million citizens have already been created in NHIS-T. Currently, only the general practitioners can access the EHRs of their patients. In the second phase of the implementation and once the legal framework is completed, the proper patient consent mechanisms will be available through the personal health record system that is under development. At this time authorized healthcare professionals in secondary and tertiary healthcare systems can access the patients' EHRs. A number of factors affected the successful implementation of NHIS-T. First, all stakeholders have to adopt the specified standards. Second, the UN/CEFACT CCTS approach was applied which facilitated the development and understanding of rather complex EHR schemas. Finally, the comprehensive testing of vendor-based hospital information systems for their conformance to and interoperability with NHIS-T through an automated testing platform enhanced substantially the fast integration of vendor-based solutions with the NHIS-T.

  5. Interoperability of clinical decision-support systems and electronic health records using archetypes: a case study in clinical trial eligibility.

    PubMed

    Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat

    2013-08-01

    Clinical decision-support systems (CDSSs) comprise systems as diverse as sophisticated platforms to store and manage clinical data, tools to alert clinicians of problematic situations, or decision-making tools to assist clinicians. Irrespective of the kind of decision-support task CDSSs should be smoothly integrated within the clinical information system, interacting with other components, in particular with the electronic health record (EHR). However, despite decades of developments, most CDSSs lack interoperability features. We deal with the interoperability problem of CDSSs and EHRs by exploiting the dual-model methodology. This methodology distinguishes a reference model and archetypes. A reference model is represented by a stable and small object-oriented model that describes the generic properties of health record information. For their part, archetypes are reusable and domain-specific definitions of clinical concepts in the form of structured and constrained combinations of the entities of the reference model. We rely on archetypes to make the CDSS compatible with EHRs from different institutions. Concretely, we use archetypes for modelling the clinical concepts that the CDSS requires, in conjunction with a series of knowledge-intensive mappings relating the archetypes to the data sources (EHR and/or other archetypes) they depend on. We introduce a comprehensive approach, including a set of tools as well as methodological guidelines, to deal with the interoperability of CDSSs and EHRs based on archetypes. Archetypes are used to build a conceptual layer of the kind of a virtual health record (VHR) over the EHR whose contents need to be integrated and used in the CDSS, associating them with structural and terminology-based semantics. Subsequently, the archetypes are mapped to the EHR by means of an expressive mapping language and specific-purpose tools. We also describe a case study where the tools and methodology have been employed in a CDSS to support patient recruitment in the framework of a clinical trial for colorectal cancer screening. The utilisation of archetypes not only has proved satisfactory to achieve interoperability between CDSSs and EHRs but also offers various advantages, in particular from a data model perspective. First, the VHR/data models we work with are of a high level of abstraction and can incorporate semantic descriptions. Second, archetypes can potentially deal with different EHR architectures, due to their deliberate independence of the reference model. Third, the archetype instances we obtain are valid instances of the underlying reference model, which would enable e.g. feeding back the EHR with data derived by abstraction mechanisms. Lastly, the medical and technical validity of archetype models would be assured, since in principle clinicians should be the main actors in their development. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Challenges and opportunities of undertaking a video ethnographic study to understand medication communication.

    PubMed

    Liu, Wei; Gerdtz, Marie; Manias, Elizabeth

    2015-12-01

    To examine the challenges and opportunities of undertaking a video ethnographic study on medication communication among nurses, doctors, pharmacists and patients. Video ethnography has proved to be a dynamic and useful method to explore clinical communication activities. This approach involves filming actual behaviours and activities of clinicians to develop new knowledge and to stimulate reflections of clinicians on their behaviours and activities. However, there is limited information about the complex negotiations required to use video ethnography in actual clinical practice. Discursive paper. A video ethnographic approach was used to gain better understanding of medication communication processes in two general medical wards of a metropolitan hospital in Melbourne, Australia. This paper presents the arduous and delicate process of gaining access into hospital wards to video-record actual clinical practice and the methodological and ethical issues associated with video-recording. Obtaining access to clinical settings and clinician consent are the first hurdles of conducting a video ethnographic study. Clinicians may still feel intimidated or self-conscious in being video recorded about their medication communication practices, which they could perceive as judgements being passed about their clinical competence. By thoughtful and strategic planning, video ethnography can provide in-depth understandings of medication communication in acute care hospital settings. Ethical issues of informed consent, patient safety and respect for the confidentiality of patients and clinicians need to be carefully addressed to build up and maintain trusting relationships between researchers and participants in the clinical environment. By prudently considering the complex ethical and methodological concerns of using video ethnography, this approach can help to reveal the unpredictability and messiness of clinical practice. The visual data generated can stimulate clinicians' reflexivity about their norms of practice and bring about improved communication about managing medications. © 2015 John Wiley & Sons Ltd.

  7. Classification of rainfall events for weather forecasting purposes in andean region of Colombia

    NASA Astrophysics Data System (ADS)

    Suárez Hincapié, Joan Nathalie; Romo Melo, Liliana; Vélez Upegui, Jorge Julian; Chang, Philippe

    2016-04-01

    This work presents a comparative analysis of the results of applying different methodologies for the identification and classification of rainfall events of different duration in meteorological records of the Colombian Andean region. In this study the work area is the urban and rural area of Manizales that counts with a monitoring hydro-meteorological network. This network is composed of forty-five (45) strategically located stations, this network is composed of forty-five (45) strategically located stations where automatic weather stations record seven climate variables: air temperature, relative humidity, wind speed and direction, rainfall, solar radiation and barometric pressure. All this information is sent wirelessly every five (5) minutes to a data warehouse located at the Institute of Environmental Studies-IDEA. With obtaining the series of rainfall recorded by the hydrometeorological station Palogrande operated by the National University of Colombia in Manizales (http://froac.manizales.unal.edu.co/bodegaIdea/); it is with this information that we proceed to perform behavior analysis of other meteorological variables, monitored at surface level and that influence the occurrence of such rainfall events. To classify rainfall events different methodologies were used: The first according to Monjo (2009) where the index n of the heavy rainfall was calculated through which various types of precipitation are defined according to the intensity variability. A second methodology that permitted to produce a classification in terms of a parameter β introduced by Rice and Holmberg (1973) and adapted by Llasat and Puigcerver, (1985, 1997) and the last one where a rainfall classification is performed according to the value of its intensity following the issues raised by Linsley (1977) where the rains can be considered light, moderate and strong fall rates to 2.5 mm / h; from 2.5 to 7.6 mm / h and above this value respectively for the previous classifications. The main contribution which is done with this research is the obtainment elements to optimize and to improve the spatial resolution of the results obtained with mesoscale models such as the Weather Research & Forecasting Model- WRF, used in Colombia for the purposes of weather forecasting and that in addition produces other tools used in current issues such as risk management.

  8. Use of IDEF modeling to develop an information management system for drug and alcohol outpatient treatment clinics

    NASA Astrophysics Data System (ADS)

    Hoffman, Kenneth J.

    1995-10-01

    Few information systems create a standardized clinical patient record in which there are discrete and concise observations of patient problems and their resolution. Clinical notes usually are narratives which don't support an aggregate and systematic outcome analysis. Many programs collect information on diagnosis and coded procedures but are not focused on patient problems. Integrated definition (IDEF) methodology has been accepted by the Department of Defense as part of the Corporate Information Management Initiative and serves as the foundation that establishes a need for automation. We used IDEF modeling to describe present and idealized patient care activities. A logical IDEF data model was created to support those activities. The modeling process allows for accurate cost estimates based upon performed activities, efficient collection of relevant information, and outputs which allow real- time assessments of process and outcomes. This model forms the foundation for a prototype automated clinical information system (ACIS).

  9. A mapping of information security in health Information Systems in Latin America and Brazil.

    PubMed

    Pereira, Samáris Ramiro; Fernandes, João Carlos Lopes; Labrada, Luis; Bandiera-Paiva, Paulo

    2013-01-01

    In health, Information Systems are patient records, hospital administration or other, have advantages such as cost, availability and integration. However, for these benefits to be fully met, it is necessary to guarantee the security of information maintained and provided by the systems. The lack of security can lead to serious consequences such as lawsuits and induction to medical errors. The management of information security is complex and is used in various fields of knowledge. Often, it is left in the background for not being the ultimate goal of a computer system, causing huge financial losses to corporations. This paper by systematic review methodologies, presented a mapping in the literature, in order to identify the most relevant aspects that are addressed by security researchers of health information, as to the development of computerized systems. They conclude through the results, some important aspects, for which the managers of computerized health systems should remain alert.

  10. A methodology for assessing the effect of correlations among muscle synergy activations on task-discriminating information.

    PubMed

    Delis, Ioannis; Berret, Bastien; Pozzo, Thierry; Panzeri, Stefano

    2013-01-01

    Muscle synergies have been hypothesized to be the building blocks used by the central nervous system to generate movement. According to this hypothesis, the accomplishment of various motor tasks relies on the ability of the motor system to recruit a small set of synergies on a single-trial basis and combine them in a task-dependent manner. It is conceivable that this requires a fine tuning of the trial-to-trial relationships between the synergy activations. Here we develop an analytical methodology to address the nature and functional role of trial-to-trial correlations between synergy activations, which is designed to help to better understand how these correlations may contribute to generating appropriate motor behavior. The algorithm we propose first divides correlations between muscle synergies into types (noise correlations, quantifying the trial-to-trial covariations of synergy activations at fixed task, and signal correlations, quantifying the similarity of task tuning of the trial-averaged activation coefficients of different synergies), and then uses single-trial methods (task-decoding and information theory) to quantify their overall effect on the task-discriminating information carried by muscle synergy activations. We apply the method to both synchronous and time-varying synergies and exemplify it on electromyographic data recorded during performance of reaching movements in different directions. Our method reveals the robust presence of information-enhancing patterns of signal and noise correlations among pairs of synchronous synergies, and shows that they enhance by 9-15% (depending on the set of tasks) the task-discriminating information provided by the synergy decompositions. We suggest that the proposed methodology could be useful for assessing whether single-trial activations of one synergy depend on activations of other synergies and quantifying the effect of such dependences on the task-to-task differences in muscle activation patterns.

  11. Dementia ascertainment using existing data in UK longitudinal and cohort studies: a systematic review of methodology.

    PubMed

    Sibbett, Ruth A; Russ, Tom C; Deary, Ian J; Starr, John M

    2017-07-03

    Studies investigating the risk factors for or causation of dementia must consider subjects prior to disease onset. To overcome the limitations of prospective studies and self-reported recall of information, the use of existing data is key. This review provides a narrative account of dementia ascertainment methods using sources of existing data. The literature search was performed using: MEDLINE, EMBASE, PsychInfo and Web of Science. Included articles reported a UK-based study of dementia in which cases were ascertained using existing data. Existing data included that which was routinely collected and that which was collected for previous research. After removing duplicates, abstracts were screened and the remaining articles were included for full-text review. A quality tool was used to evaluate the description of the ascertainment methodology. Of the 3545 abstracts screened, 360 articles were selected for full-text review. 47 articles were included for final consideration. Data sources for ascertainment included: death records, national datasets, research databases and hospital records among others. 36 articles used existing data alone for ascertainment, of which 27 used only a single data source. The most frequently used source was a research database. Quality scores ranged from 7/16 to 16/16. Quality scores were better for articles with dementia ascertainment as an outcome. Some papers performed validation studies of dementia ascertainment and most indicated that observed rates of dementia were lower than expected. We identified a lack of consistency in dementia ascertainment methodology using existing data. With no data source identified as a "gold-standard", we suggest the use of multiple sources. Where possible, studies should access records with evidence to confirm the diagnosis. Studies should also calculate the dementia ascertainment rate for the population being studied to enable a comparison with an expected rate.

  12. Bayesian WLS/GLS regression for regional skewness analysis for regions with large crest stage gage networks

    USGS Publications Warehouse

    Veilleux, Andrea G.; Stedinger, Jery R.; Eash, David A.

    2012-01-01

    This paper summarizes methodological advances in regional log-space skewness analyses that support flood-frequency analysis with the log Pearson Type III (LP3) distribution. A Bayesian Weighted Least Squares/Generalized Least Squares (B-WLS/B-GLS) methodology that relates observed skewness coefficient estimators to basin characteristics in conjunction with diagnostic statistics represents an extension of the previously developed B-GLS methodology. B-WLS/B-GLS has been shown to be effective in two California studies. B-WLS/B-GLS uses B-WLS to generate stable estimators of model parameters and B-GLS to estimate the precision of those B-WLS regression parameters, as well as the precision of the model. The study described here employs this methodology to develop a regional skewness model for the State of Iowa. To provide cost effective peak-flow data for smaller drainage basins in Iowa, the U.S. Geological Survey operates a large network of crest stage gages (CSGs) that only record flow values above an identified recording threshold (thus producing a censored data record). CSGs are different from continuous-record gages, which record almost all flow values and have been used in previous B-GLS and B-WLS/B-GLS regional skewness studies. The complexity of analyzing a large CSG network is addressed by using the B-WLS/B-GLS framework along with the Expected Moments Algorithm (EMA). Because EMA allows for the censoring of low outliers, as well as the use of estimated interval discharges for missing, censored, and historic data, it complicates the calculations of effective record length (and effective concurrent record length) used to describe the precision of sample estimators because the peak discharges are no longer solely represented by single values. Thus new record length calculations were developed. The regional skewness analysis for the State of Iowa illustrates the value of the new B-WLS/BGLS methodology with these new extensions.

  13. Application of Lean Healthcare methodology in a urology department of a tertiary hospital as a tool for improving efficiency.

    PubMed

    Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D

    To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. Self-reports of meaning in life matter.

    PubMed

    Heintzelman, Samantha J; King, Laura A

    2015-09-01

    Replies to the comments made by Friedman (see record 2015-39598-012), Jeffery & Shackelford (see record 2015-39598-013), Brown & Wong (see record 2015-39598-014), Fowers & Lefevor (see record 2015-39598-015), Hill et al. (see record 2015-39598-016) on the current authors' original article, "Life is pretty meaningful," (see record 2014-03265-001). The current authors thank the comment authors for their efforts, and acknowledge their dedication to what is often a difficult and inscrutable construct, meaning in life. One lesson the current authors have learned from these reactions is that a review of self-report responses to items like "My life is purposeful and meaningful" cannot encompass the entirety of the meaning-in-life landscape. In this reply, the current authors reflect on aspects of the commentaries, highlighting what they can garner about meaning in life from the portion of it that is reflected in phenomenological experience and represented in self-reports: These are the data they have. The current authors first consider three methodological concerns that bear on whether these data are informative (at all) and then they consider more conceptual critiques. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  15. Hospital information systems: experience at the fully digitized Seoul National University Bundang Hospital.

    PubMed

    Yoo, Sooyoung; Hwang, Hee; Jheon, Sanghoon

    2016-08-01

    The different levels of health information technology (IT) adoption and its integration into hospital workflow can affect the maximization of the benefits of using of health IT. We aimed at sharing our experiences and the journey to the successful adoption of health IT over 13 years at a tertiary university hospital in South Korea. The integrated system of comprehensive applications for direct care, support care, and smart care has been implemented with the latest IT and a rich user information platform, achieving the fully digitized hospital. The users experience design methodology, barcode and radio-frequency identification (RFID) technologies, smartphone and mobile technologies, and data analytics were integrated into hospital workflow. Applications for user-centered electronic medical record (EMR) and clinical decision support (CDS), closed loop medication administration (CLMA), mobile EMR and dashboard system for care coordination, clinical data warehouse (CDW) system, and patient engagement solutions were designed and developed to improve quality of care, work efficiency, and patient safety. We believe that comprehensive electronic health record systems and patient-centered smart hospital applications will go a long way in ensuring seamless patient care and experience.

  16. Implementation of Personnel Support Centers in the United States Coast Guard.

    DTIC Science & Technology

    1983-06-01

    test site in Seattle, as an example of change in a complex organization. 3y compiling a record of what has been done, the reactions of the people to...III will describe the methodology used to gather information and data for the thesis. Findings on what has occurred (is occurring) in the 11th and...processes and decisions which occur in the organi- zation. Figure 2 is a model depicting what Leavitt considers the three primary targets which managers

  17. Three computer codes to read, plot and tabulate operational test-site recorded solar data

    NASA Technical Reports Server (NTRS)

    Stewart, S. D.; Sampson, R. S., Jr.; Stonemetz, R. E.; Rouse, S. L.

    1980-01-01

    Computer programs used to process data that will be used in the evaluation of collector efficiency and solar system performance are described. The program, TAPFIL, reads data from an IBM 360 tape containing information (insolation, flowrates, temperatures, etc.) from 48 operational solar heating and cooling test sites. Two other programs, CHPLOT and WRTCNL, plot and tabulate the data from the direct access, unformatted TAPFIL file. The methodology of the programs, their inputs, and their outputs are described.

  18. An empirical approach to predicting long term behavior of metal particle based recording media

    NASA Technical Reports Server (NTRS)

    Hadad, Allan S.

    1992-01-01

    Alpha iron particles used for magnetic recording are prepared through a series of dehydration and reduction steps of alpha-Fe2O3-H2O resulting in acicular, polycrystalline, body centered cubic (bcc) alpha-Fe particles that are single magnetic domains. Since fine iron particles are pyrophoric by nature, stabilization processes had to be developed in order for iron particles to be considered as a viable recording medium for long term archival (i.e., 25+ years) information storage. The primary means of establishing stability is through passivation or controlled oxidation of the iron particle's surface. A study was undertaken to examine the degradation in magnetic properties as a function of both temperature and humidity on silicon-containing iron particles between 50-120 C and 3-89 percent relative humidity. The methodology to which experimental data was collected and analyzed leading to predictive capability is discussed.

  19. Development of a relational database to capture and merge clinical history with the quantitative results of radionuclide renography.

    PubMed

    Folks, Russell D; Savir-Baruch, Bital; Garcia, Ernest V; Verdes, Liudmila; Taylor, Andrew T

    2012-12-01

    Our objective was to design and implement a clinical history database capable of linking to our database of quantitative results from (99m)Tc-mercaptoacetyltriglycine (MAG3) renal scans and export a data summary for physicians or our software decision support system. For database development, we used a commercial program. Additional software was developed in Interactive Data Language. MAG3 studies were processed using an in-house enhancement of a commercial program. The relational database has 3 parts: a list of all renal scans (the RENAL database), a set of patients with quantitative processing results (the Q2 database), and a subset of patients from Q2 containing clinical data manually transcribed from the hospital information system (the CLINICAL database). To test interobserver variability, a second physician transcriber reviewed 50 randomly selected patients in the hospital information system and tabulated 2 clinical data items: hydronephrosis and presence of a current stent. The CLINICAL database was developed in stages and contains 342 fields comprising demographic information, clinical history, and findings from up to 11 radiologic procedures. A scripted algorithm is used to reliably match records present in both Q2 and CLINICAL. An Interactive Data Language program then combines data from the 2 databases into an XML (extensible markup language) file for use by the decision support system. A text file is constructed and saved for review by physicians. RENAL contains 2,222 records, Q2 contains 456 records, and CLINICAL contains 152 records. The interobserver variability testing found a 95% match between the 2 observers for presence or absence of ureteral stent (κ = 0.52), a 75% match for hydronephrosis based on narrative summaries of hospitalizations and clinical visits (κ = 0.41), and a 92% match for hydronephrosis based on the imaging report (κ = 0.84). We have developed a relational database system to integrate the quantitative results of MAG3 image processing with clinical records obtained from the hospital information system. We also have developed a methodology for formatting clinical history for review by physicians and export to a decision support system. We identified several pitfalls, including the fact that important textual information extracted from the hospital information system by knowledgeable transcribers can show substantial interobserver variation, particularly when record retrieval is based on the narrative clinical records.

  20. Cell-phone vs microphone recordings: Judging emotion in the voice.

    PubMed

    Green, Joshua J; Eigsti, Inge-Marie

    2017-09-01

    Emotional states can be conveyed by vocal cues such as pitch and intensity. Despite the ubiquity of cellular telephones, there is limited information on how vocal emotional states are perceived during cell-phone transmissions. Emotional utterances (neutral, happy, angry) were elicited from two female talkers and simultaneously recorded via microphone and cell-phone. Ten-step continua (neutral to happy, neutral to angry) were generated using the straight algorithm. Analyses compared reaction time (RT) and emotion judgment as a function of recording type (microphone vs cell-phone). Logistic regression revealed no judgment differences between recording types, though there were interactions with emotion type. Multi-level model analyses indicated that RT data were best fit by a quadratic model, with slower RT at the middle of each continuum, suggesting greater ambiguity, and slower RT for cell-phone stimuli across blocks. While preliminary, results suggest that critical acoustic cues to emotion are largely retained in cell-phone transmissions, though with effects of recording source on RT, and support the methodological utility of collecting speech samples by phone.

  1. Cortical network architecture for context processing in primate brain

    PubMed Central

    Chao, Zenas C; Nagasaka, Yasuo; Fujii, Naotaka

    2015-01-01

    Context is information linked to a situation that can guide behavior. In the brain, context is encoded by sensory processing and can later be retrieved from memory. How context is communicated within the cortical network in sensory and mnemonic forms is unknown due to the lack of methods for high-resolution, brain-wide neuronal recording and analysis. Here, we report the comprehensive architecture of a cortical network for context processing. Using hemisphere-wide, high-density electrocorticography, we measured large-scale neuronal activity from monkeys observing videos of agents interacting in situations with different contexts. We extracted five context-related network structures including a bottom-up network during encoding and, seconds later, cue-dependent retrieval of the same network with the opposite top-down connectivity. These findings show that context is represented in the cortical network as distributed communication structures with dynamic information flows. This study provides a general methodology for recording and analyzing cortical network neuronal communication during cognition. DOI: http://dx.doi.org/10.7554/eLife.06121.001 PMID:26416139

  2. Using exceedance probabilities to detect anomalies in routinely recorded animal health data, with particular reference to foot-and-mouth disease in Viet Nam.

    PubMed

    Richards, K K; Hazelton, M L; Stevenson, M A; Lockhart, C Y; Pinto, J; Nguyen, L

    2014-10-01

    The widespread availability of computer hardware and software for recording and storing disease event information means that, in theory, we have the necessary information to carry out detailed analyses of factors influencing the spatial distribution of disease in animal populations. However, the reliability of such analyses depends on data quality, with anomalous records having the potential to introduce significant bias and lead to inappropriate decision making. In this paper we promote the use of exceedance probabilities as a tool for detecting anomalies when applying hierarchical spatio-temporal models to animal health data. We illustrate this methodology through a case study data on outbreaks of foot-and-mouth disease (FMD) in Viet Nam for the period 2006-2008. A flexible binomial logistic regression was employed to model the number of FMD infected communes within each province of the country. Standard analyses of the residuals from this model failed to identify problems, but exceedance probabilities identified provinces in which the number of reported FMD outbreaks was unexpectedly low. This finding is interesting given that these provinces are on major cattle movement pathways through Viet Nam. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Expanding the Security Dimension of Surety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SENGLAUB, MICHAEL E.

    1999-10-01

    A small effort was conducted at Sandia National Laboratories to explore the use of a number of modern analytic technologies in the assessment of terrorist actions and to predict trends. This work focuses on Bayesian networks as a means of capturing correlations between groups, tactics, and targets. The data that was used as a test of the methodology was obtained by using a special parsing algorithm written in JAVA to create records in a database from information articles captured electronically. As a vulnerability assessment technique the approach proved very useful. The technology also proved to be a valuable development mediummore » because of the ability to integrate blocks of information into a deployed network rather than waiting to fully deploy only after all relevant information has been assembled.« less

  4. Advances in spatial epidemiology and geographic information systems.

    PubMed

    Kirby, Russell S; Delmelle, Eric; Eberth, Jan M

    2017-01-01

    The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Does the patient‐held record improve continuity and related outcomes in cancer care: a systematic review

    PubMed Central

    Gysels, Marjolein; Richardson, Alison; Higginson, Irene J.

    2006-01-01

    Abstract Objectives  To assess the effectiveness of the patient‐held record (PHR) in cancer care. Background  Patients with cancer may receive care from different services resulting in gaps. A PHR could provide continuity and patient involvement in care. Search strategy  Relevant literature was identified through five electronic databases (Medline, Embase, Cinahl, CCTR and CDSR) and hand searches. Inclusion criteria  Patient‐held records in cancer care with the purpose of improving communication and information exchange between and within different levels of care and to promote continuity of care and patients’ involvement in their own care. Data extraction and synthesis  Data extraction recorded characteristics of intervention, type of study and factors that contributed to methodological quality of individual studies. Data were then contrasted by setting, objectives, population, study design, outcome measures and changes in outcome, including knowledge, satisfaction, anxiety and depression. Methodological quality of randomized control trials and non‐experimental studies were assessed with separate standard grading scales. Main results and conclusions  Seven randomized control trials and six non‐experimental studies were identified. Evaluations of the PHR have reached equivocal findings. Randomized trials found an absence of effect, non‐experimental evaluations shed light on the conditions for its successful use. Most patients welcomed introduction of a PHR. Main problems related to its suitability for different patient groups and the lack of agreement between patients and health professionals regarding its function. Further research is required to determine the conditions under which the PHR can realize its potential as a tool to promote continuity of care and patient participation. PMID:17324196

  6. Identifying social factors amongst older individuals in linked electronic health records: An assessment in a population based study

    PubMed Central

    van Hoek, Albert J.; Walker, Jemma L.; Mathur, Rohini; Smeeth, Liam; Thomas, Sara L.

    2017-01-01

    Identification and quantification of health inequities amongst specific social groups is a pre-requisite for designing targeted healthcare interventions. This study investigated the recording of social factors in linked electronic health records (EHR) of individuals aged ≥65 years, to assess the potential of these data to identify the social determinants of disease burden and uptake of healthcare interventions. Methodology was developed for ascertaining social factors recorded on or before a pre-specified index date (01/01/2013) using primary care data from Clinical Practice Research Datalink (CPRD) linked to hospitalisation and deprivation data in a cross-sectional study. Social factors included: religion, ethnicity, immigration status, small area-level deprivation, place of residence (including communal establishments such as care homes), marital status and living arrangements (e.g. living alone, cohabitation). Each social factor was examined for: completeness of recording including improvements in completeness by using other linked EHR, timeliness of recording for factors that might change over time and their representativeness (compared with English 2011 Census data when available). Data for 591,037 individuals from 389 practices from England were analysed. The completeness of recording varied from 1.6% for immigration status to ~80% for ethnicity. Linkages provided the deprivation data (available for 82% individuals) and improved completeness of ethnicity recording from 55% to 79% (when hospitalisation data were added). Data for ethnicity, deprivation, living arrangements and care home residence were comparable to the Census data. For time-varying variables such as residence and living alone, ~60% and ~35% respectively of those with available data, had this information recorded within the last 5 years of the index date. This work provides methods to identify social factors in EHR relevant to older individuals and shows that factors such as ethnicity, deprivation, not living alone, cohabitation and care home residence can be ascertained using these data. Applying these methodologies to routinely collected data could improve surveillance programmes and allow assessment of health equity in specific healthcare studies. PMID:29190680

  7. Geographic Information Systems to Assess External Validity in Randomized Trials.

    PubMed

    Savoca, Margaret R; Ludwig, David A; Jones, Stedman T; Jason Clodfelter, K; Sloop, Joseph B; Bollhalter, Linda Y; Bertoni, Alain G

    2017-08-01

    To support claims that RCTs can reduce health disparities (i.e., are translational), it is imperative that methodologies exist to evaluate the tenability of external validity in RCTs when probabilistic sampling of participants is not employed. Typically, attempts at establishing post hoc external validity are limited to a few comparisons across convenience variables, which must be available in both sample and population. A Type 2 diabetes RCT was used as an example of a method that uses a geographic information system to assess external validity in the absence of a priori probabilistic community-wide diabetes risk sampling strategy. A geographic information system, 2009-2013 county death certificate records, and 2013-2014 electronic medical records were used to identify community-wide diabetes prevalence. Color-coded diabetes density maps provided visual representation of these densities. Chi-square goodness of fit statistic/analysis tested the degree to which distribution of RCT participants varied across density classes compared to what would be expected, given simple random sampling of the county population. Analyses were conducted in 2016. Diabetes prevalence areas as represented by death certificate and electronic medical records were distributed similarly. The simple random sample model was not a good fit for death certificate record (chi-square, 17.63; p=0.0001) and electronic medical record data (chi-square, 28.92; p<0.0001). Generally, RCT participants were oversampled in high-diabetes density areas. Location is a highly reliable "principal variable" associated with health disparities. It serves as a directly measurable proxy for high-risk underserved communities, thus offering an effective and practical approach for examining external validity of RCTs. Copyright © 2017 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  8. 'Seed + expand': a general methodology for detecting publication oeuvres of individual researchers.

    PubMed

    Reijnhoudt, Linda; Costas, Rodrigo; Noyons, Ed; Börner, Katy; Scharnhorst, Andrea

    2014-01-01

    The study of science at the individual scholar level requires the disambiguation of author names. The creation of author's publication oeuvres involves matching the list of unique author names to names used in publication databases. Despite recent progress in the development of unique author identifiers, e.g., ORCID, VIVO, or DAI, author disambiguation remains a key problem when it comes to large-scale bibliometric analysis using data from multiple databases. This study introduces and tests a new methodology called seed + expand for semi-automatic bibliographic data collection for a given set of individual authors. Specifically, we identify the oeuvre of a set of Dutch full professors during the period 1980-2011. In particular, we combine author records from a Dutch National Research Information System (NARCIS) with publication records from the Web of Science. Starting with an initial list of 8,378 names, we identify 'seed publications' for each author using five different approaches. Subsequently, we 'expand' the set of publications in three different approaches. The different approaches are compared and resulting oeuvres are evaluated on precision and recall using a 'gold standard' dataset of authors for which verified publications in the period 2001-2010 are available.

  9. Investigating Perceptual Biases, Data Reliability, and Data Discovery in a Methodology for Collecting Speech Errors From Audio Recordings.

    PubMed

    Alderete, John; Davies, Monica

    2018-04-01

    This work describes a methodology of collecting speech errors from audio recordings and investigates how some of its assumptions affect data quality and composition. Speech errors of all types (sound, lexical, syntactic, etc.) were collected by eight data collectors from audio recordings of unscripted English speech. Analysis of these errors showed that: (i) different listeners find different errors in the same audio recordings, but (ii) the frequencies of error patterns are similar across listeners; (iii) errors collected "online" using on the spot observational techniques are more likely to be affected by perceptual biases than "offline" errors collected from audio recordings; and (iv) datasets built from audio recordings can be explored and extended in a number of ways that traditional corpus studies cannot be.

  10. Connectivity Measures in EEG Microstructural Sleep Elements.

    PubMed

    Sakellariou, Dimitris; Koupparis, Andreas M; Kokkinos, Vasileios; Koutroumanidis, Michalis; Kostopoulos, George K

    2016-01-01

    During Non-Rapid Eye Movement sleep (NREM) the brain is relatively disconnected from the environment, while connectedness between brain areas is also decreased. Evidence indicates, that these dynamic connectivity changes are delivered by microstructural elements of sleep: short periods of environmental stimuli evaluation followed by sleep promoting procedures. The connectivity patterns of the latter, among other aspects of sleep microstructure, are still to be fully elucidated. We suggest here a methodology for the assessment and investigation of the connectivity patterns of EEG microstructural elements, such as sleep spindles. The methodology combines techniques in the preprocessing, estimation, error assessing and visualization of results levels in order to allow the detailed examination of the connectivity aspects (levels and directionality of information flow) over frequency and time with notable resolution, while dealing with the volume conduction and EEG reference assessment. The high temporal and frequency resolution of the methodology will allow the association between the microelements and the dynamically forming networks that characterize them, and consequently possibly reveal aspects of the EEG microstructure. The proposed methodology is initially tested on artificially generated signals for proof of concept and subsequently applied to real EEG recordings via a custom built MATLAB-based tool developed for such studies. Preliminary results from 843 fast sleep spindles recorded in whole night sleep of 5 healthy volunteers indicate a prevailing pattern of interactions between centroparietal and frontal regions. We demonstrate hereby, an opening to our knowledge attempt to estimate the scalp EEG connectivity that characterizes fast sleep spindles via an "EEG-element connectivity" methodology we propose. The application of the latter, via a computational tool we developed suggests it is able to investigate the connectivity patterns related to the occurrence of EEG microstructural elements. Network characterization of specified physiological or pathological EEG microstructural elements can potentially be of great importance in the understanding, identification, and prediction of health and disease.

  11. Connectivity Measures in EEG Microstructural Sleep Elements

    PubMed Central

    Sakellariou, Dimitris; Koupparis, Andreas M.; Kokkinos, Vasileios; Koutroumanidis, Michalis; Kostopoulos, George K.

    2016-01-01

    During Non-Rapid Eye Movement sleep (NREM) the brain is relatively disconnected from the environment, while connectedness between brain areas is also decreased. Evidence indicates, that these dynamic connectivity changes are delivered by microstructural elements of sleep: short periods of environmental stimuli evaluation followed by sleep promoting procedures. The connectivity patterns of the latter, among other aspects of sleep microstructure, are still to be fully elucidated. We suggest here a methodology for the assessment and investigation of the connectivity patterns of EEG microstructural elements, such as sleep spindles. The methodology combines techniques in the preprocessing, estimation, error assessing and visualization of results levels in order to allow the detailed examination of the connectivity aspects (levels and directionality of information flow) over frequency and time with notable resolution, while dealing with the volume conduction and EEG reference assessment. The high temporal and frequency resolution of the methodology will allow the association between the microelements and the dynamically forming networks that characterize them, and consequently possibly reveal aspects of the EEG microstructure. The proposed methodology is initially tested on artificially generated signals for proof of concept and subsequently applied to real EEG recordings via a custom built MATLAB-based tool developed for such studies. Preliminary results from 843 fast sleep spindles recorded in whole night sleep of 5 healthy volunteers indicate a prevailing pattern of interactions between centroparietal and frontal regions. We demonstrate hereby, an opening to our knowledge attempt to estimate the scalp EEG connectivity that characterizes fast sleep spindles via an “EEG-element connectivity” methodology we propose. The application of the latter, via a computational tool we developed suggests it is able to investigate the connectivity patterns related to the occurrence of EEG microstructural elements. Network characterization of specified physiological or pathological EEG microstructural elements can potentially be of great importance in the understanding, identification, and prediction of health and disease. PMID:26924980

  12. Construction of a database for published phase II/III drug intervention clinical trials for the period 2009-2014 comprising 2,326 records, 90 disease categories, and 939 drug entities.

    PubMed

    Jeong, Sohyun; Han, Nayoung; Choi, Boyoon; Sohn, Minji; Song, Yun-Kyoung; Chung, Myeon-Woo; Na, Han-Sung; Ji, Eunhee; Kim, Hyunah; Rhew, Ki Yon; Kim, Therasa; Kim, In-Wha; Oh, Jung Mi

    2016-06-01

    To construct a database of published clinical drug trials suitable for use 1) as a research tool in accessing clinical trial information and 2) in evidence-based decision-making by regulatory professionals, clinical research investigators, and medical practitioners. Comprehensive information obtained from a search of design elements and results of clinical trials in peer reviewed journals using PubMed (http://www.ncbi.nlm.ih.gov/pubmed). The methodology to develop a structured database was devised by a panel composed of experts in medical, pharmaceutical, information technology, and members of Ministry of Food and Drug Safety (MFDS) using a step by step approach. A double-sided system consisting of user mode and manager mode served as the framework for the database; elements of interest from each trial were entered via secure manager mode enabling the input information to be accessed in a user-friendly manner (user mode). Information regarding methodology used and results of drug treatment were extracted as detail elements of each data set and then inputted into the web-based database system. Comprehensive information comprising 2,326 clinical trial records, 90 disease states, and 939 drugs entities and concerning study objectives, background, methods used, results, and conclusion could be extracted from published information on phase II/III drug intervention clinical trials appearing in SCI journals within the last 10 years. The extracted data was successfully assembled into a clinical drug trial database with easy access suitable for use as a research tool. The clinically most important therapeutic categories, i.e., cancer, cardiovascular, respiratory, neurological, metabolic, urogenital, gastrointestinal, psychological, and infectious diseases were covered by the database. Names of test and control drugs, details on primary and secondary outcomes and indexed keywords could also be retrieved and built into the database. The construction used in the database enables the user to sort and download targeted information as a Microsoft Excel spreadsheet. Because of the comprehensive and standardized nature of the clinical drug trial database and its ease of access it should serve as valuable information repository and research tool for accessing clinical trial information and making evidence-based decisions by regulatory professionals, clinical research investigators, and medical practitioners.

  13. A systematic review of the quality of homeopathic pathogenetic trials published from 1945 to 1995.

    PubMed

    Dantas, F; Fisher, P; Walach, H; Wieland, F; Rastogi, D P; Teixeira, H; Koster, D; Jansen, J P; Eizayaga, J; Alvarez, M E P; Marim, M; Belon, P; Weckx, L L M

    2007-01-01

    The quality of information gathered from homeopathic pathogenetic trials (HPTs), also known as 'provings', is fundamental to homeopathy. We systematically reviewed HPTs published in six languages (English, German, Spanish, French, Portuguese and Dutch) from 1945 to 1995, to assess their quality in terms of the validity of the information they provide. The literature was comprehensively searched, only published reports of HPTs were included. Information was extracted by two reviewers per trial using a form with 87 items. Information on: medicines, volunteers, ethical aspects, blinding, randomization, use of placebo, adverse effects, assessments, presentation of data and number of claimed findings were recorded. Methodological quality was assessed by an index including indicators of internal and external validity, personal judgement and comments of reviewers for each study. 156 HPTs on 143 medicines, involving 2815 volunteers, produced 20,538 pathogenetic effects (median 6.5 per volunteer). There was wide variation in methods and results. Sample size (median 15, range 1-103) and trial duration (mean 34 days) were very variable. Most studies had design flaws, particularly absence of proper randomization, blinding, placebo control and criteria for analysis of outcomes. Mean methodological score was 5.6 (range 4-16). More symptoms were reported from HPTs of poor quality than from better ones. In 56% of trials volunteers took placebo. Pathogenetic effects were claimed in 98% of publications. On average about 84% of volunteers receiving active treatment developed symptoms. The quality of reports was in general poor, and much important information was not available. The HPTs were generally of low methodological quality. There is a high incidence of pathogenetic effects in publications and volunteers but this could be attributable to design flaws. Homeopathic medicines, tested in HPTs, appear safe. The central question of whether homeopathic medicines in high dilutions can provoke effects in healthy volunteers has not yet been definitively answered, because of methodological weaknesses of the reports. Improvement of the method and reporting of results of HPTs are required. References to all included RCTs are available on-line at.

  14. Machine Readable Bibliographic Records: Criteria and Creation.

    ERIC Educational Resources Information Center

    Bregzis, Ritvars

    The centrality of bibliographic records in library automation, objectives of the bibliographic record file and elemental factors involved in bibliographic record creation are discussed. The practical work of creating bibliographic records involves: (1) data base environment, (2) technical aspects, (3) cost and (4) operational methodology. The…

  15. Ontological Standardization for Historical Map Collections: Studying the Greek Borderlines of 1881

    NASA Astrophysics Data System (ADS)

    Gkadolou, E.; Tomai, E.; Stefanakis, E.; Kritikos, G.

    2012-07-01

    Historical maps deliver valuable historical information which is applicable in several domains while they document the spatiotemporal evolution of the geographical entities that are depicted therein. In order to use the historical cartographic information effectively, the maps' semantic documentation becomes a necessity for restoring any semantic ambiguities and structuring the relationship between historical and current geographical space. This paper examines cartographic ontologies as a proposed methodology and presents the first outcomes of the methodology applied for the historical map series «Carte de la nouvelle frontière Turco-Grecque» that sets the borderlines between Greece and Ottoman Empire in 1881. The map entities were modelled and compared to the current ones so as to record the changes in their spatial and thematic attributes and an ontology was developed in Protégé OWL Editor 3.4.4 for the attributes that thoroughly define a historical map and the digitised spatial entities. Special focus was given on the Greek borderline and the changes that it caused to other geographic entities.

  16. Methodological aspects of EEG and body dynamics measurements during motion

    PubMed Central

    Reis, Pedro M. R.; Hebenstreit, Felix; Gabsteiger, Florian; von Tscharner, Vinzenz; Lochmann, Matthias

    2014-01-01

    EEG involves the recording, analysis, and interpretation of voltages recorded on the human scalp which originate from brain gray matter. EEG is one of the most popular methods of studying and understanding the processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements that are performed in response to the environment. However, there are methodological difficulties which can occur when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions on how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics, and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determinating real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks. PMID:24715858

  17. Relationships between healthcare and research records.

    PubMed

    Duwe, Helmut

    2004-01-01

    The ultimate end-point of healthcare and health-related life sciences, more or less as regulatory idea, is the prevention and cure of diseases, considering the fate of individual patients as well as the challenge of providing sufficient care for all. However, all undertakings stand under the "proviso of rightness of action". The movement of evidence-based medicine has triggered a renaissance of systematic self-assurance of best practise. The systematic utilization of healthcare records and research study recordings in an inter-linked manner provides a better enabling environment to improve evidence. Good e-health must contribute to accumulate inter-generation clinical experience. Qualified research should ensure methodological strictness via gold standards like controlled, randomised and masked trials. Building information systems for e-health as well as for e-science bears as a special focus the mutual cross-fertilization of these application domains. Analysing a variety of building blocks shows that both areas can benefit from generic solution pattern, keeping in mind that each domain has distinguished knowledge realms. Generic patterns as well as distinguished special features are illustrated by analysing state of the art solutions plus some experimental approaches, as there are: the generic part of the HL7 V3 RIM, the RCRIM work, laboratory information handling, vital sign standardization efforts, like ECG information models.Finally, the precision of the usage of the ubiquitous term "metadata" is taken as example of an open issue.

  18. Video analysis for insight and coding: Examples from tutorials in introductory physics

    NASA Astrophysics Data System (ADS)

    Scherr, Rachel E.

    2009-12-01

    The increasing ease of video recording offers new opportunities to create richly detailed records of classroom activities. These recordings, in turn, call for research methodologies that balance generalizability with interpretive validity. This paper shares methodology for two practices of video analysis: (1) gaining insight into specific brief classroom episodes and (2) developing and applying a systematic observational protocol for a relatively large corpus of video data. These two aspects of analytic practice are illustrated in the context of a particular research interest but are intended to serve as general suggestions.

  19. Clinical decision support systems for improving diagnostic accuracy and achieving precision medicine.

    PubMed

    Castaneda, Christian; Nalley, Kip; Mannion, Ciaran; Bhattacharyya, Pritish; Blake, Patrick; Pecora, Andrew; Goy, Andre; Suh, K Stephen

    2015-01-01

    As research laboratories and clinics collaborate to achieve precision medicine, both communities are required to understand mandated electronic health/medical record (EHR/EMR) initiatives that will be fully implemented in all clinics in the United States by 2015. Stakeholders will need to evaluate current record keeping practices and optimize and standardize methodologies to capture nearly all information in digital format. Collaborative efforts from academic and industry sectors are crucial to achieving higher efficacy in patient care while minimizing costs. Currently existing digitized data and information are present in multiple formats and are largely unstructured. In the absence of a universally accepted management system, departments and institutions continue to generate silos of information. As a result, invaluable and newly discovered knowledge is difficult to access. To accelerate biomedical research and reduce healthcare costs, clinical and bioinformatics systems must employ common data elements to create structured annotation forms enabling laboratories and clinics to capture sharable data in real time. Conversion of these datasets to knowable information should be a routine institutionalized process. New scientific knowledge and clinical discoveries can be shared via integrated knowledge environments defined by flexible data models and extensive use of standards, ontologies, vocabularies, and thesauri. In the clinical setting, aggregated knowledge must be displayed in user-friendly formats so that physicians, non-technical laboratory personnel, nurses, data/research coordinators, and end-users can enter data, access information, and understand the output. The effort to connect astronomical numbers of data points, including '-omics'-based molecular data, individual genome sequences, experimental data, patient clinical phenotypes, and follow-up data is a monumental task. Roadblocks to this vision of integration and interoperability include ethical, legal, and logistical concerns. Ensuring data security and protection of patient rights while simultaneously facilitating standardization is paramount to maintaining public support. The capabilities of supercomputing need to be applied strategically. A standardized, methodological implementation must be applied to developed artificial intelligence systems with the ability to integrate data and information into clinically relevant knowledge. Ultimately, the integration of bioinformatics and clinical data in a clinical decision support system promises precision medicine and cost effective and personalized patient care.

  20. Cleaning up the paper trail - our clinical notes in open view.

    PubMed

    Lambe, Gerard; Linnane, Niall; Callanan, Ian; Butler, Marcus W

    2018-04-16

    Purpose Ireland's physicians have a legal and an ethical duty to protect confidential patient information. Most healthcare records in Ireland remain paper based, so the purpose of this paper is to: assess the protection afforded to paper records; log highest risk records; note the variations that occurred during the working week; and observe the varying protection that occurred when staff, students and public members were present. Design/methodology/approach A customised audit tool was created using Sphinx software. Data were collected for three months. All wards included in the study were visited once during four discrete time periods across the working week. The medical records trolley's location was noted and total unattended medical records, total unattended nursing records, total unattended patient lists and when nursing personnel, medical students, public and a ward secretary were visibly present were recorded. Findings During 84 occasions when the authors visited wards, unattended medical records were identified on 33 per cent of occasions, 49 per cent were found during weekend visiting hours and just 4 per cent were found during morning rounds. The unattended medical records belonged to patients admitted to a medical specialty in 73 per cent of cases and a surgical specialty in 27 per cent. Medical records were found unattended in the nurses' station with much greater frequency when the ward secretary was off duty. Unattended nursing records were identified on 67 per cent of occasions the authors visited the ward and were most commonly found unattended in groups of six or more. Practical implications This study is a timely reminder that confidential patient information is at risk from inappropriate disclosure in the hospital. There are few context-specific standards for data protection to guide healthcare professionals, particularly paper records. Nursing records are left unattended with twice the frequency of medical records and are found unattended in greater numbers than medical records. Protection is strongest when ward secretaries are on duty. Over-reliance on vigilant ward secretaries could represent a threat to confidential patient information. Originality/value While other studies identified data protection as an issue, this study assesses how data security varies inside and outside conventional working hours. It provides a rationale and an impetus for specific changes across the whole working week. By identifying the on-duty ward secretary's favourable effect on medical record security, it highlights the need for alternative arrangements when the ward secretary is off duty. Data were collected prospectively in real time, giving a more accurate healthcare record security snapshot in each data collection point.

  1. 19 CFR 111.23 - Retention of records.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... bonded warehouse, records relating to the withdrawal must be retained for 5 years from the date of... consolidated location, the methodology of record maintenance, a description of any automated data processing to...

  2. 19 CFR 111.23 - Retention of records.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... bonded warehouse, records relating to the withdrawal must be retained for 5 years from the date of... consolidated location, the methodology of record maintenance, a description of any automated data processing to...

  3. 19 CFR 111.23 - Retention of records.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... bonded warehouse, records relating to the withdrawal must be retained for 5 years from the date of... consolidated location, the methodology of record maintenance, a description of any automated data processing to...

  4. Multisensor system and artificial intelligence in housing for the elderly.

    PubMed

    Chan, M; Bocquet, H; Campo, E; Val, T; Estève, D; Pous, J

    1998-01-01

    To improve the safety of a growing proportion of elderly and disabled people in the developed countries, a multisensor system based on Artificial Intelligence (AI), Advanced Telecommunications (AT) and Information Technology (IT) has been devised and fabricated. Thus, the habits and behaviours of these populations will be recorded without disturbing their daily activities. AI will diagnose any abnormal behavior or change and the system will warn the professionals. Gerontology issues are presented together with the multisensor system, the AI-based learning and diagnosis methodology and the main functionalities.

  5. EHR Big Data Deep Phenotyping

    PubMed Central

    Lenert, L.; Lopez-Campos, G.

    2014-01-01

    Summary Objectives Given the quickening speed of discovery of variant disease drivers from combined patient genotype and phenotype data, the objective is to provide methodology using big data technology to support the definition of deep phenotypes in medical records. Methods As the vast stores of genomic information increase with next generation sequencing, the importance of deep phenotyping increases. The growth of genomic data and adoption of Electronic Health Records (EHR) in medicine provides a unique opportunity to integrate phenotype and genotype data into medical records. The method by which collections of clinical findings and other health related data are leveraged to form meaningful phenotypes is an active area of research. Longitudinal data stored in EHRs provide a wealth of information that can be used to construct phenotypes of patients. We focus on a practical problem around data integration for deep phenotype identification within EHR data. The use of big data approaches are described that enable scalable markup of EHR events that can be used for semantic and temporal similarity analysis to support the identification of phenotype and genotype relationships. Conclusions Stead and colleagues’ 2005 concept of using light standards to increase the productivity of software systems by riding on the wave of hardware/processing power is described as a harbinger for designing future healthcare systems. The big data solution, using flexible markup, provides a route to improved utilization of processing power for organizing patient records in genotype and phenotype research. PMID:25123744

  6. Nursing record systems: effects on nursing practice and health care outcomes.

    PubMed

    Currell, R; Wainwright, P; Urquhart, C

    2000-01-01

    A nursing record system is the record of care planned and/or given to individual patients/clients by qualified nurses or other caregivers under the direction of a qualified nurse. Nursing record systems may be an effective way of influencing nurse practice. To assess the effects of nursing record systems on nursing practice and patient outcomes. We searched The Cochrane Library, MEDLINE, Cinahl, Sigle, and databases of the Royal College of Nursing, King's Fund, the NHS Centre for Reviews and Dissemination, and the Institute of Electrical Engineers up to August 1999; and OCLC First Search, Department of Health database, NHS Register of Computer Applications and the Health Visitors' Association database up to the end of 1995. We hand searched the Journal of Nursing Administration (1971-1999), Computers in Nursing (1984-1999), Information Technology in Nursing (1989-1999) and reference lists of articles. We also hand searched the major health informatics conference proceedings. We contacted experts in the field of nursing informatics, suppliers of nursing computer systems, and relevant Internet groups. Randomised trials, controlled before and after studies and interrupted time series comparing one kind of nursing record system with another, in hospital, community or primary care settings. The participants were qualified nurses, students or health care assistants working under the direction of a qualified nurse and patients receiving care recorded and/or planned using nursing record systems. Two reviewers independently assessed trial quality and extracted data. Six trials involving 1407 people were included. In three studies of client held records, there were no overall positive or negative effects, although some administrative benefits through fewer missing notes were suggested. A paediatric pain management sheet study showed a positive effect on the children's pain intensity. A computerised nursing care planning study showed a negative effect on documented nursing care planning. A controlled before-and-after study of two paper nursing record systems showed improvement in meeting documentation standards. No evidence was found of effects on practice attributable to changes in record systems. Although there is a paucity of studies of sufficient methodological rigour to yield reliable results in this area, it is clear from the literature that it is possible to set up randomised trials or other quasi-experimental designs needed to produce evidence for practice. The research undertaken so far may have suffered both from methodological problems and faulty hypotheses.

  7. The effect of information technology on hospital performance.

    PubMed

    Williams, Cynthia; Asi, Yara; Raffenaud, Amanda; Bagwell, Matt; Zeini, Ibrahim

    2016-12-01

    While healthcare entities have integrated various forms of health information technology (HIT) into their systems due to claims of increased quality and decreased costs, as well as various incentives, there is little available information about which applications of HIT are actually the most beneficial and efficient. In this study, we aim to assist administrators in understanding the characteristics of top performing hospitals. We utilized data from the Health Information and Management Systems Society and the Center for Medicare and Medicaid to assess 1039 hospitals. Inputs considered were full time equivalents, hospital size, and technology inputs. Technology inputs included personal health records (PHR), electronic medical records (EMRs), computerized physician order entry systems (CPOEs), and electronic access to diagnostic results. Output variables were measures of quality, hospital readmission and mortality rate. The analysis was conducted in a two-stage methodology: Data Envelopment Analysis (DEA) and Automatic Interaction Detector Analysis (AID), decision tree regression (DTreg). Overall, we found that electronic access to diagnostic results systems was the most influential technological characteristics; however organizational characteristics were more important than technological inputs. Hospitals that had the highest levels of quality indicated no excess in the use of technology input, averaging one use of a technology component. This study indicates that prudent consideration of organizational characteristics and technology is needed before investing in innovative programs.

  8. Health information management: an introduction to disease classification and coding.

    PubMed

    Mony, Prem Kumar; Nagaraj, C

    2007-01-01

    Morbidity and mortality data constitute an important component of a health information system and their coding enables uniform data collation and analysis as well as meaningful comparisons between regions or countries. Strengthening the recording and reporting systems for health monitoring is a basic requirement for an efficient health information management system. Increased advocacy for and awareness of a uniform coding system together with adequate capacity building of physicians, coders and other allied health and information technology personnel would pave the way for a valid and reliable health information management system in India. The core requirements for the implementation of disease coding are: (i) support from national/institutional health administrators, (ii) widespread availability of the ICD-10 material for morbidity and mortality coding; (iii) enhanced human and financial resources; and (iv) optimal use of informatics. We describe the methodology of a disease classification and codification system as also its applications for developing and maintaining an effective health information management system for India.

  9. Toward Proper Authentication Methods in Electronic Medical Record Access Compliant to HIPAA and C.I.A. Triangle.

    PubMed

    Tipton, Stephen J; Forkey, Sara; Choi, Young B

    2016-04-01

    This paper examines various methods encompassing the authentication of users in accessing Electronic Medical Records (EMRs). From a methodological perspective, multiple authentication methods have been researched from both a desktop and mobile accessibility perspective. Each method is investigated at a high level, along with comparative analyses, as well as real world examples. The projected outcome of this examination is a better understanding of the sophistication required in protecting the vital privacy constraints of an individual's Protected Health Information (PHI). In understanding the implications of protecting healthcare data in today's technological world, the scope of this paper is to grasp an overview of confidentiality as it pertains to information security. In addressing this topic, a high level overview of the three goals of information security are examined; in particular, the goal of confidentiality is the primary focus. Expanding upon the goal of confidentiality, healthcare accessibility legal aspects are considered, with a focus upon the Health Insurance Portability and Accountability Act of 1996 (HIPAA). With the primary focus of this examination being access to EMRs, the paper will consider two types of accessibility of concern: access from a physician, or group of physicians; and access from an individual patient.

  10. Modelling Medications for Public Health Research

    PubMed Central

    van Gaans, D.; Ahmed, S.; D’Onise, K.; Moyon, J.; Caughey, G.; McDermott, R.

    2016-01-01

    Most patients with chronic disease are prescribed multiple medications, which are recorded in their personal health records. This is rich information for clinical public health researchers but also a challenge to analyse. This paper describes the method that was undertaken within the Public Health Research Data Management System (PHReDMS) to map medication data retrieved from individual patient health records for population health researcher’s use. The PHReDMS manages clinical, health service, community and survey research data within a secure web environment that allows for data sharing amongst researchers. The PHReDMS is currently used by researchers to answer a broad range of questions, including monitoring of prescription patterns in different population groups and geographic areas with high incidence/prevalence of chronic renal, cardiovascular, metabolic and mental health issues. In this paper, we present the general notion of abstraction network, a higher level network that sits above a terminology and offers compact and more easily understandable view of its content. We demonstrate the utilisation of abstraction network methodology to examine medication data from electronic medical records to allow a compact and more easily understandable view of its content. PMID:28149446

  11. A pragmatic method for electronic medical record-based observational studies: developing an electronic medical records retrieval system for clinical research

    PubMed Central

    Yamamoto, Keiichi; Sumi, Eriko; Yamazaki, Toru; Asai, Keita; Yamori, Masashi; Teramukai, Satoshi; Bessho, Kazuhisa; Yokode, Masayuki; Fukushima, Masanori

    2012-01-01

    Objective The use of electronic medical record (EMR) data is necessary to improve clinical research efficiency. However, it is not easy to identify patients who meet research eligibility criteria and collect the necessary information from EMRs because the data collection process must integrate various techniques, including the development of a data warehouse and translation of eligibility criteria into computable criteria. This research aimed to demonstrate an electronic medical records retrieval system (ERS) and an example of a hospital-based cohort study that identified both patients and exposure with an ERS. We also evaluated the feasibility and usefulness of the method. Design The system was developed and evaluated. Participants In total, 800 000 cases of clinical information stored in EMRs at our hospital were used. Primary and secondary outcome measures The feasibility and usefulness of the ERS, the method to convert text from eligible criteria to computable criteria, and a confirmation method to increase research data accuracy. Results To comprehensively and efficiently collect information from patients participating in clinical research, we developed an ERS. To create the ERS database, we designed a multidimensional data model optimised for patient identification. We also devised practical methods to translate narrative eligibility criteria into computable parameters. We applied the system to an actual hospital-based cohort study performed at our hospital and converted the test results into computable criteria. Based on this information, we identified eligible patients and extracted data necessary for confirmation by our investigators and for statistical analyses with our ERS. Conclusions We propose a pragmatic methodology to identify patients from EMRs who meet clinical research eligibility criteria. Our ERS allowed for the efficient collection of information on the eligibility of a given patient, reduced the labour required from the investigators and improved the reliability of the results. PMID:23117567

  12. Improving core outcome set development: qualitative interviews with developers provided pointers to inform guidance.

    PubMed

    Gargon, Elizabeth; Williamson, Paula R; Young, Bridget

    2017-06-01

    The objective of the study was to explore core outcome set (COS) developers' experiences of their work to inform methodological guidance on COS development and identify areas for future methodological research. Semistructured, audio-recorded interviews with a purposive sample of 32 COS developers. Analysis of transcribed interviews was informed by the constant comparative method and framework analysis. Developers found COS development to be challenging, particularly in relation to patient participation and accessing funding. Their accounts raised fundamental questions about the status of COS development and whether it is consultation or research. Developers emphasized how the absence of guidance had affected their work and identified areas where guidance or evidence about COS development would be useful including, patient participation, ethics, international development, and implementation. They particularly wanted guidance on systematic reviews, Delphi, and consensus meetings. The findings raise important questions about the funding, status, and process of COS development and indicate ways that it could be strengthened. Guidance could help developers to strengthen their work, but over specification could threaten quality in COS development. Guidance should therefore highlight common issues to consider and encourage tailoring of COS development to the context and circumstances of particular COS. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Coding Early Naturalists' Accounts into Long-Term Fish Community Changes in the Adriatic Sea (1800–2000)

    PubMed Central

    Fortibuoni, Tomaso; Libralato, Simone; Raicevich, Saša; Giovanardi, Otello; Solidoro, Cosimo

    2010-01-01

    The understanding of fish communities' changes over the past centuries has important implications for conservation policy and marine resource management. However, reconstructing these changes is difficult because information on marine communities before the second half of the 20th century is, in most cases, anecdotal and merely qualitative. Therefore, historical qualitative records and modern quantitative data are not directly comparable, and their integration for long-term analyses is not straightforward. We developed a methodology that allows the coding of qualitative information provided by early naturalists into semi-quantitative information through an intercalibration with landing proportions. This approach allowed us to reconstruct and quantitatively analyze a 200-year-long time series of fish community structure indicators in the Northern Adriatic Sea (Mediterranean Sea). Our analysis provides evidence of long-term changes in fish community structure, including the decline of Chondrichthyes, large-sized and late-maturing species. This work highlights the importance of broadening the time-frame through which we look at marine ecosystem changes and provides a methodology to exploit, in a quantitative framework, historical qualitative sources. To the purpose, naturalists' eyewitness accounts proved to be useful for extending the analysis on fish community back in the past, well before the onset of field-based monitoring programs. PMID:21103349

  14. Comparing the application of Health Information Technology in primary care in Denmark and Andalucía, Spain.

    PubMed

    Protti, Denis; Johansen, Ib; Perez-Torres, Francisco

    2009-04-01

    It is generally acknowledged that Denmark is one, if not the, leading country in terms of the use of information technology by its primary care physicians. Other countries, notably excluding the United States and Canada, are also advanced in terms of electronic medical records in general practitioner offices and clinics. This paper compares the status of primary care physician office computing in Andalucía to that of Denmark by contrasting the functionality of electronic medical records (EMRs) and the ability to electronically communicate clinical information in both jurisdictions. A novel scoring system has been developed based on data gathered from databases held by the respective jurisdictional programs, and interviews with individuals involved in the deployment of the systems. The scoring methodology was applied for the first time in a comparison of the degree of automation in primary care physician offices in Denmark and the province of Alberta in Canada. It was also used to compare Denmark and New Zealand. This paper is the third offering of this method of scoring the adoption of electronic medical records in primary care office settings which hopefully may be applicable to other health jurisdictions at national, state, or provincial levels. Although similar in many respects, there are significant differences between these two relatively autonomous health systems which have led to the rates of uptake of physician office computing. Particularly notable is the reality that the Danish primary care physicians have individual "Electronic Medical Records" while in Andalucía, the primary care physicians share a common record which when secondary care is fully implemented will indeed be an "Electronic Health Record". It is clear that the diffusion of technology, within the primary care physician sector of the health care market, is subject to historical, financial, legal, cultural, and social factors. This tale of two places illustrates the issues, and different ways that they have been addressed.

  15. Estimating magnitude and frequency of floods using the PeakFQ 7.0 program

    USGS Publications Warehouse

    Veilleux, Andrea G.; Cohn, Timothy A.; Flynn, Kathleen M.; Mason, Jr., Robert R.; Hummel, Paul R.

    2014-01-01

    Flood-frequency analysis provides information about the magnitude and frequency of flood discharges based on records of annual maximum instantaneous peak discharges collected at streamgages. The information is essential for defining flood-hazard areas, for managing floodplains, and for designing bridges, culverts, dams, levees, and other flood-control structures. Bulletin 17B (B17B) of the Interagency Advisory Committee on Water Data (IACWD; 1982) codifies the standard methodology for conducting flood-frequency studies in the United States. B17B specifies that annual peak-flow data are to be fit to a log-Pearson Type III distribution. Specific methods are also prescribed for improving skew estimates using regional skew information, tests for high and low outliers, adjustments for low outliers and zero flows, and procedures for incorporating historical flood information. The authors of B17B identified various needs for methodological improvement and recommended additional study. In response to these needs, the Advisory Committee on Water Information (ACWI, successor to IACWD; http://acwi.gov/, Subcommittee on Hydrology (SOH), Hydrologic Frequency Analysis Work Group (HFAWG), has recommended modest changes to B17B. These changes include adoption of a generalized method-of-moments estimator denoted the Expected Moments Algorithm (EMA) (Cohn and others, 1997) and a generalized version of the Grubbs-Beck test for low outliers (Cohn and others, 2013). The SOH requested that the USGS implement these changes in a user-friendly, publicly accessible program.

  16. Leukemia and brain tumors among children after radiation exposure from CT scans: design and methodological opportunities of the Dutch Pediatric CT Study.

    PubMed

    Meulepas, Johanna M; Ronckers, Cécile M; Smets, Anne M J B; Nievelstein, Rutger A J; Jahnen, Andreas; Lee, Choonsik; Kieft, Mariëtte; Laméris, Johan S; van Herk, Marcel; Greuter, Marcel J W; Jeukens, Cécile R L P N; van Straten, Marcel; Visser, Otto; van Leeuwen, Flora E; Hauptmann, Michael

    2014-04-01

    Computed tomography (CT) scans are indispensable in modern medicine; however, the spectacular rise in global use coupled with relatively high doses of ionizing radiation per examination have raised radiation protection concerns. Children are of particular concern because they are more sensitive to radiation-induced cancer compared with adults and have a long lifespan to express harmful effects which may offset clinical benefits of performing a scan. This paper describes the design and methodology of a nationwide study, the Dutch Pediatric CT Study, regarding risk of leukemia and brain tumors in children after radiation exposure from CT scans. It is a retrospective record-linkage cohort study with an expected number of 100,000 children who received at least one electronically archived CT scan covering the calendar period since the introduction of digital archiving until 2012. Information on all archived CT scans of these children will be obtained, including date of examination, scanned body part and radiologist's report, as well as the machine settings required for organ dose estimation. We will obtain cancer incidence by record linkage with external databases. In this article, we describe several approaches to the collection of data on archived CT scans, the estimation of radiation doses and the assessment of confounding. The proposed approaches provide useful strategies for data collection and confounder assessment for general retrospective record-linkage studies, particular those using hospital databases on radiological procedures for the assessment of exposure to ionizing or non-ionizing radiation.

  17. Using a terminology server and consumer search phrases to help patients find physicians with particular expertise.

    PubMed

    Cole, Curtis L; Kanter, Andrew S; Cummens, Michael; Vostinar, Sean; Naeymi-Rad, Frank

    2004-01-01

    To design and implement a real world application using a terminology server to assist patients and physicians who use common language search terms to find specialist physicians with a particular clinical expertise. Terminology servers have been developed to help users encoding of information using complicated structured vocabulary during data entry tasks, such as recording clinical information. We describe a methodology using Personal Health Terminology trade mark and a SNOMED CT-based hierarchical concept server. Construction of a pilot mediated-search engine to assist users who use vernacular speech in querying data which is more technical than vernacular. This approach, which combines theoretical and practical requirements, provides a useful example of concept-based searching for physician referrals.

  18. Development of a Video Coding Scheme for Analyzing the Usability and Usefulness of Health Information Systems.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    Usability has been identified as a key issue in health informatics. Worldwide numerous projects have been carried out in an attempt to increase and optimize health system usability. Usability testing, involving observing end users interacting with systems, has been widely applied and numerous publications have appeared describing such studies. However, to date, fewer works have been published describing methodological approaches to analyzing the rich data stream that results from usability testing. This includes analysis of video, audio and screen recordings. In this paper we describe our work in the development and application of a coding scheme for analyzing the usability of health information systems. The phases involved in such analyses are described.

  19. Predicting the severity of motor neuron disease progression using electronic health record data with a cloud computing Big Data approach.

    PubMed

    Ko, Kyung Dae; El-Ghazawi, Tarek; Kim, Dongkyu; Morizono, Hiroki

    2014-05-01

    Motor neuron diseases (MNDs) are a class of progressive neurological diseases that damage the motor neurons. An accurate diagnosis is important for the treatment of patients with MNDs because there is no standard cure for the MNDs. However, the rates of false positive and false negative diagnoses are still very high in this class of diseases. In the case of Amyotrophic Lateral Sclerosis (ALS), current estimates indicate 10% of diagnoses are false-positives, while 44% appear to be false negatives. In this study, we developed a new methodology to profile specific medical information from patient medical records for predicting the progression of motor neuron diseases. We implemented a system using Hbase and the Random forest classifier of Apache Mahout to profile medical records provided by the Pooled Resource Open-Access ALS Clinical Trials Database (PRO-ACT) site, and we achieved 66% accuracy in the prediction of ALS progress.

  20. Meaning of Missing Values in Eyewitness Recall and Accident Records

    PubMed Central

    Uttl, Bob; Kisinger, Kelly

    2010-01-01

    Background Eyewitness recalls and accident records frequently do not mention the conditions and behaviors of interest to researchers and lead to missing values and to uncertainty about the prevalence of these conditions and behaviors surrounding accidents. Missing values may occur because eyewitnesses report the presence but not the absence of obvious clues/accident features. We examined this possibility. Methodology/Principal Findings Participants watched car accident videos and were asked to recall as much information as they could remember about each accident. The results showed that eyewitnesses were far more likely to report the presence of present obvious clues than the absence of absent obvious clues even though they were aware of their absence. Conclusions One of the principal mechanisms causing missing values may be eyewitnesses' tendency to not report the absence of obvious features. We discuss the implications of our findings for both retrospective and prospective analyses of accident records, and illustrate the consequences of adopting inappropriate assumptions about the meaning of missing values using the Avaluator Avalanche Accident Prevention Card. PMID:20824054

  1. Recruitment, Methods, and Descriptive Results of a Physiologic Assessment of Latino Farmworkers: The California Heat Illness Prevention Study.

    PubMed

    Mitchell, Diane C; Castro, Javier; Armitage, Tracey L; Vega-Arroyo, Alondra J; Moyce, Sally C; Tancredi, Daniel J; Bennett, Deborah H; Jones, James H; Kjellstrom, Tord; Schenker, Marc B

    2017-07-01

    The California heat illness prevention study (CHIPS) devised methodology and collected physiological data to assess heat related illness (HRI) risk in Latino farmworkers. Bilingual researchers monitored HRI across a workshift, recording core temperature, work rate (metabolic equivalents [METs]), and heart rate at minute intervals. Hydration status was assessed by changes in weight and blood osmolality. Personal data loggers and a weather station measured exposure to heat. Interviewer administered questionnaires were used to collect demographic and occupational information. California farmworkers (n = 588) were assessed. Acceptable quality data was obtained from 80% of participants (core temperature) to 100% of participants (weight change). Workers (8.3%) experienced a core body temperature more than or equal to 38.5 °C and 11.8% experienced dehydration (lost more than 1.5% of body weight). Methodology is presented for the first comprehensive physiological assessment of HRI risk in California farmworkers.

  2. Benzene exposure in the petroleum distribution industry associated with leukemia in the United Kingdom: overview of the methodology of a case-control study.

    PubMed Central

    Rushton, L

    1996-01-01

    This paper describes basic principles underlying the methodology for obtaining quantitative estimates of benzene exposure in the petroleum marketing and distribution industry. Work histories for 91 cases of leukemia and 364 matched controls (4 per case) identified for a cohort of oil distribution workers up to the end of 1992 were obtained, primarily from personnel records. Information on the distribution sites, more than 90% of which were closed at the time of data collection, was obtained from site visits and archive material. Industrial hygiene measurements measured under known conditions were assembled for different tasks. These were adjusted for conditions where measured data were not available using variables known to influence exposure, such as temperature, technology, percentage of benzene in fuel handled, products handled, number of loads, and job activity. A quantitative estimate of dermal contact and peak exposure was also made. PMID:9118922

  3. Investigating human cognitive performance during spaceflight

    NASA Astrophysics Data System (ADS)

    Pattyn, Nathalie; Migeotte, Pierre-Francois; Demaeseleer, Wim; Kolinsky, Regine; Morais, Jose; Zizi, Martin

    2005-08-01

    Although astronauts' subjective self-evaluation of cognitive functioning often reports impairments, to date most studies of human higher cognitive functions in space never yielded univocal results. Since no golden standard exists to evaluate the higher cognitive functions, we proposed to assess astronaut's cognitive performance through a novel series of tests combined with the simultaneous recording of physiological parameters. We report here the validation of our methodology and the cognitive results of this testing on the cosmonauts from the 11 days odISSsea mission to the ISS (2002) and on a control group of pilots, carefully matched to the characteristics of the subjects. For the first time, we show a performance decrement in higher cognitive functions during space flight. Our results show a significant performance decrement for inflight measurement, as well as measurable variations in executive control of cognitive functions. Taken together, our data establish the validity of our methodology and the presence of a different information processing in operational conditions.

  4. Toward a Bio-Medical Thesaurus: Building the Foundation of the UMLS

    PubMed Central

    Tuttle, Mark S.; Blois, Marsden S.; Erlbaum, Mark S.; Nelson, Stuart J.; Sherertz, David D.

    1988-01-01

    The Unified Medical Language System (UMLS) is being designed to provide a uniform user interface to heterogeneous machine-readable bio-medical information resources, such as bibliographic databases, genetic databases, expert systems and patient records.1 Such an interface will have to recognize different ways of saying the same thing, and provide links to ways of saying related things. One way to represent the necessary associations is via a domain thesaurus. As no such thesaurus exists, and because, once built, it will be both sizable and in need of continuous maintenance, its design should include a methodology for building and maintaining it. We propose a methodology, utilizing lexically expanded schema inversion, and a design, called T. Lex, which together form one approach to the problem of defining and building a bio-medical thesaurus. We argue that the semantic locality implicit in such a thesaurus will support model-based reasoning in bio-medicine.2

  5. Detecting eye movements in dynamic environments.

    PubMed

    Reimer, Bryan; Sodhi, Manbir

    2006-11-01

    To take advantage of the increasing number of in-vehicle devices, automobile drivers must divide their attention between primary (driving) and secondary (operating in-vehicle device) tasks. In dynamic environments such as driving, however, it is not easy to identify and quantify how a driver focuses on the various tasks he/she is simultaneously engaged in, including the distracting tasks. Measures derived from the driver's scan path have been used as correlates of driver attention. This article presents a methodology for analyzing eye positions, which are discrete samples of a subject's scan path, in order to categorize driver eye movements. Previous methods of analyzing eye positions recorded in a dynamic environment have relied completely on the manual identification of the focus of visual attention from a point of regard superimposed on a video of a recorded scene, failing to utilize information regarding movement structure in the raw recorded eye positions. Although effective, these methods are too time consuming to be easily used when the large data sets that would be required to identify subtle differences between drivers, under different road conditions, and with different levels of distraction are processed. The aim of the methods presented in this article are to extend the degree of automation in the processing of eye movement data by proposing a methodology for eye movement analysis that extends automated fixation identification to include smooth and saccadic movements. By identifying eye movements in the recorded eye positions, a method of reducing the analysis of scene video to a finite search space is presented. The implementation of a software tool for the eye movement analysis is described, including an example from an on-road test-driving sample.

  6. Automated Thermal Image Processing for Detection and Classification of Birds and Bats - FY2012 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duberstein, Corey A.; Matzner, Shari; Cullinan, Valerie I.

    Surveying wildlife at risk from offshore wind energy development is difficult and expensive. Infrared video can be used to record birds and bats that pass through the camera view, but it is also time consuming and expensive to review video and determine what was recorded. We proposed to conduct algorithm and software development to identify and to differentiate thermally detected targets of interest that would allow automated processing of thermal image data to enumerate birds, bats, and insects. During FY2012 we developed computer code within MATLAB to identify objects recorded in video and extract attribute information that describes the objectsmore » recorded. We tested the efficiency of track identification using observer-based counts of tracks within segments of sample video. We examined object attributes, modeled the effects of random variability on attributes, and produced data smoothing techniques to limit random variation within attribute data. We also began drafting and testing methodology to identify objects recorded on video. We also recorded approximately 10 hours of infrared video of various marine birds, passerine birds, and bats near the Pacific Northwest National Laboratory (PNNL) Marine Sciences Laboratory (MSL) at Sequim, Washington. A total of 6 hours of bird video was captured overlooking Sequim Bay over a series of weeks. An additional 2 hours of video of birds was also captured during two weeks overlooking Dungeness Bay within the Strait of Juan de Fuca. Bats and passerine birds (swallows) were also recorded at dusk on the MSL campus during nine evenings. An observer noted the identity of objects viewed through the camera concurrently with recording. These video files will provide the information necessary to produce and test software developed during FY2013. The annotation will also form the basis for creation of a method to reliably identify recorded objects.« less

  7. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  8. Methodology for Generating Conflict Scenarios by Time Shifting Recorded Traffic Data

    NASA Technical Reports Server (NTRS)

    Paglione, Mike; Oaks, Robert; Bilimoria, Karl D.

    2003-01-01

    A methodology is presented for generating conflict scenarios that can be used as test cases to estimate the operational performance of a conflict probe. Recorded air traffic data is time shifted to create traffic scenarios featuring conflicts with characteristic properties similar to those encountered in typical air traffic operations. First, a reference set of conflicts is obtained from trajectories that are computed using birth points and nominal flight plans extracted from recorded traffic data. Distributions are obtained for several primary properties (e.g., encounter angle) that are most likely to affect the performance of a conflict probe. A genetic algorithm is then utilized to determine the values of time shifts for the recorded track data so that the primary properties of conflicts generated by the time shifted data match those of the reference set. This methodology is successfully demonstrated using recorded traffic data for the Memphis Air Route Traffic Control Center; a key result is that the required time shifts are less than 5 min for 99% of the tracks. It is also observed that close matching of the primary properties used in this study additionally provides a good match for some other secondary properties.

  9. Monitoring safety in a phase III real‐world effectiveness trial: use of novel methodology in the Salford Lung Study

    PubMed Central

    Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F.; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P.; McCrae, John; McCorkindale, Sheila; Leather, David

    2016-01-01

    Abstract Background The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once‐daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. Objective The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in‐depth exploration of the safety results will be the subject of future publications. Achievements The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Conclusion Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. PMID:27804174

  10. Monitoring safety in a phase III real-world effectiveness trial: use of novel methodology in the Salford Lung Study.

    PubMed

    Collier, Sue; Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P; McCrae, John; McCorkindale, Sheila; Leather, David

    2017-03-01

    The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once-daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in-depth exploration of the safety results will be the subject of future publications. The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd.

  11. Does Metformin Reduce Cancer Risks? Methodologic Considerations.

    PubMed

    Golozar, Asieh; Liu, Shuiqing; Lin, Joeseph A; Peairs, Kimberly; Yeh, Hsin-Chieh

    2016-01-01

    The substantial burden of cancer and diabetes and the association between the two conditions has been a motivation for researchers to look for targeted strategies that can simultaneously affect both diseases and reduce their overlapping burden. In the absence of randomized clinical trials, researchers have taken advantage of the availability and richness of administrative databases and electronic medical records to investigate the effects of drugs on cancer risk among diabetic individuals. The majority of these studies suggest that metformin could potentially reduce cancer risk. However, the validity of this purported reduction in cancer risk is limited by several methodological flaws either in the study design or in the analysis. Whether metformin use decreases cancer risk relies heavily on the availability of valid data sources with complete information on confounders, accurate assessment of drug use, appropriate study design, and robust analytical techniques. The majority of the observational studies assessing the association between metformin and cancer risk suffer from methodological shortcomings and efforts to address these issues have been incomplete. Future investigations on the association between metformin and cancer risk should clearly address the methodological issues due to confounding by indication, prevalent user bias, and time-related biases. Although the proposed strategies do not guarantee a bias-free estimate for the association between metformin and cancer, they will reduce synthesis of and reporting of erroneous results.

  12. Cenozoic climate changes: A review based on time series analysis of marine benthic δ18O records

    NASA Astrophysics Data System (ADS)

    Mudelsee, Manfred; Bickert, Torsten; Lear, Caroline H.; Lohmann, Gerrit

    2014-09-01

    The climate during the Cenozoic era changed in several steps from ice-free poles and warm conditions to ice-covered poles and cold conditions. Since the 1950s, a body of information on ice volume and temperature changes has been built up predominantly on the basis of measurements of the oxygen isotopic composition of shells of benthic foraminifera collected from marine sediment cores. The statistical methodology of time series analysis has also evolved, allowing more information to be extracted from these records. Here we provide a comprehensive view of Cenozoic climate evolution by means of a coherent and systematic application of time series analytical tools to each record from a compilation spanning the interval from 4 to 61 Myr ago. We quantitatively describe several prominent features of the oxygen isotope record, taking into account the various sources of uncertainty (including measurement, proxy noise, and dating errors). The estimated transition times and amplitudes allow us to assess causal climatological-tectonic influences on the following known features of the Cenozoic oxygen isotopic record: Paleocene-Eocene Thermal Maximum, Eocene-Oligocene Transition, Oligocene-Miocene Boundary, and the Middle Miocene Climate Optimum. We further describe and causally interpret the following features: Paleocene-Eocene warming trend, the two-step, long-term Eocene cooling, and the changes within the most recent interval (Miocene-Pliocene). We review the scope and methods of constructing Cenozoic stacks of benthic oxygen isotope records and present two new latitudinal stacks, which capture besides global ice volume also bottom water temperatures at low (less than 30°) and high latitudes. This review concludes with an identification of future directions for data collection, statistical method development, and climate modeling.

  13. An empirical approach to predicting long term behavior of metal particle based recording media

    NASA Technical Reports Server (NTRS)

    Hadad, Allan S.

    1991-01-01

    Alpha iron particles used for magnetic recording are prepared through a series of dehydration and reduction steps of alpha-Fe2O3-H2O resulting in acicular, polycrystalline, body centered cubic (bcc) alpha-Fe particles that are single magnetic domains. Since fine iron particles are pyrophoric by nature, stabilization processes had to be developed in order for iron particles to be considered as a viable recording medium for long term archival (i.e., 25+ years) information storage. The primary means of establishing stability is through passivation or controlled oxidation of the iron particle's surface. Since iron particles used for magnetic recording are small, additional oxidation has a direct impact on performance especially where archival storage of recorded information for long periods of time is important. Further stabilization chemistry/processes had to be developed to guarantee that iron particles could be considered as a viable long term recording medium. In an effort to retard the diffusion of iron ions through the oxide layer, other elements such as silicon, aluminum, and chromium have been added to the base iron to promote more dense scale formation or to alleviate some of the non-stoichiometric behavior of the oxide or both. The presence of water vapor has been shown to disrupt the passive layer, subsequently increasing the oxidation rate of the iron. A study was undertaken to examine the degradation in magnetic properties as a function of both temperature and humidity on silicon-containing iron particles between 50-120 deg C and 3-89 percent relative humidity. The methodology to which experimental data was collected and analyzed leading to predictive capability is discussed.

  14. Data driven discrete-time parsimonious identification of a nonlinear state-space model for a weakly nonlinear system with short data record

    NASA Astrophysics Data System (ADS)

    Relan, Rishi; Tiels, Koen; Marconato, Anna; Dreesen, Philippe; Schoukens, Johan

    2018-05-01

    Many real world systems exhibit a quasi linear or weakly nonlinear behavior during normal operation, and a hard saturation effect for high peaks of the input signal. In this paper, a methodology to identify a parsimonious discrete-time nonlinear state space model (NLSS) for the nonlinear dynamical system with relatively short data record is proposed. The capability of the NLSS model structure is demonstrated by introducing two different initialisation schemes, one of them using multivariate polynomials. In addition, a method using first-order information of the multivariate polynomials and tensor decomposition is employed to obtain the parsimonious decoupled representation of the set of multivariate real polynomials estimated during the identification of NLSS model. Finally, the experimental verification of the model structure is done on the cascaded water-benchmark identification problem.

  15. Capturing patient information at nursing shift changes: methodological evaluation of speech recognition and information extraction

    PubMed Central

    Suominen, Hanna; Johnson, Maree; Zhou, Liyuan; Sanchez, Paula; Sirel, Raul; Basilakis, Jim; Hanlen, Leif; Estival, Dominique; Dawson, Linda; Kelly, Barbara

    2015-01-01

    Objective We study the use of speech recognition and information extraction to generate drafts of Australian nursing-handover documents. Methods Speech recognition correctness and clinicians’ preferences were evaluated using 15 recorder–microphone combinations, six documents, three speakers, Dragon Medical 11, and five survey/interview participants. Information extraction correctness evaluation used 260 documents, six-class classification for each word, two annotators, and the CRF++ conditional random field toolkit. Results A noise-cancelling lapel-microphone with a digital voice recorder gave the best correctness (79%). This microphone was also the most preferred option by all but one participant. Although the participants liked the small size of this recorder, their preference was for tablets that can also be used for document proofing and sign-off, among other tasks. Accented speech was harder to recognize than native language and a male speaker was detected better than a female speaker. Information extraction was excellent in filtering out irrelevant text (85% F1) and identifying text relevant to two classes (87% and 70% F1). Similarly to the annotators’ disagreements, there was confusion between the remaining three classes, which explains the modest 62% macro-averaged F1. Discussion We present evidence for the feasibility of speech recognition and information extraction to support clinicians’ in entering text and unlock its content for computerized decision-making and surveillance in healthcare. Conclusions The benefits of this automation include storing all information; making the drafts available and accessible almost instantly to everyone with authorized access; and avoiding information loss, delays, and misinterpretations inherent to using a ward clerk or transcription services. PMID:25336589

  16. Methodology of Historical Flood Evaluation from Korean Historical Documents during AD 1392 to 1910

    NASA Astrophysics Data System (ADS)

    Cho, H. B.; Kim, H.; Noh, S.; Jang, C.

    2007-12-01

    Study on extreme flood events has critical limitation of shortage of historical data because modern systematic data don't implement long time series. The historical documentary records hence can be one of the important sources to contribute additional information on extreme flood events which had occurred before the instrumental observations began. For the proper data mining, documentary records satisfying following four conditions are preferred. 1. Long enough time series, 2. Official archives covering over all Korean peninsular, 3. Abundant enough record number, and 4. Detailed damage description. The Annals of Choson Dynasty includes about 500 years and 511 number of flood records during Choson Dynasty in ancient Korea. According to the annals, there were highly dense flood damage records in the middle of 17th century and the largest human damage and residence damage occurred in 1739 and 1856 respectively. Another source is Jeungbo-Munheonbigo. Jeungbo-Munheonbigo is a taxonomic document categorized by the themes such as cultures, social systems, and climates as well as contains 79 number of flood damage records. An effective way to analyze those historical floods without water level data is to classify and categorize the flood damage records because all records are written in descriptive way. Consequently, 556 records are categorized into 10 items by flood damage types and each categorized record is classified into three grades by numerical level that is how much the record is expressed in numerical way. These grouping results are applied to decide reasonable period range to get detailed information from entire inspection period. In addition, Historical Flood Evaluation Index (HFEI) thereby can be derived from the processes in quantitative and statistical ways to evaluate the magnitude of each ancient flood. In this research, flood damage evaluation is mainly focused on the damage of human beings and residences. Also degree ranges based on cumulative probability are induced with two damage inventory. HFEI by conditional weighted factors is applied to every flood record and to analysis for flood distribution in annual series.

  17. Biodiversity's Big Wet Secret: The Global Distribution of Marine Biological Records Reveals Chronic Under-Exploration of the Deep Pelagic Ocean

    PubMed Central

    Webb, Thomas J.; Vanden Berghe, Edward; O'Dor, Ron

    2010-01-01

    Background Understanding the distribution of marine biodiversity is a crucial first step towards the effective and sustainable management of marine ecosystems. Recent efforts to collate location records from marine surveys enable us to assemble a global picture of recorded marine biodiversity. They also effectively highlight gaps in our knowledge of particular marine regions. In particular, the deep pelagic ocean – the largest biome on Earth – is chronically under-represented in global databases of marine biodiversity. Methodology/Principal Findings We use data from the Ocean Biogeographic Information System to plot the position in the water column of ca 7 million records of marine species occurrences. Records from relatively shallow waters dominate this global picture of recorded marine biodiversity. In addition, standardising the number of records from regions of the ocean differing in depth reveals that regardless of ocean depth, most records come either from surface waters or the sea bed. Midwater biodiversity is drastically under-represented. Conclusions/Significance The deep pelagic ocean is the largest habitat by volume on Earth, yet it remains biodiversity's big wet secret, as it is hugely under-represented in global databases of marine biological records. Given both its value in the provision of a range of ecosystem services, and its vulnerability to threats including overfishing and climate change, there is a pressing need to increase our knowledge of Earth's largest ecosystem. PMID:20689845

  18. 76 FR 50993 - Agency Information Collection Activities: Proposed Collection; Comment Request-Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ...: Proposed Collection; Comment Request--Generic Clearance to Conduct Methodological Testing, Surveys, Focus... proposed information collection. This information collection will conduct research by methodological... Methodological Testing, Surveys, Focus Groups, and Related Tools to Improve the Management of Federal Nutrition...

  19. A methodology to event reconstruction from trace images.

    PubMed

    Milliet, Quentin; Delémont, Olivier; Sapin, Eric; Margot, Pierre

    2015-03-01

    The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  20. The inverse problem in electroencephalography using the bidomain model of electrical activity.

    PubMed

    Lopez Rincon, Alejandro; Shimoda, Shingo

    2016-12-01

    Acquiring information about the distribution of electrical sources in the brain from electroencephalography (EEG) data remains a significant challenge. An accurate solution would provide an understanding of the inner mechanisms of the electrical activity in the brain and information about damaged tissue. In this paper, we present a methodology for reconstructing brain electrical activity from EEG data by using the bidomain formulation. The bidomain model considers continuous active neural tissue coupled with a nonlinear cell model. Using this technique, we aim to find the brain sources that give rise to the scalp potential recorded by EEG measurements taking into account a non-static reconstruction. We simulate electrical sources in the brain volume and compare the reconstruction to the minimum norm estimates (MNEs) and low resolution electrical tomography (LORETA) results. Then, with the EEG dataset from the EEG Motor Movement/Imagery Database of the Physiobank, we identify the reaction to visual stimuli by calculating the time between stimulus presentation and the spike in electrical activity. Finally, we compare the activation in the brain with the registered activation using the LinkRbrain platform. Our methodology shows an improved reconstruction of the electrical activity and source localization in comparison with MNE and LORETA. For the Motor Movement/Imagery Database, the reconstruction is consistent with the expected position and time delay generated by the stimuli. Thus, this methodology is a suitable option for continuously reconstructing brain potentials. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  1. Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches

    PubMed Central

    Saneiro, Mar; Salmeron-Majadas, Sergio

    2014-01-01

    We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support. PMID:24892055

  2. Towards emotion detection in educational scenarios from facial expressions and body movements through multimodal approaches.

    PubMed

    Saneiro, Mar; Santos, Olga C; Salmeron-Majadas, Sergio; Boticario, Jesus G

    2014-01-01

    We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support.

  3. How geostatistics can help you find lead and galvanized water service lines: The case of Flint, MI.

    PubMed

    Goovaerts, Pierre

    2017-12-01

    In the aftermath of Flint drinking water crisis, most US cities have been scrambling to locate all lead service lines (LSLs) in their water supply systems. This information, which is most often inaccurate or lacking, is critical to assess compliance with the Lead and Copper Rule and to plan the replacement of lead and galvanized service lines (GSLs) as currently under way in Flint. This paper presents the first geospatial approach to predict the likelihood that a home has a LSL or GSL based on neighboring field data (i.e., house inspection) and secondary information (i.e., construction year and city records). The methodology is applied to the City of Flint where 3254 homes have been inspected by the Michigan Department of Environmental Quality to identify service line material. GSLs and LSLs were mostly observed in houses built prior to 1934 and during World War II, respectively. City records led to the over-identification of LSLs, likely because old records were not updated as these lines were being replaced. Indicator semivariograms indicated that both types of service line are spatially clustered with a range of 1.4km for LSLs and 2.8km for GSLs. This spatial autocorrelation was integrated with secondary data using residual indicator kriging to predict the probability of finding each type of material at the tax parcel level. Cross-validation analysis using Receiver Operating Characteristic (ROC) Curves demonstrated the greater accuracy of the kriging model relative to the current approach targeting houses built in the forties; in particular as more field data become available. Anticipated rates of false positives and percentages of detection were computed for different sampling strategies. This approach is flexible enough to accommodate additional sources of information, such as local code and regulatory changes, historical permit records, maintenance and operation records, or customer self-reporting. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Towards a Methodology for the Design of Multimedia Public Access Interfaces.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1998-01-01

    Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…

  5. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series.

    PubMed

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in multivariate patterns of voxel activity.

  6. The two-brain approach: how can mutually interacting brains teach us something about social interaction?

    PubMed Central

    Konvalinka, Ivana; Roepstorff, Andreas

    2012-01-01

    Measuring brain activity simultaneously from two people interacting is intuitively appealing if one is interested in putative neural markers of social interaction. However, given the complex nature of interactions, it has proven difficult to carry out two-person brain imaging experiments in a methodologically feasible and conceptually relevant way. Only a small number of recent studies have put this into practice, using fMRI, EEG, or NIRS. Here, we review two main two-brain methodological approaches, each with two conceptual strategies. The first group has employed two-brain fMRI recordings, studying (1) turn-based interactions on the order of seconds, or (2) pseudo-interactive scenarios, where only one person is scanned at a time, investigating the flow of information between brains. The second group of studies has recorded dual EEG/NIRS from two people interacting, in (1) face-to-face turn-based interactions, investigating functional connectivity between theory-of-mind regions of interacting partners, or in (2) continuous mutual interactions on millisecond timescales, to measure coupling between the activity in one person's brain and the activity in the other's brain. We discuss the questions these approaches have addressed, and consider scenarios when simultaneous two-brain recordings are needed. Furthermore, we suggest that (1) quantification of inter-personal neural effects via measures of emergence, and (2) multivariate decoding models that generalize source-specific features of interaction, may provide novel tools to study brains in interaction. This may allow for a better understanding of social cognition as both representation and participation. PMID:22837744

  7. Beyond annual streamflow reconstructions for the Upper Colorado River Basin: a paleo-water-balance approach

    USGS Publications Warehouse

    Gangopadhyay, Subhrendu; McCabe, Gregory J.; Woodhouse, Connie A.

    2015-01-01

    In this paper, we present a methodology to use annual tree-ring chronologies and a monthly water balance model to generate annual reconstructions of water balance variables (e.g., potential evapotrans- piration (PET), actual evapotranspiration (AET), snow water equivalent (SWE), soil moisture storage (SMS), and runoff (R)). The method involves resampling monthly temperature and precipitation from the instrumental record directed by variability indicated by the paleoclimate record. The generated time series of monthly temperature and precipitation are subsequently used as inputs to a monthly water balance model. The methodology is applied to the Upper Colorado River Basin, and results indicate that the methodology reliably simulates water-year runoff, maximum snow water equivalent, and seasonal soil moisture storage for the instrumental period. As a final application, the methodology is used to produce time series of PET, AET, SWE, SMS, and R for the 1404–1905 period for the Upper Colorado River Basin.

  8. Assessment of COPD-related outcomes via a national electronic medical record database.

    PubMed

    Asche, Carl; Said, Quayyim; Joish, Vijay; Hall, Charles Oaxaca; Brixner, Diana

    2008-01-01

    The technology and sophistication of healthcare utilization databases have expanded over the last decade to include results of lab tests, vital signs, and other clinical information. This review provides an assessment of the methodological and analytical challenges of conducting chronic obstructive pulmonary disease (COPD) outcomes research in a national electronic medical records (EMR) dataset and its potential application towards the assessment of national health policy issues, as well as a description of the challenges or limitations. An EMR database and its application to measuring outcomes for COPD are described. The ability to measure adherence to the COPD evidence-based practice guidelines, generated by the NIH and HEDIS quality indicators, in this database was examined. Case studies, before and after their publication, were used to assess the adherence to guidelines and gauge the conformity to quality indicators. EMR was the only source of information for pulmonary function tests, but low frequency in ordering by primary care was an issue. The EMR data can be used to explore impact of variation in healthcare provision on clinical outcomes. The EMR database permits access to specific lab data and biometric information. The richness and depth of information on "real world" use of health services for large population-based analytical studies at relatively low cost render such databases an attractive resource for outcomes research. Various sources of information exist to perform outcomes research. It is important to understand the desired endpoints of such research and choose the appropriate database source.

  9. Automatic generation of computable implementation guides from clinical information models.

    PubMed

    Boscá, Diego; Maldonado, José Alberto; Moner, David; Robles, Montserrat

    2015-06-01

    Clinical information models are increasingly used to describe the contents of Electronic Health Records. Implementation guides are a common specification mechanism used to define such models. They contain, among other reference materials, all the constraints and rules that clinical information must obey. However, these implementation guides typically are oriented to human-readability, and thus cannot be processed by computers. As a consequence, they must be reinterpreted and transformed manually into an executable language such as Schematron or Object Constraint Language (OCL). This task can be difficult and error prone due to the big gap between both representations. The challenge is to develop a methodology for the specification of implementation guides in such a way that humans can read and understand easily and at the same time can be processed by computers. In this paper, we propose and describe a novel methodology that uses archetypes as basis for generation of implementation guides. We use archetypes to generate formal rules expressed in Natural Rule Language (NRL) and other reference materials usually included in implementation guides such as sample XML instances. We also generate Schematron rules from NRL rules to be used for the validation of data instances. We have implemented these methods in LinkEHR, an archetype editing platform, and exemplify our approach by generating NRL rules and implementation guides from EN ISO 13606, openEHR, and HL7 CDA archetypes. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Clinical diabetes research using data mining: a Canadian perspective.

    PubMed

    Shah, Baiju R; Lipscombe, Lorraine L

    2015-06-01

    With the advent of the digitization of large amounts of information and the computer power capable of analyzing this volume of information, data mining is increasingly being applied to medical research. Datasets created for administration of the healthcare system provide a wealth of information from different healthcare sectors, and Canadian provinces' single-payer universal healthcare systems mean that data are more comprehensive and complete in this country than in many other jurisdictions. The increasing ability to also link clinical information, such as electronic medical records, laboratory test results and disease registries, has broadened the types of data available for analysis. Data-mining methods have been used in many different areas of diabetes clinical research, including classic epidemiology, effectiveness research, population health and health services research. Although methodologic challenges and privacy concerns remain important barriers to using these techniques, data mining remains a powerful tool for clinical research. Copyright © 2015 Canadian Diabetes Association. Published by Elsevier Inc. All rights reserved.

  11. User-oriented views in health care information systems.

    PubMed

    Portoni, Luisa; Combi, Carlo; Pinciroli, Francesco

    2002-12-01

    In this paper, we present the methodology we adopted in designing and developing an object-oriented database system for the management of medical records. The designed system provides technical solutions to important requirements of most clinical information systems, such as 1) the support of tools to create and manage views on data and view schemas, offering to different users specific perspectives on data tailored to their needs; 2) the capability to handle in a suitable way the temporal aspects related to clinical information; and 3) the effective integration of multimedia data. Remote data access for authorized users is also considered. As clinical application, we describe here the prototype of a user-oriented clinical information system for the archiving and the management of multimedia and temporally oriented clinical data related to percutaneous transluminal coronary angioplasty (PTCA) patients. Suitable view schemas for various user roles (cath-lab physician, ward nurse, general practitioner) have been modeled and implemented on the basis of a detailed analysis of the considered clinical environment, carried out by an object-oriented approach.

  12. Linking Supermarket Sales Data To Nutritional Information: An Informatics Feasibility Study

    PubMed Central

    Brinkerhoff, Kristina M.; Brewster, Philip J.; Clark, Edward B.; Jordan, Kristine C.; Cummins, Mollie R.; Hurdle, John F.

    2011-01-01

    Grocery sales are a data source of potential value to dietary assessment programs in public health informatics. However, the lack of a computable method for mapping between nutrient and food item information represents a major obstacle. We studied the feasibility of linking point-of-sale data to USDA-SR nutrient database information in a sustainable way. We analyzed 2,009,533 de-identified sales items purchased by 32,785 customers over a two-week period. We developed a method using the item category hierarchy in the supermarket’s database to link purchased items to records from the USDA-SR. We describe our methodology and its rationale and limitations. Approximately 70% of all items were mapped and linked to the SR; approximately 90% of all items could be mapped with an equivalent expenditure of additional effort. 100% of all items were mapped to USDA standard food groups. We conclude that mapping grocery sales data to nutritional information is feasible. PMID:22195115

  13. Bayesian analysis of U.S. hurricane climate

    USGS Publications Warehouse

    Elsner, James B.; Bossak, Brian H.

    2001-01-01

    Predictive climate distributions of U.S. landfalling hurricanes are estimated from observational records over the period 1851–2000. The approach is Bayesian, combining the reliable records of hurricane activity during the twentieth century with the less precise accounts of activity during the nineteenth century to produce a best estimate of the posterior distribution on the annual rates. The methodology provides a predictive distribution of future activity that serves as a climatological benchmark. Results are presented for the entire coast as well as for the Gulf Coast, Florida, and the East Coast. Statistics on the observed annual counts of U.S. hurricanes, both for the entire coast and by region, are similar within each of the three consecutive 50-yr periods beginning in 1851. However, evidence indicates that the records during the nineteenth century are less precise. Bayesian theory provides a rational approach for defining hurricane climate that uses all available information and that makes no assumption about whether the 150-yr record of hurricanes has been adequately or uniformly monitored. The analysis shows that the number of major hurricanes expected to reach the U.S. coast over the next 30 yr is 18 and the number of hurricanes expected to hit Florida is 20.

  14. The importance of independent chronology in integrating records of past climate change for the 60-8 ka INTIMATE time interval

    NASA Astrophysics Data System (ADS)

    Brauer, Achim; Hajdas, Irka; Blockley, Simon P. E.; Bronk Ramsey, Christopher; Christl, Marcus; Ivy-Ochs, Susan; Moseley, Gina E.; Nowaczyk, Norbert N.; Rasmussen, Sune O.; Roberts, Helen M.; Spötl, Christoph; Staff, Richard A.; Svensson, Anders

    2014-12-01

    This paper provides a brief overview of the most common dating techniques applied in palaeoclimate and palaeoenvironmental studies including four radiometric and isotopic dating methods (radiocarbon, 230Th disequilibrium, luminescence, cosmogenic nuclides) and two incremental methods based on layer counting (ice layer, varves). For each method, concise background information about the fundamental principles and methodological approaches is provided. We concentrate on the time interval of focus for the INTIMATE (Integrating Ice core, MArine and TErrestrial records) community (60-8 ka). This dating guide addresses palaeoclimatologists who aim at interpretation of their often regional and local proxy time series in a wider spatial context and, therefore, have to rely on correlation with proxy records obtained from different archives from various regions. For this reason, we especially emphasise scientific approaches for harmonising chronologies for sophisticated and robust proxy data integration. In this respect, up-to-date age modelling techniques are presented as well as tools for linking records by age equivalence including tephrochronology, cosmogenic 10Be and palaeomagnetic variations. Finally, to avoid inadequate documentation of chronologies and assure reliable correlation of proxy time series, this paper provides recommendations for minimum standards of uncertainty and age datum reporting.

  15. Use of a GIS-based hybrid artificial neural network to prioritize the order of pipe replacement in a water distribution network.

    PubMed

    Ho, Cheng-I; Lin, Min-Der; Lo, Shang-Lien

    2010-07-01

    A methodology based on the integration of a seismic-based artificial neural network (ANN) model and a geographic information system (GIS) to assess water leakage and to prioritize pipeline replacement is developed in this work. Qualified pipeline break-event data derived from the Taiwan Water Corporation Pipeline Leakage Repair Management System were analyzed. "Pipe diameter," "pipe material," and "the number of magnitude-3( + ) earthquakes" were employed as the input factors of ANN, while "the number of monthly breaks" was used for the prediction output. This study is the first attempt to manipulate earthquake data in the break-event ANN prediction model. Spatial distribution of the pipeline break-event data was analyzed and visualized by GIS. Through this, the users can swiftly figure out the hotspots of the leakage areas. A northeastern township in Taiwan, frequently affected by earthquakes, is chosen as the case study. Compared to the traditional processes for determining the priorities of pipeline replacement, the methodology developed is more effective and efficient. Likewise, the methodology can overcome the difficulty of prioritizing pipeline replacement even in situations where the break-event records are unavailable.

  16. Methodology to determine the parameters of historical earthquakes in China

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Lin, Guoliang; Zhang, Zhe

    2017-12-01

    China is one of the countries with the longest cultural tradition. Meanwhile, China has been suffering very heavy earthquake disasters; so, there are abundant earthquake recordings. In this paper, we try to sketch out historical earthquake sources and research achievements in China. We will introduce some basic information about the collections of historical earthquake sources, establishing intensity scale and the editions of historical earthquake catalogues. Spatial-temporal and magnitude distributions of historical earthquake are analyzed briefly. Besides traditional methods, we also illustrate a new approach to amend the parameters of historical earthquakes or even identify candidate zones for large historical or palaeo-earthquakes. In the new method, a relationship between instrumentally recorded small earthquakes and strong historical earthquakes is built up. Abundant historical earthquake sources and the achievements of historical earthquake research in China are of valuable cultural heritage in the world.

  17. Considerations in the Use of Interactive Voice Recording for the Temporal Assessment of Suicidal Ideation and Alcohol Use.

    PubMed

    Bishop, Todd M; Maisto, Stephen A; Britton, Peter C; Pigeon, Wilfred R

    2016-09-01

    A greater understanding of the temporal variation of suicidal ideation and suicidal behavior is needed to inform more effective prevention efforts. Interactive voice recording (IVR) allows for the study of temporal relationships that cannot be captured with most traditional methodologies. To examine the feasibility of implementing IVR for the assessment of suicidal ideation. Participants (n = 4) receiving a brief intervention based on dialectical behavior therapy were asked to respond to three phone-based surveys each day over 6 weeks that assessed suicidal ideation and alcohol consumption. Participants completed 77.7% of daily assessments, reported that calls were not burdensome, and indicated that calls were sometimes helpful in interrupting suicidal ideation. The preliminary data reported here provide optimism for the use of IVR and other forms of ecological momentary assessment in the exploration of the antecedents of suicidal behavior.

  18. SHOULDER ARTHROPLASTY RECORDS

    PubMed Central

    Filho, Geraldo Motta; Galvão, Marcus Vinicius; Monteiro, Martim; Cohen, Marcio; Brandão, Bruno

    2015-01-01

    The study's objective is to evaluate the characteristics and problems of patients who underwent shoulder arthroplasties between July 2004 and November 2006. Methodology: During the period of the study, 145 shoulder arthroplasties were performed. A prospective protocol was used for every patient; demographic, clinical and surgical procedure data were collected. All gathered data were included in the data base. The patients were divided in three major groups: fractures, degenerative diseases and trauma sequels. Information obtained from the data base was correlated in order to determine patients' epidemiologic, injuries, and surgical procedure profiles. Results: Of the 145 shoulder arthroplasties performed, 37% presented trauma sequels, 30% degenerative diseases, and 33% proximal humerus fracture. 12% of the cases required total arthroplasties and 88% partial arthroplasties. Five major complications were observed on early postoperative period. Conclusion: Shoulder arthroplasties have become a common procedure in orthopaedic practice. Surgical records are important in evidencing progressive evolution and in enabling future clinical outcomes evaluation. PMID:26998463

  19. A cloud-based approach for interoperable electronic health records (EHRs).

    PubMed

    Bahga, Arshdeep; Madisetti, Vijay K

    2013-09-01

    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  20. The Impact of Boundary Spanning Scholarly Publications and Patents

    PubMed Central

    Shi, Xiaolin; Adamic, Lada A.; Tseng, Belle L.; Clarkson, Gavin S.

    2009-01-01

    Background Human knowledge and innovation are recorded in two media: scholarly publication and patents. These records not only document a new scientific insight or new method developed, but they also carefully cite prior work upon which the innovation is built. Methodology We quantify the impact of information flow across fields using two large citation dataset: one spanning over a century of scholarly work in the natural sciences, social sciences and humanities, and second spanning a quarter century of United States patents. Conclusions We find that a publication's citing across disciplines is tied to its subsequent impact. In the case of patents and natural science publications, those that are cited at least once are cited slightly more when they draw on research outside of their area. In contrast, in the social sciences, citing within one's own field tends to be positively correlated with impact. PMID:19688087

  1. Operational Interoperability Challenges on the Example of GEOSS and WIS

    NASA Astrophysics Data System (ADS)

    Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.

    2015-12-01

    The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.

  2. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework.

    PubMed

    Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.

  3. Determining return water levels at ungauged coastal sites: a case study for northern Germany

    NASA Astrophysics Data System (ADS)

    Arns, Arne; Wahl, Thomas; Haigh, Ivan D.; Jensen, Jürgen

    2015-04-01

    We estimate return periods and levels of extreme still water levels for the highly vulnerable and historically and culturally important small marsh islands known as the Halligen, located in the Wadden Sea offshore of the coast of northern Germany. This is a challenging task as only few water level records are available for this region, and they are currently too short to apply traditional extreme value analysis methods. Therefore, we use the Regional Frequency Analysis (RFA) approach. This originates from hydrology but has been used before in several coastal studies and is also currently applied by the local federal administration responsible for coastal protection in the study area. The RFA enables us to indirectly estimate return levels by transferring hydrological information from gauged to related ungauged sites. Our analyses highlight that this methodology has some drawbacks and may over- or underestimate return levels compared to direct analyses using station data. To overcome these issues, we present an alternative approach, combining numerical and statistical models. First, we produced a numerical multidecadal model hindcast of water levels for the entire North Sea. Predicted water levels from the hindcast are bias corrected using the information from the available tide gauge records. Hence, the simulated water levels agree well with the measured water levels at gauged sites. The bias correction is then interpolated spatially to obtain correction functions for the simulated water levels at each coastal and island model grid point in the study area. Using a recommended procedure to conduct extreme value analyses from a companion study, return water levels suitable for coastal infrastructure design are estimated continuously along the entire coastline of the study area, including the offshore islands. A similar methodology can be applied in other regions of the world where tide gauge observations are sparse.

  4. Methodological challenges to human medical study.

    PubMed

    Zhong, Yixin; Liu, Baoyan; Qu, Hua; Xie, Qi

    2014-09-01

    With the transformation of modern medicinal pattern, medical studies are confronted with methodological challenges. By analyzing two methodologies existing in the study of physical matter system and information system, the article points out that traditional Chinese medicine (TCM), especially the treatment based on syndrome differentiation, embodies information conception of methodological positions, while western medicine represents matter conception of methodological positions. It proposes a new way of thinking about combination of TCM and western medicine by combinating two kinds of methodological methods.

  5. Implications for registry-based vaccine effectiveness studies from an evaluation of an immunization registry: a cross-sectional study.

    PubMed

    Mahon, Barbara E; Shea, Kimberly M; Dougherty, Nancy N; Loughlin, Anita M

    2008-05-14

    Population-based electronic immunization registries create the possibility of using registry data to conduct vaccine effectiveness studies which could have methodological advantages over traditional observational studies. For study validity, the base population would have to be clearly defined and the immunization status of members of the population accurately recorded in the registry. We evaluated a city-wide immunization registry, focusing on its potential as a tool to study pertussis vaccine effectiveness, especially in adolescents. We conducted two evaluations - one in sites that were active registry participants and one in sites that had implemented an electronic medical record with plans for future direct data transfer to the registry - of the ability to match patients' medical records to registry records and the accuracy of immunization records in the registry. For each site, records from current pediatric patients were chosen randomly. Data regarding pertussis-related immunizations, clinic usage, and demographic and identifying information were recorded; for 11-17-year-old subjects, information on MMR, hepatitis B, and varicella immunizations was also collected. Records were then matched, when possible, to registry records. For records with a registry match, immunization data were compared. Among 350 subjects from sites that were current registry users, 307 (87.7%) matched a registry record. Discrepancies in pertussis-related data were common for up-to-date status (22.6%), number of immunizations (34.7%), dates (10.2%), and formulation (34.4%). Among 442 subjects from sites that planned direct electronic transfer of immunization data to the registry, 393 (88.9%) would have matched a registry record; discrepancies occurred frequently in number of immunizations (11.9%), formulation (29.1%), manufacturer (94.4%), and lot number (95.1%.) Inability to match and immunization discrepancies were both more common in subjects who were older at their first visit to the provider site. For 11-17-year-old subjects, discrepancies were also common for MMR, hepatitis B, and varicella vaccination data. Provider records frequently could not be matched to registry records or had discrepancies in key immunization data. These issues were more common for older children and were present even with electronic data transfer. These results highlight general challenges that may face investigators wishing to use registry-based immunization data for vaccine effectiveness studies, especially in adolescents.

  6. Implications for registry-based vaccine effectiveness studies from an evaluation of an immunization registry: A cross-sectional study

    PubMed Central

    Mahon, Barbara E; Shea, Kimberly M; Dougherty, Nancy N; Loughlin, Anita M

    2008-01-01

    Background Population-based electronic immunization registries create the possibility of using registry data to conduct vaccine effectiveness studies which could have methodological advantages over traditional observational studies. For study validity, the base population would have to be clearly defined and the immunization status of members of the population accurately recorded in the registry. We evaluated a city-wide immunization registry, focusing on its potential as a tool to study pertussis vaccine effectiveness, especially in adolescents. Methods We conducted two evaluations – one in sites that were active registry participants and one in sites that had implemented an electronic medical record with plans for future direct data transfer to the registry – of the ability to match patients' medical records to registry records and the accuracy of immunization records in the registry. For each site, records from current pediatric patients were chosen randomly. Data regarding pertussis-related immunizations, clinic usage, and demographic and identifying information were recorded; for 11–17-year-old subjects, information on MMR, hepatitis B, and varicella immunizations was also collected. Records were then matched, when possible, to registry records. For records with a registry match, immunization data were compared. Results Among 350 subjects from sites that were current registry users, 307 (87.7%) matched a registry record. Discrepancies in pertussis-related data were common for up-to-date status (22.6%), number of immunizations (34.7%), dates (10.2%), and formulation (34.4%). Among 442 subjects from sites that planned direct electronic transfer of immunization data to the registry, 393 (88.9%) would have matched a registry record; discrepancies occurred frequently in number of immunizations (11.9%), formulation (29.1%), manufacturer (94.4%), and lot number (95.1%.) Inability to match and immunization discrepancies were both more common in subjects who were older at their first visit to the provider site. For 11–17-year-old subjects, discrepancies were also common for MMR, hepatitis B, and varicella vaccination data. Conclusion Provider records frequently could not be matched to registry records or had discrepancies in key immunization data. These issues were more common for older children and were present even with electronic data transfer. These results highlight general challenges that may face investigators wishing to use registry-based immunization data for vaccine effectiveness studies, especially in adolescents. PMID:18479517

  7. A standard methodology for the analysis, recording, and control of verbal behavior

    PubMed Central

    Drash, Philip W.; Tudor, Roger M.

    1991-01-01

    Lack of a standard methodology has been one of the major obstacles preventing advancement of behavior analytic research in verbal behavior. This article presents a standard method for the analysis, recording, and control of verbal behavior that overcomes several major methodological problems that have hindered operant research in verbal behavior. The system divides all verbal behavior into four functional response classes, correct, error, no response, and inappropriate behavior, from which all vocal responses of a subject may be classified and consequated. The effects of contingencies of reinforcement on verbal operants within each category are made immediately visible to the researcher as changes in frequency of response. Incorporating frequency of response within each category as the unit of response allows both rate and probability of verbal response to be utilized as basic dependent variables. This method makes it possible to record and consequate verbal behavior in essentially the same way as any other operant response. It may also facilitate an experimental investigation of Skinner's verbal response categories. PMID:22477629

  8. AERIS: An Integrated Domain Information System for Aerospace Science and Technology

    ERIC Educational Resources Information Center

    Hatua, Sudip Ranjan; Madalli, Devika P.

    2011-01-01

    Purpose: The purpose of this paper is to discuss the methodology in building an integrated domain information system with illustrations that provide proof of concept. Design/methodology/approach: The present work studies the usual search engine approach to information and its pitfalls. A methodology was adopted for construction of a domain-based…

  9. An Information Theoretic Investigation Of Complex Adaptive Supply Networks With Organizational Topologies

    DTIC Science & Technology

    2016-12-22

    assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology

  10. Accelerated numerical processing of electronically recorded holograms with reduced speckle noise.

    PubMed

    Trujillo, Carlos; Garcia-Sucerquia, Jorge

    2013-09-01

    The numerical reconstruction of digitally recorded holograms suffers from speckle noise. An accelerated method that uses general-purpose computing in graphics processing units to reduce that noise is shown. The proposed methodology utilizes parallelized algorithms to record, reconstruct, and superimpose multiple uncorrelated holograms of a static scene. For the best tradeoff between reduction of the speckle noise and processing time, the method records, reconstructs, and superimposes six holograms of 1024 × 1024 pixels in 68 ms; for this case, the methodology reduces the speckle noise by 58% compared with that exhibited by a single hologram. The fully parallelized method running on a commodity graphics processing unit is one order of magnitude faster than the same technique implemented on a regular CPU using its multithreading capabilities. Experimental results are shown to validate the proposal.

  11. A systematic intercomparison of regional flood frequency analysis models in a simulation framework

    NASA Astrophysics Data System (ADS)

    Ganora, Daniele; Laio, Francesco; Claps, Pierluigi

    2015-04-01

    Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve (or other discharge-related variables), based on the fundamental concept of substituting temporal information at a site (no data or short time series) by exploiting observations at other sites (spatial information). Different RFA paradigms exist, depending on the way the information is transferred to the site of interest. Despite the wide use of such methodology, a systematic comparison between these paradigms has not been performed. The aim of this study is to provide a framework wherein carrying out the intercomparison: we thus synthetically generate data through Monte Carlo simulations for a number of (virtual) stations, following a GEV parent distribution; different scenarios can be created to represent different spatial heterogeneity patterns by manipulating the parameters of the parent distribution at each station (e.g. with a linear variation in space of the shape parameter of the GEV). A special case is the homogeneous scenario where each station record is sampled from the same parent distribution. For each scenario and each simulation, different regional models are applied to evaluate the 200-year growth factor at each station. Results are than compared to the exact growth factor of each station, which is known in our virtual world. Considered regional approaches include: (i) a single growth curve for the whole region; (ii) a multiple-region model based on cluster analysis which search for an adequate number of homogeneous subregions; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially-smooth estimation procedure based on linear regressions.. A further benchmark model is the at-site estimate based on the analysis of the local record. A comprehensive analysis of the results of the simulations shows that, if the scenario is homogeneous (no spatial variability), all the regional approaches have comparable performances. Moreover, as expected, regional estimates are much more reliable than the at-site estimates. If the scenario is heterogeneous, the performances of the regional models depend on the pattern of heterogeneity; in general, however, the spatially-smooth regional approach performs better than the others, and its performances improve for increasing record lengths. For heterogeneous scenarios, the at-site estimates appear to be comparably more efficient than in the homogeneous case, and in general less biased than the regional estimates.

  12. Principles of Successful Implementation of Lecture Recordings in Higher Education

    ERIC Educational Resources Information Center

    Ollermann, Frank; Rolf, Rüdiger; Greweling, Christian; Klaßen, André

    2017-01-01

    Purpose: This paper aims to describe the principles underlying the successful implementation of a lecture recording service in higher education. Design/methodology/approach: The paper qualitatively reviews the practices and experiences of several years of automated lecture recording at a medium-sized university in Germany. Findings: The paper…

  13. BAPA Database: a Landslide Inventory in the Principality of Asturias (NW Spain) by Using Press Archives and Free Cartographic Servers

    NASA Astrophysics Data System (ADS)

    Valenzuela, P.; Domínguez-Cuesta, M. J.; Jiménez-Sánchez, M.; Mora García, M. A.

    2015-12-01

    Due to its geological and climatic conditions, landslides are very common and widespread phenomena in the Principality of Asturias (NW of Spain), causing economic losses and, sometimes, human victims. In this scenario, temporal prediction of instabilities becomes particularly important. Although previous knowledge indicates that rainfall is the main trigger, the lack of data hinders the proper temporal forecast of landslides in the region. To resolve this deficiency, a new landslide inventory is being developed: the BAPA (Base de datos de Argayos del Principado de Asturias-Principality of Asturias Landslide Database). Data collection is mainly performed through the gathering of local newspaper archives, with special emphasis on the registration of spatial and temporal information. Moreover, a BAPA App and a BAPA website (http://geol.uniovi.es/BAPA) have been developed to easily obtain additional information from authorities and private individuals. Presently, dataset covers the period 1980-2015, registering more than 2000 individual landslide events. Fifty-two per cent of the records provide accurate dates, showing the usefulness of press archives as temporal records. The use of free cartographic servers, such as Google Maps, Google Street View and Iberpix (Government of Spain), combined with the spatial descriptions and photographs contained in the press releases, makes it possible to determine the exact location in fifty-eight per cent of the records. Field work performed to date has allowed the validation of the methodology proposed to obtain spatial data. In addition, BAPA database contain information about: source, typology of landslides, triggers, damages and costs.

  14. The Use of Intensity Scales In Exploiting Tsunami Historical Databases

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Scheele, F.

    2015-12-01

    Post-disaster assessments for historical tsunami events (>15 years old) are either scarce or contain limited information. In this study, we are assessing ways to examine tsunami impacts by utilizing data from old events, but more importantly we examine how to best utilize information contained in tsunami historical databases, in order to provide meaningful products that describe the impact of the event. As such, a tsunami intensity scale was applied to two historical events that were observed in New Zealand (one local and one distant), in order to utilize the largest possible number of observations in our dataset. This is especially important for countries like New Zealand where the tsunami historical record is short, going back to only the 19th century, and where instrument recordings are only available for the most recent events. We found that despite a number of challenges in using intensities -uncertainties partly due to limitations of historical event data - these data with the help of GIS tools can be used to produce hazard maps and offer an alternative way to exploit tsunami historical records. Most importantly the assignment of intensities at each point of observation allows for utilization of many more observations than if one depends on physical information alone, such as water heights. We hope these results may be used towards developing a well-defined methodology for hazard assessments, and refine our knowledge for past tsunami events for which the tsunami sources are largely unknown, and also for when physical quantities describing the tsunami (e.g. water height, flood depth, run-up) are scarce.

  15. Palaeotsunamis in the Pacific Islands

    USGS Publications Warehouse

    Goff, J.; Chague-Goff, C.; Dominey-Howes, D.; McAdoo, B.; Cronin, S.; Bonte-Grapetin, Michael; Nichol, S.; Horrocks, M.; Cisternas, M.; Lamarche, G.; Pelletier, B.; Jaffe, B.; Dudley, W.

    2011-01-01

    The recent 29 September 2009 South Pacific and 27 February 2010 Chilean events are a graphic reminder that the tsunami hazard and risk for the Pacific Ocean region should not be forgotten. Pacific Islands Countries (PICs) generally have short (<150 years) historic records, which means that to understand their tsunami hazard and risk researchers must study evidence for prehistoric events. However, our current state of knowledge of palaeotsunamis in PICs as opposed to their circum-Pacific counterparts is minimal at best. We briefly outline the limited extent of our current knowledge and propose an innovative methodology for future research in the Pacific. Each PIC represents a point source of information in the Pacific Ocean and this would allow their palaeotsunami records to be treated akin to palaeo-DART?? (Deep-ocean Assessment and Reporting of Tsunamis) buoys. Contemporaneous palaeotsunamis from local, regional and distant sources could be identified by using the spatial distribution of island records throughout the Pacific Ocean in conjunction with robust event chronologies. This would be highly innovative and, more importantly, would help provide the building blocks necessary to achieve more meaningful disaster risk reduction for PICs. ?? 2010 Elsevier B.V.

  16. Dynamic Decision Making under Uncertainty and Partial Information

    DTIC Science & Technology

    2017-01-30

    order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those

  17. A methodology based on openEHR archetypes and software agents for developing e-health applications reusing legacy systems.

    PubMed

    Cardoso de Moraes, João Luís; de Souza, Wanderley Lopes; Pires, Luís Ferreira; do Prado, Antonio Francisco

    2016-10-01

    In Pervasive Healthcare, novel information and communication technologies are applied to support the provision of health services anywhere, at anytime and to anyone. Since health systems may offer their health records in different electronic formats, the openEHR Foundation prescribes the use of archetypes for describing clinical knowledge in order to achieve semantic interoperability between these systems. Software agents have been applied to simulate human skills in some healthcare procedures. This paper presents a methodology, based on the use of openEHR archetypes and agent technology, which aims to overcome the weaknesses typically found in legacy healthcare systems, thereby adding value to the systems. This methodology was applied in the design of an agent-based system, which was used in a realistic healthcare scenario in which a medical staff meeting to prepare a cardiac surgery has been supported. We conducted experiments with this system in a distributed environment composed by three cardiology clinics and a center of cardiac surgery, all located in the city of Marília (São Paulo, Brazil). We evaluated this system according to the Technology Acceptance Model. The case study confirmed the acceptance of our agent-based system by healthcare professionals and patients, who reacted positively with respect to the usefulness of this system in particular, and with respect to task delegation to software agents in general. The case study also showed that a software agent-based interface and a tools-based alternative must be provided to the end users, which should allow them to perform the tasks themselves or to delegate these tasks to other people. A Pervasive Healthcare model requires efficient and secure information exchange between healthcare providers. The proposed methodology allows designers to build communication systems for the message exchange among heterogeneous healthcare systems, and to shift from systems that rely on informal communication of actors to a more automated and less error-prone agent-based system. Our methodology preserves significant investment of many years in the legacy systems and allows developers to extend them adding new features to these systems, by providing proactive assistance to the end-users and increasing the user mobility with an appropriate support. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  19. One approach to design of speech emotion database

    NASA Astrophysics Data System (ADS)

    Uhrin, Dominik; Chmelikova, Zdenka; Tovarek, Jaromir; Partila, Pavol; Voznak, Miroslav

    2016-05-01

    This article describes a system for evaluating the credibility of recordings with emotional character. Sound recordings form Czech language database for training and testing systems of speech emotion recognition. These systems are designed to detect human emotions in his voice. The emotional state of man is useful in the security forces and emergency call service. Man in action (soldier, police officer and firefighter) is often exposed to stress. Information about the emotional state (his voice) will help to dispatch to adapt control commands for procedure intervention. Call agents of emergency call service must recognize the mental state of the caller to adjust the mood of the conversation. In this case, the evaluation of the psychological state is the key factor for successful intervention. A quality database of sound recordings is essential for the creation of the mentioned systems. There are quality databases such as Berlin Database of Emotional Speech or Humaine. The actors have created these databases in an audio studio. It means that the recordings contain simulated emotions, not real. Our research aims at creating a database of the Czech emotional recordings of real human speech. Collecting sound samples to the database is only one of the tasks. Another one, no less important, is to evaluate the significance of recordings from the perspective of emotional states. The design of a methodology for evaluating emotional recordings credibility is described in this article. The results describe the advantages and applicability of the developed method.

  20. First seismic shear wave velocity profile of the lunar crust as extracted from the Apollo 17 active seismic data by wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-04-01

    We present a new seismic velocity model of the shallow lunar crust, including, for the first time, shear wave velocity information. So far, the shear wave velocity structure of the lunar near-surface was effectively unconstrained due to the complexity of lunar seismograms. Intense scattering and low attenuation in the lunar crust lead to characteristic long-duration reverberations on the seismograms. The reverberations obscure later arriving shear waves and mode conversions, rendering them impossible to identify and analyze. Additionally, only vertical component data were recorded during the Apollo active seismic experiments, which further compromises the identification of shear waves. We applied a novel processing and analysis technique to the data of the Apollo 17 lunar seismic profiling experiment (LSPE), which involved recording seismic energy generated by several explosive packages on a small areal array of four vertical component geophones. Our approach is based on the analysis of the spatial gradients of the seismic wavefield and yields key parameters such as apparent phase velocity and rotational ground motion as a function of time (depth), which cannot be obtained through conventional seismic data analysis. These new observables significantly enhance the data for interpretation of the recorded seismic wavefield and allow, for example, for the identification of S wave arrivals based on their lower apparent phase velocities and distinct higher amount of generated rotational motion relative to compressional (P-) waves. Using our methodology, we successfully identified pure-mode and mode-converted refracted shear wave arrivals in the complex LSPE data and derived a P- and S-wave velocity model of the shallow lunar crust at the Apollo 17 landing site. The extracted elastic-parameter model supports the current understanding of the lunar near-surface structure, suggesting a thin layer of low-velocity lunar regolith overlying a heavily fractured crust of basaltic material showing high (>0.4 down to 60 m) Poisson's ratios. Our new model can be used in future studies to better constrain the deep interior of the Moon. Given the rich information derived from the minimalistic recording configuration, our results demonstrate that wavefield gradient analysis should be critically considered for future space missions that aim to explore the interior structure of extraterrestrial objects by seismic methods. Additionally, we anticipate that the proposed shear wave identification methodology can also be applied to the routinely recorded vertical component data from land seismic exploration on Earth.

  1. Using the FAIMS Mobile App for field data recording

    NASA Astrophysics Data System (ADS)

    Ballsun-Stanton, Brian; Klump, Jens; Ross, Shawn

    2016-04-01

    Multiple people creating data in the field poses a hard technical problem: our ``web 2.0'' environment presumes constant connectivity, data ``authority'' held by centralised servers, and sees mobile devices as tools for presentation rather than origination. A particular design challenge is the remoteness of the sampling locations, hundreds of kilometres away from network access. The alternative, however, is hand collection with a lengthy, error prone, and expensive digitisation process. This poster will present a field-tested open-source solution to field data recording. This solution, originally created by a community of archaeologists, needed to accommodate diverse recording methodologies. The community could not agree on standard vocabularies, workflows, attributes, or methodologies, but most agreed that at app to ``record data in the field'' was desirable. As a result, the app is generalised for field data collection; not only can it record a range of data types, but it is deeply customisable. The NeCTAR / ARC funded FAIMS Project, therefore, created an app which allows for arbitrary data collection in the field. In order to accomplish this ambitious goal, FAIMS relied heavily on OSS projects including: spatialite and gdal (for GIS support), sqlite (for a lightweight key-attribute-value datastore), Javarosa and Beanshell (for UI and scripting), Ruby, and Linux. Only by standing on the shoulders of giants, FAIMS was able to make a flexible and highly generalisable field data collection system that CSIRO geoscientists were able to customise to suit most of their completely unanticipated needs. While single-task apps (i.e. those commissioned by structural geologists to take strikes and dips) will excel in their domains, other geoscientists (palaeoecologists, palaeontologists, anyone taking samples) likely cannot afford to commission domain- and methodology-specific recording tools for their custom recording needs. FAIMS shows the utility of OSS software development and provides geoscientists a way forward for edge-case field data collection. Moreover, as the data is completely open and exports are scriptable, federation with other data services is both possible and encouraged. This poster will describe the internal architecture of the FAIMS app, show how it was used by CSIRO in the field, and display a graph of its OSS heritage. The app is available from Google Play, the recording module can be found at https://github.com/FAIMS/CSIRO-Water-Samples, and the exporter we used can be found at https://github.com/FAIMS/shapefileExport. You can make your own data-collection modules for free via the documentation at https://www.fedarch.org/support/#2. See chapter by Sobotkova et. al. in {Mobilizing the Past}, forthcoming 2016 Ross, S., et. al. (2013) Creating eResearch tools for archaeologists: The federated archaeological information management systems project [online]. {Australian Archaeology}. Ross, S., et. al. (2015). Building the bazaar: enhancing archaeological field recording through an open source approach. In Wilson, A. T., & Edwards, B. (Eds.). {Open Source Archaeology: Ethics and Practice.}. Reid, N., et. al. (2015) {A mobile app for geochemical field data acquisition.} Poster presented at AGU Fall Meeting 2015, San Francisco.

  2. Topical Review: Families Coping With Child Trauma: A Naturalistic Observation Methodology

    PubMed Central

    Barrett, Anna; Bowles, Peter; Conroy, Rowena; Mehl, Matthias R.

    2016-01-01

    Objective To introduce a novel, naturalistic observational methodology (the Electronically Activated Recorder; EAR) as an opportunity to better understand the central role of the family environment in children’s recovery from trauma. Methods Discussion of current research methods and a systematic literature review of EAR studies on health and well-being. Results Surveys, experience sampling, and the EAR method each provide different opportunities and challenges for studying family interactions. We identified 17 articles describing relevant EAR studies. These investigated questions of emotional well-being, communicative behaviors, and interpersonal relationships, predominantly in adults. 5 articles reported innovative research in children, triangulating EAR-observed behavioral data (e.g., on child conflict at home) with neuroendocrine assay, sociodemographic information, and parent report. Finally, we discussed psychometric, practical, and ethical considerations for conducting EAR research with children and families. Conclusions Naturalistic observation methods such as the EAR have potential for pediatric psychology studies regarding trauma and the family environment. PMID:25797943

  3. Whole slide imaging of unstained tissue using lensfree microscopy

    NASA Astrophysics Data System (ADS)

    Morel, Sophie Nhu An; Hervé, Lionel; Bordy, Thomas; Cioni, Olivier; Delon, Antoine; Fromentin, Catherine; Dinten, Jean-Marc; Allier, Cédric

    2016-04-01

    Pathologist examination of tissue slides provides insightful information about a patient's disease. Traditional analysis of tissue slides is performed under a binocular microscope, which requires staining of the sample and delays the examination. We present a simple cost-effective lensfree imaging method to record 2-4μm resolution wide-field (10 mm2 to 6 cm2) images of unstained tissue slides. The sample processing time is reduced as there is no need for staining. A wide field of view (10 mm2) lensfree hologram is recorded in a single shot and the image is reconstructed in 2s providing a very fast acquisition chain. The acquisition is multispectral, i.e. multiple holograms are recorded simultaneously at three different wavelengths, and a dedicated holographic reconstruction algorithm is used to retrieve both amplitude and phase. Whole tissue slides imaging is obtained by recording 130 holograms with X-Y translation stages and by computing the mosaic of a 25 x 25 mm2 reconstructed image. The reconstructed phase provides a phase-contrast-like image of the unstained specimen, revealing structures of healthy and diseased tissue. Slides from various organs can be reconstructed, e.g. lung, colon, ganglion, etc. To our knowledge, our method is the first technique that enables fast wide-field lensfree imaging of such unlabeled dense samples. This technique is much cheaper and compact than a conventional phase contrast microscope and could be made portable. In sum, we present a new methodology that could quickly provide useful information when a rapid diagnosis is needed, such as tumor margin identification on frozen section biopsies during surgery.

  4. Methodological approaches of health technology assessment.

    PubMed

    Goodman, C S; Ahn, R

    1999-12-01

    In this era of evolving health care systems throughout the world, technology remains the substance of health care. Medical informatics comprises a growing contribution to the technologies used in the delivery and management of health care. Diverse, evolving technologies include artificial neural networks, computer-assisted surgery, computer-based patient records, hospital information systems, and more. Decision-makers increasingly demand well-founded information to determine whether or how to develop these technologies, allow them on the market, acquire them, use them, pay for their use, and more. The development and wider use of health technology assessment (HTA) reflects this demand. While HTA offers systematic, well-founded approaches for determining the value of medical informatics technologies, HTA must continue to adapt and refine its methods in response to these evolving technologies. This paper provides a basic overview of HTA principles and methods.

  5. The Value of Metrics for Science Data Center Management

    NASA Astrophysics Data System (ADS)

    Moses, J.; Behnke, J.; Watts, T. H.; Lu, Y.

    2005-12-01

    The Earth Observing System Data and Information System (EOSDIS) has been collecting and analyzing records of science data archive, processing and product distribution for more than 10 years. The types of information collected and the analysis performed has matured and progressed to become an integral and necessary part of the system management and planning functions. Science data center managers are realizing the importance that metrics can play in influencing and validating their business model. New efforts focus on better understanding of users and their methods. Examples include tracking user web site interactions and conducting user surveys such as the government authorized American Customer Satisfaction Index survey. This paper discusses the metrics methodology, processes and applications that are growing in EOSDIS, the driving requirements and compelling events, and the future envisioned for metrics as an integral part of earth science data systems.

  6. Novel Representation of Clinical Information in the ICU

    PubMed Central

    Pickering, B.W.; Herasevich, V.; Ahmed, A.; Gajic, O.

    2010-01-01

    The introduction of electronic medical records (EMR) and computerized physician order entry (CPOE) into the intensive care unit (ICU) is transforming the way health care providers currently work. The challenge facing developers of EMR’s is to create products which add value to systems of health care delivery. As EMR’s become more prevalent, the potential impact they have on the quality and safety, both negative and positive, will be amplified. In this paper we outline the key barriers to effective use of EMR and describe the methodology, using a worked example of the output. AWARE (Ambient Warning and Response Evaluation), is a physician led, electronic-environment enhancement program in an academic, tertiary care institution’s ICU. The development process is focused on reducing information overload, improving efficiency and eliminating medical error in the ICU. PMID:23616831

  7. How can hospitals better protect the privacy of electronic medical records? Perspectives from staff members of health information management departments.

    PubMed

    Sher, Ming-Ling; Talley, Paul C; Cheng, Tain-Junn; Kuo, Kuang-Ming

    2017-05-01

    The adoption of electronic medical records (EMR) is expected to better improve overall healthcare quality and to offset the financial pressure of excessive administrative burden. However, safeguarding EMR against potentially hostile security breaches from both inside and outside healthcare facilities has created increased patients' privacy concerns from all sides. The aim of our study was to examine the influencing factors of privacy protection for EMR by healthcare professionals. We used survey methodology to collect questionnaire responses from staff members in health information management departments among nine Taiwanese hospitals active in EMR utilisation. A total of 209 valid responses were collected in 2014. We used partial least squares for analysing the collected data. Perceived benefits, perceived barriers, self-efficacy and cues to action were found to have a significant association with intention to protect EMR privacy, while perceived susceptibility and perceived severity were not. Based on the findings obtained, we suggest that hospitals should provide continuous ethics awareness training to relevant staff and design more effective strategies for improving the protection of EMR privacy in their charge. Further practical and research implications are also discussed.

  8. Systematic review of the methodological and reporting quality of case series in surgery.

    PubMed

    Agha, R A; Fowler, A J; Lee, S-Y; Gundogan, B; Whitehurst, K; Sagoo, H K; Jeong, K J L; Altman, D G; Orgill, D P

    2016-09-01

    Case series are an important and common study type. No guideline exists for reporting case series and there is evidence of key data being missed from such reports. The first step in the process of developing a methodologically sound reporting guideline is a systematic review of literature relevant to the reporting deficiencies of case series. A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, Embase, Cochrane Methods Register, Science Citation Index and Conference Proceedings Citation index, from the start of indexing to 5 November 2014. Independent screening, eligibility assessments and data extraction were performed. Included articles were then analysed for five areas of deficiency: failure to use standardized definitions, missing or selective data (including the omission of whole cases or important variables), transparency or incomplete reporting, whether alternative study designs were considered, and other issues. Database searching identified 2205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequencies of methodological and reporting issues identified were: failure to use standardized definitions (57 per cent), missing or selective data (66 per cent), transparency or incomplete reporting (70 per cent), whether alternative study designs were considered (11 per cent) and other issues (52 per cent). The methodological and reporting quality of surgical case series needs improvement. The data indicate that evidence-based guidelines for the conduct and reporting of case series may be useful. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  9. Integrating Laser Scanner and Bim for Conservation and Reuse: "the Lyric Theatre of Milan"

    NASA Astrophysics Data System (ADS)

    Utica, G.; Pinti, L.; Guzzoni, L.; Bonelli, S.; Brizzolari, A.

    2017-12-01

    The paper underlines the importance to apply a methodology that integrates the Building Information Modeling (BIM), Work Breakdown Structure (WBS) and the Laser Scanner tool in conservation and reuse projects. As it is known, the laser scanner technology provides a survey of the building object which is more accurate rather than that carried out using traditional methodologies. Today most existing buildings present their attributes in a dispersed way, stored and collected in paper documents, in sheets of equipment information, in file folders of maintenance records. In some cases, it is difficult to find updated technical documentation and the research of reliable data can be a cost and time-consuming process. Therefore, this new survey technology, embedded with BIM systems represents a valid tool to obtain a coherent picture of the building state. The following case consists in the conservation and reuse project of Milan Lyric Theatre, started in 2013 from the collaboration between the Milan Polytechnic and the Municipality. This project first attempts to integrate these new techniques which are already professional standards in many other countries such as the US, Norway, Finland, England and so on. Concerning the methodology, the choice has been to use BIM software for the structured analysis of the project, with the aim to define a single code of communication to develop a coherent documentation according to rules in a consistent manner and in tight schedules. This process provides the definition of an effective and efficient operating method that can be applied to other projects.

  10. A method for the determination of potentially profitable service patterns for commuter air carriers

    NASA Technical Reports Server (NTRS)

    Ransone, R. K.; Kuhlthau, A. R.; Deptula, D. A.

    1975-01-01

    A methodology for estimating market conception was developed as a part of the short-haul air transportation program. It is based upon an analysis of actual documents which provide a record of known travel history. Applying this methodology a forecast was made of the demand for an air feeder service between Charlottesville, Virginia and Dulles International Airport. Local business travel vouchers and local travel agent records were selected to provide the documentation. The market was determined to be profitable for an 8-passenger Cessna 402B aircraft flying a 2-hour daily service pattern designed to mesh to the best extent possible with the connecting schedules at Dulles. The Charlottesville - Dulles air feeder service market conception forecast and its methodology are documented.

  11. Depressed Mothers as Informants on Child Behavior: Methodological Issues

    PubMed Central

    Ordway, Monica Roosa

    2011-01-01

    Mothers with depressive symptoms more frequently report behavioral problems among their children than non-depressed mothers leading to a debate regarding the accuracy of depressed mothers as informants of children’s behavior. The purpose of this integrative review was to identify methodological challenges in research related to the debate. Data were extracted from 43 papers (6 theoretical, 36 research reports, and 1 instrument scoring manual). The analysis focused on the methodologies considered when using depressed mothers as informants. Nine key themes were identified and I concluded that researchers should incorporate multiple informants, identify the characteristics of maternal depression, and incorporate advanced statistical methodology. The use of a conceptual framework to understand informant discrepancies within child behavior evaluations is suggested for future research. PMID:21964958

  12. 40 CFR 98.247 - Records that must be retained.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....246(b), and records of any annual average HHV calculations. (b) If you comply with the mass balance... produced, if you comply with the alternative methodology in § 98.243(c)(4) for determining carbon content...

  13. Building the Qualification File of EGNOS with DOORS

    NASA Astrophysics Data System (ADS)

    Fabre, J.

    2008-08-01

    EGNOS, the European Satellite-Based Augmentation System (SBAS) to GPS, is getting to its final deployment and being initially operated towards qualification and certification to reach operational capability by 2008/2009. A very important milestone in the development process is the System Qualification Review (QR). As the verification phase aims at demonstrating that the EGNOS System design meets the applicable requirements, the QR declares the completion of verification activities. The main document to present at QR is a consolidated, consistent and complete Qualification file. The information included shall give confidence to the QR reviewers that the performed qualification activities are completed. Therefore, an important issue for the project team is to focus on synthetic and consistent information, and to make the presentation as clear as possible. Traceability to applicable requirements shall be systematically presented. Moreover, in order to support verification justification, reference to details shall be available, and the reviewer shall have the possibility to link automatically to the documents including this detailed information. In that frame, Thales Alenia Space has implemented a strong support in terms of methodology and tool, to provide to System Engineering and Verification teams a single reference technical database, in which all team members consult the applicable requirements, compliance, justification, design data and record the information necessary to build the final Qualification file. This paper presents the EGNOS context, the Qualification file contents, and the methodology implemented, based on Thales Alenia Space practices and in line with ECSS. Finally, it shows how the Qualification file is built in a DOORS environment.

  14. [Demonstrating patient safety requires acceptance of a broader scientific palette].

    PubMed

    Leistikow, I

    2017-01-01

    It is high time the medical community recognised that patient-safety research can be assessed using other scientific methods than the traditional medical ones. There is often a fundamental mismatch between the methodology of patient-safety research and the methodology used to assess the quality of this research. One example is research into the reliability and validity of record review as a method for detecting adverse events. This type of research is based on logical positivism, while record review itself is based on social constructivism. Record review does not lead to "one truth": adverse events are not measured on the basis of the records themselves, but by weighing the probability of certain situations being classifiable as adverse events. Healthcare should welcome behavioural and social sciences to its scientific palette. Restricting ourselves to the randomised control trial paradigm is short-sighted and dangerous; it deprives patients of much-needed improvements in safety.

  15. MUSIC-Expected maximization gaussian mixture methodology for clustering and detection of task-related neuronal firing rates.

    PubMed

    Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A

    2017-01-15

    Researchers often rely on simple methods to identify involvement of neurons in a particular motor task. The historical approach has been to inspect large groups of neurons and subjectively separate neurons into groups based on the expertise of the investigator. In cases where neuron populations are small it is reasonable to inspect these neuronal recordings and their firing rates carefully to avoid data omissions. In this paper, a new methodology is presented for automatic objective classification of neurons recorded in association with behavioral tasks into groups. By identifying characteristics of neurons in a particular group, the investigator can then identify functional classes of neurons based on their relationship to the task. The methodology is based on integration of a multiple signal classification (MUSIC) algorithm to extract relevant features from the firing rate and an expectation-maximization Gaussian mixture algorithm (EM-GMM) to cluster the extracted features. The methodology is capable of identifying and clustering similar firing rate profiles automatically based on specific signal features. An empirical wavelet transform (EWT) was used to validate the features found in the MUSIC pseudospectrum and the resulting signal features captured by the methodology. Additionally, this methodology was used to inspect behavioral elements of neurons to physiologically validate the model. This methodology was tested using a set of data collected from awake behaving non-human primates. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. 78 FR 48720 - Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research To Support the National... Redesign Research (NCVS-RR) program: Methodological Research to Support the National Crime Victimization...

  17. 78 FR 66954 - Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB No. 1121-NEW] Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological Research To Support the National Crime... related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program: Methodological...

  18. Information security system quality assessment through the intelligent tools

    NASA Astrophysics Data System (ADS)

    Trapeznikov, E. V.

    2018-04-01

    The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.

  19. Standards for the Analysis and Processing of Surface-Water Data and Information Using Electronic Methods

    USGS Publications Warehouse

    Sauer, Vernon B.

    2002-01-01

    Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.

  20. Fast maximum likelihood estimation using continuous-time neural point process models.

    PubMed

    Lepage, Kyle Q; MacDonald, Christopher J

    2015-06-01

    A recent report estimates that the number of simultaneously recorded neurons is growing exponentially. A commonly employed statistical paradigm using discrete-time point process models of neural activity involves the computation of a maximum-likelihood estimate. The time to computate this estimate, per neuron, is proportional to the number of bins in a finely spaced discretization of time. By using continuous-time models of neural activity and the optimally efficient Gaussian quadrature, memory requirements and computation times are dramatically decreased in the commonly encountered situation where the number of parameters p is much less than the number of time-bins n. In this regime, with q equal to the quadrature order, memory requirements are decreased from O(np) to O(qp), and the number of floating-point operations are decreased from O(np(2)) to O(qp(2)). Accuracy of the proposed estimates is assessed based upon physiological consideration, error bounds, and mathematical results describing the relation between numerical integration error and numerical error affecting both parameter estimates and the observed Fisher information. A check is provided which is used to adapt the order of numerical integration. The procedure is verified in simulation and for hippocampal recordings. It is found that in 95 % of hippocampal recordings a q of 60 yields numerical error negligible with respect to parameter estimate standard error. Statistical inference using the proposed methodology is a fast and convenient alternative to statistical inference performed using a discrete-time point process model of neural activity. It enables the employment of the statistical methodology available with discrete-time inference, but is faster, uses less memory, and avoids any error due to discretization.

  1. 78 FR 50111 - Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-16

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research to Support the National...: Methodological Research to Support the National Crime Victimization Survey: Self-Report Data on Rape and Sexual...

  2. Effective Information Systems: What's the Secret?

    ERIC Educational Resources Information Center

    Kirkham, Sandi

    1994-01-01

    Argues that false assumptions about user needs implicit in methodologies for building information systems have resulted in inadequate and inflexible systems. Checkland's Soft Systems Methodology is examined as a useful alternative. Its fundamental features are described, and examples of models demonstrate how the methodology can facilitate…

  3. The EHR-ARCHE project: satisfying clinical information needs in a Shared Electronic Health Record system based on IHE XDS and Archetypes.

    PubMed

    Duftschmid, Georg; Rinner, Christoph; Kohler, Michael; Huebner-Bloder, Gudrun; Saboor, Samrend; Ammenwerth, Elske

    2013-12-01

    While contributing to an improved continuity of care, Shared Electronic Health Record (EHR) systems may also lead to information overload of healthcare providers. Document-oriented architectures, such as the commonly employed IHE XDS profile, which only support information retrieval at the level of documents, are particularly susceptible for this problem. The objective of the EHR-ARCHE project was to develop a methodology and a prototype to efficiently satisfy healthcare providers' information needs when accessing a patient's Shared EHR during a treatment situation. We especially aimed to investigate whether this objective can be reached by integrating EHR Archetypes into an IHE XDS environment. Using methodical triangulation, we first analysed the information needs of healthcare providers, focusing on the treatment of diabetes patients as an exemplary application domain. We then designed ISO/EN 13606 Archetypes covering the identified information needs. To support a content-based search for fine-grained information items within EHR documents, we extended the IHE XDS environment with two additional actors. Finally, we conducted a formative and summative evaluation of our approach within a controlled study. We identified 446 frequently needed diabetes-specific information items, representing typical information needs of healthcare providers. We then created 128 Archetypes and 120 EHR documents for two fictive patients. All seven diabetes experts, who evaluated our approach, preferred the content-based search to a conventional XDS search. Success rates of finding relevant information was higher for the content-based search (100% versus 80%) and the latter was also more time-efficient (8-14min versus 20min or more). Our results show that for an efficient satisfaction of health care providers' information needs, a content-based search that rests upon the integration of Archetypes into an IHE XDS-based Shared EHR system is superior to a conventional metadata-based XDS search. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  4. The EHR-ARCHE project: Satisfying clinical information needs in a Shared Electronic Health Record System based on IHE XDS and Archetypes☆

    PubMed Central

    Duftschmid, Georg; Rinner, Christoph; Kohler, Michael; Huebner-Bloder, Gudrun; Saboor, Samrend; Ammenwerth, Elske

    2013-01-01

    Purpose While contributing to an improved continuity of care, Shared Electronic Health Record (EHR) systems may also lead to information overload of healthcare providers. Document-oriented architectures, such as the commonly employed IHE XDS profile, which only support information retrieval at the level of documents, are particularly susceptible for this problem. The objective of the EHR-ARCHE project was to develop a methodology and a prototype to efficiently satisfy healthcare providers’ information needs when accessing a patient's Shared EHR during a treatment situation. We especially aimed to investigate whether this objective can be reached by integrating EHR Archetypes into an IHE XDS environment. Methods Using methodical triangulation, we first analysed the information needs of healthcare providers, focusing on the treatment of diabetes patients as an exemplary application domain. We then designed ISO/EN 13606 Archetypes covering the identified information needs. To support a content-based search for fine-grained information items within EHR documents, we extended the IHE XDS environment with two additional actors. Finally, we conducted a formative and summative evaluation of our approach within a controlled study. Results We identified 446 frequently needed diabetes-specific information items, representing typical information needs of healthcare providers. We then created 128 Archetypes and 120 EHR documents for two fictive patients. All seven diabetes experts, who evaluated our approach, preferred the content-based search to a conventional XDS search. Success rates of finding relevant information was higher for the content-based search (100% versus 80%) and the latter was also more time-efficient (8–14 min versus 20 min or more). Conclusions Our results show that for an efficient satisfaction of health care providers’ information needs, a content-based search that rests upon the integration of Archetypes into an IHE XDS-based Shared EHR system is superior to a conventional metadata-based XDS search. PMID:23999002

  5. Development of Product Availability Monitoring System In Production Unit In Automotive Component Industry

    NASA Astrophysics Data System (ADS)

    Hartono, Rachmad; Raharno, Sri; Yuwana Martawirya, Yatna; Arthaya, Bagus

    2018-03-01

    This paper described a methodology to monitor the availability of products in a production unit in the automotive component industry. Automotive components made are automotive components made through sheet metal working. Raw material coming into production unit in the form of pieces of plates that have a certain size. Raw materials that come stored in the warehouse. Data of raw each material in the warehouse are recorded and stored in a data base system. The material will then undergo several production processes in the production unit. When the material is taken from the warehouse, material data are also recorded and stored in a data base. The data recorded are the amount of material, material type, and date when the material is out of the warehouse. The material coming out of the warehouse is labeled with information related to the production processes that the material must pass. Material out of the warehouse is a product will be made. The products have been completed, are stored in the warehouse products. When the product is entered into the product warehouse, product data is also recorded by scanning the barcode contained on the label. By recording the condition of the product at each stage of production, we can know the availability of the product in a production unit in the form of a raw material, the product being processed and the finished product.

  6. On Estimation of the Survivor Average Causal Effect in Observational Studies when Important Confounders are Missing Due to Death

    PubMed Central

    Egleston, Brian L.; Scharfstein, Daniel O.; MacKenzie, Ellen

    2008-01-01

    We focus on estimation of the causal effect of treatment on the functional status of individuals at a fixed point in time t* after they have experienced a catastrophic event, from observational data with the following features: (1) treatment is imposed shortly after the event and is non-randomized, (2) individuals who survive to t* are scheduled to be interviewed, (3) there is interview non-response, (4) individuals who die prior to t* are missing information on pre-event confounders, (5) medical records are abstracted on all individuals to obtain information on post-event, pre-treatment confounding factors. To address the issue of survivor bias, we seek to estimate the survivor average causal effect (SACE), the effect of treatment on functional status among the cohort of individuals who would survive to t* regardless of whether or not assigned to treatment. To estimate this effect from observational data, we need to impose untestable assumptions, which depend on the collection of all confounding factors. Since pre-event information is missing on those who die prior to t*, it is unlikely that these data are missing at random (MAR). We introduce a sensitivity analysis methodology to evaluate the robustness of SACE inferences to deviations from the MAR assumption. We apply our methodology to the evaluation of the effect of trauma center care on vitality outcomes using data from the National Study on Costs and Outcomes of Trauma Care. PMID:18759833

  7. Multilingual Information Retrieval in Thoracic Radiology: Feasibility Study

    PubMed Central

    Castilla, André Coutinho; Furuie, Sérgio Shiguemi; Mendonça, Eneida A.

    2014-01-01

    Most of essential information contained on Electronic Medical Record is stored as text, imposing several difficulties on automated data extraction and retrieval. Natural language processing is an approach that can unlock clinical information from free texts. The proposed methodology uses the specialized natural language processor MEDLEE developed for English language. To use this processor on Portuguese medical texts, chest x-ray reports were Machine Translated into English. The result of serial coupling of MT an NLP is tagged text which needs further investigation for extracting clinical findings. The objective of this experiment was to investigate normal reports and reports with device description on a set of 165 chest x-ray reports. We obtained sensitivity and specificity of 1 and 0.71 for the first condition and 0.97 and 0.97 for the second respectively. The reference was formed by the opinion of two radiologists. The results of this experiment indicate the viability of extracting clinical findings from chest x-ray reports through coupling MT and NLP. PMID:17911745

  8. Handling Missing Data With Multilevel Structural Equation Modeling and Full Information Maximum Likelihood Techniques.

    PubMed

    Schminkey, Donna L; von Oertzen, Timo; Bullock, Linda

    2016-08-01

    With increasing access to population-based data and electronic health records for secondary analysis, missing data are common. In the social and behavioral sciences, missing data frequently are handled with multiple imputation methods or full information maximum likelihood (FIML) techniques, but healthcare researchers have not embraced these methodologies to the same extent and more often use either traditional imputation techniques or complete case analysis, which can compromise power and introduce unintended bias. This article is a review of options for handling missing data, concluding with a case study demonstrating the utility of multilevel structural equation modeling using full information maximum likelihood (MSEM with FIML) to handle large amounts of missing data. MSEM with FIML is a parsimonious and hypothesis-driven strategy to cope with large amounts of missing data without compromising power or introducing bias. This technique is relevant for nurse researchers faced with ever-increasing amounts of electronic data and decreasing research budgets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Organizing, exploring, and analyzing antibody sequence data: the case for relational-database managers.

    PubMed

    Owens, John

    2009-01-01

    Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.

  10. Retrospective hospital based surveillance of intussusception in children in a sentinel paediatric hospital: benefits and pitfalls for use in post-marketing surveillance of rotavirus vaccines.

    PubMed

    Lloyd-Johnsen, C; Justice, F; Donath, S; Bines, J E

    2012-04-27

    Evaluation of the safety of rotavirus vaccines, particularly with respect to the risk of intussusception, is recommended for countries planning to introduce rotavirus vaccines into the National Immunisation Program. However, as prospective studies are costly, require time to conduct and may be difficult to perform in some settings, retrospective hospital based surveillance at sentinel sites has been suggested as an option for surveillance for intussusception following introduction of rotavirus vaccines. To assess the value of retrospective hospital based surveillance to describe clinical and epidemiological features of intussusception in children aged <24 months and to investigate any temporal association between receipt of a rotavirus vaccine and intussusception. A retrospective chart review of all patients diagnosed with intussusception at Royal Children's Hospital, Melbourne, Australia over an 8-year period including before and after rotavirus vaccine introduction into the National Immunisation Program, was conducted using patients identified by a medical record database (ICD-10-CM 56.1). Patient profile, clinical presentation, treatment and outcome were analysed along with records of immunisation status obtained using the Australian Childhood Immunisation Register. A 9% misclassification rate of discharge diagnosis of intussusception was identified on critical chart review. The incidence rate of intussusception at the Royal Children's Hospital over the study period was 1.91 per 10,000 infants <24 months (95% CI 1.65-2.20). Intestinal resection was required in 6.5% of infants (95% CI 3.6%, 11.0%). Intussusception occurred within 30 days after vaccination in 2 of 27 patients who had received at least 1 dose of a rotavirus vaccine. Valuable data on the incidence, clinical presentation and treatment outcomes of intussusception can be obtained from data retrieved from hospital medical records in a sentinel paediatric hospital using standardised methodology. However, there are methodological limitations and the quality of the data is highly dependent on the accuracy and completeness of the patient information recorded, the system of coding and record retrieval. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. The patient perspective on the effects of medical record accessibility: a systematic review.

    PubMed

    Vermeir, Peter; Degroote, Sophie; Vandijck, Dominique; Van Tiggelen, Hanne; Peleman, Renaat; Verhaeghe, Rik; Mariman, An; Vogelaers, Dirk

    2017-06-01

    Health care is shifting from a paternalistic to a participatory model, with increasing patient involvement. Medical record accessibility to patients may contribute significantly to patient comanagement. To systematically review the literature on the patient perspective of effects of personal medical record accessibility on the individual patient, patient-physician relationship and quality of medical care. Screening of PubMed, Web of Science, Cinahl, and Cochrane Library on the keywords 'medical record', 'patient record', 'communication', 'patient participation', 'doctor-patient relationship', 'physician-patient relationship' between 1 January 2002 and 31 January 2016; systematic review after assessment for methodological quality. Out of 557 papers screened, only 12 studies qualified for the systematic review. Only a minority of patients spontaneously request access to their medical file, in contrast to frequent awareness of this patient right and the fact that patients in general have a positive view on open visit notes. The majority of those who have actually consulted their file are positive about this experience. Access to personal files improves adequacy and efficiency of communication between physician and patient, in turn facilitating decision-making and self-management. Increased documentation through patient involvement and feedback on the medical file reduces medical errors, in turn increasing satisfaction and quality of care. Information improvement through personal medical file accessibility increased reassurance and a sense of involvement and responsibility. From the patient perspective medical record accessibility contributes to co-management of personal health care.

  12. Step-By-Step Instructions for Retina Recordings with Perforated Multi Electrode Arrays

    PubMed Central

    Idrees, Saad; Mutter, Marion; Benkner, Boris; Münch, Thomas A.

    2014-01-01

    Multi-electrode arrays are a state-of-the-art tool in electrophysiology, also in retina research. The output cells of the retina, the retinal ganglion cells, form a monolayer in many species and are well accessible due to their proximity to the inner retinal surface. This structure has allowed the use of multi-electrode arrays for high-throughput, parallel recordings of retinal responses to presented visual stimuli, and has led to significant new insights into retinal organization and function. However, using conventional arrays where electrodes are embedded into a glass or ceramic plate can be associated with three main problems: (1) low signal-to-noise ratio due to poor contact between electrodes and tissue, especially in the case of strongly curved retinas from small animals, e.g. rodents; (2) insufficient oxygen and nutrient supply to cells located on the bottom of the recording chamber; and (3) displacement of the tissue during recordings. Perforated multi-electrode arrays (pMEAs) have been found to alleviate all three issues in brain slice recordings. Over the last years, we have been using such perforated arrays to study light evoked activity in the retinas of various species including mouse, pig, and human. In this article, we provide detailed step-by-step instructions for the use of perforated MEAs to record visual responses from the retina, including spike recordings from retinal ganglion cells and in vitro electroretinograms (ERG). In addition, we provide in-depth technical and methodological troubleshooting information, and show example recordings of good quality as well as examples for the various problems which might be encountered. While our description is based on the specific equipment we use in our own lab, it may also prove useful when establishing retinal MEA recordings with other equipment. PMID:25165854

  13. Harnessing the Power of Education Research Databases with the Pearl-Harvesting Methodological Framework for Information Retrieval

    ERIC Educational Resources Information Center

    Sandieson, Robert W.; Kirkpatrick, Lori C.; Sandieson, Rachel M.; Zimmerman, Walter

    2010-01-01

    Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinent information from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…

  14. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  15. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  16. 75 FR 14165 - National Institute of Child Health and Human Development; Revision to Proposed Collection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... Information Collection: The purpose of the proposed methodological study is to evaluate the feasibility... the NCS, the multiple methodological studies conducted during the Vanguard phase will inform the... methodological study is identification of recruitment strategies and components of recruitment strategies that...

  17. 77 FR 15092 - U.S. Energy Information Administration; Proposed Agency Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-14

    ... conducted under this clearance will generally be methodological studies of 500 cases or less. The samples... conducted under this clearance will generally be methodological studies of 500 cases or less, but will... the methodological design, sampling procedures (where possible) and questionnaires of the full scale...

  18. Comparing Characteristics of Highly Circulated Titles for Demand-Driven Collection Development.

    ERIC Educational Resources Information Center

    Britten, William A; Webster, Judith D.

    1992-01-01

    Describes methodology for analyzing MARC (machine-readable cataloging) records of highly circulating titles to document common characteristics for collection development purposes. Application of the methodology in a university library is discussed, and data are presented on commonality of subject heading, author, language, and imprint date for…

  19. Qualitative Shadowing as a Research Methodology for Exploring Early Childhood Leadership in Practice

    ERIC Educational Resources Information Center

    Bøe, Marit; Hognestad, Karin; Waniganayake, Manjula

    2017-01-01

    This article explores qualitative shadowing as an interpretivist methodology, and explains how two researchers participating simultaneously in data collection using a video recorder, contextual interviews and video-stimulated recall interviews, conducted a qualitative shadowing study at six early childhood centres in Norway. This paper emerged…

  20. Evaluation of methodologies for interpolation of data for hydrological modeling in glacierized basins with limited information

    NASA Astrophysics Data System (ADS)

    Muñoz, Randy; Paredes, Javier; Huggel, Christian; Drenkhan, Fabian; García, Javier

    2017-04-01

    The availability and consistency of data is a determining factor for the reliability of any hydrological model and simulated results. Unfortunately, there are many regions worldwide where data is not available in the desired quantity and quality. The Santa River basin (SRB), located within a complex topographic and climatic setting in the tropical Andes of Peru is a clear example of this challenging situation. A monitoring network of in-situ stations in the SRB recorded series of hydro-meteorological variables which finally ceased to operate in 1999. In the following years, several researchers evaluated and completed many of these series. This database was used by multiple research and policy-oriented projects in the SRB. However, hydroclimatic information remains limited, making it difficult to perform research, especially when dealing with the assessment of current and future water resources. In this context, here the evaluation of different methodologies to interpolate temperature and precipitation data at a monthly time step as well as ice volume data in glacierized basins with limited data is presented. The methodologies were evaluated for the Quillcay River, a tributary of the SRB, where the hydro-meteorological data is available from nearby monitoring stations since 1983. The study period was 1983 - 1999 with a validation period among 1993 - 1999. For temperature series the aim was to extend the observed data and interpolate it. Data from Reanalysis NCEP was used to extend the observed series: 1) using a simple correlation with multiple field stations, or 2) applying the altitudinal correction proposed in previous studies. The interpolation then was applied as a function of altitude. Both methodologies provide very close results, by parsimony simple correlation is shown as a viable choice. For precipitation series, the aim was to interpolate observed data. Two methodologies were evaluated: 1) Inverse Distance Weighting whose results underestimate the amount of precipitation in high-altitudinal zones, and 2) ordinary Kriging (OK) whose variograms were calculated with the multi-annual monthly mean precipitation applying them to the whole study period. OK leads to better results in both low and high altitudinal zones. For ice volume, the aim was to estimate values from historical data: 1) with the GlabTop algorithm which needs digital elevation models, but these are available in an appropriate scale since 2009, 2) with a widely applied but controversially discussed glacier area-volume relation whose parameters were calibrated with results from the GlabTop model. Both methodologies provide reasonable results, but for historical data, the area-volume scaling only requires the glacial area easy to calculate from satellite images since 1986. In conclusion, the simple correlation, the OK and the calibrated relation for ice volume showed the best ways to interpolate glacio-climatic information. However, these methods must be carefully applied and revisited for the specific situation with high complexity. This is a first step in order to identify the most appropriate methods to interpolate and extend observed data in glacierized basins with limited information. New research should be done evaluating another methodologies and meteorological data in order to improve hydrological models and water management policies.

  1. Research ethics committee decision-making in relation to an efficient neonatal trial.

    PubMed

    Gale, C; Hyde, M J; Modi, N

    2017-07-01

    Randomised controlled trials, a gold-standard approach to reduce uncertainties in clinical practice, are growing in cost and are often slow to recruit. We determined whether methodological approaches to facilitate large, efficient clinical trials were acceptable to UK research ethics committees (RECs). We developed a protocol in collaboration with parents, for a comparative-effectiveness, randomised controlled trial comparing two widely used blood transfusion practices in preterm infants. We incorporated four approaches to improve recruitment and efficiency: (i) point-of-care design using electronic patient records for patient identification, randomisation and data acquisition, (ii) short two-page information sheet; (iii) explicit mention of possible inclusion benefit; (iv) opt-out consent with enrolment as the default. With the support of the UK Health Research Authority, we submitted an identical protocol to 12 UK REC. RECs in the UK. Number of REC granting favourable opinions. The use of electronic patient records was acceptable to all RECs; one REC raised concerns about the short parent information sheet, 10 about inclusion benefit and 9 about opt-out consent. Following responses to queries, nine RECs granted a favourable final opinion and three rejected the application because they considered the opt-out consent process invalid. A majority of RECs in this study consider the use of electronic patient record data, short information sheets, opt-out consent and mention of possible inclusion benefit to be acceptable in neonatal comparative-effectiveness research. We identified a need for guidance for RECs in relation to opt-out consent processes. These methods provide opportunity to facilitate large randomised controlled trials. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  2. Probabilistic Risk Analysis of Run-up and Inundation in Hawaii due to Distant Tsunamis

    NASA Astrophysics Data System (ADS)

    Gica, E.; Teng, M. H.; Liu, P. L.

    2004-12-01

    Risk assessment of natural hazards usually includes two aspects, namely, the probability of the natural hazard occurrence and the degree of damage caused by the natural hazard. Our current study is focused on the first aspect, i.e., the development and evaluation of a methodology that can predict the probability of coastal inundation due to distant tsunamis in the Pacific Basin. The calculation of the probability of tsunami inundation could be a simple statistical problem if a sufficiently long record of field data on inundation was available. Unfortunately, such field data are very limited in the Pacific Basin due to the reason that field measurement of inundation requires the physical presence of surveyors on site. In some areas, no field measurements were ever conducted in the past. Fortunately, there are more complete and reliable historical data on earthquakes in the Pacific Basin partly because earthquakes can be measured remotely. There are also numerical simulation models such as the Cornell COMCOT model that can predict tsunami generation by an earthquake, propagation in the open ocean, and inundation onto a coastal land. Our objective is to develop a methodology that can link the probability of earthquakes in the Pacific Basin with the inundation probability in a coastal area. The probabilistic methodology applied here involves the following steps: first, the Pacific Rim is divided into blocks of potential earthquake sources based on the past earthquake record and fault information. Then the COMCOT model is used to predict the inundation at a distant coastal area due to a tsunami generated by an earthquake of a particular magnitude in each source block. This simulation generates a response relationship between the coastal inundation and an earthquake of a particular magnitude and location. Since the earthquake statistics is known for each block, by summing the probability of all earthquakes in the Pacific Rim, the probability of the inundation in a coastal area can be determined through the response relationship. Although the idea of the statistical methodology applied here is not new, this study is the first to apply it to study the probability of inundation caused by earthquake-generated distant tsunamis in the Pacific Basin. As a case study, the methodology is applied to predict the tsunami inundation risk in Hilo Bay in Hawaii. Since relatively more field data on tsunami inundation are available for Hilo Bay, this case study can help to evaluate the applicability of the methodology for predicting tsunami inundation risk in the Pacific Basin. Detailed results will be presented at the AGU meeting.

  3. Ontology development for provenance tracing in National Climate Assessment of the US Global Change Research Program

    NASA Astrophysics Data System (ADS)

    Ma, X.; Zheng, J. G.; Goldstein, J.; Duggan, B.; Xu, J.; Du, C.; Akkiraju, A.; Aulenbach, S.; Tilmes, C.; Fox, P. A.

    2013-12-01

    The periodical National Climate Assessment (NCA) of the US Global Change Research Program (USGCRP) [1] produces reports about findings of global climate change and the impacts of climate change on the United States. Those findings are of great public and academic concerns and are used in policy and management decisions, which make the provenance information of findings in those reports especially important. The USGCRP is developing a Global Change Information System (GCIS), in which the NCA reports and associated provenance information are the primary records. We were modeling and developing Semantic Web applications for the GCIS. By applying a use case-driven iterative methodology [2], we developed an ontology [3] to represent the content structure of a report and the associated provenance information. We also mapped the classes and properties in our ontology into the W3C PROV-O ontology [4] to realize the formal presentation of provenance. We successfully implemented the ontology in several pilot systems for a recent National Climate Assessment report (i.e., the NCA3). They provide users the functionalities to browse and search provenance information with topics of interest. Provenance information of the NCA3 has been made structured and interoperable by applying the developed ontology. Besides the pilot systems we developed, other tools and services are also able to interact with the data in the context of the 'Web of data' and thus create added values. Our research shows that the use case-driven iterative method bridges the gap between Semantic Web researchers and earth and environmental scientists and is able to be deployed rapidly for developing Semantic Web applications. Our work also provides first-hand experience for re-using the W3C PROV-O ontology in the field of earth and environmental sciences, as the PROV-O ontology is recently ratified (on 04/30/2013) by the W3C as a recommendation and relevant applications are still rare. [1] http://www.globalchange.gov [2] Fox, P., McGuinness, D.L., 2008. TWC Semantic Web Methodology. Accessible at: http://tw.rpi.edu/web/doc/TWC_SemanticWebMethodology [3] https://scm.escience.rpi.edu/svn/public/projects/gcis/trunk/rdf/schema/GCISOntology.ttl [4] http://www.w3.org/TR/prov-o/

  4. Method for automatic detection of wheezing in lung sounds.

    PubMed

    Riella, R J; Nohama, P; Maia, J M

    2009-07-01

    The present report describes the development of a technique for automatic wheezing recognition in digitally recorded lung sounds. This method is based on the extraction and processing of spectral information from the respiratory cycle and the use of these data for user feedback and automatic recognition. The respiratory cycle is first pre-processed, in order to normalize its spectral information, and its spectrogram is then computed. After this procedure, the spectrogram image is processed by a two-dimensional convolution filter and a half-threshold in order to increase the contrast and isolate its highest amplitude components, respectively. Thus, in order to generate more compressed data to automatic recognition, the spectral projection from the processed spectrogram is computed and stored as an array. The higher magnitude values of the array and its respective spectral values are then located and used as inputs to a multi-layer perceptron artificial neural network, which results an automatic indication about the presence of wheezes. For validation of the methodology, lung sounds recorded from three different repositories were used. The results show that the proposed technique achieves 84.82% accuracy in the detection of wheezing for an isolated respiratory cycle and 92.86% accuracy for the detection of wheezes when detection is carried out using groups of respiratory cycles obtained from the same person. Also, the system presents the original recorded sound and the post-processed spectrogram image for the user to draw his own conclusions from the data.

  5. The need for calcium imaging in nonhuman primates: New motor neuroscience and brain-machine interfaces.

    PubMed

    O'Shea, Daniel J; Trautmann, Eric; Chandrasekaran, Chandramouli; Stavisky, Sergey; Kao, Jonathan C; Sahani, Maneesh; Ryu, Stephen; Deisseroth, Karl; Shenoy, Krishna V

    2017-01-01

    A central goal of neuroscience is to understand how populations of neurons coordinate and cooperate in order to give rise to perception, cognition, and action. Nonhuman primates (NHPs) are an attractive model with which to understand these mechanisms in humans, primarily due to the strong homology of their brains and the cognitively sophisticated behaviors they can be trained to perform. Using electrode recordings, the activity of one to a few hundred individual neurons may be measured electrically, which has enabled many scientific findings and the development of brain-machine interfaces. Despite these successes, electrophysiology samples sparsely from neural populations and provides little information about the genetic identity and spatial micro-organization of recorded neurons. These limitations have spurred the development of all-optical methods for neural circuit interrogation. Fluorescent calcium signals serve as a reporter of neuronal responses, and when combined with post-mortem optical clearing techniques such as CLARITY, provide dense recordings of neuronal populations, spatially organized and annotated with genetic and anatomical information. Here, we advocate that this methodology, which has been of tremendous utility in smaller animal models, can and should be developed for use with NHPs. We review here several of the key opportunities and challenges for calcium-based optical imaging in NHPs. We focus on motor neuroscience and brain-machine interface design as representative domains of opportunity within the larger field of NHP neuroscience. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  6. A diabetes dashboard and physician efficiency and accuracy in accessing data needed for high-quality diabetes care.

    PubMed

    Koopman, Richelle J; Kochendorfer, Karl M; Moore, Joi L; Mehr, David R; Wakefield, Douglas S; Yadamsuren, Borchuluun; Coberly, Jared S; Kruse, Robin L; Wakefield, Bonnie J; Belden, Jeffery L

    2011-01-01

    We compared use of a new diabetes dashboard screen with use of a conventional approach of viewing multiple electronic health record (EHR) screens to find data needed for ambulatory diabetes care. We performed a usability study, including a quantitative time study and qualitative analysis of information-seeking behaviors. While being recorded with Morae Recorder software and "think-aloud" interview methods, 10 primary care physicians first searched their EHR for 10 diabetes data elements using a conventional approach for a simulated patient, and then using a new diabetes dashboard for another. We measured time, number of mouse clicks, and accuracy. Two coders analyzed think-aloud and interview data using grounded theory methodology. The mean time needed to find all data elements was 5.5 minutes using the conventional approach vs 1.3 minutes using the diabetes dashboard (P <.001). Physicians correctly identified 94% of the data requested using the conventional method, vs 100% with the dashboard (P <.01). The mean number of mouse clicks was 60 for conventional searching vs 3 clicks with the diabetes dashboard (P <.001). A common theme was that in everyday practice, if physicians had to spend too much time searching for data, they would either continue without it or order a test again. Using a patient-specific diabetes dashboard improves both the efficiency and accuracy of acquiring data needed for high-quality diabetes care. Usability analysis tools can provide important insights into the value of optimizing physician use of health information technologies.

  7. Easier said than done!: methodological challenges with conducting maternal death review research in Malawi.

    PubMed

    Combs Thorsen, Viva; Sundby, Johanne; Meguid, Tarek; Malata, Address

    2014-02-21

    Maternal death auditing is widely used to ascertain in-depth information on the clinical, social, cultural, and other contributing factors that result in a maternal death. As the 2015 deadline for Millennium Development Goal 5 of reducing maternal mortality by three quarters between 1990 and 2015 draws near, this information becomes even more critical for informing intensified maternal mortality reduction strategies. Studies using maternal death audit methodologies are widely available, but few discuss the challenges in their implementation. The purpose of this paper is to discuss the methodological issues that arose while conducting maternal death review research in Lilongwe, Malawi. Critical reflections were based on a recently conducted maternal mortality study in Lilongwe, Malawi in which a facility-based maternal death review approach was used. The five-step maternal mortality surveillance cycle provided the framework for discussion. The steps included: 1) identification of cases, 2) data collection, 3) data analysis, 4) recommendations, and 5) evaluation. Challenges experienced were related to the first three steps of the surveillance cycle. They included: 1) identification of cases: conflicting maternal death numbers, and missing medical charts, 2) data collection: poor record keeping, poor quality of documentation, difficulties in identifying and locating appropriate healthcare workers for interviews, the potential introduction of bias through the use of an interpreter, and difficulties with locating family and community members and recall bias; and 3) data analysis: determining the causes of death and clinical diagnoses. Conducting facility-based maternal death reviews for the purpose of research has several challenges. This paper illustrated that performing such an activity, particularly the data collection phase, was not as easy as conveyed in international guidelines and in published studies. However, these challenges are not insurmountable. If they are anticipated and proper steps are taken in advance, they can be avoided or their effects minimized.

  8. [Simulation and data mining model for identifying and prediction budget changes in the care of patients with hypertension].

    PubMed

    Joyanes-Aguilar, Luis; Castaño, Néstor J; Osorio, José H

    2015-10-01

    Objective To present a simulation model that establishes the economic impact to the health care system produced by the diagnostic evolution of patients suffering from arterial hypertension. Methodology The information used corresponds to that available in Individual Health Records (RIPs, in Spanish). A statistical characterization was carried out and a model for matrix storage in MATLAB was proposed. Data mining was used to create predictors. Finally, a simulation environment was built to determine the economic cost of diagnostic evolution. Results 5.7 % of the population progresses from the diagnosis, and the cost overrun associated with it is 43.2 %. Conclusions Results shows the applicability and possibility of focussing research on establishing diagnosis relationships using all the information reported in the RIPS in order to create econometric indicators that can determine which diagnostic evolutions are most relevant to budget allocation.

  9. Determining accurate vaccination coverage rates for adolescents: the National Immunization Survey-Teen 2006.

    PubMed

    Jain, Nidhi; Singleton, James A; Montgomery, Margrethe; Skalland, Benjamin

    2009-01-01

    Since 1994, the Centers for Disease Control and Prevention has funded the National Immunization Survey (NIS), a large telephone survey used to estimate vaccination coverage of U.S. children aged 19-35 months. The NIS is a two-phase survey that obtains vaccination receipt information from a random-digit-dialed survey, designed to identify households with eligible children, followed by a provider record check, which obtains provider-reported vaccination histories for eligible children. In 2006, the survey was expanded for the first time to include a national sample of adolescents aged 13-17 years, called the NIS-Teen. This article summarizes the methodology used in the NIS-Teen. In 2008, the NIS-Teen was expanded to collect state-specific and national-level data to determine vaccination coverage estimates. This survey provides valuable information to guide immunization programs for adolescents.

  10. Why Consumers Misattribute Sponsorships to Non-Sponsor Brands: Differential Roles of Item and Relational Communications.

    PubMed

    Weeks, Clinton S; Humphreys, Michael S; Cornwell, T Bettina

    2018-02-01

    Brands engaged in sponsorship of events commonly have objectives that depend on consumer memory for the sponsor-event relationship (e.g., sponsorship awareness). Consumers however, often misattribute sponsorships to nonsponsor competitor brands, indicating erroneous memory for these relationships. The current research uses an item and relational memory framework to reveal sponsor brands may inadvertently foster this misattribution when they communicate relational linkages to events. Effects can be explained via differential roles of communicating item information (information that supports processing item distinctiveness) versus relational information (information that supports processing relationships among items) in contributing to memory outcomes. Experiment 1 uses event-cued brand recall to show that correct memory retrieval is best supported by communicating relational information when sponsorship relationships are not obvious (low congruence). In contrast, correct retrieval is best supported by communicating item information when relationships are obvious (high congruence). Experiment 2 uses brand-cued event recall to show that, against conventional marketing recommendations, relational information increases misattribution, whereas item information guards against misattribution. Results suggest sponsor brands must distinguish between item and relational communications to enhance correct retrieval and limit misattribution. Methodologically, the work shows that choice of cueing direction is critical in differentially revealing patterns of correct and incorrect retrieval with pair relationships. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Methodological, technical, and ethical issues of a computerized data system.

    PubMed

    Rice, C A; Godkin, M A; Catlin, R J

    1980-06-01

    This report examines some methodological, technical, and ethical issues which need to be addressed in designing and implementing a valid and reliable computerized clinical data base. The report focuses on the data collection system used by four residency based family health centers, affiliated with the University of Massachusetts Medical Center. It is suggested that data reliability and validity can be maximized by: (1) standardizing encounter forms at affiliated health centers to eliminate recording biases and ensure data comparability; (2) using forms with a diagnosis checklist to reduce coding errors and increase the number of diagnoses recorded per encounter; (3) developing uniform diagnostic criteria; (4) identifying sources of error, including discrepancies of clinical data as recorded in medical records, encounter forms, and the computer; and (5) improving provider cooperation in recording data by distributing data summaries which reinforce the data's applicability to service provision. Potential applications of the data for research purposes are restricted by personnel and computer costs, confidentiality considerations, programming related issues, and, most importantly, health center priorities, largely focused on patient care, not research.

  12. Cloud cameras at the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Winnick, Michael G.

    2010-06-01

    This thesis presents the results of measurements made by infrared cloud cameras installed at the Pierre Auger Observatory in Argentina. These cameras were used to record cloud conditions during operation of the observatory's fluorescence detectors. As cloud may affect the measurement of fluorescence from cosmic ray extensive air showers, the cloud cameras provide a record of which measurements have been interfered with by cloud. Several image processing algorithms were developed, along with a methodology for the detection of cloud within infrared images taken by the cloud cameras. A graphical user interface (GUI) was developed to expediate this, as a large number of images need to be checked for cloud. A cross-check between images recorded by three of the observatory's cloud cameras is presented, along with a comparison with independent cloud measurements made by LIDAR. Despite the cloud cameras and LIDAR observing different areas of the sky, a good agreement is observed in the measured cloud fraction between the two instruments, particularly on very clear and overcast nights. Cloud information recorded by the cloud cameras, with cloud height information measured by the LIDAR, was used to identify those extensive air showers that were obscured by cloud. These events were used to study the effectiveness of standard quality cuts at removing cloud afflicted events. Of all of the standard quality cuts studied in this thesis, the LIDAR cloud fraction cut was the most effective at preferentially removing cloud obscured events. A 'cloudy pixel' veto is also presented, whereby cloud obscured measurements are excluded during the standard hybrid analysis, and new extensive air shower reconstructed parameters determined. The application of such a veto would provide a slight increase to the number of events available for higher level analysis.

  13. 75 FR 80425 - Satellite Television Extension and Localism Act of 2010 and Satellite Home Viewer Extension and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ... comment the submission of additional information concerning the methodological changes for the digital... additional information concerning the methodological changes suggested in the comments by Mr. Shumate for the...-loss. The Commission is requesting a detailed description of the methodological changes that would be...

  14. Managing In-House Development of a Campus-Wide Information System

    ERIC Educational Resources Information Center

    Shurville, Simon; Williams, John

    2005-01-01

    Purpose: To show how a combination of hard and soft project and change management methodologies guided successful in-house development of a campus-wide information system. Design/methodology/approach: A case study of the methodologies and management structures that guided the development is presented. Findings: Applying a combination of the…

  15. Calibration of diatom-pH-alkalinity methodology for the interpretation of the sedimentary record in Emerald Lake Integrated watershed study. Final report, 6 May 1985-10 October 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holmes, R.W.

    1986-10-10

    The present study was designed to establish quantitative relationships between lake air-equilibrated pH, alkalinity, and diatoms occurring in the surface sediments in high-elevation Sierra Nevada Lakes. These relationships provided the necessary information to develop predictive equations relating lake pH to the composition of surface-sediment diatom assemblages in 27 study lakes. Using the Hustedt diatom pH classification system, Index B of Renberg and Hellberg, and multiple linear regression analysis, two equations were developed which predict lake pH from the relative abundance of sediment diatoms occurring in each of four diatom pH groupings.

  16. Experiences of Structured Elicitation for Model-Based Cost-Effectiveness Analyses.

    PubMed

    Soares, Marta O; Sharples, Linda; Morton, Alec; Claxton, Karl; Bojke, Laura

    2018-06-01

    Empirical evidence supporting the cost-effectiveness estimates of particular health care technologies may be limited, or it may even be missing entirely. In these situations, additional information, often in the form of expert judgments, is needed to reach a decision. There are formal methods to quantify experts' beliefs, termed as structured expert elicitation (SEE), but only limited research is available in support of methodological choices. Perhaps as a consequence, the use of SEE in the context of cost-effectiveness modelling is limited. This article reviews applications of SEE in cost-effectiveness modelling with the aim of summarizing the basis for methodological choices made in each application and recording the difficulties and challenges reported by the authors in the design, conduct, and analyses. The methods used in each application were extracted along with the criteria used to support methodological and practical choices and any issues or challenges discussed in the text. Issues and challenges were extracted using an open field, and then categorised and grouped for reporting. The review demonstrates considerable heterogeneity in methods used, and authors acknowledge great methodological uncertainty in justifying their choices. Specificities of the context area emerging as potentially important in determining further methodological research in elicitation are between- expert variation and its interpretation, the fact that substantive experts in the area may not be trained in quantitative subjects, that judgments are often needed on various parameter types, the need for some form of assessment of validity, and the need for more integration with behavioural research to devise relevant debiasing strategies. This review of experiences of SEE highlights a number of specificities/constraints that can shape the development of guidance and target future research efforts in this area. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using Ultrasound and "Braslet-M" Occlusion Cuffs

    NASA Technical Reports Server (NTRS)

    Bogomolov, V. V.; Duncan, J. M.; Alferova, I. V.; Dulchavsky, S. A.; Ebert, D.; Hamilton, D. R.; Matveev, V. P.; Sargsyan, A. E.

    2008-01-01

    Recent advances in remotely guided imaging techniques on ISS allow the acquisition of high quality ultrasound data using crewmember operators with no medical background and minimal training. However, ongoing efforts are required to develop and validate methodology for complex imaging protocols to ensure their repeatability, efficiency, and suitability for use aboard the ISS. This Station Developmental Test Objective (SDTO) tests a cardiovascular evaluation methodology that takes advantage of the ISS Ultrasound capability, the Braslet-M device, and modified respiratory maneuvers (Valsalva and Mueller), to broaden the spectrum of anatomical and functional information on human cardiovascular system during long-duration space missions. The proposed methodology optimizes and combines new and previously demonstrated methods, and is expected to benefit medically indicated assessments, operational research protocols, and data collections for science. Braslet-M is a current Russian operational countermeasure that compresses the upper thigh to impede the venous return from lower extremities. The goal of the SDTO is to establish and validate a repeatable ultrasound-based methodology for the assessment of a number of cardiovascular criteria in microgravity. Braslet-M device is used as a means to acutely alter volume distribution while focused ultrasound measurements are performed. Modified respiratory maneuvers are done upon volume manipulations to record commensurate changes in anatomical and functional parameters. The overall cardiovascular effects of the Braslet-M device are not completely understood, and although not a primary objective of this SDTO, this effort will provide pilot data regarding the suitability of Braslet-M for its intended purpose, effects, and the indications for its use.

  18. Analyzing workplace exposures using direct reading instruments and video exposure monitoring techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gressel, M.G.; Heitbrink, W.A.; Jensen, P.A.

    1992-08-01

    The techniques for conducting video exposure monitoring were described along with the equipment required to monitor and record worker breathing zone concentrations, the analysis of the real time exposure data using video recordings, and the use of real time concentration data from a direct reading instrument to determine the effective ventilation rate and the mixing factor of a given room at a specific time. Case studies which made use of video exposure monitoring techniques to provide information not available through integrated sampling were also discussed. The process being monitored and the methodology used to monitor the exposures were described formore » each of the case studies. The case studies included manual material weigh out, ceramic casting cleaning, dumping bags of powdered materials, furniture stripping, administration of nitrous-oxide during dental procedures, hand held sanding operation, methanol exposures in maintenance garages, brake servicing, bulk loading of railroad cars and trucks, and grinding operations.« less

  19. Recommendations for the use of electroencephalography and evoked potentials in comatose patients.

    PubMed

    André-Obadia, Nathalie; Zyss, Julie; Gavaret, Martine; Lefaucheur, Jean-Pascal; Azabou, Eric; Boulogne, Sébastien; Guérit, Jean-Michel; McGonigal, Aileen; Merle, Philippe; Mutschler, Véronique; Naccache, Lionel; Sabourdy, Cécile; Trébuchon, Agnès; Tyvaert, Louise; Vercueil, Laurent; Rohaut, Benjamin; Delval, Arnaud

    2018-05-18

    Predicting the outcome of a comatose or poorly responsive patient is a major issue for intensive care unit teams, in order to give the most accurate information to the family and to choose the best therapeutic option. However, determining the level of cortical activity in patients with disorders of consciousness is a real challenge. Reliable criteria are required to help clinicians in the decision-making process, especially in the acute phase of coma. In this paper, we propose recommendations for recording and interpreting electroencephalography and evoked potentials in comatose patients based on the literature and the clinical experience of a group of neurophysiologists trained in the management of comatose patients. We propose methodological guidelines and discuss prognostic value of each test as well as the limitations concerning recording and interpretation. Recommendations for the strategy and timing of neurophysiological assessments are also proposed according to various clinical situations. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  20. Estimating length of avian incubation and nestling stages in afrotropical forest birds from interval-censored nest records

    USGS Publications Warehouse

    Stanley, T.R.; Newmark, W.D.

    2010-01-01

    In the East Usambara Mountains in northeast Tanzania, research on the effects of forest fragmentation and disturbance on nest survival in understory birds resulted in the accumulation of 1,002 nest records between 2003 and 2008 for 8 poorly studied species. Because information on the length of the incubation and nestling stages in these species is nonexistent or sparse, our objectives in this study were (1) to estimate the length of the incubation and nestling stage and (2) to compute nest survival using these estimates in combination with calculated daily survival probability. Because our data were interval censored, we developed and applied two new statistical methods to estimate stage length. In the 8 species studied, the incubation stage lasted 9.6-21.8 days and the nestling stage 13.9-21.2 days. Combining these results with estimates of daily survival probability, we found that nest survival ranged from 6.0% to 12.5%. We conclude that our methodology for estimating stage lengths from interval-censored nest records is a reasonable and practical approach in the presence of interval-censored data. ?? 2010 The American Ornithologists' Union.

  1. Studying Information Needs as Question-Negotiations in an Educational Context: A Methodological Comment

    ERIC Educational Resources Information Center

    Lundh, Anna

    2010-01-01

    Introduction: The concept of information needs is significant within the field of Information Needs Seeking and Use. "How" information needs can be studied empirically is however something that has been called into question. The main aim of this paper is to explore the methodological consequences of discursively oriented theories when…

  2. Methodologies and Methods for User Behavioral Research.

    ERIC Educational Resources Information Center

    Wang, Peiling

    1999-01-01

    Discusses methodological issues in empirical studies of information-related behavior in six specific research areas: information needs and uses; information seeking; relevance judgment; online searching (including online public access catalog, online database, and the Web); human-system interactions; and reference transactions. (Contains 191…

  3. Thinking about Museum Information.

    ERIC Educational Resources Information Center

    Reed, Patricia Ann; Sledge, Jane

    1988-01-01

    Describes work in progress at the Smithsonian Institution in developing a system to understand and articulate the information needed to support collection related functions. The discussion covers the data modeling methodology used and the advantages of this methodology in structuring museum collections information. (one reference) (CLB)

  4. A New Kind of Single-Well Tracer Test for Assessing Subsurface Heterogeneity

    NASA Astrophysics Data System (ADS)

    Hansen, S. K.; Vesselinov, V. V.; Lu, Z.; Reimus, P. W.; Katzman, D.

    2017-12-01

    Single-well injection-withdrawal (SWIW) tracer tests have historically been interpreted using the idealized assumption of tracer path reversibility (i.e., negligible background flow), with background flow due to natural hydraulic gradient being an un-modeled confounding factor. However, we have recently discovered that it is possible to use background flow to our advantage to extract additional information about the subsurface. To wit: we have developed a new kind of single-well tracer test that exploits flow due to natural gradient to estimate the variance of the log hydraulic conductivity field of a heterogeneous aquifer. The test methodology involves injection under forced gradient and withdrawal under natural gradient, and makes use of a relationship, discovered using a large-scale Monte Carlo study and machine learning techniques, between power law breakthrough curve tail exponent and log-hydraulic conductivity variance. We will discuss how we performed the computational study and derived this relationship and then show an application example in which our new single-well tracer test interpretation scheme was applied to estimation of heterogeneity of a formation at the chromium contamination site at Los Alamos National Laboratory. Detailed core hole records exist at the same site, from which it was possible to estimate the log hydraulic conductivity variance using a Kozeny-Carman relation. The variances estimated using our new tracer test methodology and estimated by direct inspection of core were nearly identical, corroborating the new methodology. Assessment of aquifer heterogeneity is of critical importance to deployment of amendments associated with in-situ remediation strategies, since permeability contrasts potentially reduce the interaction between amendment and contaminant. Our new tracer test provides an easy way to obtain this information.

  5. Development of a risk-based prioritisation methodology to inform public health emergency planning and preparedness in case of accidental spill at sea of hazardous and noxious substances (HNS).

    PubMed

    Harold, P D; de Souza, A S; Louchart, P; Russell, D; Brunt, H

    2014-11-01

    Hazardous and noxious chemicals are increasingly being transported by sea. Current estimates indicate some 2000 hazardous and noxious substances (HNS) are carried regularly by sea with bulk trade of 165milliontonnes per year worldwide. Over 100 incidents involving HNS have been reported in EU waters. Incidents occurring in a port or coastal area can have potential and actual public health implications. A methodology has been developed for prioritisation of HNS, based upon potential public health risks. The work, undertaken for the Atlantic Region Pollution Response programme (ARCOPOL), aims to provide information for incident planning and preparedness. HNS were assessed using conventional methodology based upon acute toxicity, behaviour and reactivity. Tonnage was used as a proxy for likelihood, although other factors such as shipping frequency and local navigation may also contribute. Analysis of 350 individual HNS identified the highest priority HNS as being those that present an inhalation risk. Limitations were identified around obtaining accurate data on HNS handled on a local and regional level due to a lack of port records and also political and commercial confidentiality issues. To account for this the project also developed a software tool capable of combining chemical data from the study with user defined shipping data to be used by operators to produce area-specific prioritisations. In conclusion a risk prioritisation matrix has been developed to assess the acute risks to public health from the transportation of HNS. Its potential use in emergency planning and preparedness is discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Assessing the spatial representability of charcoal and PAH-based paleofire records with integrated GIS, modelling, and empirical approaches

    NASA Astrophysics Data System (ADS)

    Vachula, R. S.; Huang, Y.; Russell, J. M.

    2017-12-01

    Lake sediment-based fire reconstructions offer paleoenvironmental context in which to assess modern fires and predict future burning. However, despite the ubiquity, many uncertainties remain regarding the taphonomy of paleofire proxies and the spatial scales for which they record variations in fire history. Here we present down-core proxy analyses of polycyclic aromatic hydrocarbons (PAHs) and three size-fractions of charcoal (63-150, >150 and >250 μm) from Swamp Lake, California, an annually laminated lacustrine archive. Using a statewide historical GIS dataset of area burned, we assess the spatial scales for which these proxies are reliable recorders of fire history. We find that the coherence of observed and proxy-recorded fire history inherently depends upon spatial scale. Contrary to conventional thinking that charcoal mainly records local fires, our results indicate that macroscopic charcoal (>150 μm) may record spatially broader (<25 km) changes in fire history, and as such, the coarsest charcoal particles (>250 μm) may be a more conservative proxy for local burning. We find that sub-macroscopic charcoal particles (63-150 μm) reliably record regional (up to 150 km) changes in fire history. These results indicate that charcoal-based fire reconstructions may represent spatially broader fire history than previously thought, which has major implications for our understanding of spatiotemporal paleofire variations. Our analyses of PAHs show that dispersal mobility is heterogeneous between compounds, but that PAH fluxes are reliable proxies of fire history within 25-50 km, which suggests PAHs may be a better spatially constrained paleofire proxy than sedimentary charcoal. Further, using a linear discriminant analysis model informed by modern emissions analyses, we show that PAH assemblages preserved in lake sediments can differentiate vegetation type burned, and are thus promising paleoecological biomarkers warranting further research and implementation. In sum, our analyses offer new insight into the spatial dimensions of paleofire proxies and constitute a methodology that can be applied to other locations and proxies to better inform site-specific reconstructions.

  7. [Linking anonymous databases for national and international multicenter epidemiological studies: a cryptographic algorithm].

    PubMed

    Quantin, C; Fassa, M; Coatrieux, G; Riandey, B; Trouessin, G; Allaert, F A

    2009-02-01

    Compiling individual records which come from different sources remains very important for multicenter epidemiological studies, but at the same time European directives or other national legislation concerning nominal data processing have to be respected. These legal aspects can be satisfied by implementing mechanisms that allow anonymization of patient data (such as hashing techniques). Moreover, for security reasons, official recommendations suggest using different cryptographic keys in combination with a cryptographic hash function for each study. Unfortunately, such an anonymization procedure is in contradiction with the common requirement in public health and biomedical research as it becomes almost impossible to link records from separate data collections where the same entity is not referenced in the same way. Solving this paradox by using methodology based on the combination of hashing and enciphering techniques is the main aim of this article. The method relies on one of the best known hashing functions (the secure hash algorithm) to ensure the anonymity of personal information while providing greater resistance to dictionary attacks, combined with encryption techniques. The originality of the method relies on the way the combination of hashing and enciphering techniques is performed: like in asymmetric encryption, two keys are used but the private key depends on the patient's identity. The combination of hashing and enciphering techniques provides a great improvement in the overall security of the proposed scheme. This methodology makes the stored data available for use in the field of public health for the benefit of patients, while respecting legal security requirements.

  8. A model for national outcome audit in vascular surgery.

    PubMed

    Prytherch, D R; Ridler, B M; Beard, J D; Earnshaw, J J

    2001-06-01

    The aim was to model vascular surgical outcome in a national study using POSSUM scoring. One hundred and twenty-one British and Irish surgeons completed data questionnaires on patients undergoing arterial surgery under their care (mean 12 patients, range 1-49) in May/June 1998. A total of 1480 completed data records were available for logistic regression analysis using P-POSSUM methodology. Information collected included all POSSUM data items plus other factors thought to have a significant bearing on patient outcome: "extra items". The main outcome measures were death and major postoperative complications. The data were checked and inconsistent records were excluded. The remaining 1313 were divided into two sets for analysis. The first "training" set was used to obtain logistic regression models that were applied prospectively to the second "test" dataset. using POSSUM data items alone, it was possible to predict both mortality and morbidity after vascular reconstruction using P-POSSUM analysis. The addition of the "extra items" found significant in regression analysis did not significantly improve the accuracy of prediction. It was possible to predict both mortality and morbidity derived from the preoperative physiology components of the POSSUM data items alone. this study has shown that P-POSSUM methodology can be used to predict outcome after arterial surgery across a range of surgeons in different hospitals and could form the basis of a national outcome audit. It was also possible to obtain accurate models for both mortality and major morbidity from the POSSUM physiology scores alone. Copyright 2001 Harcourt Publishers Limited.

  9. Knowledge discovery for Deep Phenotyping serious mental illness from Electronic Mental Health records.

    PubMed

    Jackson, Richard; Patel, Rashmi; Velupillai, Sumithra; Gkotsis, George; Hoyle, David; Stewart, Robert

    2018-01-01

    Background: Deep Phenotyping is the precise and comprehensive analysis of phenotypic features in which the individual components of the phenotype are observed and described. In UK mental health clinical practice, most clinically relevant information is recorded as free text in the Electronic Health Record, and offers a granularity of information beyond what is expressed in most medical knowledge bases. The SNOMED CT nomenclature potentially offers the means to model such information at scale, yet given a sufficiently large body of clinical text collected over many years, it is difficult to identify the language that clinicians favour to express concepts. Methods: By utilising a large corpus of healthcare data, we sought to make use of semantic modelling and clustering techniques to represent the relationship between the clinical vocabulary of internationally recognised SMI symptoms and the preferred language used by clinicians within a care setting. We explore how such models can be used for discovering novel vocabulary relevant to the task of phenotyping Serious Mental Illness (SMI) with only a small amount of prior knowledge.  Results: 20 403 terms were derived and curated via a two stage methodology. The list was reduced to 557 putative concepts based on eliminating redundant information content. These were then organised into 9 distinct categories pertaining to different aspects of psychiatric assessment. 235 concepts were found to be expressions of putative clinical significance. Of these, 53 were identified having novel synonymy with existing SNOMED CT concepts. 106 had no mapping to SNOMED CT. Conclusions: We demonstrate a scalable approach to discovering new concepts of SMI symptomatology based on real-world clinical observation. Such approaches may offer the opportunity to consider broader manifestations of SMI symptomatology than is typically assessed via current diagnostic frameworks, and create the potential for enhancing nomenclatures such as SNOMED CT based on real-world expressions.

  10. Knowledge discovery for Deep Phenotyping serious mental illness from Electronic Mental Health records

    PubMed Central

    Jackson, Richard; Patel, Rashmi; Velupillai, Sumithra; Gkotsis, George; Hoyle, David; Stewart, Robert

    2018-01-01

    Background: Deep Phenotyping is the precise and comprehensive analysis of phenotypic features in which the individual components of the phenotype are observed and described. In UK mental health clinical practice, most clinically relevant information is recorded as free text in the Electronic Health Record, and offers a granularity of information beyond what is expressed in most medical knowledge bases. The SNOMED CT nomenclature potentially offers the means to model such information at scale, yet given a sufficiently large body of clinical text collected over many years, it is difficult to identify the language that clinicians favour to express concepts. Methods: By utilising a large corpus of healthcare data, we sought to make use of semantic modelling and clustering techniques to represent the relationship between the clinical vocabulary of internationally recognised SMI symptoms and the preferred language used by clinicians within a care setting. We explore how such models can be used for discovering novel vocabulary relevant to the task of phenotyping Serious Mental Illness (SMI) with only a small amount of prior knowledge.  Results: 20 403 terms were derived and curated via a two stage methodology. The list was reduced to 557 putative concepts based on eliminating redundant information content. These were then organised into 9 distinct categories pertaining to different aspects of psychiatric assessment. 235 concepts were found to be expressions of putative clinical significance. Of these, 53 were identified having novel synonymy with existing SNOMED CT concepts. 106 had no mapping to SNOMED CT. Conclusions: We demonstrate a scalable approach to discovering new concepts of SMI symptomatology based on real-world clinical observation. Such approaches may offer the opportunity to consider broader manifestations of SMI symptomatology than is typically assessed via current diagnostic frameworks, and create the potential for enhancing nomenclatures such as SNOMED CT based on real-world expressions. PMID:29899974

  11. Opportunities and methodological challenges in EEG and MEG resting state functional brain network research.

    PubMed

    van Diessen, E; Numan, T; van Dellen, E; van der Kooi, A W; Boersma, M; Hofman, D; van Lutterveld, R; van Dijk, B W; van Straaten, E C W; Hillebrand, A; Stam, C J

    2015-08-01

    Electroencephalogram (EEG) and magnetoencephalogram (MEG) recordings during resting state are increasingly used to study functional connectivity and network topology. Moreover, the number of different analysis approaches is expanding along with the rising interest in this research area. The comparison between studies can therefore be challenging and discussion is needed to underscore methodological opportunities and pitfalls in functional connectivity and network studies. In this overview we discuss methodological considerations throughout the analysis pipeline of recording and analyzing resting state EEG and MEG data, with a focus on functional connectivity and network analysis. We summarize current common practices with their advantages and disadvantages; provide practical tips, and suggestions for future research. Finally, we discuss how methodological choices in resting state research can affect the construction of functional networks. When taking advantage of current best practices and avoid the most obvious pitfalls, functional connectivity and network studies can be improved and enable a more accurate interpretation and comparison between studies. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Quantifying the probability of record-setting heat events in the historical record and at different levels of climate forcing

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.

    2017-12-01

    Severe heat provides one of the most direct, acute, and rapidly changing impacts of climate on people and ecostystems. Theory, historical observations, and climate model simulations all suggest that global warming should increase the probability of hot events that fall outside of our historical experience. Given the acutre impacts of extreme heat, quantifying the probability of historically unprecedented hot events at different levels of climate forcing is critical for climate adaptation and mitigation decisions. However, in practice that quantification presents a number of methodological challenges. This presentation will review those methodological challenges, including the limitations of the observational record and of climate model fidelity. The presentation will detail a comprehensive approach to addressing these challenges. It will then demonstrate the application of that approach to quantifying uncertainty in the probability of record-setting hot events in the current climate, as well as periods with lower and higher greenhouse gas concentrations than the present.

  13. Cognitive chrono-ethnography lite.

    PubMed

    Nakajima, Masato; Yamada, Kosuke C; Kitajima, Muneo

    2012-01-01

    Conducting field research facilitates understanding human daily activities. Cognitive Chrono-Ethnography (CCE) is a study methodology used to understand how people select actions in daily life by conducting ethnographical field research. CCE consists of measuring monitors' daily activities in a specified field and in-depth interviews using the recorded videos afterward. However, privacy issues may arise when conducting standard CCE with video recordings in a daily field. To resolve these issues, we developed a new study methodology, CCE Lite. To replace video recordings, we created pseudo-first-personview (PFPV) movies using a computer-graphic technique. The PFPV movies were used to remind the monitors of their activities. These movies replicated monitors' activities (e.g., locomotion and change in physical direction), with no human images and voices. We applied CCE Lite in a case study that involved female employees of hotels at a spa resort. In-depth interviews while showing the PFPV movies determined service schema of the employees (i.e., hospitality). Results indicated that using PFPV movies helped the employees to remember and reconstruct the situation of recorded activities.

  14. The influence of biological sex, sexuality and gender role on interpersonal distance.

    PubMed

    Uzzell, David; Horne, Nathalie

    2006-09-01

    This research reports on a conceptually and methodologically innovative study, which sought to measure the influence of gender on interpersonal distance. In so doing, we argue for an important distinction to be made between biological sex, gender role, and sexuality. To date, however, progress in the study of interpersonal distance (IPD) has been inhibited by poor operational definitions and inadequate measurement methodologies. For our own investigation, we innovated on methodology by devising the digital video-recording IPD method (DiVRID) that records interpersonal spatial relationships using high quality digital video equipment. The findings highlighted not only the validity of our innovative method of investigation, but also that a more sophisticated conceptualization of the impact of gender on IPD is warranted than can be accounted for by biological sex differences. In this study, we found that gender role accounts for more of the variation in IPD than the conventionally reported gender variable, sex.

  15. Moving from theory to practice: A participatory social network mapping approach to address unmet need for family planning in Benin.

    PubMed

    Igras, Susan; Diakité, Mariam; Lundgren, Rebecka

    2017-07-01

    In West Africa, social factors influence whether couples with unmet need for family planning act on birth-spacing desires. Tékponon Jikuagou is testing a social network-based intervention to reduce social barriers by diffusing new ideas. Individuals and groups judged socially influential by their communities provide entrée to networks. A participatory social network mapping methodology was designed to identify these diffusion actors. Analysis of monitoring data, in-depth interviews, and evaluation reports assessed the methodology's acceptability to communities and staff and whether it produced valid, reliable data to identify influential individuals and groups who diffuse new ideas through their networks. Results indicated the methodology's acceptability. Communities were actively and equitably engaged. Staff appreciated its ability to yield timely, actionable information. The mapping methodology also provided valid and reliable information by enabling communities to identify highly connected and influential network actors. Consistent with social network theory, this methodology resulted in the selection of informal groups and individuals in both informal and formal positions. In-depth interview data suggest these actors were diffusing new ideas, further confirming their influence/connectivity. The participatory methodology generated insider knowledge of who has social influence, challenging commonly held assumptions. Collecting and displaying information fostered staff and community learning, laying groundwork for social change.

  16. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    ERIC Educational Resources Information Center

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  17. A Proposed Methodology for the Conceptualization, Operationalization, and Empirical Validation of the Concept of Information Need

    ERIC Educational Resources Information Center

    Afzal, Waseem

    2017-01-01

    Introduction: The purpose of this paper is to propose a methodology to conceptualize, operationalize, and empirically validate the concept of information need. Method: The proposed methodology makes use of both qualitative and quantitative perspectives, and includes a broad array of approaches such as literature reviews, expert opinions, focus…

  18. Facilitating biomedical researchers' interrogation of electronic health record data: Ideas from outside of biomedical informatics.

    PubMed

    Hruby, Gregory W; Matsoukas, Konstantina; Cimino, James J; Weng, Chunhua

    2016-04-01

    Electronic health records (EHR) are a vital data resource for research uses, including cohort identification, phenotyping, pharmacovigilance, and public health surveillance. To realize the promise of EHR data for accelerating clinical research, it is imperative to enable efficient and autonomous EHR data interrogation by end users such as biomedical researchers. This paper surveys state-of-art approaches and key methodological considerations to this purpose. We adapted a previously published conceptual framework for interactive information retrieval, which defines three entities: user, channel, and source, by elaborating on channels for query formulation in the context of facilitating end users to interrogate EHR data. We show the current progress in biomedical informatics mainly lies in support for query execution and information modeling, primarily due to emphases on infrastructure development for data integration and data access via self-service query tools, but has neglected user support needed during iteratively query formulation processes, which can be costly and error-prone. In contrast, the information science literature has offered elaborate theories and methods for user modeling and query formulation support. The two bodies of literature are complementary, implying opportunities for cross-disciplinary idea exchange. On this basis, we outline the directions for future informatics research to improve our understanding of user needs and requirements for facilitating autonomous interrogation of EHR data by biomedical researchers. We suggest that cross-disciplinary translational research between biomedical informatics and information science can benefit our research in facilitating efficient data access in life sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. The use of personal data from medical records and biological materials: ethical perspectives and the basis for legal restrictions in health research.

    PubMed

    Regidor, Enrique

    2004-11-01

    This paper discusses the moral justification for using personal data without informed consent, from both medical records and biological materials, in research where subjects are not physically present in the study and will never have any contact with the study investigators. Although the idea of waiving the requirement for informed consent in certain investigations has been mentioned in several ethical guidelines formulated by epidemiologists and physicians since the late 1980s, these guidelines are now of limited use due to legal restrictions on the use of personal data in most western countries. Several misconceptions that form the basis for legal restriction of health research are discussed: lack of knowledge of the need to link personal information from health services with personal information produced outside the health system in many biomedical investigations; the assumption of a deterministic model of disease causation in which the prediction of disease occurrence is based on a genetic association despite the fact that most genotypes for common diseases are incompletely penetrant; the lack of a logical rationale for the recommendation in the Declaration of Helsinki that only research that offers some benefit to study subjects is justified; the great lack of knowledge about research methodology revealed in some alternatives proposed to avoid using personal data; and the lack of a debate about the ethical double standard of institutions and investigators in countries that prohibit the use of personal data but finance and carry out studies in other countries where it is permitted.

  20. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  1. Indigenous ancestral sayings contribute to modern conservation partnerships: examples using Phormium tenax.

    PubMed

    Wehi, Priscilla M

    2009-01-01

    Traditional ecological knowledge (TEK) is central to indigenous worldviews and practices and is one of the most important contributions that indigenous people can bring to conservation management partnerships. However, researchers and managers may have difficulty accessing such knowledge, particularly where knowledge transmission has been damaged. A new methodological approach analyzes ancestral sayings from Maori oral traditions for ecological information about Phormium tenax, a plant with high cultural value that is a dominant component in many threatened wetland systems, and frequently used in restoration plantings in New Zealand. Maori ancestral sayings record an association with nectar-feeding native parrots that has only rarely been reported, as well as indications of important environmental parameters (rainfall and drought) for this species. These sayings provide evidence of indigenous management that has not been reported from interviews with elders, including evidence of fire use to create Phormium cultivations. TEK in Maori ancestral sayings imply landscape-scale processes in comparison to intensive, small-scale management methods often reported in interviews. TEK in ancestral sayings can be used to generate new scientific hypotheses, negotiate collaborative pathways, and identify ecological management strategies that support biodiversity retention. TEK can inform restoration ecology, historical ecology, and conservation management of species and ecosystems, especially where data from pollen records and archaeological artifacts are incomplete.

  2. Topical Review: Families Coping With Child Trauma: A Naturalistic Observation Methodology.

    PubMed

    Alisic, Eva; Barrett, Anna; Bowles, Peter; Conroy, Rowena; Mehl, Matthias R

    2016-01-01

    To introduce a novel, naturalistic observational methodology (the Electronically Activated Recorder; EAR) as an opportunity to better understand the central role of the family environment in children's recovery from trauma. Discussion of current research methods and a systematic literature review of EAR studies on health and well-being. Surveys, experience sampling, and the EAR method each provide different opportunities and challenges for studying family interactions. We identified 17 articles describing relevant EAR studies. These investigated questions of emotional well-being, communicative behaviors, and interpersonal relationships, predominantly in adults. 5 articles reported innovative research in children, triangulating EAR-observed behavioral data (e.g., on child conflict at home) with neuroendocrine assay, sociodemographic information, and parent report. Finally, we discussed psychometric, practical, and ethical considerations for conducting EAR research with children and families. Naturalistic observation methods such as the EAR have potential for pediatric psychology studies regarding trauma and the family environment. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. The inland water macro-invertebrate occurrences in Flanders, Belgium.

    PubMed

    Vannevel, Rudy; Brosens, Dimitri; Cooman, Ward De; Gabriels, Wim; Frank Lavens; Mertens, Joost; Vervaeke, Bart

    2018-01-01

    The Flanders Environment Agency (VMM) has been performing biological water quality assessments on inland waters in Flanders (Belgium) since 1989 and sediment quality assessments since 2000. The water quality monitoring network is a combined physico-chemical and biological network, the biological component focusing on macro-invertebrates. The sediment monitoring programme produces biological data to assess the sediment quality. Both monitoring programmes aim to provide index values, applying a similar conceptual methodology based on the presence of macro-invertebrates. The biological data obtained from both monitoring networks are consolidated in the VMM macro-invertebrates database and include identifications at family and genus level of the freshwater phyla Coelenterata, Platyhelminthes, Annelida, Mollusca, and Arthropoda. This paper discusses the content of this database, and the dataset published thereof: 282,309 records of 210 observed taxa from 4,140 monitoring sites located on 657 different water bodies, collected during 22,663 events. This paper provides some background information on the methodology, temporal and spatial coverage, and taxonomy, and describes the content of the dataset. The data are distributed as open data under the Creative Commons CC-BY license.

  4. The use of phenomenology in mental health nursing research.

    PubMed

    Picton, Caroline Jane; Moxham, Lorna; Patterson, Christopher

    2017-12-18

    Historically, mental health research has been strongly influenced by the underlying positivism of the quantitative paradigm. Quantitative research dominates scientific enquiry and contributes significantly to understanding our natural world. It has also greatly benefitted the medical model of healthcare. However, the more literary, silent, qualitative approach is gaining prominence in human sciences research, particularly mental healthcare research. To examine the qualitative methodological assumptions of phenomenology to illustrate the benefits to mental health research of studying the experiences of people with mental illness. Phenomenology is well positioned to ask how people with mental illness reflect on their experiences. Phenomenological research is congruent with the principles of contemporary mental healthcare, as person-centred care is favoured at all levels of mental healthcare, treatment, service and research. Phenomenology is a highly appropriate and suitable methodology for mental health research, given it includes people's experiences and enables silent voices to be heard. This overview of the development of phenomenology informs researchers new to phenomenological enquiry. ©2017 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  5. Developing mobile- and BIM-based integrated visual facility maintenance management system.

    PubMed

    Lin, Yu-Cheng; Su, Yu-Chih

    2013-01-01

    Facility maintenance management (FMM) has become an important topic for research on the operation phase of the construction life cycle. Managing FMM effectively is extremely difficult owing to various factors and environments. One of the difficulties is the performance of 2D graphics when depicting maintenance service. Building information modeling (BIM) uses precise geometry and relevant data to support the maintenance service of facilities depicted in 3D object-oriented CAD. This paper proposes a new and practical methodology with application to FMM using BIM technology. Using BIM technology, this study proposes a BIM-based facility maintenance management (BIMFMM) system for maintenance staff in the operation and maintenance phase. The BIMFMM system is then applied in selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FMM practice. Using the BIMFMM system, maintenance staff can access and review 3D BIM models for updating related maintenance records in a digital format. Moreover, this study presents a generic system architecture and its implementation. The combined results demonstrate that a BIMFMM-like system can be an effective visual FMM tool.

  6. Proceedings of the Seminar on the DOD Computer Security Initiative (4th) Held at the National Bureau of Standards, Gaithersburg, Maryland on August 10-12, 1981.

    DTIC Science & Technology

    1981-01-01

    comparison of formal and informal design methodologies will show how we think they are converging. Lastly, I will describe our involvement with the DoD...computer security must begin with the design methodology , with the objective being provability. The idea ofa formal evaluation and on-the-shelf... Methodologies ] Here we can compare the formal design methodologies with those used by informal practitioners like Control Data. Obviously, both processes

  7. A Computer-Based System Integrating Instruction and Information Retrieval: A Description of Some Methodological Considerations.

    ERIC Educational Resources Information Center

    Selig, Judith A.; And Others

    This report, summarizing the activities of the Vision Information Center (VIC) in the field of computer-assisted instruction from December, 1966 to August, 1967, describes the methodology used to load a large body of information--a programed text on basic opthalmology--onto a computer for subsequent information retrieval and computer-assisted…

  8. Modifiable worker risk factors contributing to workplace absence: a stakeholder-centred best-evidence synthesis of systematic reviews.

    PubMed

    Wagner, Shannon; White, Marc; Schultz, Izabela; Murray, Eleanor; Bradley, Susan M; Hsu, Vernita; McGuire, Lisa; Schulz, Werner

    2014-01-01

    A challenge facing stakeholders is the identification and translation of relevant high quality research to inform policy and practice. This study engaged academic and community stakeholders in conducting a best evidence-synthesis to identify modifiable risk and protective worker factors across health conditions impacting work-related absence. To identify modifiable worker disability risk and protective factors across common health conditions impacting work-related absence. We searched Medline, Embase, CINHAL, The Cochrane Library, PsycINFO, BusinessSourceComplete, and ABI/Inform from 2000 to 2011. Quantitative, qualitative, or mixed methods systematic reviews of work-focused population were considered for inclusion. Two or more reviewers independently reviewed articles for inclusion and methodological screening. The search strategy, expert input and grey literature identified 2,467 unique records. One hundred and forty-two full text articles underwent comprehensive review. Twenty-four systematic reviews met eligibility criteria. Modifiable worker factors found to have consistent evidence across two or more health conditions included emotional distress, negative enduring psychology/personality factors, negative health and disability perception, decreased physical activity, lack of family support, poor general health, increased functional disability, increased pain, increased fatigue and lack of motivation to return to work. Systematic reviews are limited by availability of high quality studies, lack of consistency of methodological screening and reporting, and variability of outcome measures used.

  9. Acoustic Seabed Characterization of the Porcupine Bank, Irish Margin

    NASA Astrophysics Data System (ADS)

    O'Toole, Ronan; Monteys, Xavier

    2010-05-01

    The Porcupine Bank represents a large section of continental shelf situated west of the Irish landmass, located in water depths ranging between 150 and 500m. Under the Irish National Seabed Survey (INSS 1999-2006) this area was comprehensively mapped, generating multiple acoustic datasets including high resolution multibeam echosounder data. The unique nature of the area's datasets in terms of data density, consistency and geographic extent has allowed the development of a large-scale integrated physical characterization of the Porcupine Bank for multidisciplinary applications. Integrated analysis of backscatter and bathymetry data has resulted in a baseline delineation of sediment distribution, seabed geology and geomorphological features on the bank, along with an inclusive set of related database information. The methodology used incorporates a variety of statistical techniques which are necessary in isolating sonar system artefacts and addressing sonar geometry related issues. A number of acoustic backscatter parameters at several angles of incidence have been analysed in order to complement the characterization for both surface and subsurface sediments. Acoustic sub bottom records have also been incorporated in order to investigate the physical characteristics of certain features on the Porcupine Bank. Where available, groundtruthing information in terms of sediment samples, video footage and cores has been applied to add physical descriptors and validation to the characterization. Extensive mapping of different rock outcrops, sediment drifts, seabed features and other geological classes has been achieved using this methodology.

  10. Design of a terminal solution for integration of in-home health care devices and services towards the Internet-of-Things

    NASA Astrophysics Data System (ADS)

    Pang, Zhibo; Zheng, Lirong; Tian, Junzhe; Kao-Walter, Sharon; Dubrova, Elena; Chen, Qiang

    2015-01-01

    In-home health care services based on the Internet-of-Things are promising to resolve the challenges caused by the ageing of population. But the existing research is rather scattered and shows lack of interoperability. In this article, a business-technology co-design methodology is proposed for cross-boundary integration of in-home health care devices and services. In this framework, three key elements of a solution (business model, device and service integration architecture and information system integration architecture) are organically integrated and aligned. In particular, a cooperative Health-IoT ecosystem is formulated, and information systems of all stakeholders are integrated in a cooperative health cloud as well as extended to patients' home through the in-home health care station (IHHS). Design principles of the IHHS includes the reuse of 3C platform, certification of the Health Extension, interoperability and extendibility, convenient and trusted software distribution, standardised and secured electrical health care record handling, effective service composition and efficient data fusion. These principles are applied to the design of an IHHS solution called iMedBox. Detailed device and service integration architecture and hardware and software architecture are presented and verified by an implemented prototype. The quantitative performance analysis and field trials have confirmed the feasibility of the proposed design methodology and solution.

  11. Methodological variation in economic evaluations conducted in low- and middle-income countries: information for reference case development.

    PubMed

    Santatiwongchai, Benjarin; Chantarastapornchit, Varit; Wilkinson, Thomas; Thiboonboon, Kittiphong; Rattanavipapong, Waranya; Walker, Damian G; Chalkidou, Kalipso; Teerawattananon, Yot

    2015-01-01

    Information generated from economic evaluation is increasingly being used to inform health resource allocation decisions globally, including in low- and middle- income countries. However, a crucial consideration for users of the information at a policy level, e.g. funding agencies, is whether the studies are comparable, provide sufficient detail to inform policy decision making, and incorporate inputs from data sources that are reliable and relevant to the context. This review was conducted to inform a methodological standardisation workstream at the Bill and Melinda Gates Foundation (BMGF) and assesses BMGF-funded cost-per-DALY economic evaluations in four programme areas (malaria, tuberculosis, HIV/AIDS and vaccines) in terms of variation in methodology, use of evidence, and quality of reporting. The findings suggest that there is room for improvement in the three areas of assessment, and support the case for the introduction of a standardised methodology or reference case by the BMGF. The findings are also instructive for all institutions that fund economic evaluations in LMICs and who have a desire to improve the ability of economic evaluations to inform resource allocation decisions.

  12. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework

    PubMed Central

    Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028

  13. Understanding the neural basis of cognitive bias modification as a clinical treatment for depression.

    PubMed

    Eguchi, Akihiro; Walters, Daniel; Peerenboom, Nele; Dury, Hannah; Fox, Elaine; Stringer, Simon

    2017-03-01

    [Correction Notice: An Erratum for this article was reported in Vol 85(3) of Journal of Consulting and Clinical Psychology (see record 2017-07144-002). In the article, there was an error in the Discussion section's first paragraph for Implications and Future Work. The in-text reference citation for Penton-Voak et al. (2013) was incorrectly listed as "Blumenfeld, Preminger, Sagi, and Tsodyks (2006)". All versions of this article have been corrected.] Objective: Cognitive bias modification (CBM) eliminates cognitive biases toward negative information and is efficacious in reducing depression recurrence, but the mechanisms behind the bias elimination are not fully understood. The present study investigated, through computer simulation of neural network models, the neural dynamics underlying the use of CBM in eliminating the negative biases in the way that depressed patients evaluate facial expressions. We investigated 2 new CBM methodologies using biologically plausible synaptic learning mechanisms-continuous transformation learning and trace learning-which guide learning by exploiting either the spatial or temporal continuity between visual stimuli presented during training. We first describe simulations with a simplified 1-layer neural network, and then we describe simulations in a biologically detailed multilayer neural network model of the ventral visual pathway. After training with either the continuous transformation learning rule or the trace learning rule, the 1-layer neural network eliminated biases in interpreting neutral stimuli as sad. The multilayer neural network trained with realistic face stimuli was also shown to be able to use continuous transformation learning or trace learning to reduce biases in the interpretation of neutral stimuli. The simulation results suggest 2 biologically plausible synaptic learning mechanisms, continuous transformation learning and trace learning, that may subserve CBM. The results are highly informative for the development of experimental protocols to produce optimal CBM training methodologies with human participants. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Post-trial follow-up methodology in large randomized controlled trials: a systematic review protocol.

    PubMed

    Llewellyn-Bennett, Rebecca; Bowman, Louise; Bulbulia, Richard

    2016-12-15

    Clinical trials typically have a relatively short follow-up period, and may both underestimate potential benefits of treatments investigated, and fail to detect hazards, which can take much longer to emerge. Prolonged follow-up of trial participants after the end of the scheduled trial period can provide important information on both efficacy and safety outcomes. This protocol describes a systematic review to qualitatively compare methods of post-trial follow-up used in large randomized controlled trials. A systematic search of electronic databases and clinical trial registries will use a predefined search strategy. All large (more than 1000 adult participants) randomized controlled trials will be evaluated. Two reviewers will screen and extract data according to this protocol with the aim of 95% concordance of papers checked and discrepancies will be resolved by a third reviewer. Trial methods, participant retention rates and prevalence of missing data will be recorded and compared. The potential for bias will be evaluated using the Cochrane Risk of Bias tool (applied to the methods used during the in-trial period) with the aim of investigating whether the quality of the post-trial follow-up methodology might be predicted by the quality of the methods used for the original trial. Post-trial follow-up can provide valuable information about the long-term benefits and hazards of medical interventions. However, it can be logistically challenging and costly. The aim of this systematic review is to describe how trial participants have been followed-up post-trial in order to inform future post-trial follow-up designs. Not applicable for PROSPERO registration.

  15. Strengths and weaknesses of 'real-world' studies involving non-vitamin K antagonist oral anticoagulants.

    PubMed

    Camm, A John; Fox, Keith A A

    2018-01-01

    Randomised controlled trials (RCTs) provide the reference standard for comparing the efficacy of one therapy or intervention with another. However, RCTs have restrictive inclusion and exclusion criteria; thus, they are not fully representative of an unselected real-world population. Real-world evidence (RWE) studies encompass a wide range of research methodologies and data sources and can be broadly categorised as non-interventional studies, patient registries, claims database studies, patient surveys and electronic health record studies. If appropriately designed, RWE studies include a patient population that is far more representative of unselected patient populations than those of RCTs, but they do not provide a robust basis for comparing treatment strategies. RWE studies can have very large sample sizes, can provide information on treatments in patient groups that are usually excluded from RCTs, are generally less expensive and quicker than RCTs, and can assess a broad range of outcomes. Limitations of RWE studies can include low internal validity, lack of quality control surrounding data collection and susceptibility to multiple sources of bias for comparing outcomes. RWE studies can complement the findings from RCTs by providing valuable information on treatment practices and patient characteristics among unselected patients. This information is necessary to guide treatment decisions and for reimbursement and payment decisions. RWE studies have been extensively applied in the postmarketing approval assessment of non-vitamin K antagonist oral anticoagulants since 2010. However, the benefits, costs, limitations and methodological challenges associated with the different types of RWE must be considered carefully when interpreting the findings.

  16. Strengths and weaknesses of ‘real-world’ studies involving non-vitamin K antagonist oral anticoagulants

    PubMed Central

    Camm, A John; Fox, Keith A A

    2018-01-01

    Randomised controlled trials (RCTs) provide the reference standard for comparing the efficacy of one therapy or intervention with another. However, RCTs have restrictive inclusion and exclusion criteria; thus, they are not fully representative of an unselected real-world population. Real-world evidence (RWE) studies encompass a wide range of research methodologies and data sources and can be broadly categorised as non-interventional studies, patient registries, claims database studies, patient surveys and electronic health record studies. If appropriately designed, RWE studies include a patient population that is far more representative of unselected patient populations than those of RCTs, but they do not provide a robust basis for comparing treatment strategies. RWE studies can have very large sample sizes, can provide information on treatments in patient groups that are usually excluded from RCTs, are generally less expensive and quicker than RCTs, and can assess a broad range of outcomes. Limitations of RWE studies can include low internal validity, lack of quality control surrounding data collection and susceptibility to multiple sources of bias for comparing outcomes. RWE studies can complement the findings from RCTs by providing valuable information on treatment practices and patient characteristics among unselected patients. This information is necessary to guide treatment decisions and for reimbursement and payment decisions. RWE studies have been extensively applied in the postmarketing approval assessment of non-vitamin K antagonist oral anticoagulants since 2010. However, the benefits, costs, limitations and methodological challenges associated with the different types of RWE must be considered carefully when interpreting the findings. PMID:29713485

  17. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed

    Lefebvre, Carol; Glanville, Julie; Wieland, L Susan; Coles, Bernadette; Weightman, Alison L

    2013-09-25

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies.Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory' and 'highly desirable' standards for various aspects of review conduct and reporting including searching, the development of Standard Training Materials for Cochrane Reviews and work on peer review of electronic search strategies. Almost all Cochrane Review Groups and some Cochrane Centres and Fields now have a Trials Search Co-ordinator responsible for study identification and medical librarians and other information specialists are increasingly experienced in searching for studies for systematic reviews.Prospective registration of clinical trials is increasing and searching trials registers is now mandatory for Cochrane Reviews, where relevant. Portals such as the WHO International Clinical Trials Registry Platform (ICTRP) are likely to become increasingly attractive, given concerns about the number of trials which may not be registered and/or published. The importance of access to information from regulatory and reimbursement agencies is likely to increase. Cross-database searching, gateways or portals and improved access to full-text databases will impact on how searches are conducted and reported, as will services such as Google Scholar, Scopus and Web of Science. Technologies such as textual analysis, semantic analysis, text mining and data linkage will have a major impact on the search process but efficient and effective updating of reviews may remain a challenge.In twenty years' time, we envisage that the impact of universal social networking, as well as national and international legislation, will mean that all trials involving humans will be registered at inception and detailed trial results will be routinely available to all. Challenges will remain, however, to ensure the discoverability of relevant information in diverse and often complex sources and the availability of metadata to provide the most efficient access to information. We envisage an ongoing role for information professionals as experts in identifying new resources, researching efficient ways to link or mine them for relevant data and managing their content for the efficient production of systematic reviews.

  18. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed Central

    2013-01-01

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies. Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory’ and 'highly desirable’ standards for various aspects of review conduct and reporting including searching, the development of Standard Training Materials for Cochrane Reviews and work on peer review of electronic search strategies. Almost all Cochrane Review Groups and some Cochrane Centres and Fields now have a Trials Search Co-ordinator responsible for study identification and medical librarians and other information specialists are increasingly experienced in searching for studies for systematic reviews. Prospective registration of clinical trials is increasing and searching trials registers is now mandatory for Cochrane Reviews, where relevant. Portals such as the WHO International Clinical Trials Registry Platform (ICTRP) are likely to become increasingly attractive, given concerns about the number of trials which may not be registered and/or published. The importance of access to information from regulatory and reimbursement agencies is likely to increase. Cross-database searching, gateways or portals and improved access to full-text databases will impact on how searches are conducted and reported, as will services such as Google Scholar, Scopus and Web of Science. Technologies such as textual analysis, semantic analysis, text mining and data linkage will have a major impact on the search process but efficient and effective updating of reviews may remain a challenge. In twenty years’ time, we envisage that the impact of universal social networking, as well as national and international legislation, will mean that all trials involving humans will be registered at inception and detailed trial results will be routinely available to all. Challenges will remain, however, to ensure the discoverability of relevant information in diverse and often complex sources and the availability of metadata to provide the most efficient access to information. We envisage an ongoing role for information professionals as experts in identifying new resources, researching efficient ways to link or mine them for relevant data and managing their content for the efficient production of systematic reviews. PMID:24066664

  19. Freedom, Flow and Fairness: Exploring How Children Develop Socially at School through Outdoor Play

    ERIC Educational Resources Information Center

    Waite, Sue; Rogers, Sue; Evans, Julie

    2013-01-01

    In this article, we report on a study that sought to discover micro-level social interactions in fluid outdoor learning spaces. Our methodology was centred around the children; our methods moved with them and captured their social interactions through mobile audio-recording. We argue that our methodological approach supported access to…

  20. Beyond "on" or "with": Questioning Power Dynamics and Knowledge Production in "Child-Oriented" Research Methodology

    ERIC Educational Resources Information Center

    Hunleth, Jean

    2011-01-01

    By taking a reflexive approach to research methodology, this article contributes to discussions on power dynamics and knowledge production in the social studies of children. The author describes and analyzes three research methods that she used with children--drawing, child-led tape-recording and focus group discussions. These methods were carried…

  1. A Dynamic System Approach to Willingness to Communicate: Developing an Idiodynamic Method to Capture Rapidly Changing Affect

    ERIC Educational Resources Information Center

    Macintyre, Peter D.; Legatto, James Jason

    2011-01-01

    Willingness to communicate (WTC) can be conceptualized as changing from moment to moment, as opportunities for second-language communication arise. In this study we present an idiodynamic methodology for studying rapid changes in WTC. The methodology consists of recording responses from six young adult, female speakers to second-language…

  2. Management Information System Based on the Balanced Scorecard

    ERIC Educational Resources Information Center

    Kettunen, Juha; Kantola, Ismo

    2005-01-01

    Purpose: This study seeks to describe the planning and implementation in Finland of a campus-wide management information system using a rigorous planning methodology. Design/methodology/approach: The structure of the management information system is planned on the basis of the management process, where strategic management and the balanced…

  3. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Treesearch

    Toddi A. Steelman; Branda Nowell; Deena Bayoumi; Sarah McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  4. Opinion: Clarifying Two Controversies about Information Mapping's Method.

    ERIC Educational Resources Information Center

    Horn, Robert E.

    1992-01-01

    Describes Information Mapping, a methodology for the analysis, organization, sequencing, and presentation of information and explains three major parts of the method: (1) content analysis, (2) project life-cycle synthesis and integration of the content analysis, and (3) sequencing and formatting. Major criticisms of the methodology are addressed.…

  5. High-density digital recording

    NASA Technical Reports Server (NTRS)

    Kalil, F. (Editor); Buschman, A. (Editor)

    1985-01-01

    The problems associated with high-density digital recording (HDDR) are discussed. Five independent users of HDDR systems and their problems, solutions, and insights are provided as guidance for other users of HDDR systems. Various pulse code modulation coding techniques are reviewed. An introduction to error detection and correction head optimization theory and perpendicular recording are provided. Competitive tape recorder manufacturers apply all of the above theories and techniques and present their offerings. The methodology used by the HDDR Users Subcommittee of THIC to evaluate parallel HDDR systems is presented.

  6. Alternating Renewal Process Models for Behavioral Observation: Simulation Methods, Software, and Validity Illustrations

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Runyon, Christopher

    2014-01-01

    Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…

  7. Does the Recording Medium Influence Phonetic Transcription of Cleft Palate Speech?

    ERIC Educational Resources Information Center

    Klintö, Kristina; Lohmander, Anette

    2017-01-01

    Background: In recent years, analyses of cleft palate speech based on phonetic transcriptions have become common. However, the results vary considerably among different studies. It cannot be excluded that differences in assessment methodology, including the recording medium, influence the results. Aims: To compare phonetic transcriptions from…

  8. 76 FR 66917 - Privacy Act of 1974; Notice To Amend an Existing System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-28

    ... methodologies. DATES: The proposed amendment to this existing system of records will become effective without... that this amendment should not become effective on that date. Comments regarding this amendment must be...: Name, date and place of birth, social security number, citizenship status, grade, organization...

  9. Physician Sensemaking and Readiness for Electronic Medical Records

    ERIC Educational Resources Information Center

    Riesenmy, Kelly Rouse

    2010-01-01

    Purpose: The purpose of this paper is to explore physician sensemaking and readiness to implement electronic medical records (EMR) as a first step to finding strategies that enhance EMR adoption behaviors. Design/methodology/approach: The case study approach provides a detailed analysis of individuals within an organizational unit. Using a…

  10. The medical educator, the discourse analyst, and the phonetician: a collaborative feedback methodology for clinical communication.

    PubMed

    Woodward-Kron, Robyn; Stevens, Mary; Flynn, Eleanor

    2011-05-01

    Frameworks for clinical communication assist educators in making explicit the principles of good communication and providing feedback to medical trainees. However, existing frameworks rarely take into account the roles of culture and language in communication, which can be important for international medical graduates (IMGs) whose first language is not English. This article describes the collaboration by a medical educator, a discourse analyst, and a phonetician to develop a communication and language feedback methodology to assist IMG trainees at a Victorian hospital in Australia with developing their doctor-patient communication skills. The Communication and Language Feedback (CaLF) methodology incorporates a written tool and video recording of role-plays of doctor-patient interactions in a classroom setting or in an objective structured clinical examination (OSCE) practice session with a simulated patient. IMG trainees receive verbal feedback from their hospital-based medical clinical educator, the simulated patient, and linguists. The CaLF tool was informed by a model of language in context, observation of IMG communication training, and process evaluation by IMG participants during January to August 2009. The authors provided participants with a feedback package containing their practice video (which included verbal feedback) and the completed CaLF tool.The CaLF methodology provides a tool for medical educators and language practitioners to work collaboratively with IMGs to enhance communication and language skills. The ongoing interdisciplinary collaboration also provides much-needed applied research opportunities in intercultural health communication, an area the authors believe cannot be adequately addressed from the perspective of one discipline alone. Copyright © by the Association of American medical Colleges.

  11. How to Reduce Head CT Orders in Children with Hydrocephalus Using the Lean Six Sigma Methodology: Experience at a Major Quaternary Care Academic Children's Center.

    PubMed

    Tekes, A; Jackson, E M; Ogborn, J; Liang, S; Bledsoe, M; Durand, D J; Jallo, G; Huisman, T A G M

    2016-06-01

    Lean Six Sigma methodology is increasingly used to drive improvement in patient safety, quality of care, and cost-effectiveness throughout the US health care delivery system. To demonstrate our value as specialists, radiologists can combine lean methodologies along with imaging expertise to optimize imaging elements-of-care pathways. In this article, we describe a Lean Six Sigma project with the goal of reducing the relative use of pediatric head CTs in our population of patients with hydrocephalus by 50% within 6 months. We applied a Lean Six Sigma methodology using a multidisciplinary team at a quaternary care academic children's center. The existing baseline imaging practice for hydrocephalus was outlined in a Kaizen session, and potential interventions were discussed. An improved radiation-free workflow with ultrafast MR imaging was created. Baseline data were collected for 3 months by using the departmental radiology information system. Data collection continued postintervention and during the control phase (each for 3 months). The percentage of neuroimaging per technique (head CT, head ultrasound, ultrafast brain MR imaging, and routine brain MR imaging) was recorded during each phase. The improved workflow resulted in a 75% relative reduction in the percentage of hydrocephalus imaging performed by CT between the pre- and postintervention/control phases (Z-test, P = .0001). Our lean interventions in the pediatric hydrocephalus care pathway resulted in a significant reduction in head CT orders and increased use of ultrafast brain MR imaging. © 2016 by American Journal of Neuroradiology.

  12. 39 CFR 262.4 - Records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION RECORDS AND INFORMATION MANAGEMENT DEFINITIONS § 262.4 Records. Recorded information, regardless of media, format, or physical characteristics...) Corporate records. Those records series that are designated by the Records Office as containing information...

  13. 39 CFR 262.4 - Records.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION RECORDS AND INFORMATION MANAGEMENT DEFINITIONS § 262.4 Records. Recorded information, regardless of media, format, or physical characteristics...) Corporate records. Those records series that are designated by the Records Office as containing information...

  14. 39 CFR 262.4 - Records.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION RECORDS AND INFORMATION MANAGEMENT DEFINITIONS § 262.4 Records. Recorded information, regardless of media, format, or physical characteristics...) Corporate records. Those records series that are designated by the Records Office as containing information...

  15. 39 CFR 262.4 - Records.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION RECORDS AND INFORMATION MANAGEMENT DEFINITIONS § 262.4 Records. Recorded information, regardless of media, format, or physical characteristics...) Corporate records. Those records series that are designated by the Records Office as containing information...

  16. 39 CFR 262.4 - Records.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION RECORDS AND INFORMATION MANAGEMENT DEFINITIONS § 262.4 Records. Recorded information, regardless of media, format, or physical characteristics...) Corporate records. Those records series that are designated by the Records Office as containing information...

  17. High-resolution behavioral mapping of electric fishes in Amazonian habitats.

    PubMed

    Madhav, Manu S; Jayakumar, Ravikrishnan P; Demir, Alican; Stamper, Sarah A; Fortune, Eric S; Cowan, Noah J

    2018-04-11

    The study of animal behavior has been revolutionized by sophisticated methodologies that identify and track individuals in video recordings. Video recording of behavior, however, is challenging for many species and habitats including fishes that live in turbid water. Here we present a methodology for identifying and localizing weakly electric fishes on the centimeter scale with subsecond temporal resolution based solely on the electric signals generated by each individual. These signals are recorded with a grid of electrodes and analyzed using a two-part algorithm that identifies the signals from each individual fish and then estimates the position and orientation of each fish using Bayesian inference. Interestingly, because this system involves eavesdropping on electrocommunication signals, it permits monitoring of complex social and physical interactions in the wild. This approach has potential for large-scale non-invasive monitoring of aquatic habitats in the Amazon basin and other tropical freshwater systems.

  18. The Ideal Oriented Co-design Approach Revisited

    NASA Astrophysics Data System (ADS)

    Johnstone, Christina

    There exist a large number of different methodologies for developing information systems on the market. This implies that there also are a large number of "best" ways of developing those information systems. Avison and Fitzgerald (2003) states that every methodology is built on a philosophy. With philosophy they refer to the underlying attitudes and viewpoints, and the different assumptions and emphases to be found within the specific methodology.

  19. Identification of redundant and synergetic circuits in triplets of electrophysiological data

    NASA Astrophysics Data System (ADS)

    Erramuzpe, Asier; Ortega, Guillermo J.; Pastor, Jesus; de Sola, Rafael G.; Marinazzo, Daniele; Stramaglia, Sebastiano; Cortes, Jesus M.

    2015-12-01

    Objective. Neural systems are comprised of interacting units, and relevant information regarding their function or malfunction can be inferred by analyzing the statistical dependencies between the activity of each unit. While correlations and mutual information are commonly used to characterize these dependencies, our objective here is to extend interactions to triplets of variables to better detect and characterize dynamic information transfer. Approach. Our approach relies on the measure of interaction information (II). The sign of II provides information as to the extent to which the interaction of variables in triplets is redundant (R) or synergetic (S). Three variables are said to be redundant when a third variable, say Z, added to a pair of variables (X, Y), diminishes the information shared between X and Y. Similarly, the interaction in the triplet is said to be synergetic when conditioning on Z enhances the information shared between X and Y with respect to the unconditioned state. Here, based on this approach, we calculated the R and S status for triplets of electrophysiological data recorded from drug-resistant patients with mesial temporal lobe epilepsy in order to study the spatial organization and dynamics of R and S close to the epileptogenic zone (the area responsible for seizure propagation). Main results. In terms of spatial organization, our results show that R matched the epileptogenic zone while S was distributed more in the surrounding area. In relation to dynamics, R made the largest contribution to high frequency bands (14-100 Hz), while S was expressed more strongly at lower frequencies (1-7 Hz). Thus, applying II to such clinical data reveals new aspects of epileptogenic structure in terms of the nature (redundancy versus synergy) and dynamics (fast versus slow rhythms) of the interactions. Significance. We expect this methodology, robust and simple, can reveal new aspects beyond pair-interactions in networks of interacting units in other setups with multi-recording data sets (and thus, not necessarily in epilepsy, the pathology we have approached here).

  20. Exploiting neurovascular coupling: a Bayesian sequential Monte Carlo approach applied to simulated EEG fNIRS data

    NASA Astrophysics Data System (ADS)

    Croce, Pierpaolo; Zappasodi, Filippo; Merla, Arcangelo; Chiarelli, Antonio Maria

    2017-08-01

    Objective. Electrical and hemodynamic brain activity are linked through the neurovascular coupling process and they can be simultaneously measured through integration of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS). Thanks to the lack of electro-optical interference, the two procedures can be easily combined and, whereas EEG provides electrophysiological information, fNIRS can provide measurements of two hemodynamic variables, such as oxygenated and deoxygenated hemoglobin. A Bayesian sequential Monte Carlo approach (particle filter, PF) was applied to simulated recordings of electrical and neurovascular mediated hemodynamic activity, and the advantages of a unified framework were shown. Approach. Multiple neural activities and hemodynamic responses were simulated in the primary motor cortex of a subject brain. EEG and fNIRS recordings were obtained by means of forward models of volume conduction and light propagation through the head. A state space model of combined EEG and fNIRS data was built and its dynamic evolution was estimated through a Bayesian sequential Monte Carlo approach (PF). Main results. We showed the feasibility of the procedure and the improvements in both electrical and hemodynamic brain activity reconstruction when using the PF on combined EEG and fNIRS measurements. Significance. The investigated procedure allows one to combine the information provided by the two methodologies, and, by taking advantage of a physical model of the coupling between electrical and hemodynamic response, to obtain a better estimate of brain activity evolution. Despite the high computational demand, application of such an approach to in vivo recordings could fully exploit the advantages of this combined brain imaging technology.

  1. Automated identification of diagnosis and co-morbidity in clinical records.

    PubMed

    Cano, C; Blanco, A; Peshkin, L

    2009-01-01

    Automated understanding of clinical records is a challenging task involving various legal and technical difficulties. Clinical free text is inherently redundant, unstructured, and full of acronyms, abbreviations and domain-specific language which make it challenging to mine automatically. There is much effort in the field focused on creating specialized ontology, lexicons and heuristics based on expert knowledge of the domain. However, ad-hoc solutions poorly generalize across diseases or diagnoses. This paper presents a successful approach for a rapid prototyping of a diagnosis classifier based on a popular computational linguistics platform. The corpus consists of several hundred of full length discharge summaries provided by Partners Healthcare. The goal is to identify a diagnosis and assign co-morbidi-ty. Our approach is based on the rapid implementation of a logistic regression classifier using an existing toolkit: LingPipe (http://alias-i.com/lingpipe). We implement and compare three different classifiers. The baseline approach uses character 5-grams as features. The second approach uses a bag-of-words representation enriched with a small additional set of features. The third approach reduces a feature set to the most informative features according to the information content. The proposed systems achieve high performance (average F-micro 0.92) for the task. We discuss the relative merit of the three classifiers. Supplementary material with detailed results is available at: http:// decsai.ugr.es/~ccano/LR/supplementary_ material/ We show that our methodology for rapid prototyping of a domain-unaware system is effective for building an accurate classifier for clinical records.

  2. A SWOT Analysis of the Various Backup Scenarios Used in Electronic Medical Record Systems.

    PubMed

    Seo, Hwa Jeong; Kim, Hye Hyeon; Kim, Ju Han

    2011-09-01

    Electronic medical records (EMRs) are increasingly being used by health care services. Currently, if an EMR shutdown occurs, even for a moment, patient safety and care can be seriously impacted. Our goal was to determine the methodology needed to develop an effective and reliable EMR backup system. Our "independent backup system by medical organizations" paradigm implies that individual medical organizations develop their own EMR backup systems within their organizations. A "personal independent backup system" is defined as an individual privately managing his/her own medical records, whereas in a "central backup system by the government" the government controls all the data. A "central backup system by private enterprises" implies that individual companies retain control over their own data. A "cooperative backup system among medical organizations" refers to a networked system established through mutual agreement. The "backup system based on mutual trust between an individual and an organization" means that the medical information backup system at the organizational level is established through mutual trust. Through the use of SWOT analysis it can be shown that cooperative backup among medical organizations is possible to be established through a network composed of various medical agencies and that it can be managed systematically. An owner of medical information only grants data access to the specific person who gave the authorization for backup based on the mutual trust between an individual and an organization. By employing SWOT analysis, we concluded that a linkage among medical organizations or between an individual and an organization can provide an efficient backup system.

  3. A SWOT Analysis of the Various Backup Scenarios Used in Electronic Medical Record Systems

    PubMed Central

    Seo, Hwa Jeong; Kim, Hye Hyeon

    2011-01-01

    Objectives Electronic medical records (EMRs) are increasingly being used by health care services. Currently, if an EMR shutdown occurs, even for a moment, patient safety and care can be seriously impacted. Our goal was to determine the methodology needed to develop an effective and reliable EMR backup system. Methods Our "independent backup system by medical organizations" paradigm implies that individual medical organizations develop their own EMR backup systems within their organizations. A "personal independent backup system" is defined as an individual privately managing his/her own medical records, whereas in a "central backup system by the government" the government controls all the data. A "central backup system by private enterprises" implies that individual companies retain control over their own data. A "cooperative backup system among medical organizations" refers to a networked system established through mutual agreement. The "backup system based on mutual trust between an individual and an organization" means that the medical information backup system at the organizational level is established through mutual trust. Results Through the use of SWOT analysis it can be shown that cooperative backup among medical organizations is possible to be established through a network composed of various medical agencies and that it can be managed systematically. An owner of medical information only grants data access to the specific person who gave the authorization for backup based on the mutual trust between an individual and an organization. Conclusions By employing SWOT analysis, we concluded that a linkage among medical organizations or between an individual and an organization can provide an efficient backup system. PMID:22084811

  4. Approach to Managing MeaSURES Data at the GSFC Earth Science Data and Information Services Center (GES DISC)

    NASA Technical Reports Server (NTRS)

    Vollmer, Bruce; Kempler, Steven J.; Ramapriyan, Hampapuram K.

    2009-01-01

    A major need stated by the NASA Earth science research strategy is to develop long-term, consistent, and calibrated data and products that are valid across multiple missions and satellite sensors. (NASA Solicitation for Making Earth System data records for Use in Research Environments (MEaSUREs) 2006-2010) Selected projects create long term records of a given parameter, called Earth Science Data Records (ESDRs), based on mature algorithms that bring together continuous multi-sensor data. ESDRs, associated algorithms, vetted by the appropriate community, are archived at a NASA affiliated data center for archive, stewardship, and distribution. See http://measures-projects.gsfc.nasa.gov/ for more details. This presentation describes the NASA GSFC Earth Science Data and Information Services Center (GES DISC) approach to managing the MEaSUREs ESDR datasets assigned to GES DISC. (Energy/water cycle related and atmospheric composition ESDRs) GES DISC will utilize its experience to integrate existing and proven reusable data management components to accommodate the new ESDRs. Components include a data archive system (S4PA), a data discovery and access system (Mirador), and various web services for data access. In addition, if determined to be useful to the user community, the Giovanni data exploration tool will be made available to ESDRs. The GES DISC data integration methodology to be used for the MEaSUREs datasets is presented. The goals of this presentation are to share an approach to ESDR integration, and initiate discussions amongst the data centers, data managers and data providers for the purpose of gaining efficiencies in data management for MEaSUREs projects.

  5. A Diabetes Dashboard and Physician Efficiency and Accuracy in Accessing Data Needed for High-Quality Diabetes Care

    PubMed Central

    Koopman, Richelle J.; Kochendorfer, Karl M.; Moore, Joi L.; Mehr, David R.; Wakefield, Douglas S.; Yadamsuren, Borchuluun; Coberly, Jared S.; Kruse, Robin L.; Wakefield, Bonnie J.; Belden, Jeffery L.

    2011-01-01

    PURPOSE We compared use of a new diabetes dashboard screen with use of a conventional approach of viewing multiple electronic health record (EHR) screens to find data needed for ambulatory diabetes care. METHODS We performed a usability study, including a quantitative time study and qualitative analysis of information-seeking behaviors. While being recorded with Morae Recorder software and “think-aloud” interview methods, 10 primary care physicians first searched their EHR for 10 diabetes data elements using a conventional approach for a simulated patient, and then using a new diabetes dashboard for another. We measured time, number of mouse clicks, and accuracy. Two coders analyzed think-aloud and interview data using grounded theory methodology. RESULTS The mean time needed to find all data elements was 5.5 minutes using the conventional approach vs 1.3 minutes using the diabetes dashboard (P <.001). Physicians correctly identified 94% of the data requested using the conventional method, vs 100% with the dashboard (P <.01). The mean number of mouse clicks was 60 for conventional searching vs 3 clicks with the diabetes dashboard (P <.001). A common theme was that in everyday practice, if physicians had to spend too much time searching for data, they would either continue without it or order a test again. CONCLUSIONS Using a patient-specific diabetes dashboard improves both the efficiency and accuracy of acquiring data needed for high-quality diabetes care. Usability analysis tools can provide important insights into the value of optimizing physician use of health information technologies. PMID:21911758

  6. Feminist-informed participatory action research: a methodology of choice for examining critical nursing issues.

    PubMed

    Corbett, Andrea M; Francis, Karen; Chapman, Ysanne

    2007-04-01

    Identifying a methodology to guide a study that aims to enhance service delivery can be challenging. Participatory action research offers a solution to this challenge as it both informs and is informed by critical social theory. In addition, using a feminist lens helps acquiesce this approach as a suitable methodology for changing practice. This methodology embraces empowerment self-determination and the facilitation of agreed change as central tenets that guide the research process. Encouraged by the work of Foucault, Friere, Habermas, and Maguire, this paper explicates the philosophical assumptions underpinning critical social theory and outlines how feminist influences are complimentary in exploring the processes and applications of nursing research that seeks to embrace change.

  7. Methodology to incorporate EIA in land-use ordering -- case study: The Cataniapo River basin, Venezuela

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sebastiani, M.; Llambi, L.D.; Marquez, E.

    1998-07-01

    In Venezuela, the idea of tiering information between land-use ordering instruments and impact assessment is absent. In this article the authors explore a methodological alternative to bridge the information presented in land-use ordering instruments with the information requirements for impact assessment. The methodology is based on the steps carried out for an environmental impact assessment as well as on those considered to develop land-use ordering instruments. The methodology is applied to the territorial ordering plan and its proposal for the protection zone of the Cataniapo River basin. The purpose of the protection zone is to preserve the water quality andmore » quantity of the river basin for human consumption.« less

  8. Approach to Teaching Research Methodology for Information Technology

    ERIC Educational Resources Information Center

    Steenkamp, Annette Lerine; McCord, Samual Alan

    2007-01-01

    The paper reports on an approach to teaching a course in information technology research methodology in a doctoral program, the Doctor of Management in Information Technology (DMIT), in which research, with focus on finding innovative solutions to problems found in practice, comprises a significant part of the degree. The approach makes a…

  9. PNNL Strategic Goods Testbed: A Data Library for Illicit Nuclear Trafficking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Jennifer B.; Erikson, Luke E.; Toomey, Christopher M.

    2014-05-12

    Pacific Northwest National Laboratory (PNNL) has put significant effort into nonproliferation activities as an institution, both in terms of the classical nuclear material focused approach and in the examination of other strategic goods necessary to implement a nuclear program. To assist in these efforts, several projects in the Analysis in Motion (AIM) and Signature Discovery (SDI) Initiatives at PNNL are developing machine learning methodology for human-computer interaction in real time environments to assist analysts in this domain. All of these technical projects require access to data – whether it is in terms of detector data, shipping records, financial information, companymore » relations, or other communications. The first question that mathematical and computational researchers come up with when asked to build analyst assist or automated tools is “What does the data look like? ” They become frustrated when basic questions like this can not be easily answered and this can have the effect of pushing researchers away from the nuclear trafficking domain, especially in strategic commodity and export control areas where data sets can not easily be generated through standard experimental techniques. For small projects that are building a proof of concept for their methodology, obtaining this data can be arduous and expensive. To relieve the burden of data collection from these projects and grow a lab-wide capability, the Strategic Goods Testbed Team has taken over data collection and placed subscriptions and access to flat data files in a centralized location so that all projects can benefit from these items. We have collected shipping data in the form of PIERS records, judicial information about export control cases, NAC data on the nuclear fuel industry, and financial data from Dun and Bradstreet and our data sets are continuing to expand. With a single access agreement, researchers in data-mining and other fields can utilize all of the records that have been downloaded, make requests through subscription services, and interact with other researchers through our interface. Our testbed team provides more than a simply static repository by working with researchers to refine their data needs and insure data quality as well as quantity. We are currently working with laboratory and initiative specific management to examine effective ways for continuing data growth and sustainability.« less

  10. Preprocessing film-copied MRI for studying morphological brain changes.

    PubMed

    Pham, Tuan D; Eisenblätter, Uwe; Baune, Bernhard T; Berger, Klaus

    2009-06-15

    The magnetic resonance imaging (MRI) of the brain is one of the important data items for studying memory and morbidity in elderly as these images can provide useful information through the quantitative measures of various regions of interest of the brain. As an effort to fully automate the biomedical analysis of the brain that can be combined with the genetic data of the same human population and where the records of the original MRI data are missing, this paper presents two effective methods for addressing this imaging problem. The first method handles the restoration of the film-copied MRI. The second method involves the segmentation of the image data. Experimental results and comparisons with other methods suggest the usefulness of the proposed image analysis methodology.

  11. Cervical motion testing: methodology and clinical implications.

    PubMed

    Prushansky, Tamara; Dvir, Zeevi

    2008-09-01

    Measurement of cervical motion (CM) is probably the most commonly applied functional outcome measure in assessing the status of patients with cervical pathology. In general terms, CM refers to motion of the head relative to the trunk as well as conjunct motions within the cervical spine. Multiple techniques and instruments have been used for assessing CM. These were associated with a wide variety of parameters relating to accuracy, reproducibility, and validity. Modern measurement systems enable recording, processing, and documentation of CM with a high degree of precision. Cervical motion measures provide substantial information regarding the severity of motion limitation and level of effort in cervically involved patients. They may also be used for following up performance during and after conservative or invasive interventions.

  12. Security Protection on Trust Delegated Data in Public Mobile Networks

    NASA Astrophysics Data System (ADS)

    Weerasinghe, Dasun; Rajarajan, Muttukrishnan; Rakocevic, Veselin

    This paper provides detailed solutions for trust delegation and security protection for medical records in public mobile communication networks. The solutions presented in this paper enable the development of software for mobile devices that can be used by emergency medical units in urgent need of sensitive personal information about unconscious patients. In today's world, technical improvements in mobile communication systems mean that users can expect to have access to data at any time regardless of their location. This paper presents a token-based procedure for the data security at a mobile device and delegation of trust between a requesting mobile unit and secure medical data storage. The data security at the mobile device is enabled using identity based key generation methodology.

  13. An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design

    NASA Technical Reports Server (NTRS)

    Lin, Risheng; Afjeh, Abdollah A.

    2003-01-01

    Crucial to an efficient aircraft simulation-based design is a robust data modeling methodology for both recording the information and providing data transfer readily and reliably. To meet this goal, data modeling issues involved in the aircraft multidisciplinary design are first analyzed in this study. Next, an XML-based. extensible data object model for multidisciplinary aircraft design is constructed and implemented. The implementation of the model through aircraft databinding allows the design applications to access and manipulate any disciplinary data with a lightweight and easy-to-use API. In addition, language independent representation of aircraft disciplinary data in the model fosters interoperability amongst heterogeneous systems thereby facilitating data sharing and exchange between various design tools and systems.

  14. Structured prediction models for RNN based sequence labeling in clinical text.

    PubMed

    Jagannatha, Abhyuday N; Yu, Hong

    2016-11-01

    Sequence labeling is a widely used method for named entity recognition and information extraction from unstructured natural language data. In clinical domain one major application of sequence labeling involves extraction of medical entities such as medication, indication, and side-effects from Electronic Health Record narratives. Sequence labeling in this domain, presents its own set of challenges and objectives. In this work we experimented with various CRF based structured learning models with Recurrent Neural Networks. We extend the previously studied LSTM-CRF models with explicit modeling of pairwise potentials. We also propose an approximate version of skip-chain CRF inference with RNN potentials. We use these methodologies for structured prediction in order to improve the exact phrase detection of various medical entities.

  15. Structured prediction models for RNN based sequence labeling in clinical text

    PubMed Central

    Jagannatha, Abhyuday N; Yu, Hong

    2016-01-01

    Sequence labeling is a widely used method for named entity recognition and information extraction from unstructured natural language data. In clinical domain one major application of sequence labeling involves extraction of medical entities such as medication, indication, and side-effects from Electronic Health Record narratives. Sequence labeling in this domain, presents its own set of challenges and objectives. In this work we experimented with various CRF based structured learning models with Recurrent Neural Networks. We extend the previously studied LSTM-CRF models with explicit modeling of pairwise potentials. We also propose an approximate version of skip-chain CRF inference with RNN potentials. We use these methodologies1 for structured prediction in order to improve the exact phrase detection of various medical entities. PMID:28004040

  16. Incorporating ideas from computer-supported cooperative work.

    PubMed

    Pratt, Wanda; Reddy, Madhu C; McDonald, David W; Tarczy-Hornoch, Peter; Gennari, John H

    2004-04-01

    Many information systems have failed when deployed into complex health-care settings. We believe that one cause of these failures is the difficulty in systematically accounting for the collaborative and exception-filled nature of medical work. In this methodological review paper, we highlight research from the field of computer-supported cooperative work (CSCW) that could help biomedical informaticists recognize and design around the kinds of challenges that lead to unanticipated breakdowns and eventual abandonment of their systems. The field of CSCW studies how people collaborate with each other and the role that technology plays in this collaboration for a wide variety of organizational settings. Thus, biomedical informaticists could benefit from the lessons learned by CSCW researchers. In this paper, we provide a focused review of CSCW methods and ideas-we review aspects of the field that could be applied to improve the design and deployment of medical information systems. To make our discussion concrete, we use electronic medical record systems as an example medical information system, and present three specific principles from CSCW: accounting for incentive structures, understanding workflow, and incorporating awareness.

  17. Burden of informal care giving to patients with psychoses: a descriptive and methodological study.

    PubMed

    Flyckt, Lena; Löthman, Anna; Jörgensen, Leif; Rylander, Anders; Koernig, Thomas

    2013-03-01

    There is a lack of studies of the size of burden associated with informal care giving in psychosis. To evaluate the objective and subjective burden of informal care giving to patients with psychoses, and to compare a diary and recall method for assessments of objective burden. Patients and their informal caregivers were recruited from nine Swedish psychiatric outpatient centres. Subjective burden was assessed at inclusion using the CarerQoL and COPE index scales. The objective burden (time and money spent) was assessed by the caregivers daily using diaries over four weeks and by recall at the end of weeks 1 and 2. One-hundred and seven patients (53% females; mean age 43 ± 11) and 118 informal caregivers (67%; 58 ± 15 years) were recruited. Informal caregivers spent 22.5 hours/week and about 14% of their gross income on care-related activities. The time spent was underestimated by two to 20 hours when assessed by recall than by daily diary records. The most prominent aspects of the subjective burden were mental problems. Despite a substantial amount of time and money spent on care giving, the informal caregivers perceived the mental aspects of burden as the most troublesome. The informal caregiver burden is considerable and should be taken into account when evaluating effects of health care provided to patients with psychoses.

  18. Patient-Reported Safety Information: A Renaissance of Pharmacovigilance?

    PubMed

    Härmark, Linda; Raine, June; Leufkens, Hubert; Edwards, I Ralph; Moretti, Ugo; Sarinic, Viola Macolic; Kant, Agnes

    2016-10-01

    The role of patients as key contributors in pharmacovigilance was acknowledged in the new EU pharmacovigilance legislation. This contains several efforts to increase the involvement of the general public, including making patient adverse drug reaction (ADR) reporting systems mandatory. Three years have passed since the legislation was introduced and the key question is: does pharmacovigilance yet make optimal use of patient-reported safety information? Independent research has shown beyond doubt that patients make an important contribution to pharmacovigilance signal detection. Patient reports provide first-hand information about the suspected ADR and the circumstances under which it occurred, including medication errors, quality failures, and 'near misses'. Patient-reported safety information leads to a better understanding of the patient's experiences of the ADR. Patients are better at explaining the nature, personal significance and consequences of ADRs than healthcare professionals' reports on similar associations and they give more detailed information regarding quality of life including psychological effects and effects on everyday tasks. Current methods used in pharmacovigilance need to optimise use of the information reported from patients. To make the most of information from patients, the systems we use for collecting, coding and recording patient-reported information and the methodologies applied for signal detection and assessment need to be further developed, such as a patient-specific form, development of a severity grading and evolution of the database structure and the signal detection methods applied. It is time for a renaissance of pharmacovigilance.

  19. A simple landslide susceptibility analysis for hazard and risk assessment in developing countries

    NASA Astrophysics Data System (ADS)

    Guinau, M.; Vilaplana, J. M.

    2003-04-01

    In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.

  20. 40 CFR 98.247 - Records that must be retained.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Tier 4 Calculation Methodology in § 98.37. (b) If you comply with the mass balance methodology in § 98... with § 98.243(c)(4). (2) Start and end times and calculated carbon contents for time periods when off... determining carbon content of feedstock or product. (3) A part of the monitoring plan required under § 98.3(g...

  1. Methodological standards and patient-centeredness in comparative effectiveness research: the PCORI perspective.

    PubMed

    2012-04-18

    Rigorous methodological standards help to ensure that medical research produces information that is valid and generalizable, and are essential in patient-centered outcomes research (PCOR). Patient-centeredness refers to the extent to which the preferences, decision-making needs, and characteristics of patients are addressed, and is the key characteristic differentiating PCOR from comparative effectiveness research. The Patient Protection and Affordable Care Act signed into law in 2010 created the Patient-Centered Outcomes Research Institute (PCORI), which includes an independent, federally appointed Methodology Committee. The Methodology Committee is charged to develop methodological standards for PCOR. The 4 general areas identified by the committee in which standards will be developed are (1) prioritizing research questions, (2) using appropriate study designs and analyses, (3) incorporating patient perspectives throughout the research continuum, and (4) fostering efficient dissemination and implementation of results. A Congressionally mandated PCORI methodology report (to be issued in its first iteration in May 2012) will begin to provide standards in each of these areas, and will inform future PCORI funding announcements and review criteria. The work of the Methodology Committee is intended to enable generation of information that is relevant and trustworthy for patients, and to enable decisions that improve patient-centered outcomes.

  2. Review and evaluation of innovative technologies for measuring diet in nutritional epidemiology.

    PubMed

    Illner, A-K; Freisling, H; Boeing, H; Huybrechts, I; Crispim, S P; Slimani, N

    2012-08-01

    The use of innovative technologies is deemed to improve dietary assessment in various research settings. However, their relative merits in nutritional epidemiological studies, which require accurate quantitative estimates of the usual intake at individual level, still need to be evaluated. To report on the inventory of available innovative technologies for dietary assessment and to critically evaluate their strengths and weaknesses as compared with the conventional methodologies (i.e. Food Frequency Questionnaires, food records, 24-hour dietary recalls) used in epidemiological studies. A list of currently available technologies was identified from English-language journals, using PubMed and Web of Science. The search criteria were principally based on the date of publication (between 1995 and 2011) and pre-defined search keywords. Six main groups of innovative technologies were identified ('Personal Digital Assistant-', 'Mobile-phone-', 'Interactive computer-', 'Web-', 'Camera- and tape-recorder-' and 'Scan- and sensor-based' technologies). Compared with the conventional food records, Personal Digital Assistant and mobile phone devices seem to improve the recording through the possibility for 'real-time' recording at eating events, but their validity to estimate individual dietary intakes was low to moderate. In 24-hour dietary recalls, there is still limited knowledge regarding the accuracy of fully automated approaches; and methodological problems, such as the inaccuracy in self-reported portion sizes might be more critical than in interview-based applications. In contrast, measurement errors in innovative web-based and in conventional paper-based Food Frequency Questionnaires are most likely similar, suggesting that the underlying methodology is unchanged by the technology. Most of the new technologies in dietary assessment were seen to have overlapping methodological features with the conventional methods predominantly used for nutritional epidemiology. Their main potential to enhance dietary assessment is through more cost- and time-effective, less laborious ways of data collection and higher subject acceptance, though their integration in epidemiological studies would need additional considerations, such as the study objectives, the target population and the financial resources available. However, even in innovative technologies, the inherent individual bias related to self-reported dietary intake will not be resolved. More research is therefore crucial to investigate the validity of innovative dietary assessment technologies.

  3. The role of health informatics in clinical audit: part of the problem or key to the solution?

    PubMed

    Georgiou, Andrew; Pearson, Michael

    2002-05-01

    The concepts of quality assurance (for which clinical audit is an essential part), evaluation and clinical governance each depend on the ability to derive and record measurements that describe clinical performance. Rapid IT developments have raised many new possibilities for managing health care. They have allowed for easier collection and processing of data in greater quantities. These developments have encouraged the growth of quality assurance as a key feature of health care delivery. In the past most of the emphasis has been on hospital information systems designed predominantly for the administration of patients and the management of financial performance. Large, hi-tech information system capacity does not guarantee quality information. The task of producing information that can be confidently used to monitor the quality of clinical care requires attention to key aspects of the design and operation of the audit. The Myocardial Infarction National Audit Project (MINAP) utilizes an IT-based system to collect and process data on large numbers of patients and make them readily available to contributing hospitals. The project shows that IT systems that employ rigorous health informatics methodologies can do much to improve the monitoring and provision of health care.

  4. Complying with Executive Order 13148 using the Enterprise Environmental Safety And Occupational Health Management Information System.

    PubMed

    McFarland, Michael J; Nelson, Tim M; Rasmussen, Steve L; Palmer, Glenn R; Olivas, Arthur C

    2005-03-01

    All U.S. Department of Defense (DoD) facilities are required under Executive Order (EO) 13148, "Greening the Government through Leadership in Environmental Management," to establish quality-based environmental management systems (EMSs) that support environmental decision-making and verification of continuous environmental improvement by December 31, 2005. Compliance with EO 13148 as well as other federal, state, and local environmental regulations places a significant information management burden on DoD facilities. Cost-effective management of environmental data compels DoD facilities to establish robust database systems that not only address the complex and multifaceted environmental monitoring, record-keeping, and reporting requirements demanded by these rules but enable environmental management decision-makers to gauge improvements in environmental performance. The Enterprise Environmental Safety and Occupational Health Management Information System (EESOH-MIS) is a new electronic database developed by the U.S. Air Force to manage both the data needs associated with regulatory compliance programs across its facilities as well as the non-regulatory environmental information that supports installation business practices. The U.S. Air Force, which has adopted the Plan-Do-Check-Act methodology as the EMS standard that it will employ to address EO 13148 requirements.

  5. Natural Language Processing in Radiology: A Systematic Review.

    PubMed

    Pons, Ewoud; Braun, Loes M M; Hunink, M G Myriam; Kors, Jan A

    2016-05-01

    Radiological reporting has generated large quantities of digital content within the electronic health record, which is potentially a valuable source of information for improving clinical care and supporting research. Although radiology reports are stored for communication and documentation of diagnostic imaging, harnessing their potential requires efficient and automated information extraction: they exist mainly as free-text clinical narrative, from which it is a major challenge to obtain structured data. Natural language processing (NLP) provides techniques that aid the conversion of text into a structured representation, and thus enables computers to derive meaning from human (ie, natural language) input. Used on radiology reports, NLP techniques enable automatic identification and extraction of information. By exploring the various purposes for their use, this review examines how radiology benefits from NLP. A systematic literature search identified 67 relevant publications describing NLP methods that support practical applications in radiology. This review takes a close look at the individual studies in terms of tasks (ie, the extracted information), the NLP methodology and tools used, and their application purpose and performance results. Additionally, limitations, future challenges, and requirements for advancing NLP in radiology will be discussed. (©) RSNA, 2016 Online supplemental material is available for this article.

  6. Methodological Variation in Economic Evaluations Conducted in Low- and Middle-Income Countries: Information for Reference Case Development

    PubMed Central

    2015-01-01

    Information generated from economic evaluation is increasingly being used to inform health resource allocation decisions globally, including in low- and middle- income countries. However, a crucial consideration for users of the information at a policy level, e.g. funding agencies, is whether the studies are comparable, provide sufficient detail to inform policy decision making, and incorporate inputs from data sources that are reliable and relevant to the context. This review was conducted to inform a methodological standardisation workstream at the Bill and Melinda Gates Foundation (BMGF) and assesses BMGF-funded cost-per-DALY economic evaluations in four programme areas (malaria, tuberculosis, HIV/AIDS and vaccines) in terms of variation in methodology, use of evidence, and quality of reporting. The findings suggest that there is room for improvement in the three areas of assessment, and support the case for the introduction of a standardised methodology or reference case by the BMGF. The findings are also instructive for all institutions that fund economic evaluations in LMICs and who have a desire to improve the ability of economic evaluations to inform resource allocation decisions. PMID:25950443

  7. Validation of the Earthquake Archaeological Effects methodology by studying the San Clemente cemetery damages generated during the Lorca earthquake of 2011

    NASA Astrophysics Data System (ADS)

    Martín-González, Fidel; Martín-Velazquez, Silvia; Rodrigez-Pascua, Miguel Angel; Pérez-López, Raul; Silva, Pablo

    2014-05-01

    The intensity scales determined the damage caused by an earthquake. However, a new methodology takes into account not only the damage but the type of damage "Earthquake Archaeological Effects", EAE's, and its orientation (e.g. displaced masonry blocks, conjugated fractures, fallen and oriented columns, impact marks, dipping broken corners, etc.) (Rodriguez-Pascua et al., 2011; Giner-Robles et al., 2012). Its main contribution is that it focuses not only on the amount of damage but also in its orientation, giving information about the ground motion during the earthquake. Therefore, this orientations and instrumental data can be correlated with historical earthquakes. In 2011 an earthquake of magnitude Mw 5.2 took place in Lorca (SE Spain) (9 casualties and 460 million Euros in reparations). The study of the EAE's was carried out through the whole city (Giner-Robles et al., 2012). The present study aimed to a.- validate the EAE's methodology using it only in a small place, specifically the cemetery of San Clemente in Lorca, and b.- constraining the range of orientation for each EAE's. This cemetery has been selected because these damage orientation data can be correlated with instrumental information available, and also because this place has: a.- wide variety of architectural styles (neogothic, neobaroque, neoarabian), b.- its Cultural Interest (BIC), and c.- different building materials (brick, limestone, marble). The procedure involved two main phases: a.- inventory and identification of damage (EAE's) by pictures, and b.- analysis of the damage orientations. The orientation was calculated for each EAE's and plotted in maps. Results are NW-SE damage orientation. This orientation is consistent with that recorded in the accelerometer of Lorca (N160°E) and with that obtained from the analysis of EAE's for the whole town of Lorca (N130°E) (Giner-Robles et al., 2012). Due to the existence of an accelerometer, we know the orientation of the peak ground acceleration and we have been able to constrain the ranges of orientation for each EAE's. The orientation of the damage is not usually recorded after an earthquake; however, it can provide information on seismic source in historical earthquakes. References Giner-Robles, J. L., Perez-Lopez, R., Silva Barroso, P., Rodriguez-Pascua, M. A., Martin-Gonzalez, F. and Cabanas, L. 2012. Analisis estructural de danos orientados en el terremoto de Lorca del 11 de mayo de 2011. Aplicaciones en arqueosismologia. Boletín Geológico y Minero, 123 (4): 503-513 Rodriguez-Pascua, M.A., Perez-Lopez, R., Silva, P.G., Giner- Robles, J.L., Garduno-Monroy, V.H. and Reicherter, K. 2011. A Comprehensive Classification of Earthquake Archaeological Effects (EAE) for Archaeoseismology. Quaternary International, 242, 20-30.

  8. Key Challenges of Using Video When Investigating Social Practices in Education: Contextualization, Magnification, and Representation

    ERIC Educational Resources Information Center

    Blikstad-Balas, Marte

    2017-01-01

    Audio- and video-recordings are increasingly popular data sources in contemporary qualitative research, making discussions about methodological implications of such recordings timelier than ever. This article goes beyond discussing practical issues and issues of "camera effect" and reactivity to identify three major challenges of using…

  9. Functional Requirements for Bibliographic Records: An Investigation of Two Prototypes

    ERIC Educational Resources Information Center

    Pisanski, Jan; Zumer, Maja

    2007-01-01

    Purpose: This paper aims to establish how the Functional Requirements for Bibliographic Records (FRBR) conceptual model, which holds a lot of potential in theory, works in practice. It also aims to identify, and if possible, give solutions to problems found in two of the existing prototypes. Design/methodology/approach: An independent evaluation…

  10. 76 FR 66325 - Agency Information Collection Activities: Proposed Collection, Comments Requested; E-FOIA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-26

    ... Division Record Information Dissemination Section (RIDS) will be submitting the following information... collection: Records Management Division/ Record Information Dissemination Section, Federal Bureau of... requesters etc). Abstract: The Record/Information Dissemination Section (RIDS) effectively plans, develops...

  11. Single-shot speckle reduction in numerical reconstruction of digitally recorded holograms.

    PubMed

    Hincapie, Diego; Herrera-Ramírez, Jorge; Garcia-Sucerquia, Jorge

    2015-04-15

    A single-shot method to reduce the speckle noise in the numerical reconstructions of electronically recorded holograms is presented. A recorded hologram with the dimensions N×M is split into S=T×T sub-holograms. The uncorrelated superposition of the individually reconstructed sub-holograms leads to an image with the speckle noise reduced proportionally to the 1/S law. The experimental results are presented to support the proposed methodology.

  12. 28 CFR 20.33 - Dissemination of criminal history record information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Dissemination of criminal history record... SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.33 Dissemination of criminal history record information. (a) Criminal history record information contained in the III System...

  13. 28 CFR 20.33 - Dissemination of criminal history record information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Dissemination of criminal history record... SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.33 Dissemination of criminal history record information. (a) Criminal history record information contained in the III System...

  14. 28 CFR 20.33 - Dissemination of criminal history record information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Dissemination of criminal history record... SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.33 Dissemination of criminal history record information. (a) Criminal history record information contained in the III System...

  15. 28 CFR 20.33 - Dissemination of criminal history record information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Dissemination of criminal history record... SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.33 Dissemination of criminal history record information. (a) Criminal history record information contained in the III System...

  16. 28 CFR 20.33 - Dissemination of criminal history record information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Dissemination of criminal history record... SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.33 Dissemination of criminal history record information. (a) Criminal history record information contained in the III System...

  17. Measurement-based auralization methodology for the assessment of noise mitigation measures

    NASA Astrophysics Data System (ADS)

    Thomas, Pieter; Wei, Weigang; Van Renterghem, Timothy; Botteldooren, Dick

    2016-09-01

    The effect of noise mitigation measures is generally expressed by noise levels only, neglecting the listener's perception. In this study, an auralization methodology is proposed that enables an auditive preview of noise abatement measures for road traffic noise, based on the direction dependent attenuation of a priori recordings made with a dedicated 32-channel spherical microphone array. This measurement-based auralization has the advantage that all non-road traffic sounds that create the listening context are present. The potential of this auralization methodology is evaluated through the assessment of the effect of an L-shaped mound. The angular insertion loss of the mound is estimated by using the ISO 9613-2 propagation model, the Pierce barrier diffraction model and the Harmonoise point-to-point model. The realism of the auralization technique is evaluated by listening tests, indicating that listeners had great difficulty in differentiating between a posteriori recordings and auralized samples, which shows the validity of the followed approaches.

  18. Accuracies of breeding values for dry matter intake using nongenotyped animals and predictor traits in different lactations.

    PubMed

    Manzanilla-Pech, C I V; Veerkamp, R F; de Haas, Y; Calus, M P L; Ten Napel, J

    2017-11-01

    Given the interest of including dry matter intake (DMI) in the breeding goal, accurate estimated breeding values (EBV) for DMI are needed, preferably for separate lactations. Due to the limited amount of records available on DMI, 2 main approaches have been suggested to compute those EBV: (1) the inclusion of predictor traits, such as fat- and protein-corrected milk (FPCM) and live weight (LW), and (2) the addition of genomic information of animals using what is called genomic prediction. Recently, several methodologies to estimate EBV utilizing genomic information (EBV) have become available. In this study, a new method known as single-step ridge-regression BLUP (SSRR-BLUP) is suggested. The SSRR-BLUP method does not have an imposed limit on the number of genotyped animals, as the commonly used methods do. The objective of this study was to estimate genetic parameters using a relatively large data set with DMI records, as well as compare the accuracies of the EBV for DMI. These accuracies were obtained using 4 different methods: BLUP (using pedigree for all animals with phenotypes), genomic BLUP (GBLUP; only for genotyped animals), single-step GBLUP (SS-GBLUP), and SSRR-BLUP (for genotyped and nongenotyped animals). Records from different lactations, with or without predictor traits (FPCM and LW), were used in the model. Accuracies of EBV for DMI (defined as the correlation between the EBV and pre-adjusted DMI phenotypes divided by the average accuracy of those phenotypes) ranged between 0.21 and 0.38 across methods and scenarios. Accuracies of EBV for DMI using BLUP were the lowest accuracies obtained across methods. Meanwhile, accuracies of EBV for DMI were similar in SS-GBLUP and SSRR-BLUP, and lower for the GBLUP method. Hence, SSRR-BLUP could be used when the number of genotyped animals is large, avoiding the construction of the inverse genomic relationship matrix. Adding information on DMI from different lactations in the reference population gave higher accuracies in comparison when only lactation 1 was included. Finally, no benefit was obtained by adding information on predictor traits to the reference population when DMI was already included. However, in the absence of DMI records, having records on FPCM and LW from different lactations is a good way to obtain EBV with a relatively good accuracy. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  19. The Perceptions of U.S.-Based IT Security Professionals about the Effectiveness of IT Security Frameworks: A Quantitative Study

    ERIC Educational Resources Information Center

    Warfield, Douglas L.

    2011-01-01

    The evolution of information technology has included new methodologies that use information technology to control and manage various industries and government activities. Information Technology has also evolved as its own industry with global networks of interconnectivity, such as the Internet, and frameworks, models, and methodologies to control…

  20. 36 CFR 1290.3 - Sources of assassination records and additional records and information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Sources of assassination records and additional records and information. Assassination records and... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Sources of assassination records and additional records and information. 1290.3 Section 1290.3 Parks, Forests, and Public Property...

  1. Using TELOS for the planning of the information system audit

    NASA Astrophysics Data System (ADS)

    Drljaca, D. P.; Latinovic, B.

    2018-01-01

    This paper intent is to analyse different aspects of information system audit and to synthesise them into the feasibility study report in order to facilitate decision making and planning of information system audit process. The TELOS methodology provides a comprehensive and holistic review for making feasibility study in general. This paper examines the use of TELOS in the identification of possible factors that may influence the decision on implementing information system audit. The research question relates to TELOS provision of sufficient information to decision makers to plan an information system audit. It was found that the TELOS methodology can be successfully applied in the process of approving and planning of information system audit. The five aspects of the feasibility study, if performed objectively, can provide sufficient information to decision makers to commission an information system audit, and also contribute better planning of the audit. Using TELOS methodology can assure evidence-based and cost-effective decision-making process and facilitate planning of the audit. The paper proposes an original approach, not examined until now. It is usual to use TELOS for different purposes and when there is a need for conveying of the feasibility study, but not in the planning of the information system audit. This gives originality to the paper and opens further research questions about evaluation of the feasibility study and possible research on comparative and complementary methodologies.

  2. Selected Streamflow Statistics and Regression Equations for Predicting Statistics at Stream Locations in Monroe County, Pennsylvania

    USGS Publications Warehouse

    Thompson, Ronald E.; Hoffman, Scott A.

    2006-01-01

    A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.

  3. Using barometric time series of the IMS infrasound network for a global analysis of thermally induced atmospheric tides

    NASA Astrophysics Data System (ADS)

    Hupe, Patrick; Ceranna, Lars; Pilger, Christoph

    2018-04-01

    The International Monitoring System (IMS) has been established to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty and comprises four technologies, one of which is infrasound. When fully established, the IMS infrasound network consists of 60 sites uniformly distributed around the globe. Besides its primary purpose of determining explosions in the atmosphere, the recorded data reveal information on other anthropogenic and natural infrasound sources. Furthermore, the almost continuous multi-year recordings of differential and absolute air pressure allow for analysing the atmospheric conditions. In this paper, spectral analysis tools are applied to derive atmospheric dynamics from barometric time series. Based on the solar atmospheric tides, a methodology for performing geographic and temporal variability analyses is presented, which is supposed to serve for upcoming studies related to atmospheric dynamics. The surplus value of using the IMS infrasound network data for such purposes is demonstrated by comparing the findings on the thermal tides with previous studies and the Modern-Era Retrospective analysis for Research and Applications Version 2 (MERRA-2), which represents the solar tides well in its surface pressure fields. Absolute air pressure recordings reveal geographical characteristics of atmospheric tides related to the solar day and even to the lunar day. We therefore claim the chosen methodology of using the IMS infrasound network to be applicable for global and temporal studies on specific atmospheric dynamics. Given the accuracy and high temporal resolution of the barometric data from the IMS infrasound network, interactions with gravity waves and planetary waves can be examined in future for refining the knowledge of atmospheric dynamics, e.g. the origin of tidal harmonics up to 9 cycles per day as found in the barometric data sets. Data assimilation in empirical models of solar tides would be a valuable application of the IMS infrasound data.

  4. Building a house on shifting sand: methodological considerations when evaluating the implementation and adoption of national electronic health record systems

    PubMed Central

    2012-01-01

    Background A commitment to Electronic Health Record (EHR) systems now constitutes a core part of many governments’ healthcare reform strategies. The resulting politically-initiated large-scale or national EHR endeavors are challenging because of their ambitious agendas of change, the scale of resources needed to make them work, the (relatively) short timescales set, and the large number of stakeholders involved, all of whom pursue somewhat different interests. These initiatives need to be evaluated to establish if they improve care and represent value for money. Methods Critical reflections on these complexities in the light of experience of undertaking the first national, longitudinal, and sociotechnical evaluation of the implementation and adoption of England’s National Health Service’s Care Records Service (NHS CRS). Results/discussion We advance two key arguments. First, national programs for EHR implementations are likely to take place in the shifting sands of evolving sociopolitical and sociotechnical and contexts, which are likely to shape them in significant ways. This poses challenges to conventional evaluation approaches which draw on a model of baseline operations → intervention → changed operations (outcome). Second, evaluation of such programs must account for this changing context by adapting to it. This requires careful and creative choice of ontological, epistemological and methodological assumptions. Summary New and significant challenges are faced in evaluating national EHR implementation endeavors. Based on experiences from this national evaluation of the implementation and adoption of the NHS CRS in England, we argue for an approach to these evaluations which moves away from seeing EHR systems as Information and Communication Technologies (ICT) projects requiring an essentially outcome-centred assessment towards a more interpretive approach that reflects the situated and evolving nature of EHR seen within multiple specific settings and reflecting a constantly changing milieu of policies, strategies and software, with constant interactions across such boundaries. PMID:22545646

  5. Building a house on shifting sand: methodological considerations when evaluating the implementation and adoption of national electronic health record systems.

    PubMed

    Takian, Amirhossein; Petrakaki, Dimitra; Cornford, Tony; Sheikh, Aziz; Barber, Nicholas

    2012-04-30

    A commitment to Electronic Health Record (EHR) systems now constitutes a core part of many governments' healthcare reform strategies. The resulting politically-initiated large-scale or national EHR endeavors are challenging because of their ambitious agendas of change, the scale of resources needed to make them work, the (relatively) short timescales set, and the large number of stakeholders involved, all of whom pursue somewhat different interests. These initiatives need to be evaluated to establish if they improve care and represent value for money. Critical reflections on these complexities in the light of experience of undertaking the first national, longitudinal, and sociotechnical evaluation of the implementation and adoption of England's National Health Service's Care Records Service (NHS CRS). We advance two key arguments. First, national programs for EHR implementations are likely to take place in the shifting sands of evolving sociopolitical and sociotechnical and contexts, which are likely to shape them in significant ways. This poses challenges to conventional evaluation approaches which draw on a model of baseline operations → intervention → changed operations (outcome). Second, evaluation of such programs must account for this changing context by adapting to it. This requires careful and creative choice of ontological, epistemological and methodological assumptions. New and significant challenges are faced in evaluating national EHR implementation endeavors. Based on experiences from this national evaluation of the implementation and adoption of the NHS CRS in England, we argue for an approach to these evaluations which moves away from seeing EHR systems as Information and Communication Technologies (ICT) projects requiring an essentially outcome-centred assessment towards a more interpretive approach that reflects the situated and evolving nature of EHR seen within multiple specific settings and reflecting a constantly changing milieu of policies, strategies and software, with constant interactions across such boundaries.

  6. SOAP Methodology in General Practice/Family Medicine Teaching in Practical Context.

    PubMed

    Santiago, Luiz Miguel; Neto, Isabel

    2016-12-30

    Medical records in General Practice/Family Medicine are an essential information support on the health status of the patient and a communication document between health professionals. The development of competencies in General Practice/Family Medicine during pre-graduation must include the ability to make adequate medical records in practical context. As of 2012, medicine students at the University of Beira Interior have been performing visits using the Subjective, Objective, Assessment and Plan - SOAP methodology, with a performance evaluation of the visit, with the aim to check on which Subjective, Objective, Assessment and Plan - SOAP aspects students reveal the most difficulties in order to define improvement techniques and to correlate patient grade with tutor evaluation. Analysing the evaluation data for the 2015 - 2016 school year at the General Practice/Family Medicine visit carried out by fourth year students in medicine, comparing the averages of each item in the Subjective, Objective, Assessment and Plan - SOAP checklist and the patient evaluation. In the Subjective, Objective, Assessment and Plan - SOAP, 29.7% of students are on the best grade quartile, 37.1% are on the best competencies quartile and 27.2% on the best patient grade quartile. 'Evolution was verified/noted' received the worst grades in Subjective, 'Record of physical examination focused on the problem of the visit' received the worst grades in Objective, 'Notes of Diagnostic reasoning / differential diagnostic' received de worst grades in Assessment and 'Negotiation of aims to achieve' received the worst grades in Plan. The best tutor evaluation is found in 'communication'. Only one single study evaluated student´s performance under examination during a visit, with similar results to the present one and none addressed the patient's evaluation. Students revealed a good performance in using the Subjective, Objective, Assessment and Plan - SOAP. The findings represent the beginning of the introduction of the Subjective, Objective, Assessment and Plan - SOAP to the students. This evaluation breaks ground towards better ways to teach the most difficult aspects.

  7. Using Nocturnal Flight Calls to Assess the Fall Migration of Warblers and Sparrows along a Coastal Ecological Barrier

    PubMed Central

    Smith, Adam D.; Paton, Peter W. C.; McWilliams, Scott R.

    2014-01-01

    Atmospheric conditions fundamentally influence the timing, intensity, energetics, and geography of avian migration. While radar is typically used to infer the influence of weather on the magnitude and spatiotemporal patterns of nocturnal bird migration, monitoring the flight calls produced by many bird species during nocturnal migration represents an alternative methodology and provides information regarding the species composition of nocturnal migration. We used nocturnal flight call (NFC) recordings of at least 22 migratory songbirds (14 warbler and 8 sparrow species) during fall migration from eight sites along the mainland and island coasts of Rhode Island to evaluate five hypotheses regarding NFC detections. Patterns of warbler and sparrow NFC detections largely supported our expectations in that (1) NFC detections associated positively and strongly with wind conditions that influence the intensity of coastal bird migration and negatively with regional precipitation; (2) NFCs increased during conditions with reduced visibility (e.g., high cloud cover); (3) NFCs decreased with higher wind speeds, presumably due mostly to increased ambient noise; and (4) coastal mainland sites recorded five to nine times more NFCs, on average, than coastal nearshore or offshore island sites. However, we found little evidence that (5) nightly or intra-night patterns of NFCs reflected the well-documented latitudinal patterns of migrant abundance on an offshore island. Despite some potential complications in inferring migration intensity and species composition from NFC data, the acoustic monitoring of NFCs provides a viable and complementary methodology for exploring the spatiotemporal patterns of songbird migration as well as evaluating the atmospheric conditions that shape these patterns. PMID:24643060

  8. The value of using DNA markers for beef bull selection in the seedstock sector.

    PubMed

    Van Eenennaam, A L; van der Werf, J H J; Goddard, M E

    2011-02-01

    The objective of this study was to estimate the value derived from using DNA information to increase the accuracy of beef sire selection in a closed seedstock herd. Breeding objectives for commercial production systems targeting 2 diverse markets were examined using multiple-trait selection indexes developed for the Australian cattle industry. Indexes included those for both maternal (self-replacing) and terminal herds targeting either a domestic market, where steers are finished on pasture, or the export market, where steers are finished on concentrate rations in feedlots and marbling has a large value. Selection index theory was used to predict the response to conventional selection based on phenotypic performance records, and this was compared with including information from 2 hypothetical marker panels. In 1 case the marker panel explained a percentage of additive genetic variance equal to the heritability for all traits in the breeding objective and selection criteria, and in the other case to one-half of this amount. Discounted gene flow methodology was used to calculate the value derived from the use of superior bulls selected using DNA test information and performance recording over that derived from conventional selection using performance recording alone. Results were ultimately calculated as discounted returns per DNA test purchased by the seedstock operator. The DNA testing using these hypothetical marker panels increased the selection response between 29 to 158%. The value of this improvement above that obtained using traditional performance recording ranged from $89 to 565 per commercial bull, and $5,332 to 27,910 per stud bull. Assuming that the entire bull calf crop was tested to achieve these gains, the value of the genetic gain derived from DNA testing ranged from $204 to 1,119 per test. All values assumed that the benefits derived from using superior bulls were efficiently transferred along the production chain to the seedstock producer incurring the costs of genotyping. These results suggest that the development of greater-accuracy DNA tests for beef cattle selection could be beneficial from an industry-wide perspective, but the commercial viability will strongly depend on price signaling throughout the production chain.

  9. Information for decision making from imperfect national data: tracking major changes in health care use in Kenya using geostatistics.

    PubMed

    Gething, Peter W; Noor, Abdisalan M; Goodman, Catherine A; Gikandi, Priscilla W; Hay, Simon I; Sharif, Shahnaaz K; Atkinson, Peter M; Snow, Robert W

    2007-12-11

    Most Ministries of Health across Africa invest substantial resources in some form of health management information system (HMIS) to coordinate the routine acquisition and compilation of monthly treatment and attendance records from health facilities nationwide. Despite the expense of these systems, poor data coverage means they are rarely, if ever, used to generate reliable evidence for decision makers. One critical weakness across Africa is the current lack of capacity to effectively monitor patterns of service use through time so that the impacts of changes in policy or service delivery can be evaluated. Here, we present a new approach that, for the first time, allows national changes in health service use during a time of major health policy change to be tracked reliably using imperfect data from a national HMIS. Monthly attendance records were obtained from the Kenyan HMIS for 1 271 government-run and 402 faith-based outpatient facilities nationwide between 1996 and 2004. A space-time geostatistical model was used to compensate for the large proportion of missing records caused by non-reporting health facilities, allowing robust estimation of monthly and annual use of services by outpatients during this period. We were able to reconstruct robust time series of mean levels of outpatient utilisation of health facilities at the national level and for all six major provinces in Kenya. These plots revealed reliably for the first time a period of steady nationwide decline in the use of health facilities in Kenya between 1996 and 2002, followed by a dramatic increase from 2003. This pattern was consistent across different causes of attendance and was observed independently in each province. The methodological approach presented can compensate for missing records in health information systems to provide robust estimates of national patterns of outpatient service use. This represents the first such use of HMIS data and contributes to the resurrection of these hugely expensive but underused systems as national monitoring tools. Applying this approach to Kenya has yielded output with immediate potential to enhance the capacity of decision makers in monitoring nationwide patterns of service use and assessing the impact of changes in health policy and service delivery.

  10. Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology

    NASA Astrophysics Data System (ADS)

    Morgan, T. W.; Thurgood, R. L.

    1984-05-01

    This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.

  11. Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Joseph Daniel; Anderson, Robert Stephen

    Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less

  12. Evaluation of internal peer-review to train nurses recruiting to a randomized controlled trial--Internal Peer-review for Recruitment Training in Trials (InterPReTiT).

    PubMed

    Mann, Cindy; Delgado, Debbie; Horwood, Jeremy

    2014-04-01

    A discussion and qualitative evaluation of the use of peer-review to train nurses and optimize recruitment practice in a randomized controlled trial. Sound recruitment processes are critical to the success of randomized controlled trials. Nurses recruiting to trials must obtain consent for an intervention that is administered for reasons other than anticipated benefit to the patient. This requires not only patients' acquiescence but also evidence that they have weighed the relevant information in reaching their decision. How trial information is explained is vital, but communication and training can be inadequate. A discussion of a new process to train nurses recruiting to a randomized controlled trial. Literature from 1999-2013 about consenting to trials is included. Over 3 months from 2009-2010, recruiting nurses reviewed recruitment interviews recorded during the pilot phase of a single-site randomized controlled trial and noted content, communication style and interactions. They discussed their findings during peer-review meetings, which were audio-recorded and analysed using qualitative methodology. Peer-review can enhance nurses' training in trial recruitment procedures by supporting development of the necessary communication skills, facilitating consistency in information provision and sharing best practice. Nurse-led peer-review can provide a forum to share communication strategies that will elicit and address participant concerns and obtain evidence of participant understanding prior to consent. Comparing practice can improve consistency and accuracy of trial information and facilitate identification of recruitment issues. Internal peer-review was well accepted and promoted team cohesion. Further evaluation is needed. © 2013 John Wiley & Sons Ltd.

  13. On the importance of local dynamics in statokinesigram: A multivariate approach for postural control evaluation in elderly.

    PubMed

    Bargiotas, Ioannis; Audiffren, Julien; Vayatis, Nicolas; Vidal, Pierre-Paul; Buffat, Stephane; Yelnik, Alain P; Ricard, Damien

    2018-01-01

    The fact that almost one third of population >65 years-old has at least one fall per year, makes the risk-of-fall assessment through easy-to-use measurements an important issue in current clinical practice. A common way to evaluate posture is through the recording of the center-of-pressure (CoP) displacement (statokinesigram) with force platforms. Most of the previous studies, assuming homogeneous statokinesigrams in quiet standing, used global parameters in order to characterize the statokinesigrams. However the latter analysis provides little information about local characteristics of statokinesigrams. In this study, we propose a multidimensional scoring approach which locally characterizes statokinesigrams on small time-periods, or blocks, while highlighting those which are more indicative to the general individual's class (faller/non-faller). Moreover, this information can be used to provide a global score in order to evaluate the postural control and classify fallers/non-fallers. We evaluate our approach using the statokinesigram of 126 community-dwelling elderly (78.5 ± 7.7 years). Participants were recorded with eyes open and eyes closed (25 seconds each acquisition) and information about previous falls was collected. The performance of our findings are assessed using the receiver operating characteristics (ROC) analysis and the area under the curve (AUC). The results show that global scores provided by splitting statokinesigrams in smaller blocks and analyzing them locally, classify fallers/non-fallers more effectively (AUC = 0.77 ± 0.09 instead of AUC = 0.63 ± 0.12 for global analysis when splitting is not used). These promising results indicate that such methodology might provide supplementary information about the risk of fall of an individual and be of major usefulness in assessment of balance-related diseases such as Parkinson's disease.

  14. On the importance of local dynamics in statokinesigram: A multivariate approach for postural control evaluation in elderly

    PubMed Central

    Audiffren, Julien; Vayatis, Nicolas; Vidal, Pierre-Paul; Buffat, Stephane; Yelnik, Alain P.; Ricard, Damien

    2018-01-01

    The fact that almost one third of population >65 years-old has at least one fall per year, makes the risk-of-fall assessment through easy-to-use measurements an important issue in current clinical practice. A common way to evaluate posture is through the recording of the center-of-pressure (CoP) displacement (statokinesigram) with force platforms. Most of the previous studies, assuming homogeneous statokinesigrams in quiet standing, used global parameters in order to characterize the statokinesigrams. However the latter analysis provides little information about local characteristics of statokinesigrams. In this study, we propose a multidimensional scoring approach which locally characterizes statokinesigrams on small time-periods, or blocks, while highlighting those which are more indicative to the general individual’s class (faller/non-faller). Moreover, this information can be used to provide a global score in order to evaluate the postural control and classify fallers/non-fallers. We evaluate our approach using the statokinesigram of 126 community-dwelling elderly (78.5 ± 7.7 years). Participants were recorded with eyes open and eyes closed (25 seconds each acquisition) and information about previous falls was collected. The performance of our findings are assessed using the receiver operating characteristics (ROC) analysis and the area under the curve (AUC). The results show that global scores provided by splitting statokinesigrams in smaller blocks and analyzing them locally, classify fallers/non-fallers more effectively (AUC = 0.77 ± 0.09 instead of AUC = 0.63 ± 0.12 for global analysis when splitting is not used). These promising results indicate that such methodology might provide supplementary information about the risk of fall of an individual and be of major usefulness in assessment of balance-related diseases such as Parkinson’s disease. PMID:29474402

  15. Searching for qualitative research for inclusion in systematic reviews: a structured methodological review.

    PubMed

    Booth, Andrew

    2016-05-04

    Qualitative systematic reviews or qualitative evidence syntheses (QES) are increasingly recognised as a way to enhance the value of systematic reviews (SRs) of clinical trials. They can explain the mechanisms by which interventions, evaluated within trials, might achieve their effect. They can investigate differences in effects between different population groups. They can identify which outcomes are most important to patients, carers, health professionals and other stakeholders. QES can explore the impact of acceptance, feasibility, meaningfulness and implementation-related factors within a real world setting and thus contribute to the design and further refinement of future interventions. To produce valid, reliable and meaningful QES requires systematic identification of relevant qualitative evidence. Although the methodologies of QES, including methods for information retrieval, are well-documented, little empirical evidence exists to inform their conduct and reporting. This structured methodological overview examines papers on searching for qualitative research identified from the Cochrane Qualitative and Implementation Methods Group Methodology Register and from citation searches of 15 key papers. A single reviewer reviewed 1299 references. Papers reporting methodological guidance, use of innovative methodologies or empirical studies of retrieval methods were categorised under eight topical headings: overviews and methodological guidance, sampling, sources, structured questions, search procedures, search strategies and filters, supplementary strategies and standards. This structured overview presents a contemporaneous view of information retrieval for qualitative research and identifies a future research agenda. This review concludes that poor empirical evidence underpins current information practice in information retrieval of qualitative research. A trend towards improved transparency of search methods and further evaluation of key search procedures offers the prospect of rapid development of search methods.

  16. Engineering design knowledge recycling in near-real-time

    NASA Technical Reports Server (NTRS)

    Leifer, Larry; Baya, Vinod; Toye, George; Baudin, Catherine; Underwood, Jody Gevins

    1994-01-01

    It is hypothesized that the capture and reuse of machine readable design records is cost beneficial. This informal engineering notebook design knowledge can be used to model the artifact and the design process. Design rationale is, in part, preserved and available for examination. Redesign cycle time is significantly reduced (Baya et al, 1992). These factors contribute to making it less costly to capture and reuse knowledge than to recreate comparable knowledge (current practice). To test the hypothesis, we have focused on validation of the concept and tools in two 'real design' projects this past year: (1) a short (8 month) turnaround project for NASA life science bioreactor researchers was done by a team of three mechanical engineering graduate students at Stanford University (in a class, ME210abc 'Mechatronic Systems Design and Methodology' taught by one of the authors, Leifer); and (2) a long range (8 to 20 year) international consortium project for NASA's Space Science program (STEP: satellite test of the equivalence principle). Design knowledge capture was supported this year by assigning the use of a Team-Design PowerBook. Design records were cataloged in near-real time. These records were used to qualitatively model the artifact design as it evolved. Dedal, an 'intelligent librarian' developed at NASA-ARC, was used to navigate and retrieve captured knowledge for reuse.

  17. Recording Information on Architectural Heritage Should Meet the Requirements for Conservation Digital Recording Practices at the Summer Palace

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Cong, Y.; Wu, C.; Bai, C.; Wu, C.

    2017-08-01

    The recording of Architectural heritage information is the foundation of research, conservation, management, and the display of architectural heritage. In other words, the recording of architectural heritage information supports heritage research, conservation, management and architectural heritage display. What information do we record and collect and what technology do we use for information recording? How do we determine the level of accuracy required when recording architectural information? What method do we use for information recording? These questions should be addressed in relation to the nature of the particular heritage site and the specific conditions for the conservation work. In recent years, with the rapid development of information acquisition technology such as Close Range Photogrammetry, 3D Laser Scanning as well as high speed and high precision Aerial Photogrammetry, many Chinese universities, research institutes and heritage management bureaux have purchased considerable equipment for information recording. However, the lack of understanding of both the nature of architectural heritage and the purpose for which the information is being collected has led to several problems. For example: some institutions when recording architectural heritage information aim solely at high accuracy. Some consider that advanced measuring methods must automatically replace traditional measuring methods. Information collection becomes the purpose, rather than the means, of architectural heritage conservation. Addressing these issues, this paper briefly reviews the history of architectural heritage information recording at the Summer Palace (Yihe Yuan, first built in 1750), Beijing. Using the recording practices at the Summer Palace during the past ten years as examples, we illustrate our achievements and lessons in recording architectural heritage information with regard to the following aspects: (buildings') ideal status desired, (buildings') current status, structural distortion analysis, display, statue restoration and thematic research. Three points will be highlighted in our discussion: 1. Understanding of the heritage is more important than the particular technology used: Architectural heritage information collection and recording are based on an understanding of the value and nature of the architectural heritage. Understanding is the purpose, whereas information collection and recording are the means. 2. Demand determines technology: Collecting and recording architectural heritage information is to serve the needs of heritage research, conservation, management and display. These different needs determine the different technologies that we use. 3. Set the level of accuracy appropriately: For information recording, high accuracy is not the key criterion; rather an appropriate level of accuracy is key. There is considerable deviation between the nominal accuracy of any instrument and the accuracy of any particular measurement.

  18. An integrative review of information systems and terminologies used in local health departments.

    PubMed

    Olsen, Jeanette; Baisch, Mary Jo

    2014-02-01

    The purpose of this integrative review based on the published literature was to identify information systems currently being used by local health departments and to determine the extent to which standard terminology was used to communicate data, interventions, and outcomes to improve public health informatics at the local health department (LHD) level and better inform research, policy, and programs. Whittemore and Knafl's integrative review methodology was used. Data were obtained through key word searches of three publication databases and reference lists of retrieved articles and consulting with experts to identify landmark works. The final sample included 45 articles analyzed and synthesized using the matrix method. The results indicated a wide array of information systems were used by LHDs and supported diverse functions aligned with five categories: administration; surveillance; health records; registries; and consumer resources. Detail regarding specific programs being used, location or extent of use, or effectiveness was lacking. The synthesis indicated evidence of growing interest in health information exchange groups, yet few studies described use of data standards or standard terminology in LHDs. Research to address these gaps is needed to provide current, meaningful data that inform public health informatics research, policy, and initiatives at and across the LHD level. Coordination at a state or national level is recommended to collect information efficiently about LHD information systems that will inform improvements while minimizing duplication of efforts and financial burden. Until this happens, efforts to strengthen LHD information systems and policies may be significantly challenged.

  19. Increasing the information rates of optical communications via coded modulation: a study of transceiver performance

    NASA Astrophysics Data System (ADS)

    Maher, Robert; Alvarado, Alex; Lavery, Domaniç; Bayvel, Polina

    2016-02-01

    Optical fibre underpins the global communications infrastructure and has experienced an astonishing evolution over the past four decades, with current commercial systems transmitting data rates in excess of 10 Tb/s over a single fibre core. The continuation of this dramatic growth in throughput has become constrained due to a power dependent nonlinear distortion arising from a phenomenon known as the Kerr effect. The mitigation of fibre nonlinearities is an area of intense research. However, even in the absence of nonlinear distortion, the practical limit on the transmission throughput of a single fibre core is dominated by the finite signal-to-noise ratio (SNR) afforded by current state-of-the-art coherent optical transceivers. Therefore, the key to maximising the number of information bits that can be reliably transmitted over a fibre channel hinges on the simultaneous optimisation of the modulation format and code rate, based on the SNR achieved at the receiver. In this work, we use an information theoretic approach based on the mutual information and the generalised mutual information to characterise a state-of-the-art dual polarisation m-ary quadrature amplitude modulation transceiver and subsequently apply this methodology to a 15-carrier super-channel to achieve the highest throughput (1.125 Tb/s) ever recorded using a single coherent receiver.

  20. Increasing the information rates of optical communications via coded modulation: a study of transceiver performance

    PubMed Central

    Maher, Robert; Alvarado, Alex; Lavery, Domaniç; Bayvel, Polina

    2016-01-01

    Optical fibre underpins the global communications infrastructure and has experienced an astonishing evolution over the past four decades, with current commercial systems transmitting data rates in excess of 10 Tb/s over a single fibre core. The continuation of this dramatic growth in throughput has become constrained due to a power dependent nonlinear distortion arising from a phenomenon known as the Kerr effect. The mitigation of fibre nonlinearities is an area of intense research. However, even in the absence of nonlinear distortion, the practical limit on the transmission throughput of a single fibre core is dominated by the finite signal-to-noise ratio (SNR) afforded by current state-of-the-art coherent optical transceivers. Therefore, the key to maximising the number of information bits that can be reliably transmitted over a fibre channel hinges on the simultaneous optimisation of the modulation format and code rate, based on the SNR achieved at the receiver. In this work, we use an information theoretic approach based on the mutual information and the generalised mutual information to characterise a state-of-the-art dual polarisation m-ary quadrature amplitude modulation transceiver and subsequently apply this methodology to a 15-carrier super-channel to achieve the highest throughput (1.125 Tb/s) ever recorded using a single coherent receiver. PMID:26864633

  1. 77 FR 26047 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-02

    ... Service System (SSS) records. The SSS records contain both classification records and registration cards... information from or copies of SSS records they must provide on forms or letters certain information about the... obtain information from SSS records stored at NARA facilities. Dated: April 26, 2012. Michael L. Wash...

  2. Concurrent recordings of Electrical Current Emissions and Acoustic Emissions detected from marble specimens subjected to mechanical stress up to fracture

    NASA Astrophysics Data System (ADS)

    Stavrakas, I.; Hloupis, G.; Triantis, D.; Vallianatos, F.

    2012-04-01

    The emission of electrical signals during the application of mechanical stress on brittle geo-materials (the so called Pressure Stimulated Current - PSC[1,2]), provides significant information regarding the mechanical status of the studied rock sample, since PSCs are originated as a result of the opening of cracks and microfractures[3]. The latter mechanism for the creation of PSCs it is straightforward to associated with the recording of acoustic emissions (AE). To justify the common origin of PSCs and AE due to opening of cracks, a combined study was performed implicating the concurrent recording of electric current emissions and AE on marble samples when they are subjected to linearly increasing mechanical load up to the fracture. The electric signal detected is recorded by an ultra sensitive electrometer (Keithley 6514). The sensor used for detecting the electric current is a pair of gold plated electrodes adapted bilaterally on the sample found under axial mechanical stress[4]. The AE were recorded through the Physical Acoustics PCI-2 Acquisition System. The experimental results prove the strong association of the recorded electrical signals and the corresponding acoustic emissions justifying their common origin due to opening of microfractures. Furthermore, when the applied mechanical load exceeds the yield stress then an increasing of PSCs amplitude along with that of AE rate is observed. Acknowledgments. This work was partly supported by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC".

  3. 28 CFR 20.34 - Individual's right to access criminal history record information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... history record information. 20.34 Section 20.34 Judicial Administration DEPARTMENT OF JUSTICE CRIMINAL JUSTICE INFORMATION SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.34 Individual's right to access criminal history record information. The procedures by which an individual may...

  4. 28 CFR 20.34 - Individual's right to access criminal history record information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... history record information. 20.34 Section 20.34 Judicial Administration DEPARTMENT OF JUSTICE CRIMINAL JUSTICE INFORMATION SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.34 Individual's right to access criminal history record information. The procedures by which an individual may...

  5. 28 CFR 20.34 - Individual's right to access criminal history record information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... history record information. 20.34 Section 20.34 Judicial Administration DEPARTMENT OF JUSTICE CRIMINAL JUSTICE INFORMATION SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.34 Individual's right to access criminal history record information. The procedures by which an individual may...

  6. 28 CFR 20.34 - Individual's right to access criminal history record information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... history record information. 20.34 Section 20.34 Judicial Administration DEPARTMENT OF JUSTICE CRIMINAL JUSTICE INFORMATION SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.34 Individual's right to access criminal history record information. The procedures by which an individual may...

  7. 28 CFR 20.34 - Individual's right to access criminal history record information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... history record information. 20.34 Section 20.34 Judicial Administration DEPARTMENT OF JUSTICE CRIMINAL JUSTICE INFORMATION SYSTEMS Federal Systems and Exchange of Criminal History Record Information § 20.34 Individual's right to access criminal history record information. The procedures by which an individual may...

  8. Evaluation of the visual performance of image processing pipes: information value of subjective image attributes

    NASA Astrophysics Data System (ADS)

    Nyman, G.; Häkkinen, J.; Koivisto, E.-M.; Leisti, T.; Lindroos, P.; Orenius, O.; Virtanen, T.; Vuori, T.

    2010-01-01

    Subjective image quality data for 9 image processing pipes and 8 image contents (taken with mobile phone camera, 72 natural scene test images altogether) from 14 test subjects were collected. A triplet comparison setup and a hybrid qualitative/quantitative methodology were applied. MOS data and spontaneous, subjective image quality attributes to each test image were recorded. The use of positive and negative image quality attributes by the experimental subjects suggested a significant difference between the subjective spaces of low and high image quality. The robustness of the attribute data was shown by correlating DMOS data of the test images against their corresponding, average subjective attribute vector length data. The findings demonstrate the information value of spontaneous, subjective image quality attributes in evaluating image quality at variable quality levels. We discuss the implications of these findings for the development of sensitive performance measures and methods in profiling image processing systems and their components, especially at high image quality levels.

  9. The York Gospels: a 1000-year biological palimpsest

    PubMed Central

    Fiddyment, Sarah; Vnouček, Jiří; Mattiangeli, Valeria; Speller, Camilla; Binois, Annelise; Carver, Martin; Dand, Catherine; Newfield, Timothy P.; Webb, Christopher C.; Bradley, Daniel G.; Collins, Matthew J.

    2017-01-01

    Medieval manuscripts, carefully curated and conserved, represent not only an irreplaceable documentary record but also a remarkable reservoir of biological information. Palaeographic and codicological investigation can often locate and date these documents with remarkable precision. The York Gospels (York Minster Ms. Add. 1) is one such codex, one of only a small collection of pre-conquest Gospel books to have survived the Reformation. By extending the non-invasive triboelectric (eraser-based) sampling technique eZooMS, to include the analysis of DNA, we report a cost-effective and simple-to-use biomolecular sampling technique for parchment. We apply this combined methodology to document for the first time a rich palimpsest of biological information contained within the York Gospels, which has accumulated over the 1000-year lifespan of this cherished object that remains an active participant in the life of York Minster. These biological data provide insights into the decisions made in the selection of materials, the construction of the codex and the use history of the object. PMID:29134095

  10. An analysis of specialist and non-specialist user requirements for geographic climate change information.

    PubMed

    Maguire, Martin C

    2013-11-01

    The EU EuroClim project developed a system to monitor and record climate change indicator data based on satellite observations of snow cover, sea ice and glaciers in Northern Europe and the Arctic. It also contained projection data for temperature, rainfall and average wind speed for Europe. These were all stored as data sets in a GIS database for users to download. The process of gathering requirements for a user population including scientists, researchers, policy makers, educationalists and the general public is described. Using an iterative design methodology, a user survey was administered to obtain initial feedback on the system concept followed by panel sessions where users were presented with the system concept and a demonstrator to interact with it. The requirements of both specialist and non-specialist users is summarised together with strategies for the effective communication of geographic climate change information. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  11. The study of insect blood-feeding behaviour. 2. Recording techniques and the use of flow charts.

    PubMed

    Smith, J J; Friend, W G

    1987-01-01

    This paper continues a discussion of approaches and methodologies we have used in our studies of feeding in haematophagous insects. Described are techniques for directly monitoring behaviour: electrical recording of feeding behaviour via resistance changes in the food canal, optical methods for monitoring mouthpart activity, and a computer technique for behavioural event recording. Also described is the use of "flow charts" or "decision diagrams" to model interrelated sequences of behaviours.

  12. The disclosure of diagnosis codes can breach research participants' privacy.

    PubMed

    Loukides, Grigorios; Denny, Joshua C; Malin, Bradley

    2010-01-01

    De-identified clinical data in standardized form (eg, diagnosis codes), derived from electronic medical records, are increasingly combined with research data (eg, DNA sequences) and disseminated to enable scientific investigations. This study examines whether released data can be linked with identified clinical records that are accessible via various resources to jeopardize patients' anonymity, and the ability of popular privacy protection methodologies to prevent such an attack. The study experimentally evaluates the re-identification risk of a de-identified sample of Vanderbilt's patient records involved in a genome-wide association study. It also measures the level of protection from re-identification, and data utility, provided by suppression and generalization. Privacy protection is quantified using the probability of re-identifying a patient in a larger population through diagnosis codes. Data utility is measured at a dataset level, using the percentage of retained information, as well as its description, and at a patient level, using two metrics based on the difference between the distribution of Internal Classification of Disease (ICD) version 9 codes before and after applying privacy protection. More than 96% of 2800 patients' records are shown to be uniquely identified by their diagnosis codes with respect to a population of 1.2 million patients. Generalization is shown to reduce further the percentage of de-identified records by less than 2%, and over 99% of the three-digit ICD-9 codes need to be suppressed to prevent re-identification. Popular privacy protection methods are inadequate to deliver a sufficiently protected and useful result when sharing data derived from complex clinical systems. The development of alternative privacy protection models is thus required.

  13. Words matter: Implementing the electronically activated recorder in schizotypy.

    PubMed

    Minor, Kyle S; Davis, Beshaun J; Marggraf, Matthew P; Luther, Lauren; Robbins, Megan L

    2018-03-01

    In schizophrenia-spectrum populations, analyzing the words people use has offered promise for unlocking information about affective states and social behaviors. The electronically activated recorder (EAR) is an application-based program that is combined with widely used smartphone technology to capture a person's real-world interactions via audio recordings. It improves on the ecological validity of current methodologies by providing objective and naturalistic samples of behavior. This study is the first to implement the EAR in people endorsing elevated traits of schizophrenia-spectrum personality disorders (i.e., schizotypy), and we expected the EAR to (a) differentiate high and low schizotypy groups on affective disturbances and social engagement and (b) show that high schizotypy status moderates links between affect and social behavior using a multimethod approach. Lexical analysis of EAR recordings revealed greater negative affect and decreased social engagement in those high in schizotypy. When assessing specific traits, EAR and ecological momentary assessment (EMA) converged to show that positive schizotypy predicted negative affect. Finally, high schizotypy status moderated links between negative affect and social engagement when the EAR was combined with EMA. Adherence did not influence results, as both groups wore the EAR more than 90% of their waking hours. Findings supported using the EAR to assess real-world expressions of personality and functioning in schizotypy. Evidence also showed that the EAR can be used alongside EMA to provide a mixed-method, real-world assessment that is high in ecological validity and offers a window into the daily lives of those with elevated traits of schizophrenia-spectrum personality disorders. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Nineteenth Century Long-Term Instrumental Records, Examples From the Southeastern United States

    NASA Astrophysics Data System (ADS)

    Mock, C. J.

    2001-12-01

    Early instrumental records in the United States, defined as those operating before 1892 which is regarded the period prior to the modern climate record, provide a longer perspective of climatic variability at decadal and interannual timescales. Such reconstructions also provide a means of verification for other proxy data. This paper provides a American perspective of historical climatic research, emphasizing the urgent need to properly evaluate data quality and provide necessary corrections to make them compatible with the modern record. Different fixed observation times, different practices of weather instrument exposures, and statistical methods for calibration are the main issues in applying corrections and conducting proper climatic interpretations. I illustrate several examples on methodologies of this historical climatic research, focusing on the following in the Southeastern United States: daily reconstructed temperature time-series centered on Charleston SC and Natchez MS back to the late eighteenth century, and precipitation frequency reconstructions during the antebellum period for the Gulf Coast and coastal Southeast Atlantic states. Results indicate several prominent extremes unprecedented as compared to the modern record, such as the widespread warm winter of 1827-28, and the severe cold winters of 1856 and 1857. The reconstructions also yield important information concerning responses to past ENSO events, the PNA, NAO, and the PDO, particularly when compared with instrumental data from other regions. A high potential also exists for applying the climate reconstructions to assess historical climatic impacts on society in the Southeast, such as to understand climatic linkages to famous case studies of Yellow Fever epidemics and severe drought.

  15. Unveiling the geography of historical patents in the United States from 1836 to 1975

    PubMed Central

    Petralia, Sergio; Balland, Pierre-Alexandre; Rigby, David L.

    2016-01-01

    It is clear that technology is a key driver of economic growth. Much less clear is where new technologies are produced and how the geography of U.S. invention has changed over the last two hundred years. Patent data report the geography, history, and technological characteristics of invention. However, those data have only recently become available in digital form and at the present time there exists no comprehensive dataset on the geography of knowledge production in the United States prior to 1975. The database presented in this paper unveils the geography of historical patents granted by the United States Patent and Trademark Office (USPTO) from 1836 to 1975. This historical dataset, HistPat, is constructed using digitalized records of original patent documents that are publicly available. We describe a methodological procedure that allows recovery of geographical information on patents from the digital records. HistPat can be used in different disciplines ranging from geography, economics, history, network science, and science and technology studies. Additionally, it is easily merged with post-1975 USPTO digital patent data to extend it until today. PMID:27576103

  16. Fire danger index efficiency as a function of fuel moisture and fire behavior.

    PubMed

    Torres, Fillipe Tamiozzo Pereira; Romeiro, Joyce Machado Nunes; Santos, Ana Carolina de Albuquerque; de Oliveira Neto, Ricardo Rodrigues; Lima, Gumercindo Souza; Zanuncio, José Cola

    2018-08-01

    Assessment of the performance of forest fire hazard indices is important for prevention and management strategies, such as planning prescribed burnings, public notifications and firefighting resource allocation. The objective of this study was to evaluate the performance of fire hazard indices considering fire behavior variables and susceptibility expressed by the moisture of combustible material. Controlled burns were carried out at different times and information related to meteorological conditions, characteristics of combustible material and fire behavior variables were recorded. All variables analyzed (fire behavior and fuel moisture content) can be explained by the prediction indices. The Brazilian EVAP/P showed the best performance, both at predicting moisture content of the fuel material and fire behavior variables, and the Canadian system showed the best performance to predicting the rate of spread. The coherence of the correlations between the indices and the variables analyzed makes the methodology, which can be applied anywhere, important for decision-making in regions with no records or with only unreliable forest fire data. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Clinical epidemiology in the era of big data: new opportunities, familiar challenges.

    PubMed

    Ehrenstein, Vera; Nielsen, Henrik; Pedersen, Alma B; Johnsen, Søren P; Pedersen, Lars

    2017-01-01

    Routinely recorded health data have evolved from mere by-products of health care delivery or billing into a powerful research tool for studying and improving patient care through clinical epidemiologic research. Big data in the context of epidemiologic research means large interlinkable data sets within a single country or networks of multinational databases. Several Nordic, European, and other multinational collaborations are now well established. Advantages of big data for clinical epidemiology include improved precision of estimates, which is especially important for reassuring ("null") findings; ability to conduct meaningful analyses in subgroup of patients; and rapid detection of safety signals. Big data will also provide new possibilities for research by enabling access to linked information from biobanks, electronic medical records, patient-reported outcome measures, automatic and semiautomatic electronic monitoring devices, and social media. The sheer amount of data, however, does not eliminate and may even amplify systematic error. Therefore, methodologies addressing systematic error, clinical knowledge, and underlying hypotheses are more important than ever to ensure that the signal is discernable behind the noise.

  18. Analysis of the Emitted Wavelet of High-Resolution Bowtie GPR Antennas

    PubMed Central

    Rial, Fernando I.; Lorenzo, Henrique; Pereira, Manuel; Armesto, Julia

    2009-01-01

    Most Ground Penetrating Radars (GPR) cover a wide frequency range by emitting very short time wavelets. In this work, we study in detail the wavelet emitted by two bowtie GPR antennas with nominal frequencies of 800 MHz and 1 GHz. Knowledge of this emitted wavelet allows us to extract as much information as possible from recorded signals, using advanced processing techniques and computer simulations. Following previously published methodology used by Rial et al. [1], which ensures system stability and reliability in data acquisition, a thorough analysis of the wavelet in both time and frequency domain is performed. Most of tests were carried out with air as propagation medium, allowing a proper analysis of the geometrical attenuation factor. Furthermore, we attempt to determine, for each antenna, a time zero in the records to allow us to correctly assign a position to the reflectors detected by the radar. Obtained results indicate that the time zero is not a constant value for the evaluated antennas, but instead depends on the characteristics of the material in contact with the antenna. PMID:22408523

  19. A Review of Issues Related to Data Acquisition and Analysis in EEG/MEG Studies.

    PubMed

    Puce, Aina; Hämäläinen, Matti S

    2017-05-31

    Electroencephalography (EEG) and magnetoencephalography (MEG) are non-invasive electrophysiological methods, which record electric potentials and magnetic fields due to electric currents in synchronously-active neurons. With MEG being more sensitive to neural activity from tangential currents and EEG being able to detect both radial and tangential sources, the two methods are complementary. Over the years, neurophysiological studies have changed considerably: high-density recordings are becoming de rigueur; there is interest in both spontaneous and evoked activity; and sophisticated artifact detection and removal methods are available. Improved head models for source estimation have also increased the precision of the current estimates, particularly for EEG and combined EEG/MEG. Because of their complementarity, more investigators are beginning to perform simultaneous EEG/MEG studies to gain more complete information about neural activity. Given the increase in methodological complexity in EEG/MEG, it is important to gather data that are of high quality and that are as artifact free as possible. Here, we discuss some issues in data acquisition and analysis of EEG and MEG data. Practical considerations for different types of EEG and MEG studies are also discussed.

  20. Security evaluation and assurance of electronic health records.

    PubMed

    Weber-Jahnke, Jens H

    2009-01-01

    Electronic Health Records (EHRs) maintain information of sensitive nature. Security requirements in this context are typically multilateral, encompassing the viewpoints of multiple stakeholders. Two main research questions arise from a security assurance point of view, namely how to demonstrate the internal correctness of EHRs and how to demonstrate their conformance in relation to multilateral security regulations. The above notions of correctness and conformance directly relate to the general concept of system verification, which asks the question "are we building the system right?" This should not be confused with the concept of system validation, which asks the question "are we building the right system?" Much of the research in the medical informatics community has been concerned with the latter aspect (validation). However, trustworthy security requires assurances that standards are followed and specifications are met. The objective of this paper is to contribute to filling this gap. We give an introduction to fundamentals of security assurance, summarize current assurance standards, and report on experiences with using security assurance methodology applied to the EHR domain, specifically focusing on case studies in the Canadian context.

  1. Clinical epidemiology in the era of big data: new opportunities, familiar challenges

    PubMed Central

    Ehrenstein, Vera; Nielsen, Henrik; Pedersen, Alma B; Johnsen, Søren P; Pedersen, Lars

    2017-01-01

    Routinely recorded health data have evolved from mere by-products of health care delivery or billing into a powerful research tool for studying and improving patient care through clinical epidemiologic research. Big data in the context of epidemiologic research means large interlinkable data sets within a single country or networks of multinational databases. Several Nordic, European, and other multinational collaborations are now well established. Advantages of big data for clinical epidemiology include improved precision of estimates, which is especially important for reassuring (“null”) findings; ability to conduct meaningful analyses in subgroup of patients; and rapid detection of safety signals. Big data will also provide new possibilities for research by enabling access to linked information from biobanks, electronic medical records, patient-reported outcome measures, automatic and semiautomatic electronic monitoring devices, and social media. The sheer amount of data, however, does not eliminate and may even amplify systematic error. Therefore, methodologies addressing systematic error, clinical knowledge, and underlying hypotheses are more important than ever to ensure that the signal is discernable behind the noise. PMID:28490904

  2. A Deterrence Approach to Regulate Nurses' Compliance with Electronic Medical Records Privacy Policy.

    PubMed

    Kuo, Kuang-Ming; Talley, Paul C; Hung, Ming-Chien; Chen, Yen-Liang

    2017-11-03

    Hospitals have become increasingly aware that electronic medical records (EMR) may bring about tangible/intangible benefits to managing institutions, including reduced medical errors, improved quality-of-care, curtailed costs, and allowed access to patient information by healthcare professionals regardless of limitations. However, increased dependence on EMR has led to a corresponding increase in the influence of EMR breaches. Such incursions, which have been significantly facilitated by the introduction of mobile devices for accessing EMR, may induce tangible/intangible damage to both hospitals and concerned individuals. The purpose of this study was to explore factors which may tend to inhibit nurses' intentions to violate privacy policy concerning EMR based upon the deterrence theory perspective. Utilizing survey methodology, 262 responses were analyzed via structural equation modeling. Results revealed that punishment certainty, detection certainty, and subjective norm would most certainly and significantly reduce nurses' intentions to violate established EMR privacy policy. With these findings, recommendations for health administrators in planning and designing effective strategies which may potentially inhibit nurses from violating EMR privacy policy are discussed.

  3. On necessity and sufficiency in counseling and psychotherapy (revisited).

    PubMed

    Lazarus, Arnold A

    2007-09-01

    It seems to me that Carl Rogers (see record 2007-14639-002) was far too ambitious in trying to specify general conditions of necessity and sufficiency that would be relevant to the entire spectrum of problems and the diverse expectancies and personalities of the people who seek our help. Rogers' position and orientation almost totally overlook the array of problems under the rubric of "response deficits" that stem from misinformation and missing information and call for active correction, training, and retraining. Rogers also paid scant attention to problems with significant biological determinants. Nevertheless, as exemplified by his seminal 1957 article and many other articles and books, Rogers made major contributions within the domain of the therapeutic alliance. Today, the scientific emphasis looks at accountability, the need to establish various treatments of choice, and the need to understand their presumed mechanisms. Treatment efficacy and generalizability across different methodologies are now considered key issues. The efficacy narrowing and clinically self-limiting consequences of adhering to one particular school of thought are now self-evident to most. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  4. Consumer facial expression in relation to smoked ham with the use of face reading technology. The methodological aspects and informative value of research results.

    PubMed

    Kostyra, Eliza; Rambuszek, Michał; Waszkiewicz-Robak, Bożena; Laskowski, Wacław; Blicharski, Tadeusz; Poławska, Ewa

    2016-09-01

    The study determined the emotional reactions of consumers in relation to hams using face visualization method, which was recorded by FaceReader (FR). The aims of the research were to determine the effect of the ham samples on the type of emotion, to examine more deeply the individual emotional reactions of consumers and to analyse the emotional variability with regard to the temporal measurement of impressions. The research involved testing the effectiveness of measuring emotions in response to the ongoing flavour impression after consumption of smoked hams. It was found that for all of the assessed samples, neutral and negative emotions prevailed as the overall emotions recorded during the assessment of the taste/flavour impression. The range of variability of the overall emotions depended more on the consumer reactions and less on the properties of the assessed product. Consumers expressed various emotions in time and the ham samples evoked different emotional reactions as an effect of duration of the impression. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Chase: Control of Heterogeneous Autonomous Sensors for Situational Awareness

    DTIC Science & Technology

    2016-08-03

    remained the discovery and analysis of new foundational methodology for information collection and fusion that exercises rigorous feedback control over...simultaneously achieve quantified information and physical objectives. New foundational methodology for information collection and fusion that exercises...11.2.1. In the general area of novel stochastic systems analysis it seems appropriate to mention the pioneering work on non -Bayesian distributed learning

  6. Instructional Methodology and Experimental Design for Evaluating Audio-Video Support to Undergraduate Pilot Training.

    ERIC Educational Resources Information Center

    Purifoy, George R., Jr.

    This report presents a detailed description of the methods by which airborne video recording will be utilized in training Air Force pilots, and presents the format for an experiment testing the effectiveness of such training. Portable airborne recording with ground playback permits more economical and efficient teaching of the critical visual and…

  7. 2010-2012 Minneapolis - St. Paul Travel Behavior Inventory |

    Science.gov Websites

    -St. Paul area conducted the survey. Methodology The TBI consists of a paper-based survey and a wearable global positioning system (GPS) survey. The data-collection process for these two surveys was independent, and the results are not intended to function together. Survey Records Survey records include

  8. Selected Guidelines for the Management of Records and Archives: A RAMP Reader.

    ERIC Educational Resources Information Center

    Walne, Peter, Comp.

    The guidelines contained in this book are taken from studies published by UNESCO's Records and Archives Management Program (RAMP) between 1981 and 1987. Each set of guidelines is accompanied by an introduction to provide chronological or methodological context. The guidelines are titled as follows: (1) "The Use of Sampling Techniques in the…

  9. Evaluation of a Records-Review Surveillance System Used to Determine the Prevalence of Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Avchen, Rachel Nonkin; Wiggins, Lisa D.; Devine, Owen; Van Naarden Braun, Kim; Rice, Catherine; Hobson, Nancy C.; Schendel, Diana; Yeargin-Allsopp, Marshalyn

    2011-01-01

    We conducted the first study that estimates the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of a population-based autism spectrum disorders (ASD) surveillance system developed at the Centers for Disease Control and Prevention. The system employs a records-review methodology that yields ASD…

  10. Modeling Business Processes in Public Administration

    NASA Astrophysics Data System (ADS)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  11. Reconstructing missing information on precipitation datasets: impact of tails on adopted statistical distributions.

    NASA Astrophysics Data System (ADS)

    Pedretti, Daniele; Beckie, Roger Daniel

    2014-05-01

    Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were first tested on synthetic examples, to have a complete control of the impact of several variables such as minimum amount of data required to obtain reliable statistical distributions from the selected parametric functions. Then, we applied the methodology to precipitation datasets collected in the Vancouver area and on a mining site in Peru.

  12. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    PubMed

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  13. Comparison of methodologies for calculating quality measures based on administrative data versus clinical data from an electronic health record system: implications for performance measures.

    PubMed

    Tang, Paul C; Ralston, Mary; Arrigotti, Michelle Fernandez; Qureshi, Lubna; Graham, Justin

    2007-01-01

    New reimbursement policies and pay-for-performance programs to reward providers for producing better outcomes are proliferating. Although electronic health record (EHR) systems could provide essential clinical data upon which to base quality measures, most metrics in use were derived from administrative claims data. We compared commonly used quality measures calculated from administrative data to those derived from clinical data in an EHR based on a random sample of 125 charts of Medicare patients with diabetes. Using standard definitions based on administrative data (which require two visits with an encounter diagnosis of diabetes during the measurement period), only 75% of diabetics determined by manually reviewing the EHR (the gold standard) were identified. In contrast, 97% of diabetics were identified using coded information in the EHR. The discrepancies in identified patients resulted in statistically significant differences in the quality measures for frequency of HbA1c testing, control of blood pressure, frequency of testing for urine protein, and frequency of eye exams for diabetic patients. New development of standardized quality measures should shift from claims-based measures to clinically based measures that can be derived from coded information in an EHR. Using data from EHRs will also leverage their clinical content without adding burden to the care process.

  14. [Current state and prospects of military personnel health monitoring].

    PubMed

    Rezvantsev, M V; Kuznetsov, S M; Ivanov, V V; Zakurdaev, V V

    2014-01-01

    The current article is dedicated to some features of the Russian Federation Armed Forces military personnel health monitoring such as legal and informational provision, methodological basis of functioning, historical aspect of formation and development of the social and hygienic monitoring in the Russian Federation Armed Forces. The term "military personnel health monitoring" is defined as an analytical system of constant and long-term observation, analysis, assessment, studying of factors determined the military personnel health, these factors correlations, health risk factors management in order to minimize them. The current state of the military personnel health monitoring allows coming to the conclusion that the military health system does have forces and resources for state policy of establishing the population health monitoring system implementation. The following directions of the militarily personnel health monitoring improvement are proposed: the Russian Federation Armed Forces medical service record and report system reorganization bringing it closer to the civilian one, implementation of the integrated approach to the medical service informatisation, namely, military personnel health status and medical service resources monitoring. The leading means in this direction are development and introduction of a military serviceman individual health status monitoring system on the basis of a serviceman electronic medical record card. Also it is proposed the current Russian Federation Armed Forces social and hygienic monitoring improvement at the expense of informational interaction between the two subsystems on the basis of unified military medical service space.

  15. Costs of occupational injury and illness across industries.

    PubMed

    Leigh, J Paul; Waehrer, Geetha; Miller, Ted R; Keenan, Craig

    2004-06-01

    This study has ranked industries using estimated total costs and costs per worker. This incidence study of nationwide data was carried out in 1993. The main outcome measure was total cost for medical care, lost productivity, and pain and suffering for the entire United States (US). The analysis was conducted using fatal and nonfatal injury and illness data recorded in large data sets from the US Bureau of Labor Statistics. Cost data were derived from workers' compensation records, estimates of lost wages, and jury awards. Current-value calculations were used to express all costs in 1993 in US dollars. The following industries were at the top of the list for average cost (cost per worker): taxicabs, bituminous coal and lignite mining, logging, crushed stone, oil field services, water transportation services, sand and gravel, and trucking. Industries high on the total-cost list were trucking, eating and drinking places, hospitals, grocery stores, nursing homes, motor vehicles, and department stores. Industries at the bottom of the cost-per-worker list included legal services, security brokers, mortgage bankers, security exchanges, and labor union offices. Detailed methodology was developed for ranking industries by total cost and cost per worker. Ranking by total costs provided information on total burden of hazards, and ranking by cost per worker provided information on risk. Industries that ranked high on both lists deserve increased research and regulatory attention.

  16. 42 CFR 485.721 - Condition of participation: Clinical records.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... information and retrieval of records for research or administrative action. (f) Standard: Location and... information. (a) Standard: Protection of clinical record information. The organization recognizes the confidentiality of clinical record information and provides safeguards against loss, destruction, or unauthorized...

  17. 42 CFR 485.721 - Condition of participation: Clinical records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... information and retrieval of records for research or administrative action. (f) Standard: Location and... information. (a) Standard: Protection of clinical record information. The organization recognizes the confidentiality of clinical record information and provides safeguards against loss, destruction, or unauthorized...

  18. 42 CFR 485.721 - Condition of participation: Clinical records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... information and retrieval of records for research or administrative action. (f) Standard: Location and... information. (a) Standard: Protection of clinical record information. The organization recognizes the confidentiality of clinical record information and provides safeguards against loss, destruction, or unauthorized...

  19. 42 CFR 485.721 - Condition of participation: Clinical records.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... information and retrieval of records for research or administrative action. (f) Standard: Location and... information. (a) Standard: Protection of clinical record information. The organization recognizes the confidentiality of clinical record information and provides safeguards against loss, destruction, or unauthorized...

  20. 42 CFR 485.721 - Condition of participation: Clinical records.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... information and retrieval of records for research or administrative action. (f) Standard: Location and... information. (a) Standard: Protection of clinical record information. The organization recognizes the confidentiality of clinical record information and provides safeguards against loss, destruction, or unauthorized...

  1. Architecture for networked electronic patient record systems.

    PubMed

    Takeda, H; Matsumura, Y; Kuwata, S; Nakano, H; Sakamoto, N; Yamamoto, R

    2000-11-01

    There have been two major approaches to the development of networked electronic patient record (EPR) architecture. One uses object-oriented methodologies for constructing the model, which include the GEHR project, Synapses, HL7 RIM and so on. The second approach uses document-oriented methodologies, as applied in examples of HL7 PRA. It is practically beneficial to take the advantages of both approaches and to add solution technologies for network security such as PKI. In recognition of the similarity with electronic commerce, a certificate authority as a trusted third party will be organised for establishing networked EPR system. This paper describes a Japanese functional model that has been developed, and proposes a document-object-oriented architecture, which is-compared with other existing models.

  2. On the use of EEG or MEG brain imaging tools in neuromarketing research.

    PubMed

    Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Toppi, Jlenia; Aloise, Fabio; Bez, Francesco; Wei, Daming; Kong, Wanzeng; Dai, Jounging; Cincotti, Febo; Mattia, Donatella; Babiloni, Fabio

    2011-01-01

    Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG) and magnetoencephalogram (MEG) methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI) methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries.

  3. On the Use of EEG or MEG Brain Imaging Tools in Neuromarketing Research

    PubMed Central

    Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Toppi, Jlenia; Aloise, Fabio; Bez, Francesco; Wei, Daming; Kong, Wanzeng; Dai, Jounging; Cincotti, Febo; Mattia, Donatella; Babiloni, Fabio

    2011-01-01

    Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG) and magnetoencephalogram (MEG) methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI) methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries. PMID:21960996

  4. Completeness and reliability of mortality data in Viet Nam: Implications for the national routine health management information system

    PubMed Central

    Phuong Hoa, Nguyen; Walker, Sue M.; Hill, Peter S.; Rao, Chalapati

    2018-01-01

    Background Mortality statistics form a crucial component of national Health Management Information Systems (HMIS). However, there are limitations in the availability and quality of mortality data at national level in Viet Nam. This study assessed the completeness of recorded deaths and the reliability of recorded causes of death (COD) in the A6 death registers in the national routine HMIS in Viet Nam. Methodology and findings 1477 identified deaths in 2014 were reviewed in two provinces. A capture-recapture method was applied to assess the completeness of the A6 death registers. 1365 household verbal autopsy (VA) interviews were successfully conducted, and these were reviewed by physicians who assigned multiple and underlying cause of death (UCOD). These UCODs from VA were then compared with the CODs recorded in the A6 death registers, using kappa scores to assess the reliability of the A6 death register diagnoses. The overall completeness of the A6 death registers in the two provinces was 89.3% (95%CI: 87.8–90.8). No COD recorded in the A6 death registers demonstrated good reliability. There is very low reliability in recording of cardiovascular deaths (kappa for stroke = 0.47 and kappa for ischaemic heart diseases = 0.42) and diabetes (kappa = 0.33). The reporting of deaths due to road traffic accidents, HIV and some cancers are at a moderate level of reliability with kappa scores ranging between 0.57–0.69 (p<0.01). VA methods identify more specific COD than the A6 death registers, and also allow identification of multiple CODs. Conclusions The study results suggest that data completeness in HMIS A6 death registers in the study sample of communes was relatively high (nearly 90%), but triangulation with death records from other sources would improve the completeness of this system. Further, there is an urgent need to enhance the reliability of COD recorded in the A6 death registers, for which VA methods could be effective. Focussed consultation among stakeholders is needed to develop a suitable mechanism and process for integrating VA methods into the national routine HMIS A6 death registers in Viet Nam. PMID:29370191

  5. Flexible data integration and curation using a graph-based approach.

    PubMed

    Croset, Samuel; Rupp, Joachim; Romacker, Martin

    2016-03-15

    The increasing diversity of data available to the biomedical scientist holds promise for better understanding of diseases and discovery of new treatments for patients. In order to provide a complete picture of a biomedical question, data from many different origins needs to be combined into a unified representation. During this data integration process, inevitable errors and ambiguities present in the initial sources compromise the quality of the resulting data warehouse, and greatly diminish the scientific value of the content. Expensive and time-consuming manual curation is then required to improve the quality of the information. However, it becomes increasingly difficult to dedicate and optimize the resources for data integration projects as available repositories are growing both in size and in number everyday. We present a new generic methodology to identify problematic records, causing what we describe as 'data hairball' structures. The approach is graph-based and relies on two metrics traditionally used in social sciences: the graph density and the betweenness centrality. We evaluate and discuss these measures and show their relevance for flexible, optimized and automated data curation and linkage. The methodology focuses on information coherence and correctness to improve the scientific meaningfulness of data integration endeavors, such as knowledge bases and large data warehouses. samuel.croset@roche.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. The blurred vision of Lady Justice for minors with mental disorders: records of the juvenile court in Belgium.

    PubMed

    Merlevede, Sofie; Vander Laenen, Freya; Cappon, Leen

    2014-01-01

    This study examined (1) the information present in juvenile court records in Belgium (Flanders) and (2) whether there are differences in information between records that mention a mental disorder and those that do not. The file study sample included 107 court records, and we used a Pearson's chi-square test and a t-test to analyze the information within those records. Information in juvenile court records varied considerably. This variability was evident when we compared juvenile court records with and without mention of a mental disorder. Significantly more information about school-related problems, the functioning of the minor, and the occurrence of domestic violence was included in records that mentioned a mental disorder compared with records that did not. The content of the juvenile court records varied, particularly with regard to the mental health status of the minor in question. We suggest guidelines to standardize the information contained in juvenile court records. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Application of a multivariate normal distribution methodology to the dissociation of doubly ionized molecules: The DMDS (CH3 -SS-CH3 ) case.

    PubMed

    Varas, Lautaro R; Pontes, F C; Santos, A C F; Coutinho, L H; de Souza, G G B

    2015-09-15

    The ion-ion-coincidence mass spectroscopy technique brings useful information about the fragmentation dynamics of doubly and multiply charged ionic species. We advocate the use of a matrix-parameter methodology in order to represent and interpret the entire ion-ion spectra associated with the ionic dissociation of doubly charged molecules. This method makes it possible, among other things, to infer fragmentation processes and to extract information about overlapped ion-ion coincidences. This important piece of information is difficult to obtain from other previously described methodologies. A Wiley-McLaren time-of-flight mass spectrometer was used to discriminate the positively charged fragment ions resulting from the sample ionization by a pulsed 800 eV electron beam. We exemplify the application of this methodology by analyzing the fragmentation and ionic dissociation of the dimethyl disulfide (DMDS) molecule as induced by fast electrons. The doubly charged dissociation was analyzed using the Multivariate Normal Distribution. The ion-ion spectrum of the DMDS molecule was obtained at an incident electron energy of 800 eV and was matrix represented using the Multivariate Distribution theory. The proposed methodology allows us to distinguish information among [CH n SH n ] + /[CH 3 ] + (n = 1-3) fragment ions in the ion-ion coincidence spectra using ion-ion coincidence data. Using the momenta balance methodology for the inferred parameters, a secondary decay mechanism is proposed for the [CHS] + ion formation. As an additional check on the methodology, previously published data on the SiF 4 molecule was re-analyzed with the present methodology and the results were shown to be statistically equivalent. The use of a Multivariate Normal Distribution allows for the representation of the whole ion-ion mass spectrum of doubly or multiply ionized molecules as a combination of parameters and the extraction of information among overlapped data. We have successfully applied this methodology to the analysis of the fragmentation of the DMDS molecule. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  8. The Dust Storm Index (DSI): A method for monitoring broadscale wind erosion using meteorological records

    NASA Astrophysics Data System (ADS)

    O'Loingsigh, T.; McTainsh, G. H.; Tews, E. K.; Strong, C. L.; Leys, J. F.; Shinkfield, P.; Tapper, N. J.

    2014-03-01

    Wind erosion of soils is a natural process that has shaped the semi-arid and arid landscapes for millennia. This paper describes the Dust Storm Index (DSI); a methodology for monitoring wind erosion using Australian Bureau of Meteorology (ABM) meteorological observational data since the mid-1960s (long-term), at continental scale. While the 46 year length of the DSI record is its greatest strength from a wind erosion monitoring perspective, there are a number of technical challenges to its use because when the World Meteorological Organisation (WMO) recording protocols were established the use of the data for wind erosion monitoring was never intended. Data recording and storage protocols are examined, including the effects of changes to the definition of how observers should interpret and record dust events. A method is described for selecting the 180 long-term ABM stations used in this study and the limitations of variable observation frequencies between stations are in part resolved. The rationale behind the DSI equation is explained and the examples of temporal and spatial data visualisation products presented include; a long term national wind erosion record (1965-2011), continental DSI maps, and maps of the erosion event types that are factored into the DSI equation. The DSI is tested against dust concentration data and found to provide an accurate representation of wind erosion activity. As the ABM observational records used here were collected according to WMO protocols, the DSI methodology could be used in all countries with WMO-compatible meteorological observation and recording systems.

  9. For the Record: Information on Individuals [and] Remote Access to Corporate Public Records: Scanning the Field [and] The INCORP Files: Extracting Company Information from State Files.

    ERIC Educational Resources Information Center

    Paul, Nora; And Others

    1991-01-01

    Three articles assess increased availability of information about individuals and corporations. The first discusses databases that provide information on individuals--e.g., court records, real estate transactions, motor vehicles records, and credit information. The second compares databases that provide corporate information, and the third…

  10. Using Social Networking to Understand Social Networks: Analysis of a Mobile Phone Closed User Group Used by a Ghanaian Health Team

    PubMed Central

    Akosah, Eric; Ohemeng-Dapaah, Seth; Sakyi Baah, Joseph; Kanter, Andrew S

    2013-01-01

    Background The network structure of an organization influences how well or poorly an organization communicates and manages its resources. In the Millennium Villages Project site in Bonsaaso, Ghana, a mobile phone closed user group has been introduced for use by the Bonsaaso Millennium Villages Project Health Team and other key individuals. No assessment on the benefits or barriers of the use of the closed user group had been carried out. Objective The purpose of this research was to make the case for the use of social network analysis methods to be applied in health systems research—specifically related to mobile health. Methods This study used mobile phone voice records of, conducted interviews with, and reviewed call journals kept by a mobile phone closed user group consisting of the Bonsaaso Millennium Villages Project Health Team. Social network analysis methodology complemented by a qualitative component was used. Monthly voice data of the closed user group from Airtel Bharti Ghana were analyzed using UCINET and visual depictions of the network were created using NetDraw. Interviews and call journals kept by informants were analyzed using NVivo. Results The methodology was successful in helping identify effective organizational structure. Members of the Health Management Team were the more central players in the network, rather than the Community Health Nurses (who might have been expected to be central). Conclusions Social network analysis methodology can be used to determine the most productive structure for an organization or team, identify gaps in communication, identify key actors with greatest influence, and more. In conclusion, this methodology can be a useful analytical tool, especially in the context of mobile health, health services, and operational and managerial research. PMID:23552721

  11. Using social networking to understand social networks: analysis of a mobile phone closed user group used by a Ghanaian health team.

    PubMed

    Kaonga, Nadi Nina; Labrique, Alain; Mechael, Patricia; Akosah, Eric; Ohemeng-Dapaah, Seth; Sakyi Baah, Joseph; Kodie, Richmond; Kanter, Andrew S; Levine, Orin

    2013-04-03

    The network structure of an organization influences how well or poorly an organization communicates and manages its resources. In the Millennium Villages Project site in Bonsaaso, Ghana, a mobile phone closed user group has been introduced for use by the Bonsaaso Millennium Villages Project Health Team and other key individuals. No assessment on the benefits or barriers of the use of the closed user group had been carried out. The purpose of this research was to make the case for the use of social network analysis methods to be applied in health systems research--specifically related to mobile health. This study used mobile phone voice records of, conducted interviews with, and reviewed call journals kept by a mobile phone closed user group consisting of the Bonsaaso Millennium Villages Project Health Team. Social network analysis methodology complemented by a qualitative component was used. Monthly voice data of the closed user group from Airtel Bharti Ghana were analyzed using UCINET and visual depictions of the network were created using NetDraw. Interviews and call journals kept by informants were analyzed using NVivo. The methodology was successful in helping identify effective organizational structure. Members of the Health Management Team were the more central players in the network, rather than the Community Health Nurses (who might have been expected to be central). Social network analysis methodology can be used to determine the most productive structure for an organization or team, identify gaps in communication, identify key actors with greatest influence, and more. In conclusion, this methodology can be a useful analytical tool, especially in the context of mobile health, health services, and operational and managerial research.

  12. Naturalistic Observation of Health-Relevant Social Processes: The Electronically Activated Recorder (EAR) Methodology in Psychosomatics

    PubMed Central

    Mehl, Matthias R.; Robbins, Megan L.; Deters, Fenne große

    2012-01-01

    This article introduces a novel, observational ambulatory monitoring method called the Electronically Activated Recorder or EAR. The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants’ momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people’s days as they naturally unfold. In sampling only a fraction of the time, it protects participants’ privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer’s account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, and subtle emotional expressions). The article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior, (b) provide ecological, observational measures of health-related social processes that are independent of self-report, and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional, self-report based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health. PMID:22582338

  13. Naturalistic observation of health-relevant social processes: the electronically activated recorder methodology in psychosomatics.

    PubMed

    Mehl, Matthias R; Robbins, Megan L; Deters, Fenne Große

    2012-05-01

    This article introduces a novel observational ambulatory monitoring method called the electronically activated recorder (EAR). The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants' momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people's days as they naturally unfold. In sampling only a fraction of the time, it protects participants' privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer's account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, subtle emotional expressions). This article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior; (b) provide ecological observational measures of health-related social processes that are independent of self-report; and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional self-report-based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential aspects (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health.

  14. Disruption of Information Technology Projects: The Reactive Decoupling of Project Management Methodologies

    ERIC Educational Resources Information Center

    Schmitz, Kurt W.

    2013-01-01

    Information Technology projects have migrated toward two dominant Project Management (PM) methodologies. Plan-driven practices provide organizational control through highly structured plans, schedules, and specifications that facilitate oversight by hierarchical bureaucracies. In contrast, agile practices emphasize empowered teams using flexible…

  15. A methodology for the design and evaluation of user interfaces for interactive information systems. Ph.D. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Farooq, Mohammad U.

    1986-01-01

    The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.

  16. 42 CFR 485.60 - Condition of participation: Clinical records.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... retrieval and compilation of information. (a) Standard: Content. Each clinical record must contain sufficient information to identify the patient clearly and to justify the diagnosis and treatment. Entries in...: Protection of clinical record information. The facility must safeguard clinical record information against...

  17. 42 CFR 485.60 - Condition of participation: Clinical records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... retrieval and compilation of information. (a) Standard: Content. Each clinical record must contain sufficient information to identify the patient clearly and to justify the diagnosis and treatment. Entries in...: Protection of clinical record information. The facility must safeguard clinical record information against...

  18. 42 CFR 485.60 - Condition of participation: Clinical records.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... retrieval and compilation of information. (a) Standard: Content. Each clinical record must contain sufficient information to identify the patient clearly and to justify the diagnosis and treatment. Entries in...: Protection of clinical record information. The facility must safeguard clinical record information against...

  19. 42 CFR 485.60 - Condition of participation: Clinical records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... retrieval and compilation of information. (a) Standard: Content. Each clinical record must contain sufficient information to identify the patient clearly and to justify the diagnosis and treatment. Entries in...: Protection of clinical record information. The facility must safeguard clinical record information against...

  20. 76 FR 54743 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ...; and also including information in the following categories: Personnel: Records concern military and... permanent records providing core information technology to records management support programs (Freedom of..., including any personal identifiers or contact information. FOR FURTHER INFORMATION CONTACT: Mr. Leroy Jones...

  1. Tracing the decision-making process of physicians with a Decision Process Matrix.

    PubMed

    Hausmann, Daniel; Zulian, Cristina; Battegay, Edouard; Zimmerli, Lukas

    2016-10-18

    Decision-making processes in a medical setting are complex, dynamic and under time pressure, often with serious consequences for a patient's condition. The principal aim of the present study was to trace and map the individual diagnostic process of real medical cases using a Decision Process Matrix [DPM]). The naturalistic decision-making process of 11 residents and a total of 55 medical cases were recorded in an emergency department, and a DPM was drawn up according to a semi-structured technique following four steps: 1) observing and recording relevant information throughout the entire diagnostic process, 2) assessing options in terms of suspected diagnoses, 3) drawing up an initial version of the DPM, and 4) verifying the DPM, while adding the confidence ratings. The DPM comprised an average of 3.2 suspected diagnoses and 7.9 information units (cues). The following three-phase pattern could be observed: option generation, option verification, and final diagnosis determination. Residents strove for the highest possible level of confidence before making the final diagnoses (in two-thirds of the medical cases with a rating of practically certain) or excluding suspected diagnoses (with practically impossible in half of the cases). The following challenges have to be addressed in the future: real-time capturing of emerging suspected diagnoses in the memory of the physician, definition of meaningful information units, and a more contemporary measurement of confidence. DPM is a useful tool for tracing real and individual diagnostic processes. The methodological approach with DPM allows further investigations into the underlying cognitive diagnostic processes on a theoretical level and improvement of individual clinical reasoning skills in practice.

  2. Optimising Use of Electronic Health Records to Describe the Presentation of Rheumatoid Arthritis in Primary Care: A Strategy for Developing Code Lists

    PubMed Central

    Nicholson, Amanda; Ford, Elizabeth; Davies, Kevin A.; Smith, Helen E.; Rait, Greta; Tate, A. Rosemary; Petersen, Irene; Cassell, Jackie

    2013-01-01

    Background Research using electronic health records (EHRs) relies heavily on coded clinical data. Due to variation in coding practices, it can be difficult to aggregate the codes for a condition in order to define cases. This paper describes a methodology to develop ‘indicator markers’ found in patients with early rheumatoid arthritis (RA); these are a broader range of codes which may allow a probabilistic case definition to use in cases where no diagnostic code is yet recorded. Methods We examined EHRs of 5,843 patients in the General Practice Research Database, aged ≥30y, with a first coded diagnosis of RA between 2005 and 2008. Lists of indicator markers for RA were developed initially by panels of clinicians drawing up code-lists and then modified based on scrutiny of available data. The prevalence of indicator markers, and their temporal relationship to RA codes, was examined in patients from 3y before to 14d after recorded RA diagnosis. Findings Indicator markers were common throughout EHRs of RA patients, with 83.5% having 2 or more markers. 34% of patients received a disease-specific prescription before RA was coded; 42% had a referral to rheumatology, and 63% had a test for rheumatoid factor. 65% had at least one joint symptom or sign recorded and in 44% this was at least 6-months before recorded RA diagnosis. Conclusion Indicator markers of RA may be valuable for case definition in cases which do not yet have a diagnostic code. The clinical diagnosis of RA is likely to occur some months before it is coded, shown by markers frequently occurring ≥6 months before recorded diagnosis. It is difficult to differentiate delay in diagnosis from delay in recording. Information concealed in free text may be required for the accurate identification of patients and to assess the quality of care in general practice. PMID:23451024

  3. The trials methodological research agenda: results from a priority setting exercise.

    PubMed

    Tudur Smith, Catrin; Hickey, Helen; Clarke, Mike; Blazeby, Jane; Williamson, Paula

    2014-01-23

    Research into the methods used in the design, conduct, analysis, and reporting of clinical trials is essential to ensure that effective methods are available and that clinical decisions made using results from trials are based on the best available evidence, which is reliable and robust. An on-line Delphi survey of 48 UK Clinical Research Collaboration registered Clinical Trials Units (CTUs) was undertaken. During round one, CTU Directors were asked to identify important topics that require methodological research. During round two, their opinion about the level of importance of each topic was recorded, and during round three, they were asked to review the group's average opinion and revise their previous opinion if appropriate. Direct reminders were sent to maximise the number of responses at each round. Results are summarised using descriptive methods. Forty one (85%) CTU Directors responded to at least one round of the Delphi process: 25 (52%) responded in round one, 32 (67%) responded in round two, 24 (50%) responded in round three. There were only 12 (25%) who responded to all three rounds and 18 (38%) who responded to both rounds two and three. Consensus was achieved amongst CTU Directors that the top three priorities for trials methodological research were 'Research into methods to boost recruitment in trials' (considered the highest priority), 'Methods to minimise attrition' and 'Choosing appropriate outcomes to measure'. Fifty other topics were included in the list of priorities and consensus was reached that two topics, 'Radiotherapy study designs' and 'Low carbon trials', were not priorities. This priority setting exercise has identified the research topics felt to be most important to the key stakeholder group of Directors of UKCRC registered CTUs. The use of robust methodology to identify these priorities will help ensure that this work informs the trials methodological research agenda, with a focus on topics that will have most impact and relevance.

  4. "Could I return to my life?" Integrated Narrative Nursing Model in Education (INNE).

    PubMed

    Artioli, Giovanna; Foà, Chiara; Cosentino, Chiara; Sulla, Francesco; Sollami, Alfonso; Taffurelli, Chiara

    2018-03-28

    The Integrated Narrative Nursing Model (INNM) is an approach that integrates the qualitative methodology typical of the human sciences, with the quantitative methodology more often associated with the natural sciences. This complex model, which combines a focus on narrative with quantitative measures, has recently been effectively applied to the assessment of chronic patients. In this study, the model is applied to the planning phase of education (Integrated Narrative Nursing Education, INNE), and proves to be a valid instrument for the promotion of the current educational paradigm that is centered on the engagement of both the patient and the caregiver in their own path of care. The aim of this study is therefore to describe the nurse's strategy in the planning of an educational intervention by using the INNE model. The case of a 70-year-old woman with pulmonary neoplasm is described at her first admission to Hospice. Each step conducted by the reference nurse, who uses INNE to record the nurse-patient narrative and collect subsequent questionnaires in order to create a shared educational plan, is also described. The information collected was submitted, starting from a grounded methodology to the following four levels of analysis: I. Needs Assessment, II. Narrative Diagnosis, III. Quantitative Outcome, IV. Integrated Outcome. Step IV, which is derived from the integration of all levels of analysis, allows a nurse to define, even graphically, the conceptual map of a patient's needs, resources and perspectives, in a completely tailored manner. The INNE model offers a valid methodological support for the professional who intends to educate the patient through an inter-subjective and engaged pathway, between the professional, their patient and the socio-relational context. It is a matter of adopting a complex vision that combines processes and methods that require a steady scientific basis and advanced methodological expertise with active listening and empathy - skills which require emotional intelligence.

  5. The trials methodological research agenda: results from a priority setting exercise

    PubMed Central

    2014-01-01

    Background Research into the methods used in the design, conduct, analysis, and reporting of clinical trials is essential to ensure that effective methods are available and that clinical decisions made using results from trials are based on the best available evidence, which is reliable and robust. Methods An on-line Delphi survey of 48 UK Clinical Research Collaboration registered Clinical Trials Units (CTUs) was undertaken. During round one, CTU Directors were asked to identify important topics that require methodological research. During round two, their opinion about the level of importance of each topic was recorded, and during round three, they were asked to review the group’s average opinion and revise their previous opinion if appropriate. Direct reminders were sent to maximise the number of responses at each round. Results are summarised using descriptive methods. Results Forty one (85%) CTU Directors responded to at least one round of the Delphi process: 25 (52%) responded in round one, 32 (67%) responded in round two, 24 (50%) responded in round three. There were only 12 (25%) who responded to all three rounds and 18 (38%) who responded to both rounds two and three. Consensus was achieved amongst CTU Directors that the top three priorities for trials methodological research were ‘Research into methods to boost recruitment in trials’ (considered the highest priority), ‘Methods to minimise attrition’ and ‘Choosing appropriate outcomes to measure’. Fifty other topics were included in the list of priorities and consensus was reached that two topics, ‘Radiotherapy study designs’ and ‘Low carbon trials’, were not priorities. Conclusions This priority setting exercise has identified the research topics felt to be most important to the key stakeholder group of Directors of UKCRC registered CTUs. The use of robust methodology to identify these priorities will help ensure that this work informs the trials methodological research agenda, with a focus on topics that will have most impact and relevance. PMID:24456928

  6. 36 CFR 1236.10 - What records management controls must agencies establish for records in electronic information...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Implementing Electronic Information Systems § 1236.10 What records management controls must agencies establish for records in electronic information systems? The following types of records management controls are... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false What records management...

  7. 36 CFR 1236.10 - What records management controls must agencies establish for records in electronic information...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Implementing Electronic Information Systems § 1236.10 What records management controls must agencies establish for records in electronic information systems? The following types of records management controls are... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false What records management...

  8. 36 CFR 1236.10 - What records management controls must agencies establish for records in electronic information...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Implementing Electronic Information Systems § 1236.10 What records management controls must agencies establish for records in electronic information systems? The following types of records management controls are... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false What records management...

  9. 36 CFR § 1236.10 - What records management controls must agencies establish for records in electronic information...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Implementing Electronic Information Systems § 1236.10 What records management controls must agencies establish for records in electronic information systems? The following types of records management controls are... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true What records management...

  10. 36 CFR 1236.10 - What records management controls must agencies establish for records in electronic information...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Implementing Electronic Information Systems § 1236.10 What records management controls must agencies establish for records in electronic information systems? The following types of records management controls are... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false What records management...

  11. Comparison of two drug safety signals in a pharmacovigilance data mining framework.

    PubMed

    Tubert-Bitter, Pascale; Bégaud, Bernard; Ahmed, Ismaïl

    2016-04-01

    Since adverse drug reactions are a major public health concern, early detection of drug safety signals has become a top priority for regulatory agencies and the pharmaceutical industry. Quantitative methods for analyzing spontaneous reporting material recorded in pharmacovigilance databases through data mining have been proposed in the last decades and are increasingly used to flag potential safety problems. While automated data mining is motivated by the usually huge size of pharmacovigilance databases, it does not systematically produce relevant alerts. Moreover, each detected signal requires appropriate assessment that may involve investigation of the whole therapeutic class. The goal of this article is to provide a methodology for comparing two detected signals. It is nested within the automated surveillance framework as (1) no extra information is required and (2) no simple inference on the actual risks can be extrapolated from spontaneous reporting data. We designed our methodology on the basis of two classical methods used for automated signal detection: the Bayesian Gamma Poisson Shrinker and the frequentist Proportional Reporting Ratio. A simulation study was conducted to assess the performances of both proposed methods. The latter were used to compare cardiovascular signals for two HIV treatments from the French pharmacovigilance database. © The Author(s) 2012.

  12. Multichannel myopic deconvolution in underwater acoustic channels via low-rank recovery

    PubMed Central

    Tian, Ning; Byun, Sung-Hoon; Sabra, Karim; Romberg, Justin

    2017-01-01

    This paper presents a technique for solving the multichannel blind deconvolution problem. The authors observe the convolution of a single (unknown) source with K different (unknown) channel responses; from these channel outputs, the authors want to estimate both the source and the channel responses. The authors show how this classical signal processing problem can be viewed as solving a system of bilinear equations, and in turn can be recast as recovering a rank-1 matrix from a set of linear observations. Results of prior studies in the area of low-rank matrix recovery have identified effective convex relaxations for problems of this type and efficient, scalable heuristic solvers that enable these techniques to work with thousands of unknown variables. The authors show how a priori information about the channels can be used to build a linear model for the channels, which in turn makes solving these systems of equations well-posed. This study demonstrates the robustness of this methodology to measurement noises and parametrization errors of the channel impulse responses with several stylized and shallow water acoustic channel simulations. The performance of this methodology is also verified experimentally using shipping noise recorded on short bottom-mounted vertical line arrays. PMID:28599565

  13. Antecedents of narcotic use and addiction. A study of 898 Vietnam veterans.

    PubMed

    Helzer, J E; Robins, L N; Davis, D H

    1976-02-01

    Previous studies of predictors of narcotic abuse have been retrospective and based on samples of long-term addicts obtained from legal or medical channels. There are several methodological problems in this approach. The present study is an attempt to test certain alleged predictors of narcotic use in a cohort of 898 Vietnam veterans. The design overcomes several of the methodological weaknesses of previous studies. Eight variables which have been reported as predictors of drug use or addiction in the drug literature were inquired about during a personal interview which included the premilitary life of each subject. The antecedent variables were socioeconomic background, inner city residence, psychiatric illness, broken home, race, employment history, education and antisocial history. Using information obtained from interviews and military records, we then tested the predictive value of each of these antecedents by comparing narcotic used and addiction in Vietman and use after Vietnam in men differing with respect to each antecedent. Results indicate that some of the variables were very poor, and others very good predictors of the various levels of narcotic involvement. The predictive value and overall importance of each of the variables we tested are discussed.

  14. Knowledge management in secondary pharmaceutical manufacturing by mining of data historians-A proof-of-concept study.

    PubMed

    Meneghetti, Natascia; Facco, Pierantonio; Bezzo, Fabrizio; Himawan, Chrismono; Zomer, Simeone; Barolo, Massimiliano

    2016-05-30

    In this proof-of-concept study, a methodology is proposed to systematically analyze large data historians of secondary pharmaceutical manufacturing systems using data mining techniques. The objective is to develop an approach enabling to automatically retrieve operation-relevant information that can assist the management in the periodic review of a manufactory system. The proposed methodology allows one to automatically perform three tasks: the identification of single batches within the entire data-sequence of the historical dataset, the identification of distinct operating phases within each batch, and the characterization of a batch with respect to an assigned multivariate set of operating characteristics. The approach is tested on a six-month dataset of a commercial-scale granulation/drying system, where several millions of data entries are recorded. The quality of results and the generality of the approach indicate that there is a strong potential for extending the method to even larger historical datasets and to different operations, thus making it an advanced PAT tool that can assist the implementation of continual improvement paradigms within a quality-by-design framework. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Developing Mobile- and BIM-Based Integrated Visual Facility Maintenance Management System

    PubMed Central

    Su, Yu-Chih

    2013-01-01

    Facility maintenance management (FMM) has become an important topic for research on the operation phase of the construction life cycle. Managing FMM effectively is extremely difficult owing to various factors and environments. One of the difficulties is the performance of 2D graphics when depicting maintenance service. Building information modeling (BIM) uses precise geometry and relevant data to support the maintenance service of facilities depicted in 3D object-oriented CAD. This paper proposes a new and practical methodology with application to FMM using BIM technology. Using BIM technology, this study proposes a BIM-based facility maintenance management (BIMFMM) system for maintenance staff in the operation and maintenance phase. The BIMFMM system is then applied in selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FMM practice. Using the BIMFMM system, maintenance staff can access and review 3D BIM models for updating related maintenance records in a digital format. Moreover, this study presents a generic system architecture and its implementation. The combined results demonstrate that a BIMFMM-like system can be an effective visual FMM tool. PMID:24227995

  16. Use of pragmatism to explore women's experiences of traumatic brain injury: a kaleidoscopic view of the world.

    PubMed

    O'Reilly, Kate; Peters, Kath; Wilson, Nathan; Kwok, Cannas

    2018-03-16

    Although more men than women sustain a traumatic brain injury (TBI), approximately one quarter of people with TBIs are women. The experiences of TBI reported in the literature are informed from the masculine perspective and do not adequately represent women's experiences. Pragmatism provides an overarching methodological framework to explore and critique a broader perspective of health, including psychosocial, cultural, spiritual, political and environmental factors, while attempting to address gender inequity. To describe the philosophical background validating the use of pragmatism to research women's experiences of TBI. Given the limited understanding of the interplay of socially constructed barriers with the complex impairments women have following TBI, a novel approach to research is required. Pragmatism offers a way to incorporate critical thinking and advocacy into research designs. The critical feminist transformative framework presented in this paper demonstrates the strengths of using pragmatism as a framework to explore complex phenomena. This paper illustrates how methodology, which is influenced by various philosophical perspectives, can be woven throughout the design of a research project. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  17. 17 CFR 160.30 - Procedures to safeguard customer records and information.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... customer records and information. 160.30 Section 160.30 Commodity and Securities Exchanges COMMODITY... Date § 160.30 Procedures to safeguard customer records and information. Every futures commission... physical safeguards for the protection of customer records and information. These policies and procedures...

  18. Physician Interactions with Electronic Health Records in Primary Care

    PubMed Central

    Montague, Enid; Asan, Onur

    2013-01-01

    Objective It is essential to design technologies and systems that promote appropriate interactions between physicians and patients. This study explored how physicians interact with Electronic Health Records (EHRs) to understand the qualities of the interaction between the physician and the EHR that may contribute to positive physician-patient interactions. Study Design Video-taped observations of 100 medical consultations were used to evaluate interaction patterns between physicians and EHRs. Quantified observational methods were used to contribute to ecological validity. Methods Ten primary care physicians and 100 patients from five clinics participated in the study. Clinical encounters were recorded with video cameras and coded using a validated objective coding methodology in order to examine how physicians interact with electronic health records. Results Three distinct styles were identified that characterize physician interactions with the EHR: technology-centered, human-centered, and mixed. Physicians who used a technology-centered style spent more time typing and gazing at the computer during the visit. Physicians who used a mixed style shifted their attention and body language between their patients and the technology throughout the visit. Physicians who used the human-centered style spent the least amount of time typing and focused more on the patient. Conclusion A variety of EHR interaction styles may be effective in facilitating patient-centered care. However, potential drawbacks of each style exist and are discussed. Future research on this topic and design strategies for effective health information technology in primary care are also discussed. PMID:24009982

  19. Space-time precipitation extremes for urban hydrology

    NASA Astrophysics Data System (ADS)

    Bardossy, A.; Pegram, G. G. S.

    2017-12-01

    Precipitation extremes are essential for hydrological design. In urban hydrology intensity duration frequency curves (IDFs) are estimated from observation records to design sewer systems. The conventional approaches seldom consider the areal extent of events. If they do so, duration-dependent area reduction factors (ARFs) are applied. In this contribution we investigate the influence of the size of the target urban area on the frequency of occurrence of extremes. We introduce two new concepts, (i) the maximum over an area and (ii) the sub-areal extremes. The properties of these are discussed. The space-time dependence of extremes strongly influences these statistics. The findings of this presentation show that the risk of urban flooding is routinely underestimated. We do this by sampling a long sequence of radar rainfall fields of 1 km resolution, not the usual limited information from gauge records at scattered point locations. The procedure we use is to generate 20 years of plausible 'radar' fields of 5 minute precipitation on a square frame of 128x128 one kilometer pixels and sample them in a regimented way. In this presentation we find that the traditional calculations are underestimating the extremes [by up to 30 % to 50 % depending on size and duration] and we show how we can revise them sensibly. The methodology we devise from simulated radar fields is checked against the records of a dense network of pluviometers covered by a radar in Baden-Württemberg, with a (regrettably) short 4-year record, as proof of concept.

  20. Deep Neural Architectures for Mapping Scalp to Intracranial EEG.

    PubMed

    Antoniades, Andreas; Spyrou, Loukianos; Martin-Lopez, David; Valentin, Antonio; Alarcon, Gonzalo; Sanei, Saeid; Took, Clive Cheong

    2018-03-19

    Data is often plagued by noise which encumbers machine learning of clinically useful biomarkers and electroencephalogram (EEG) data is no exemption. Intracranial EEG (iEEG) data enhances the training of deep learning models of the human brain, yet is often prohibitive due to the invasive recording process. A more convenient alternative is to record brain activity using scalp electrodes. However, the inherent noise associated with scalp EEG data often impedes the learning process of neural models, achieving substandard performance. Here, an ensemble deep learning architecture for nonlinearly mapping scalp to iEEG data is proposed. The proposed architecture exploits the information from a limited number of joint scalp-intracranial recording to establish a novel methodology for detecting the epileptic discharges from the sEEG of a general population of subjects. Statistical tests and qualitative analysis have revealed that the generated pseudo-intracranial data are highly correlated with the true intracranial data. This facilitated the detection of IEDs from the scalp recordings where such waveforms are not often visible. As a real-world clinical application, these pseudo-iEEGs are then used by a convolutional neural network for the automated classification of intracranial epileptic discharges (IEDs) and non-IED of trials in the context of epilepsy analysis. Although the aim of this work was to circumvent the unavailability of iEEG and the limitations of sEEG, we have achieved a classification accuracy of 68% an increase of 6% over the previously proposed linear regression mapping.

Top