Gillum, Richard F
2013-10-01
A major transition is underway in documentation of patient-related data in clinical settings with rapidly accelerating adoption of the electronic health record and electronic medical record. This article examines the history of the development of medical records in the West in order to suggest lessons applicable to the current transition. The first documented major transition in the evolution of the clinical medical record occurred in antiquity, with the development of written case history reports for didactic purposes. Benefiting from Classical and Hellenistic models earlier than physicians in the West, medieval Islamic physicians continued the development of case histories for didactic use. A forerunner of modern medical records first appeared in Paris and Berlin by the early 19th century. Development of the clinical record in America was pioneered in the 19th century in major teaching hospitals. However, a clinical medical record useful for direct patient care in hospital and ambulatory settings was not developed until the 20th century. Several lessons are drawn from the 4000-year history of the medical record that may help physicians improve patient care in the digital age. Copyright © 2013 Elsevier Inc. All rights reserved.
75 FR 29818 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-27
... computers used with the particular project are available to authorized personnel only. Records on... Research and Development Project Records--VA'' (34VA12) as set forth in the Federal Register 40 FR 38095... include electronic or other databases containing research information developed during a research project...
Improving record linkage performance in the presence of missing linkage data.
Ong, Toan C; Mannino, Michael V; Schilling, Lisa M; Kahn, Michael G
2014-12-01
Existing record linkage methods do not handle missing linking field values in an efficient and effective manner. The objective of this study is to investigate three novel methods for improving the accuracy and efficiency of record linkage when record linkage fields have missing values. By extending the Fellegi-Sunter scoring implementations available in the open-source Fine-grained Record Linkage (FRIL) software system we developed three novel methods to solve the missing data problem in record linkage, which we refer to as: Weight Redistribution, Distance Imputation, and Linkage Expansion. Weight Redistribution removes fields with missing data from the set of quasi-identifiers and redistributes the weight from the missing attribute based on relative proportions across the remaining available linkage fields. Distance Imputation imputes the distance between the missing data fields rather than imputing the missing data value. Linkage Expansion adds previously considered non-linkage fields to the linkage field set to compensate for the missing information in a linkage field. We tested the linkage methods using simulated data sets with varying field value corruption rates. The methods developed had sensitivity ranging from .895 to .992 and positive predictive values (PPV) ranging from .865 to 1 in data sets with low corruption rates. Increased corruption rates lead to decreased sensitivity for all methods. These new record linkage algorithms show promise in terms of accuracy and efficiency and may be valuable for combining large data sets at the patient level to support biomedical and clinical research. Copyright © 2014 Elsevier Inc. All rights reserved.
Developing, implementing and disseminating a core outcome set for neonatal medicine.
Webbe, James; Brunton, Ginny; Ali, Shohaib; Duffy, James Mn; Modi, Neena; Gale, Chris
2017-01-01
In high resource settings, 1 in 10 newborn babies require admission to a neonatal unit. Research evaluating neonatal care involves recording and reporting many different outcomes and outcome measures. Such variation limits the usefulness of research as studies cannot be compared or combined. To address these limitations, we aim to develop, disseminate and implement a core outcome set for neonatal medicine. A steering group that includes parents and former patients, healthcare professionals and researchers has been formed to guide the development of the core outcome set. We will review neonatal trials systematically to identify previously reported outcomes. Additionally, we will specifically identify outcomes of importance to parents, former patients and healthcare professionals through a systematic review of qualitative studies. Outcomes identified will be entered into an international, multi-perspective eDelphi survey. All key stakeholders will be invited to participate. The Delphi method will encourage individual and group stakeholder consensus to identify a core outcome set. The core outcome set will be mapped to existing, routinely recorded data where these exist. Use of a core set will ensure outcomes of importance to key stakeholders, including former patients and parents, are recorded and reported in a standard fashion in future research. Embedding the core outcome set within future clinical studies will extend the usefulness of research to inform practice, enhance patient care and ultimately improve outcomes. Using routinely recorded electronic data will facilitate implementation with minimal addition burden. Core Outcome Measures in Effectiveness Trials (COMET) database: 842 (www.comet-initiative.org/studies/details/842).
The Development, Test, and Evaluation of Three Pilot Performance Reference Scales.
ERIC Educational Resources Information Center
Horner, Walter R.; And Others
A set of pilot performance reference scales was developed based upon airborne Audio-Video Recording (AVR) of student performance in T-37 undergraduate Pilot Training. After selection of the training maneuvers to be studied, video tape recordings of the maneuvers were selected from video tape recordings already available from a previous research…
A toolbox and a record for scientific model development
NASA Technical Reports Server (NTRS)
Ellman, Thomas
1994-01-01
Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.
Sarkar, Archana; Dutta, Arup; Dhingra, Usha; Dhingra, Pratibha; Verma, Priti; Juyal, Rakesh; Black, Robert E; Menon, Venugopal P; Kumar, Jitendra; Sazawal, Sunil
2006-08-01
In settings in developing countries, children often socialize with multiple socializing agents (peers, siblings, neighbors) apart from their parents, and thus, a measurement of a child's social interactions should be expanded beyond parental interactions. Since the environment plays a role in shaping a child's development, the measurement of child-socializing agents' interactions is important. We developed and used a computerized observational software Behavior and Social Interaction Software (BASIS) with a preloaded coding scheme installed on a handheld Palm device to record complex observations of interactions between children and socializing agents. Using BASIS, social interaction assessments were conducted on 573 preschool children for 1 h in their natural settings. Multiple screens with a set of choices in each screen were designed that included the child's location, broad activity, state, and interactions with child-socializing agents. Data were downloaded onto a computer and systematically analyzed. BASIS, installed on Palm OS (M-125), enabled the recording of the complex interactions of child-socializing agents that could not be recorded with manual forms. Thus, this tool provides an innovative and relatively accurate method for the systematic recording of social interactions in an unrestricted environment.
Keikha, Leila; Farajollah, Seyede Sedigheh Seied; Safdari, Reza; Ghazisaeedi, Marjan; Mohammadzadeh, Niloofar
2018-01-01
Background In developing countries such as Iran, international standards offer good sources to survey and use for appropriate planning in the domain of electronic health records (EHRs). Therefore, in this study, HL7 and ASTM standards were considered as the main sources from which to extract EHR data. Objective The objective of this study was to propose a hospital data set for a national EHR consisting of data classes and data elements by adjusting data sets extracted from the standards and paper-based records. Method This comparative study was carried out in 2017 by studying the contents of the paper-based records approved by the health ministry in Iran and the international ASTM and HL7 standards in order to extract a minimum hospital data set for a national EHR. Results As a result of studying the standards and paper-based records, a total of 526 data elements in 174 classes were extracted. An examination of the data indicated that the highest number of extracted data came from the free text elements, both in the paper-based records and in the standards related to the administrative data. The major sources of data extracted from ASTM and HL7 were the E1384 and Hl7V.x standards, respectively. In the paper-based records, data were extracted from 19 forms sporadically. Discussion By declaring the confidentiality of information, the ASTM standards acknowledge the issue of confidentiality of information as one of the main challenges of EHR development, and propose new types of admission, such as teleconference, tele-video, and home visit, which are inevitable with the advent of new technology for providing healthcare and treating diseases. Data related to finance and insurance, which were scattered in different categories by three organizations, emerged as the financial category. Documenting the role and responsibility of the provider by adding the authenticator/signature data element was deemed essential. Conclusion Not only using well-defined and standardized data, but also adapting EHR systems to the local facilities and the existing social and cultural conditions, will facilitate the development of structured data sets. PMID:29618962
Keikha, Leila; Farajollah, Seyede Sedigheh Seied; Safdari, Reza; Ghazisaeedi, Marjan; Mohammadzadeh, Niloofar
2018-01-01
In developing countries such as Iran, international standards offer good sources to survey and use for appropriate planning in the domain of electronic health records (EHRs). Therefore, in this study, HL7 and ASTM standards were considered as the main sources from which to extract EHR data. The objective of this study was to propose a hospital data set for a national EHR consisting of data classes and data elements by adjusting data sets extracted from the standards and paper-based records. This comparative study was carried out in 2017 by studying the contents of the paper-based records approved by the health ministry in Iran and the international ASTM and HL7 standards in order to extract a minimum hospital data set for a national EHR. As a result of studying the standards and paper-based records, a total of 526 data elements in 174 classes were extracted. An examination of the data indicated that the highest number of extracted data came from the free text elements, both in the paper-based records and in the standards related to the administrative data. The major sources of data extracted from ASTM and HL7 were the E1384 and Hl7V.x standards, respectively. In the paper-based records, data were extracted from 19 forms sporadically. By declaring the confidentiality of information, the ASTM standards acknowledge the issue of confidentiality of information as one of the main challenges of EHR development, and propose new types of admission, such as teleconference, tele-video, and home visit, which are inevitable with the advent of new technology for providing healthcare and treating diseases. Data related to finance and insurance, which were scattered in different categories by three organizations, emerged as the financial category. Documenting the role and responsibility of the provider by adding the authenticator/signature data element was deemed essential. Not only using well-defined and standardized data, but also adapting EHR systems to the local facilities and the existing social and cultural conditions, will facilitate the development of structured data sets.
Lamas, Daniela; Panariello, Natalie; Henrich, Natalie; Hammes, Bernard; Hanson, Laura C; Meier, Diane E; Guinn, Nancy; Corrigan, Janet; Hubber, Sean; Luetke-Stahlman, Hannah; Block, Susan
2018-04-01
To develop a set of clinically relevant recommendations to improve the state of advance care planning (ACP) documentation in the electronic health record (EHR). Advance care planning (ACP) is a key process that supports goal-concordant care. For preferences to be honored, clinicians must be able to reliably record, find, and use ACP documentation. However, there are no standards to guide ACP documentation in the electronic health record (EHR). We interviewed 21 key informants to understand the strengths and weaknesses of EHR documentation systems for ACP and identify best practices. We analyzed these interviews using a qualitative content analysis approach and subsequently developed a preliminary set of recommendations. These recommendations were vetted and refined in a second round of input from a national panel of content experts. Informants identified six themes regarding current inadequacies in documentation and accessibility of ACP information and opportunities for improvement. We offer a set of concise, clinically relevant recommendations, informed by expert opinion, to improve the state of ACP documentation in the EHR.
Kimura, Shinya; Sato, Toshihiko; Ikeda, Shunya; Noda, Mitsuhiko; Nakayama, Takeo
2010-01-01
Health insurance claims (ie, receipts) record patient health care treatments and expenses and, although created for the health care payment system, are potentially useful for research. Combining different types of receipts generated for the same patient would dramatically increase the utility of these receipts. However, technical problems, including standardization of disease names and classifications, and anonymous linkage of individual receipts, must be addressed. In collaboration with health insurance societies, all information from receipts (inpatient, outpatient, and pharmacy) was collected. To standardize disease names and classifications, we developed a computer-aided post-entry standardization method using a disease name dictionary based on International Classification of Diseases (ICD)-10 classifications. We also developed an anonymous linkage system by using an encryption code generated from a combination of hash values and stream ciphers. Using different sets of the original data (data set 1: insurance certificate number, name, and sex; data set 2: insurance certificate number, date of birth, and relationship status), we compared the percentage of successful record matches obtained by using data set 1 to generate key codes with the percentage obtained when both data sets were used. The dictionary's automatic conversion of disease names successfully standardized 98.1% of approximately 2 million new receipts entered into the database. The percentage of anonymous matches was higher for the combined data sets (98.0%) than for data set 1 (88.5%). The use of standardized disease classifications and anonymous record linkage substantially contributed to the construction of a large, chronologically organized database of receipts. This database is expected to aid in epidemiologic and health services research using receipt information.
Ethics of Implementing Electronic Health Records in Developing Countries: Points to Consider
Were, Martin C.; Meslin, Eric M.
2011-01-01
Electronic Health Record systems (EHRs) are increasingly being used in many developing countries, several of which have moved beyond isolated pilot projects to active large-scale implementation as part of their national health strategies. Despite growing enthusiasm for adopting EHRs in resource poor settings, almost no attention has been paid to the ethical issues that might arise. In this article we argue that these ethical issues should be addressed now if EHRs are to be appropriately implemented in these settings. We take a systematic approach guided by a widely accepted ethical framework currently in use for developing countries to first describe the ethical issues, and then propose a set of ‘Points to Consider’ to guide further thinking and decision-making. PMID:22195214
Patient Core Data Set. Standard for a longitudinal health/medical record.
Renner, A L; Swart, J C
1997-01-01
Blue Chip Computers Company, in collaboration with Wright State University-Miami Valley College of Nursing and Health, with support from the Agency for Health Care Policy and Research, Public Health Service, completed Small Business innovative Research research to design a comprehensive integrated Patient information System. The Wright State University consultants undertook the development of a Patient Core Data Set (PCDS) in response to the lack of uniform standards of minimum data sets, and lack of standards in data transfer for continuity of care. The purpose of the Patient Core Data Set is to develop a longitudinal patient health record and medical history using a common set of standard data elements with uniform definitions and coding consistent with Health Level 7 (HL7) protocol and the American Society for Testing and Materials (ASTM) standards. The PCDS, intended for transfer across all patient-care settings, is essential information for clinicians, administrators, researchers, and health policy makers.
MacRae, J; Darlow, B; McBain, L; Jones, O; Stubbe, M; Turner, N; Dowell, A
2015-08-21
To develop a natural language processing software inference algorithm to classify the content of primary care consultations using electronic health record Big Data and subsequently test the algorithm's ability to estimate the prevalence and burden of childhood respiratory illness in primary care. Algorithm development and validation study. To classify consultations, the algorithm is designed to interrogate clinical narrative entered as free text, diagnostic (Read) codes created and medications prescribed on the day of the consultation. Thirty-six consenting primary care practices from a mixed urban and semirural region of New Zealand. Three independent sets of 1200 child consultation records were randomly extracted from a data set of all general practitioner consultations in participating practices between 1 January 2008-31 December 2013 for children under 18 years of age (n=754,242). Each consultation record within these sets was independently classified by two expert clinicians as respiratory or non-respiratory, and subclassified according to respiratory diagnostic categories to create three 'gold standard' sets of classified records. These three gold standard record sets were used to train, test and validate the algorithm. Sensitivity, specificity, positive predictive value and F-measure were calculated to illustrate the algorithm's ability to replicate judgements of expert clinicians within the 1200 record gold standard validation set. The algorithm was able to identify respiratory consultations in the 1200 record validation set with a sensitivity of 0.72 (95% CI 0.67 to 0.78) and a specificity of 0.95 (95% CI 0.93 to 0.98). The positive predictive value of algorithm respiratory classification was 0.93 (95% CI 0.89 to 0.97). The positive predictive value of the algorithm classifying consultations as being related to specific respiratory diagnostic categories ranged from 0.68 (95% CI 0.40 to 1.00; other respiratory conditions) to 0.91 (95% CI 0.79 to 1.00; throat infections). A software inference algorithm that uses primary care Big Data can accurately classify the content of clinical consultations. This algorithm will enable accurate estimation of the prevalence of childhood respiratory illness in primary care and resultant service utilisation. The methodology can also be applied to other areas of clinical care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Instruments and attachments for electronystagmography
NASA Technical Reports Server (NTRS)
Mironenko, Y. T.; Vilenskiy, A. A.
1980-01-01
A portable set of instruments and devices was developed which makes it possible to record spontaneous nystagmus with open and closed eyes. Rotational, caloric, position, and pressure nystagmus under any conditions may also be recorded.
Integration of Evidence into a Detailed Clinical Model-based Electronic Nursing Record System
Park, Hyeoun-Ae; Jeon, Eunjoo; Chung, Eunja
2012-01-01
Objectives The purpose of this study was to test the feasibility of an electronic nursing record system for perinatal care that is based on detailed clinical models and clinical practice guidelines in perinatal care. Methods This study was carried out in five phases: 1) generating nursing statements using detailed clinical models; 2) identifying the relevant evidence; 3) linking nursing statements with the evidence; 4) developing a prototype electronic nursing record system based on detailed clinical models and clinical practice guidelines; and 5) evaluating the prototype system. Results We first generated 799 nursing statements describing nursing assessments, diagnoses, interventions, and outcomes using entities, attributes, and value sets of detailed clinical models for perinatal care which we developed in a previous study. We then extracted 506 recommendations from nine clinical practice guidelines and created sets of nursing statements to be used for nursing documentation by grouping nursing statements according to these recommendations. Finally, we developed and evaluated a prototype electronic nursing record system that can provide nurses with recommendations for nursing practice and sets of nursing statements based on the recommendations for guiding nursing documentation. Conclusions The prototype system was found to be sufficiently complete, relevant, useful, and applicable in terms of content, and easy to use and useful in terms of system user interface. This study has revealed the feasibility of developing such an ENR system. PMID:22844649
Hydro-Climatic Data Network (HCDN) Streamflow Data Set, 1874-1988
Slack, James Richard; Lumb, Alan M.; Landwehr, Jurate Maciunas
1993-01-01
The potential consequences of climate change to continental water resources are of great concern in the management of those resources. Critically important to society is what effect fluctuations in the prevailing climate may have on hydrologic conditions, such as the occurrence and magnitude of floods or droughts and the seasonal distribution of water supplies within a region. Records of streamflow that are unaffected by artificial diversions, storage, or other works of man in or on the natural stream channels or in the watershed can provide an account of hydrologic responses to fluctuations in climate. By examining such records given known past meteorologic conditions, we can better understand hydrologic responses to those conditions and anticipate the effects of postulated changes in current climate regimes. Furthermore, patterns in streamflow records can indicate when a change in the prevailing climate regime may have occurred in the past, even in the absence of concurrent meteorologic records. A streamflow data set, which is specifically suitable for the study of surface-water conditions throughout the United States under fluctuations in the prevailing climatic conditions, has been developed. This data set, called the Hydro-Climatic Data Network, or HCDN, consists of streamflow records for 1,659 sites throughout United States and its Territories. Records cumulatively span the period 1874 through 1988, inclusive, and represent a total of 73,231 water years of information. Development of the HCDN Data Set: Records for the HCDN were obtained through a comprehensive search of the extensive surface- water data holdings of the U.S. Geological Survey (USGS), which are contained in the USGS National Water Storage and Retrieval System (WATSTORE). All streamflow discharge records in WATSTORE through September 30, 1988, were examined for inclusion in the HCDN in accordance with strictly defined criteria of measurement accuracy and natural conditions. No reconstructed records of 'natural flow' were permitted, nor was any record extended or had missing values 'filled in' using computational algorithms. If the streamflow at a station was judged to be free of controls for only a part of the entire period of record that is available for the station, then only that part was included in the HCDN, but only if it was of sufficient length (generally 20 years) to warrant inclusion. In addition to the daily mean discharge values, complete station identification information and basin characteristics were retrieved from WATSTORE for inclusion in the HCDN. Statistical characteristics, including the monthly mean discharge, as well as the annual mean, minimum and maximum discharge values, were derived for the records in the HCDN data set. For a full description of the development and content of the Hydro-Climatic Data Network, please take a look at the HCDN Report.
5 CFR 293.101 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-01-01
....101 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PERSONNEL RECORDS Basic Policies on Maintenance of Personnel Records § 293.101 Purpose and scope. (a) This subpart sets forth basic policies governing the creation, development, maintenance, processing, use...
ERIC Educational Resources Information Center
Harrison, R. W.; And Others
The worksheets have been developed for use with any production occupational or work experience record book for high school vocational agriculture programs. Separate units have been developed for each of 11 areas in ornamental horticulture, so the student and teacher can select the appropriate one, or several, for the experiences planned by the…
A scoring system for ascertainment of incident stroke; the Risk Index Score (RISc).
Kass-Hout, T A; Moyé, L A; Smith, M A; Morgenstern, L B
2006-01-01
The main objective of this study was to develop and validate a computer-based statistical algorithm that could be translated into a simple scoring system in order to ascertain incident stroke cases using hospital admission medical records data. The Risk Index Score (RISc) algorithm was developed using data collected prospectively by the Brain Attack Surveillance in Corpus Christi (BASIC) project, 2000. The validity of RISc was evaluated by estimating the concordance of scoring system stroke ascertainment to stroke ascertainment by physician and/or abstractor review of hospital admission records. RISc was developed on 1718 randomly selected patients (training set) and then statistically validated on an independent sample of 858 patients (validation set). A multivariable logistic model was used to develop RISc and subsequently evaluated by goodness-of-fit and receiver operating characteristic (ROC) analyses. The higher the value of RISc, the higher the patient's risk of potential stroke. The study showed RISc was well calibrated and discriminated those who had potential stroke from those that did not on initial screening. In this study we developed and validated a rapid, easy, efficient, and accurate method to ascertain incident stroke cases from routine hospital admission records for epidemiologic investigations. Validation of this scoring system was achieved statistically; however, clinical validation in a community hospital setting is warranted.
Development and Evaluation of e-CA, an Electronic Mobile-Based Food Record
Bucher Della Torre, Sophie; Carrard, Isabelle; Farina, Eddy; Danuser, Brigitta; Kruseman, Maaike
2017-01-01
Measures that capture diet as validly and reliably as possible are cornerstones of nutritional research, and mobile-based devices offer new opportunities to improve and simplify data collection. The balance between precision and acceptability of these data collection tools remains debated, and rigorous validations are warranted. Our objective was to develop and evaluate an electronic mobile-based food record for a research setting. We developed e-CA, which includes almost 900 foods and beverages classified in 14 categories and 60 subcategories. e-CA was evaluated using three different methods: (1) usability and acceptability through a logbook and qualitative interviews; (2) dietary intake accuracy through comparison with 2 unannounced 24-h phone recalls on overlapping days; and (3) reliability and process comparison with a paper-based food record in a laboratory setting with a randomized design. e-CA proved to be intuitive and practical and was perceived as modern, trendy, and fun. Comparisons of e-CA with 24-h telephone recalls or paper-based food records in a laboratory setting with two small convenient samples showed good agreement but highlighted the well-known difficulty of estimating portion sizes and a necessary learning time to use the app. e-CA is a functional tool that has the potential to facilitate food intake measurement for research by increasing the pleasure of using the food record tool and reducing the perceived burden for the participants. It also decreases the workload, costs and the risk of transcription errors for researchers. PMID:28106767
Development and Evaluation of e-CA, an Electronic Mobile-Based Food Record.
Bucher Della Torre, Sophie; Carrard, Isabelle; Farina, Eddy; Danuser, Brigitta; Kruseman, Maaike
2017-01-18
Measures that capture diet as validly and reliably as possible are cornerstones of nutritional research, and mobile-based devices offer new opportunities to improve and simplify data collection. The balance between precision and acceptability of these data collection tools remains debated, and rigorous validations are warranted. Our objective was to develop and evaluate an electronic mobile-based food record for a research setting. We developed e-CA, which includes almost 900 foods and beverages classified in 14 categories and 60 subcategories. e-CA was evaluated using three different methods: (1) usability and acceptability through a logbook and qualitative interviews; (2) dietary intake accuracy through comparison with 2 unannounced 24-h phone recalls on overlapping days; and (3) reliability and process comparison with a paper-based food record in a laboratory setting with a randomized design. e-CA proved to be intuitive and practical and was perceived as modern, trendy, and fun. Comparisons of e-CA with 24-h telephone recalls or paper-based food records in a laboratory setting with two small convenient samples showed good agreement but highlighted the well-known difficulty of estimating portion sizes and a necessary learning time to use the app. e-CA is a functional tool that has the potential to facilitate food intake measurement for research by increasing the pleasure of using the food record tool and reducing the perceived burden for the participants. It also decreases the workload, costs and the risk of transcription errors for researchers.
Zhou, Yuefang; Cameron, Elaine; Forbes, Gillian; Humphris, Gerry
2012-08-01
To develop and validate the St Andrews Behavioural Interaction Coding Scheme (SABICS): a tool to record nurse-child interactive behaviours. The SABICS was developed primarily from observation of video recorded interactions; and refined through an iterative process of applying the scheme to new data sets. Its practical applicability was assessed via implementation of the scheme on specialised behavioural coding software. Reliability was calculated using Cohen's Kappa. Discriminant validity was assessed using logistic regression. The SABICS contains 48 codes. Fifty-five nurse-child interactions were successfully coded through administering the scheme on The Observer XT8.0 system. Two visualization results of interaction patterns demonstrated the scheme's capability of capturing complex interaction processes. Cohen's Kappa was 0.66 (inter-coder) and 0.88 and 0.78 (two intra-coders). The frequency of nurse behaviours, such as "instruction" (OR = 1.32, p = 0.027) and "praise" (OR = 2.04, p = 0.027), predicted a child receiving the intervention. The SABICS is a unique system to record interactions between dental nurses and 3-5 years old children. It records and displays complex nurse-child interactive behaviours. It is easily administered and demonstrates reasonable psychometric properties. The SABICS has potential for other paediatric settings. Its development procedure may be helpful for other similar coding scheme development. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Device Oriented Project Controller
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dalesio, Leo; Kraimer, Martin
2013-11-20
This proposal is directed at the issue of developing control systems for very large HEP projects. A de-facto standard in accelerator control is the Experimental Physics and Industrial Control System (EPICS), which has been applied successfully to many physics projects. EPICS is a channel based system that requires that each channel of each device be configured and controlled. In Phase I, the feasibility of a device oriented extension to the distributed channel database was demonstrated by prototyping a device aware version of an EPICS I/O controller that functions with the current version of the channel access communication protocol. Extensions havemore » been made to the grammar to define the database. Only a multi-stage position controller with limit switches was developed in the demonstration, but the grammar should support a full range of functional record types. In phase II, a full set of record types will be developed to support all existing record types, a set of process control functions for closed loop control, and support for experimental beam line control. A tool to configure these records will be developed. A communication protocol will be developed or extensions will be made to Channel Access to support introspection of components of a device. Performance bench marks will be made on both communication protocol and the database. After these records and performance tests are under way, a second of the grammar will be undertaken.« less
Openness of patients' reporting with use of electronic records: psychiatric clinicians' views
Blackford, Jennifer Urbano; Rosenbloom, S Trent; Seidel, Sandra; Clayton, Ellen Wright; Dilts, David M; Finder, Stuart G
2010-01-01
Objectives Improvements in electronic health record (EHR) system development will require an understanding of psychiatric clinicians' views on EHR system acceptability, including effects on psychotherapy communications, data-recording behaviors, data accessibility versus security and privacy, data quality and clarity, communications with medical colleagues, and stigma. Design Multidisciplinary development of a survey instrument targeting psychiatric clinicians who recently switched to EHR system use, focus group testing, data analysis, and data reliability testing. Measurements Survey of 120 university-based, outpatient mental health clinicians, with 56 (47%) responding, conducted 18 months after transition from a paper to an EHR system. Results Factor analysis gave nine item groupings that overlapped strongly with five a priori domains. Respondents both praised and criticized the EHR system. A strong majority (81%) felt that open therapeutic communications were preserved. Regarding data quality, content, and privacy, clinicians (63%) were less willing to record highly confidential information and disagreed (83%) with including their own psychiatric records among routinely accessed EHR systems. Limitations single time point; single academic medical center clinic setting; modest sample size; lack of prior instrument validation; survey conducted in 2005. Conclusions In an academic medical center clinic, the presence of electronic records was not seen as a dramatic impediment to therapeutic communications. Concerns regarding privacy and data security were significant, and may contribute to reluctances to adopt electronic records in other settings. Further study of clinicians' views and use patterns may be helpful in guiding development and deployment of electronic records systems. PMID:20064802
Mahmoudvand, Zahra; Kamkar, Mehran; Shahmoradi, Leila; Nejad, Ahmadreza Farzaneh
2016-04-01
Determination of minimum data set (MDS) in echocardiography reports is necessary for documentation and putting information in a standard way, and leads to the enhancement of electrocardiographic studies through having access to precise and perfect reports and also to the development of a standard database for electrocardiographic reports. to determine the minimum data set of echocardiography reporting system to exchange with Iran's electronic health record (EHR) system. First, a list of minimum data set was prepared after reviewing texts and studying cardiac patients' records. Then, to determine the content validity of the prepared MDS, the expert views of 10 cardiologists and 10 health information management (HIM) specialists were obtained; to estimate the reliability of the set, test-retest method was employed. Finally, the data were analyzed using SPSS software. The highest degree of consensus was found for the following MDSs: patient's name and family name (5), accepting doctor's name and family name, familial death records due to cardiac disorders, the image identification code, mitral valve, aortic valve, tricuspid valve, pulmonary valve, left ventricle, hole, atrium valve, Doppler examination of ventricular and atrial movement models and diagnoses with an average of. To prepare a model of echocardiography reporting system to exchange with EHR system, creation a standard data set is the vital point. Therefore, based on the research findings, the minimum reporting system data to exchange with Iran's electronic health record system include information on entity, management, medical record, carried-out acts, and the main content of the echocardiography report, which the planners of reporting system should consider.
Thomas, E; Sexton, J; Helmreich, R
2004-01-01
Improving teamwork in healthcare may help reduce and manage errors. This paper takes a step toward that goal by (1) proposing a set of teamwork behaviours, or behavioural markers, for neonatal resuscitation; (2) presenting a data form for recording observations about these markers; and (3) comparing and contrasting different sets of teamwork behaviours that have been developed for healthcare. Data from focus groups of neonatal providers, surveys, and video recordings of neonatal resuscitations were used to identify some new teamwork behaviours, to translate existing aviation team behaviours to this setting, and to develop a data collection form. This behavioural marker audit form for neonatal resuscitation lists and defines 10 markers that describe specific, observable behaviours seen during the resuscitation of newborn infants. These markers are compared with those developed by other groups. Future research should determine the relations among these behaviours and errors, and test their usefulness in measuring the impact of team training interventions. PMID:15465957
Herscovici, Sarah; Pe'er, Avivit; Papyan, Surik; Lavie, Peretz
2007-02-01
Scoring of REM sleep based on polysomnographic recordings is a laborious and time-consuming process. The growing number of ambulatory devices designed for cost-effective home-based diagnostic sleep recordings necessitates the development of a reliable automatic REM sleep detection algorithm that is not based on the traditional electroencephalographic, electrooccolographic and electromyographic recordings trio. This paper presents an automatic REM detection algorithm based on the peripheral arterial tone (PAT) signal and actigraphy which are recorded with an ambulatory wrist-worn device (Watch-PAT100). The PAT signal is a measure of the pulsatile volume changes at the finger tip reflecting sympathetic tone variations. The algorithm was developed using a training set of 30 patients recorded simultaneously with polysomnography and Watch-PAT100. Sleep records were divided into 5 min intervals and two time series were constructed from the PAT amplitudes and PAT-derived inter-pulse periods in each interval. A prediction function based on 16 features extracted from the above time series that determines the likelihood of detecting a REM epoch was developed. The coefficients of the prediction function were determined using a genetic algorithm (GA) optimizing process tuned to maximize a price function depending on the sensitivity, specificity and agreement of the algorithm in comparison with the gold standard of polysomnographic manual scoring. Based on a separate validation set of 30 patients overall sensitivity, specificity and agreement of the automatic algorithm to identify standard 30 s epochs of REM sleep were 78%, 92%, 89%, respectively. Deploying this REM detection algorithm in a wrist worn device could be very useful for unattended ambulatory sleep monitoring. The innovative method of optimization using a genetic algorithm has been proven to yield robust results in the validation set.
De Clercq, Etienne
2008-09-01
It is widely accepted that the development of electronic patient records, or even of a common electronic patient record, is one possible way to improve cooperation and data communication between nurses and physicians. Yet, little has been done so far to develop a common conceptual model for both medical and nursing patient records, which is a first challenge that should be met to set up a common electronic patient record. In this paper, we describe a problem-oriented conceptual model and we show how it may suit both nursing and medical perspectives in a hospital setting. We started from existing nursing theory and from an initial model previously set up for primary care. In a hospital pilot site, a multi-disciplinary team refined this model using one large and complex clinical case (retrospective study) and nine ongoing cases (prospective study). An internal validation was performed through hospital-wide multi-professional interviews and through discussions around a graphical user interface prototype. To assess the consistency of the model, a computer engineer specified it. Finally, a Belgian expert working group performed an external assessment of the model. As a basis for a common patient record we propose a simple problem-oriented conceptual model with two levels of meta-information. The model is mapped with current nursing theories and it includes the following concepts: "health care element", "health approach", "health agent", "contact", "subcontact" and "service". These concepts, their interrelationships and some practical rules for using the model are illustrated in this paper. Our results are compatible with ongoing standardization work at the Belgian and European levels. Our conceptual model is potentially a foundation for a multi-professional electronic patient record that is problem-oriented and therefore patient-centred.
Evans, D. A.; Brownlow, N. D.; Hersh, W. R.; Campbell, E. M.
1996-01-01
We discuss the development and evaluation of an automated procedure for extracting drug-dosage information from clinical narratives. The process was developed rapidly using existing technology and resources, including categories of terms from UMLS96. Evaluations over a large training and smaller test set of medical records demonstrate an approximately 80% rate of exact and partial matches' on target phrases, with few false positives and a modest rate of false negatives. The results suggest a strategy for automating general concept identification in electronic medical records. PMID:8947694
ERIC Educational Resources Information Center
Carstens, B. A.; Wright, J. M.; Coles, J. T.; McCleary, L. N.; Williams, R. L.
2013-01-01
This study developed a reliable and valid self-monitoring procedure for student use in recording and rating the quality of their individual comments in large college classes. Students used daily record cards immediately to record and rate each comment they made each day. However, a limit was set on the amount of credit students could claim for…
Developmental milestones record - 3 years
... as helping set the table or picking up toys. Encourage play with other children to help develop ... the A.D.A.M. Editorial team. Toddler Development Read more NIH MedlinePlus Magazine Read more Health ...
NASA Technical Reports Server (NTRS)
Baumert, L. D.; Mceliece, R. J.; Rodemich, E. R.; Rumsey, H., Jr.
1978-01-01
The design of an optimal merged keycode data base information retrieval system is detailed. A probability distribution of n-bit binary words that minimized false drops was developed for the case where the set of desired records was a subset of tagged records.
Verhagen, Stans C; Janssen, Mireille AE; Dekhuijzen, Richard PNR; Vissers, Kris CP; Engels, Yvonne; Heijdra, Yvonne
2016-01-01
To identify patients hospitalized for an acute exacerbation of chronic obstructive pulmonary disease (COPD) who have a poor prognosis and might benefit from proactive palliative care, a set of indicators had been developed from the literature. A patient is considered eligible for proactive palliative care when meeting ≥2 criteria of the proposed set of 11 indicators. In order to develop a doctor-friendly and patient-convenient tool, our primary objective was to examine whether these indicators are documented consistently in the medical records. Besides, percentage of patients with a poor prognosis and prognostic value were explored. We conducted a retrospective medical record review of 33 patients. Five indicators; non-invasive ventilation (NIV), comorbidity, body mass index (BMI), previous admissions for acute exacerbation COPD and age were always documented. Three indicators; hypoxaemia and/or hypercapnia, professional home care and actual forced expiratory volume1% (FEV1%) were documented in more than half of the records, whereas the clinical COPD questionnaire (CCQ), medical research council dyspnoea (MRC dyspnoea) and the surprise question were never registered. Besides, 78.8% of the patients met ≥2 criteria and there was a significant association between meeting ≥2 criteria and mortality within 1 year (one-sided Fisher’s exact test, p = 0.04). The set of indicators for proactive palliative care in patients with COPD appeared to be user-friendly and feasible. PMID:27872166
Wearable, multimodal, vitals acquisition unit for intelligent field triage.
Beck, Christoph; Georgiou, Julius
2016-09-01
In this Letter, the authors describe the characterisation design and development of the authors' wearable, multimodal vitals acquisition unit for intelligent field triage. The unit is able to record the standard electrocardiogram, blood oxygen and body temperature parameters and also has the unique capability to record up to eight custom designed acoustic streams for heart and lung sound auscultation. These acquisition channels are highly synchronised to fully maintain the time correlation of the signals. The unit is a key component enabling systematic and intelligent field triage to continuously acquire vital patient information. With the realised unit a novel data-set with highly synchronised vital signs was recorded. The new data-set may be used for algorithm design in vital sign analysis or decision making. The monitoring unit is the only known body worn system that records standard emergency parameters plus eight multi-channel auscultatory streams and stores the recordings and wirelessly transmits them to mobile response teams.
Improving data quality in neuronal population recordings
Harris, Kenneth D.; Quian Quiroga, Rodrigo; Freeman, Jeremy; Smith, Spencer
2017-01-01
Understanding how the brain operates requires understanding how large sets of neurons function together. Modern recording technology makes it possible to simultaneously record the activity of hundreds of neurons, and technological developments will soon allow recording of thousands or tens of thousands. As with all experimental techniques, these methods are subject to confounds that complicate the interpretation of such recordings, and could lead to erroneous scientific conclusions. Here, we discuss methods for assessing and improving the quality of data from these techniques, and outline likely future directions in this field. PMID:27571195
Jensen, Roxanne E; Snyder, Claire F; Basch, Ethan; Frank, Lori; Wu, Albert W
2016-11-01
In recent years, patient-reported outcomes have become increasingly collected and integrated into electronic health records. However, there are few cross-cutting recommendations and limited guidance available in this rapidly developing research area. Our goal is to report key findings from a 2013 Patient-Centered Outcomes Research Institute workshop on this topic and a summary of actions that followed from the workshop, and present resulting recommendations that address patient, clinical and research/quality improvement barriers to regular use. These findings provide actionable guidance across research and practice settings to promote and sustain widespread adoption of patient-reported outcomes across patient populations, healthcare settings and electronic health record systems.
The quality of care in occupational therapy: an assessment of selected Michigan hospitals.
Kirchman, M M
1979-07-01
In this study, a methodology was developed and tested for assessing the quality of care in occupational therapy between educational and noneducational clinical settings, as measured by process and outcome. An instrument was constructed for an external audit of the hospital record. Standards drafted by the investigator were established as normative by a panel of experts for use in judging the programs. Hospital records of 84 patients with residual hemiparesis or hemiplegia in three noneducational settings and of 100 patients with similar diagnoses in two educational clinical settings from selected Michigan facilities were chosen by proportionate stratified random sampling. The process study showed that occupational therapy was of significantly higher quality in the educational settings. The outcome study did not show significant differences between types of settings. Implications for education and practice are discussed.
Scobbie, Lesley; McLean, Donald; Dixon, Diane; Duncan, Edward; Wyke, Sally
2013-05-24
Goal setting is considered 'best practice' in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning & coping planning and appraisal & feedback. All stages are recorded in a patient-held record. In this study we examined the implementation, acceptability and perceived benefits of G-AP in one community rehabilitation team with people recovering from stroke. G-AP was implemented for 6 months with 23 stroke patients. In-depth interviews with 8 patients and 8 health professionals were analysed thematically to investigate views of its implementation, acceptability and perceived benefits. Case notes of interviewed patients were analysed descriptively to assess the fidelity of G-AP implementation. G-AP was mostly implemented according to protocol with deviations noted at the planning and appraisal and feedback stages. Each stage was felt to make a useful contribution to the overall process; however, in practice, goal negotiation and goal setting merged into one stage and the appraisal and feedback stage included an explicit decision making component. Only two issues were raised regarding G-APs acceptability: (i) health professionals were concerned about the impact of goal non-attainment on patient's well-being (patients did not share their concerns), and (ii) some patients and health professionals found the patient-held record unhelpful. G-AP was felt to have a positive impact on patient goal attainment and professional goal setting practice. Collaborative partnerships between health professionals and patients were apparent throughout the process. G-AP has been perceived as both beneficial and broadly acceptable in one community rehabilitation team; however, implementation of novel aspects of the framework was inconsistent. The regulatory function of goal non-attainment and the importance of creating flexible partnerships with patients have been highlighted. Further development of the G-AP framework, training package and patient held record is required to address the specific issues highlighted by this process evaluation. Further evaluation of G-AP is required across diverse community rehabilitation settings.
Deriving a Set of Privacy Specific Heuristics for the Assessment of PHRs (Personal Health Records).
Furano, Riccardo F; Kushniruk, Andre; Barnett, Jeff
2017-01-01
With the emergence of personal health record (PHR) platforms becoming more widely available, this research focused on the development of privacy heuristics to assess PHRs regarding privacy. Existing sets of heuristics are typically not application specific and do not address patient-centric privacy as a main concern prior to undergoing PHR procurement. A set of privacy specific heuristics were developed based on a scoping review of the literature. An internet-based commercially available, vendor specific PHR application was evaluated using the derived set of privacy specific heuristics. The proposed set of privacy specific derived heuristics is explored in detail in relation to ISO 29100. The assessment of the internet-based commercially available, vendor specific PHR application indicated numerous violations. These violations were noted within the study. It is argued that the new derived privacy heuristics should be used in addition to Nielsen's well-established set of heuristics. Privacy specific heuristics could be used to assess PHR portal system-level privacy mechanisms in the procurement process of a PHR application and may prove to be a beneficial form of assessment to prevent the selection of a PHR platform with a poor privacy specific interface design.
The Effect of Health Information Technology on Hospital Quality of Care
ERIC Educational Resources Information Center
Sun, Ruirui
2016-01-01
Health Information Technology (Health IT) is designed to store patients' records safely and clearly, to reduce input errors and missing records, and to make communications more efficiently. Concerned with the relatively lower adoption rate among the US hospitals compared to most developed countries, the Bush Administration set up the Office of…
Baus, Adam; Coben, Jeffrey; Zullig, Keith; Pollard, Cecil; Mullett, Charles; Taylor, Henry; Cochran, Jill; Jarrett, Traci; Long, Dustin
2017-01-01
Screening for risk of unintentional falls remains low in the primary care setting because of the time constraints of brief office visits. National studies suggest that physicians caring for older adults provide recommended fall risk screening only 30 to 37 percent of the time. Given prior success in developing methods for repurposing electronic health record data for the identification of fall risk, this study involves building a model in which electronic health record data could be applied for use in clinical decision support to bolster screening by proactively identifying patients for whom screening would be beneficial and targeting efforts specifically to those patients. The final model, consisting of priority and extended measures, demonstrates moderate discriminatory power, indicating that it could prove useful in a clinical setting for identifying patients at risk of falls. Focus group discussions reveal important contextual issues involving the use of fall-related data and provide direction for the development of health systems–level innovations for the use of electronic health record data for fall risk identification. PMID:29118679
Evaluating the Risk of Re-identification of Patients from Hospital Prescription Records.
Emam, Khaled El; Dankar, Fida K; Vaillancourt, Régis; Roffey, Tyson; Lysyk, Mary
2009-07-01
Pharmacies often provide prescription records to private research firms, on the assumption that these records are de-identified (i.e., identifying information has been removed). However, concerns have been expressed about the potential that patients can be re-identified from such records. Recently, a large private research firm requested prescription records from the Children's Hospital of Eastern Ontario (CHEO), as part of a larger effort to develop a database of hospital prescription records across Canada. To evaluate the ability to re-identify patients from CHEO'S prescription records and to determine ways to appropriately de-identify the data if the risk was too high. The risk of re-identification was assessed for 18 months' worth of prescription data. De-identification algorithms were developed to reduce the risk to an acceptable level while maintaining the quality of the data. The probability of patients being re-identified from the original variables and data set requested by the private research firm was deemed quite high. A new de-identified record layout was developed, which had an acceptable level of re-identification risk. The new approach involved replacing the admission and discharge dates with the quarter and year of admission and the length of stay in days, reporting the patient's age in weeks, and including only the first character of the patient's postal code. Additional requirements were included in the data-sharing agreement with the private research firm (e.g., audit requirements and a protocol for notification of a breach of privacy). Without a formal analysis of the risk of re-identification, assurances of data anonymity may not be accurate. A formal risk analysis at one hospital produced a clinically relevant data set that also protects patient privacy and allows the hospital pharmacy to explicitly manage the risks of breach of patient privacy.
A First Standardized Swiss Electronic Maternity Record.
Murbach, Michel; Martin, Sabine; Denecke, Kerstin; Nüssli, Stephan
2017-01-01
During the nine months of pregnancy, women have to regularly visit several physicians for continuous monitoring of the health and development of the fetus and mother. Comprehensive examination results of different types are generated in this process; documentation and data transmission standards are still unavailable or not in use. Relevant information is collected in a paper-based maternity record carried by the pregnant women. To improve availability and transmission of data, we aim at developing a first prototype for an electronic maternity record for Switzerland. By analyzing the documentation workflow during pregnancy, we determined a maternity record data set. Further, we collected requirements towards a digital maternity record. As data exchange format, the Swiss specific exchange format SMEEX (swiss medical data exchange) was exploited. Feedback from 27 potential users was collected to identify further improvements. The relevant data is extracted from the primary care information system as SMEEX file, stored in a database and made available in a web and a mobile application, developed as prototypes of an electronic maternity record. The user confirmed the usefulness of the system and provided multiple suggestions for an extension. An electronical maternity record as developed in this work could be in future linked to the electronic patient record.
Error sources in passive and active microwave satellite soil moisture over Australia
USDA-ARS?s Scientific Manuscript database
Development of a long-term climate record of soil moisture (SM) involves combining historic and present satellite-retrieved SM data sets. This in turn requires a consistent characterization and deep understanding of the systematic differences and errors in the individual data sets, which vary due to...
LUXSim: A component-centric approach to low-background simulations
Akerib, D. S.; Bai, X.; Bedikian, S.; ...
2012-02-13
Geant4 has been used throughout the nuclear and high-energy physics community to simulate energy depositions in various detectors and materials. These simulations have mostly been run with a source beam outside the detector. In the case of low-background physics, however, a primary concern is the effect on the detector from radioactivity inherent in the detector parts themselves. From this standpoint, there is no single source or beam, but rather a collection of sources with potentially complicated spatial extent. LUXSim is a simulation framework used by the LUX collaboration that takes a component-centric approach to event generation and recording. A newmore » set of classes allows for multiple radioactive sources to be set within any number of components at run time, with the entire collection of sources handled within a single simulation run. Various levels of information can also be recorded from the individual components, with these record levels also being set at runtime. This flexibility in both source generation and information recording is possible without the need to recompile, reducing the complexity of code management and the proliferation of versions. Within the code itself, casting geometry objects within this new set of classes rather than as the default Geant4 classes automatically extends this flexibility to every individual component. No additional work is required on the part of the developer, reducing development time and increasing confidence in the results. Here, we describe the guiding principles behind LUXSim, detail some of its unique classes and methods, and give examples of usage.« less
Eid, Wael E; Pottala, James V
2010-01-01
To develop a receiver operating characteristic (ROC) curve of glycosylated hemoglobin (HbA1c) for diagnosing diabetes mellitus within a chronic disease management system. A case-control study including medical records from January 1, 1997, to December 31, 2005, was conducted at the Sioux Falls Veterans Affairs Medical Center. Medical records for the case group (patients with diabetes) were selected based on 1 of 3 criteria: International Classification of Diseases, Ninth Revision, Clinical Modification or Current Procedural Terminology codes specific for type 1 and type 2 diabetes; patients' use of medications (oral hypoglycemic agents, antidiabetes agents, or insulin); or results from random blood or plasma glucose tests (at least 2 measurements of blood glucose > or = 200 mg/dL). Records for the control group were selected based on patients having HbA1c measured, but not meeting the above diagnostic criteria for diabetes during the study period. Records for cases and controls were randomly frequency-matched, one-to-one. The control group was randomly divided into 5 sets of an equal number of records. Five sets of an equal number of cases were then randomly selected from the total number of cases. Each test data set included 1 case group and 1 control group, resulting in 5 independent data sets. In total, 5040 patient records met the case definition in the diabetes registry. Records of 15 patients who were prescribed metformin only, but did not meet any other case criteria, were reviewed and excluded after determining the patients were not diabetic. The control group consisted of 5 sets of 616 records each (totaling 3080 records), and the case group consisted of 5 sets of 616 records each (totaling 3080 records). Thus, each of the 5 independent data sets of 1 case group and 1 control group contained 1232 records. The case group was predominantly composed of white men (mean age, 69 years; mean body mass index, 31 kg/m2). Demographic data were similar for control patients. The ROC curve revealed that a HbA1c > or = 6.3% (mean + 1 SD) offered the most accurate cutoff value for diagnosing type 2 diabetes mellitus, with the following statistical values: C statistic, 0.78; sensitivity, 70%; specificity, 85%; and positive likelihood ratio, 4.6 (95% confidence interval, 4.2-5.0). A HbA1c value > or = 6.3% may be a useful benchmark for diagnosing diabetes mellitus within a chronic disease management system and may be a useful tool for monitoring high-risk populations.
Jensen, Roxanne E; Snyder, Claire F; Basch, Ethan; Frank, Lori; Wu, Albert W
2016-01-01
In recent years, patient-reported outcomes have become increasingly collected and integrated into electronic health records. However, there are few cross-cutting recommendations and limited guidance available in this rapidly developing research area. Our goal is to report key findings from a 2013 Patient-Centered Outcomes Research Institute workshop on this topic and a summary of actions that followed from the workshop, and present resulting recommendations that address patient, clinical and research/quality improvement barriers to regular use. These findings provide actionable guidance across research and practice settings to promote and sustain widespread adoption of patient-reported outcomes across patient populations, healthcare settings and electronic health record systems. PMID:27586855
Use of graph theory measures to identify errors in record linkage.
Randall, Sean M; Boyd, James H; Ferrante, Anna M; Bauer, Jacqueline K; Semmens, James B
2014-07-01
Ensuring high linkage quality is important in many record linkage applications. Current methods for ensuring quality are manual and resource intensive. This paper seeks to determine the effectiveness of graph theory techniques in identifying record linkage errors. A range of graph theory techniques was applied to two linked datasets, with known truth sets. The ability of graph theory techniques to identify groups containing errors was compared to a widely used threshold setting technique. This methodology shows promise; however, further investigations into graph theory techniques are required. The development of more efficient and effective methods of improving linkage quality will result in higher quality datasets that can be delivered to researchers in shorter timeframes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Invite yourself to the table: librarian contributions to the electronic medical record.
Brandes, Susan; Wells, Karen; Bandy, Margaret
2013-01-01
Librarians from Exempla Healthcare hospitals initiated contact with the chief medical information officer regarding evidence-based medicine activities related to the development of the system's Electronic Medical Record (EMR). This column reviews the librarians' involvement in specific initiatives that included providing comparative information on point-of-care resources to integrate into the EMR, providing evidence as needed for the order sets being developed, and participating with clinicians on an evidence-based advisory committee.
Decades After Developing Technology, NREL Sets New Solar-to-Hydrogen Record
recently achieved 16.2% solar-to-hydrogen conversion efficiency. Photo by Dennis Schroeder Innovation is to split water into hydrogen and oxygen. Photo by Dennis Schroeder Photo shows a photoelectrochemical device to split water into hydrogen and oxygen. Photo by Dennis Schroeder Second Look Leads to Record
ERIC Educational Resources Information Center
West, Christopher E.
2010-01-01
Research objectives: This dissertation examines the state of development of each of the eight core electronic health record (EHR) functionalities as described by the IOM and describes how the current state of these functionalities limit quality improvement efforts in ambulatory care settings. There is a great deal of literature describing both the…
Wearable, multimodal, vitals acquisition unit for intelligent field triage
Georgiou, Julius
2016-01-01
In this Letter, the authors describe the characterisation design and development of the authors’ wearable, multimodal vitals acquisition unit for intelligent field triage. The unit is able to record the standard electrocardiogram, blood oxygen and body temperature parameters and also has the unique capability to record up to eight custom designed acoustic streams for heart and lung sound auscultation. These acquisition channels are highly synchronised to fully maintain the time correlation of the signals. The unit is a key component enabling systematic and intelligent field triage to continuously acquire vital patient information. With the realised unit a novel data-set with highly synchronised vital signs was recorded. The new data-set may be used for algorithm design in vital sign analysis or decision making. The monitoring unit is the only known body worn system that records standard emergency parameters plus eight multi-channel auscultatory streams and stores the recordings and wirelessly transmits them to mobile response teams. PMID:27733926
Improving the Effectiveness of Electronic Health Record-Based Referral Processes
2012-01-01
Electronic health records are increasingly being used to facilitate referral communication in the outpatient setting. However, despite support by technology, referral communication between primary care providers and specialists is often unsatisfactory and is unable to eliminate care delays. This may be in part due to lack of attention to how information and communication technology fits within the social environment of health care. Making electronic referral communication effective requires a multifaceted “socio-technical” approach. Using an 8-dimensional socio-technical model for health information technology as a framework, we describe ten recommendations that represent good clinical practices to design, develop, implement, improve, and monitor electronic referral communication in the outpatient setting. These recommendations were developed on the basis of our previous work, current literature, sound clinical practice, and a systems-based approach to understanding and implementing health information technology solutions. Recommendations are relevant to system designers, practicing clinicians, and other stakeholders considering use of electronic health records to support referral communication. PMID:22973874
Wright, Adam; Pang, Justine; Feblowitz, Joshua C; Maloney, Francine L; Wilcox, Allison R; Ramelson, Harley Z; Schneider, Louise I; Bates, David W
2011-01-01
Accurate knowledge of a patient's medical problems is critical for clinical decision making, quality measurement, research, billing and clinical decision support. Common structured sources of problem information include the patient problem list and billing data; however, these sources are often inaccurate or incomplete. To develop and validate methods of automatically inferring patient problems from clinical and billing data, and to provide a knowledge base for inferring problems. We identified 17 target conditions and designed and validated a set of rules for identifying patient problems based on medications, laboratory results, billing codes, and vital signs. A panel of physicians provided input on a preliminary set of rules. Based on this input, we tested candidate rules on a sample of 100,000 patient records to assess their performance compared to gold standard manual chart review. The physician panel selected a final rule for each condition, which was validated on an independent sample of 100,000 records to assess its accuracy. Seventeen rules were developed for inferring patient problems. Analysis using a validation set of 100,000 randomly selected patients showed high sensitivity (range: 62.8-100.0%) and positive predictive value (range: 79.8-99.6%) for most rules. Overall, the inference rules performed better than using either the problem list or billing data alone. We developed and validated a set of rules for inferring patient problems. These rules have a variety of applications, including clinical decision support, care improvement, augmentation of the problem list, and identification of patients for research cohorts.
Signal analysis of accelerometry data using gravity-based modeling
NASA Astrophysics Data System (ADS)
Davey, Neil P.; James, Daniel A.; Anderson, Megan E.
2004-03-01
Triaxial accelerometers have been used to measure human movement parameters in swimming. Interpretation of data is difficult due to interference sources including interaction of external bodies. In this investigation the authors developed a model to simulate the physical movement of the lower back. Theoretical accelerometery outputs were derived thus giving an ideal, or noiseless dataset. An experimental data collection apparatus was developed by adapting a system to the aquatic environment for investigation of swimming. Model data was compared against recorded data and showed strong correlation. Comparison of recorded and modeled data can be used to identify changes in body movement, this is especially useful when cyclic patterns are present in the activity. Strong correlations between data sets allowed development of signal processing algorithms for swimming stroke analysis using first the pure noiseless data set which were then applied to performance data. Video analysis was also used to validate study results and has shown potential to provide acceptable results.
2013-01-01
Background Goal setting is considered ‘best practice’ in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning & coping planning and appraisal & feedback. All stages are recorded in a patient-held record. In this study we examined the implementation, acceptability and perceived benefits of G-AP in one community rehabilitation team with people recovering from stroke. Methods G-AP was implemented for 6 months with 23 stroke patients. In-depth interviews with 8 patients and 8 health professionals were analysed thematically to investigate views of its implementation, acceptability and perceived benefits. Case notes of interviewed patients were analysed descriptively to assess the fidelity of G-AP implementation. Results G-AP was mostly implemented according to protocol with deviations noted at the planning and appraisal and feedback stages. Each stage was felt to make a useful contribution to the overall process; however, in practice, goal negotiation and goal setting merged into one stage and the appraisal and feedback stage included an explicit decision making component. Only two issues were raised regarding G-APs acceptability: (i) health professionals were concerned about the impact of goal non-attainment on patient’s well-being (patients did not share their concerns), and (ii) some patients and health professionals found the patient-held record unhelpful. G-AP was felt to have a positive impact on patient goal attainment and professional goal setting practice. Collaborative partnerships between health professionals and patients were apparent throughout the process. Conclusions G-AP has been perceived as both beneficial and broadly acceptable in one community rehabilitation team; however, implementation of novel aspects of the framework was inconsistent. The regulatory function of goal non-attainment and the importance of creating flexible partnerships with patients have been highlighted. Further development of the G-AP framework, training package and patient held record is required to address the specific issues highlighted by this process evaluation. Further evaluation of G-AP is required across diverse community rehabilitation settings. PMID:23705824
24 CFR 15.103 - How can I get other records from HUD?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false How can I get other records from... request form on HUD's Internet web site at http://www.hud.gov. (d) What should I include in my FOIA... on his or her behalf; and (7) If you are requesting expedited processing, your request should set out...
Using mortuary statistics in the development of an injury surveillance system in Ghana.
London, Jason; Mock, Charles; Abantanga, Francis A.; Quansah, Robert E.; Boateng, K. A.
2002-01-01
OBJECTIVE: To develop, in a mortuary setting, a pilot programme for improving the accuracy of records of deaths caused by injury. METHODS: The recording of injury-related deaths was upgraded at the mortuary of the Komfo Anokye Teaching Hospital, Kumasi, Ghana, in 1996 through the creation of a prospectively gathered database. FINDINGS: There was an increase in the number of deaths reported annually as attributable to injury from 72 before 1995 to 633 in 1996-99. Injuries accounted for 8.6% of all deaths recorded in the mortuary and for 12% of deaths in the age range 15-59 years; 80% of deaths caused by injury occurred outside the hospital and thus would not have been indicated in hospital statistics; 88% of injury-related deaths were associated with transport, and 50% of these involved injuries to pedestrians. CONCLUSIONS: Injury was a significant cause of mortality in this urban African setting, especially among adults of working age. The reporting of injury-related deaths in a mortuary was made more complete and accurate by means of simple inexpensive methods. This source of data could make a significant contribution to an injury surveillance system, along with hospital records and police accident reports. PMID:12077610
Artificial Intelligence Controls Tape-Recording Sequence
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.; Otamura, Roy M.; Zottarelli, Lawrence J.
1989-01-01
Developmental expert-system computer program intended to schedule recording of large amounts of data on limited amount of magnetic tape. Schedules recording using two sets of rules. First set incorporates knowledge of locations for recording of new data. Second set incorporates knowledge about issuing commands to recorder. Designed primarily for use on Voyager Spacecraft, also applicable to planning and sequencing in industry.
First Plant Phenological Records in the Carpathians and their Possible Use
NASA Astrophysics Data System (ADS)
Tekusova, M.; Horecká, V.; Mikulová, K.
2009-04-01
Phenological observations have a long history. The long time series come from Korea and some other parts of Asia, while wine harvest dates form the oldest phenological data sets in Europe. One of them started as early as 1457 year in Vienna, i.e. on the border of the Carpathian region. However, the first systematic phenological observations started in the south Carpathians almost four hundred years later following the establishment of the phenological network in Austria and later in the Hungarian Kingdom. A medical doctor P. Wierbitzky did first phenological observations in the Carpathian region in the beginning of thirties of the nineteenth century in Orawicza. The first systematic observations and records of plant development in this region are connected with the establishment of Austrian Institute for Meteorology and Geomagnetism since 1851. Although the historical significance of these observations is high, the data recorded are of lower quality, frequently interrupted and fragmented. Further development of phenological observations came with the introduction of the methodology of the observations introduced by Karl Fritsch in the beginning of the sixties of the nineteenth century mainly with the establishment of Hungarian Meteorological Service in 1871. These historical data were recorded and published in the yearbooks and, despite of the fragmentary character of the records, they are usable for some evaluations. This article brings the description of the data sets of systematic phenological network in the Carpathian region and considers some possible phenological evaluations. The phenological observations were done in some cases at the same localities as the climatologic observations but the number of phenological stations was quite lower in several years. The historical plant phenological records were based in many cases on the observation of four phenological phases: leafing, flowering, ripening and fall of leaves. Both the volume and the quality of the records vary from station to station. In some cases records were given in details including geographical details regarding the position of observed individual plant (orientation of the slopes) and the damages caused by frosts but this was not a general feature. All phenological observations were done on a voluntary basis. Moreover, even the stations that performed the observations for more than ten years changed the observed species from year to year. This makes the data sets quite fragmented with many gaps and the standard statistical characteristics of any station can be hardly obtained and their statistical significance is very low. As the standard statistical processing of the data sets was not possible, we tried to elaborate some descriptions that can characterize the distribution of phenological manifestation in space and time. Climatologic records available in the yearbooks were expressed as monthly mean values and totals. There are also gaps and missing data in the climatologic records. Nevertheless, these data sets enable us to get general characteristics of months and seasons. Next possible evaluation can follow the local phenological calendar. This was done also in 1874. As only three phenological phases were recorded, it was difficult to follow the development and growth of a particular plant. That is why only flowering of the plants characterizing start of early spring /Corylus Avellana/, full spring /Cornus mas, Salix alba and Prunus spinosa/, late spring /Syringa vulgaris, Aesculus Hippocastanum and Crataegus laevigata/ and early summer /Robinia Pseudoacacia and Sambucus nigra/ were considered. The full start of summer is indicated usually by flowering of Tilia platyphyllos. Three stations from the lowlands in northern region with relatively good data sets were selected in order to get this course of flowering. The northern most positioned station showed the delay in the beginning of flowering at the plants which flower in full spring and early summer while the plants flowering in early spring show data comparable with other localities. Selected data were compared with the averages of flowering from 1993 to 2008 at the localities close to the stations. The differences showed flowering of plants 1 - 3 weeks sooner. This corresponds to the higher temperature from February to June by 1.0 -1.5 ?C. The inventory of phenological records from the period of 1871-1885 from Carpathian region showed pretty fragmented data set that are not suitable for standard statistical evaluation. Some possibilities of phenological evaluations are in spatial and time analysis of the development of different plants in particular years/seasons that either represent the average climatic conditions or include also some climatic extremes. Deeper analysis of such phenological events will require daily climatologic/temperature data. The advantage of above discussed data sets is the fact that one methodology of observations was applied and that they cover big area of Central Europe and a part of Balkan. Further development of phenological observations in the region after the break-down of the Austro-Hungarian Empire was based on the conditions in a particular country. It means changes in the methods of observations and the number of stations. The recent cooperation in creating phenological databases brought considerable difficulties as some networks were cancelled and re-established again in the 20th century.
Infrared Video Pupillography Coupled with Smart Phone LED for Measurement of Pupillary Light Reflex.
Chang, Lily Yu-Li; Turuwhenua, Jason; Qu, Tian Yuan; Black, Joanna M; Acosta, Monica L
2017-01-01
Clinical assessment of pupil appearance and pupillary light reflex (PLR) may inform us the integrity of the autonomic nervous system (ANS). Current clinical pupil assessment is limited to qualitative examination, and relies on clinical judgment. Infrared (IR) video pupillography combined with image processing software offer the possibility of recording quantitative parameters. In this study we describe an IR video pupillography set-up intended for human and animal testing. As part of the validation, resting pupil diameter was measured in human subjects using the NeurOptics ™ (Irvine, CA, USA) pupillometer, to compare against that measured by our IR video pupillography set-up, and PLR was assessed in guinea pigs. The set-up consisted of a smart phone with a light emitting diode (LED) strobe light (0.2 s light ON, 5 s light OFF cycles) as the stimulus and an IR camera to record pupil kinetics. The consensual response was recorded, and the video recording was processed using a custom MATLAB program. The parameters assessed were resting pupil diameter (D1), constriction velocity (CV), percentage constriction ratio, re-dilation velocity (DV) and percentage re-dilation ratio. We report that the IR video pupillography set-up provided comparable results as the NeurOptics ™ pupillometer in human subjects, and was able to detect larger resting pupil size in juvenile male guinea pigs compared to juvenile female guinea pigs. At juvenile age, male guinea pigs also had stronger pupil kinetics for both pupil constriction and dilation. Furthermore, our IR video pupillography set-up was able to detect an age-specific increase in pupil diameter (female guinea pigs only) and reduction in CV (male and female guinea pigs) as animals developed from juvenile (3 months) to adult age (7 months). This technique demonstrated accurate and quantitative assessment of pupil parameters, and may provide the foundation for further development of an integrated system useful for clinical applications.
Prioritized Contact Transport Stream
NASA Technical Reports Server (NTRS)
Hunt, Walter Lee, Jr. (Inventor)
2015-01-01
A detection process, contact recognition process, classification process, and identification process are applied to raw sensor data to produce an identified contact record set containing one or more identified contact records. A prioritization process is applied to the identified contact record set to assign a contact priority to each contact record in the identified contact record set. Data are removed from the contact records in the identified contact record set based on the contact priorities assigned to those contact records. A first contact stream is produced from the resulting contact records. The first contact stream is streamed in a contact transport stream. The contact transport stream may include and stream additional contact streams. The contact transport stream may be varied dynamically over time based on parameters such as available bandwidth, contact priority, presence/absence of contacts, system state, and configuration parameters.
ERIC Educational Resources Information Center
Van Engen, Kristin J.; Baese-Berk, Melissa; Baker, Rachel E.; Choi, Arim; Kim, Midam; Bradlow, Ann R.
2010-01-01
This paper describes the development of the Wildcat Corpus of native- and foreign-accented English, a corpus containing scripted and spontaneous speech recordings from 24 native speakers of American English and 52 non-native speakers of English. The core element of this corpus is a set of spontaneous speech recordings, for which a new method of…
Pang, Justine; Feblowitz, Joshua C; Maloney, Francine L; Wilcox, Allison R; Ramelson, Harley Z; Schneider, Louise I; Bates, David W
2011-01-01
Background Accurate knowledge of a patient's medical problems is critical for clinical decision making, quality measurement, research, billing and clinical decision support. Common structured sources of problem information include the patient problem list and billing data; however, these sources are often inaccurate or incomplete. Objective To develop and validate methods of automatically inferring patient problems from clinical and billing data, and to provide a knowledge base for inferring problems. Study design and methods We identified 17 target conditions and designed and validated a set of rules for identifying patient problems based on medications, laboratory results, billing codes, and vital signs. A panel of physicians provided input on a preliminary set of rules. Based on this input, we tested candidate rules on a sample of 100 000 patient records to assess their performance compared to gold standard manual chart review. The physician panel selected a final rule for each condition, which was validated on an independent sample of 100 000 records to assess its accuracy. Results Seventeen rules were developed for inferring patient problems. Analysis using a validation set of 100 000 randomly selected patients showed high sensitivity (range: 62.8–100.0%) and positive predictive value (range: 79.8–99.6%) for most rules. Overall, the inference rules performed better than using either the problem list or billing data alone. Conclusion We developed and validated a set of rules for inferring patient problems. These rules have a variety of applications, including clinical decision support, care improvement, augmentation of the problem list, and identification of patients for research cohorts. PMID:21613643
Esteban, Santiago; Rodríguez Tablado, Manuel; Peper, Francisco; Mahumud, Yamila S; Ricci, Ricardo I; Kopitowski, Karin; Terrasa, Sergio
2017-01-01
Precision medicine requires extremely large samples. Electronic health records (EHR) are thought to be a cost-effective source of data for that purpose. Phenotyping algorithms help reduce classification errors, making EHR a more reliable source of information for research. Four algorithm development strategies for classifying patients according to their diabetes status (diabetics; non-diabetics; inconclusive) were tested (one codes-only algorithm; one boolean algorithm, four statistical learning algorithms and six stacked generalization meta-learners). The best performing algorithms within each strategy were tested on the validation set. The stacked generalization algorithm yielded the highest Kappa coefficient value in the validation set (0.95 95% CI 0.91, 0.98). The implementation of these algorithms allows for the exploitation of data from thousands of patients accurately, greatly reducing the costs of constructing retrospective cohorts for research.
Corridor consultations and the medical microbiological record: is patient safety at risk?
Heard, S R; Roberts, C; Furrows, S J; Kelsey, M; Southgate, L
2003-01-01
The performance procedures of the General Medical Council are aimed at identifying seriously deficient performance in a doctor. The performance procedures require the medical record to be of a standard that enables the next doctor seeing the patient to give adequate care based on the available information. Setting standards for microbiological record keeping has proved difficult. Over one fifth of practising medical microbiologists (including virologists) in the UK (139 of 676) responded to a survey undertaken by the working group developing the performance procedures for microbiology, to identify current practice and to develop recommendations for agreement within the profession about the standards of the microbiological record. The cumulative frequency for the surveyed recording methods used indicated that at various times 65% (90 of 139) of respondents used a daybook, 62% (86 of 139) used the back of the clinical request card, 57% (79 of 139) used a computer record, and 22% (30 of 139) used an index card system to record microbiological advice, suggesting wide variability in relation to how medical microbiologists maintain clinical records. PMID:12499432
Infectious hematopoietic necrosis virus virological and genetic surveillance 2000–2012
Breyta, Rachel; Brito, Ilana L.; Kurath, Gael; LaDeau, Shannon L.
2017-01-01
Surveillance records of the acute RNA pathogen of Pacific salmonid fish infectious hematopoietic necrosis virus are combined for the first time to enable landscape-level ecological analyses and modeling. The study area is the freshwater ecosystems of the large Columbia River watershed in the U.S. states of Washington, Oregon, and Idaho, as well as coastal rivers in Washington and Oregon. The study period is 2000–2012, and records were contributed by all five resource management agencies that operate conservation hatcheries in the study area. Additional records from wild fish were collected from the National Wild Fish Health Survey, operated by the U.S. Fish and Wildlife Survey. After curation and normalization, the data set consists of 6766 records, representing 1146 sample sites and 15 different fish hosts. The virus was found in an average of 12.4% of records, and of these 66.2% also have viral genetic analysis available. This data set is used to conduct univariate ecological and epidemiological analyses and develop a novel hierarchical landscape transmission model for an aquatic pathogen.
Design and implementation of an affordable, public sector electronic medical record in rural Nepal.
Raut, Anant; Yarbrough, Chase; Singh, Vivek; Gauchan, Bikash; Citrin, David; Verma, Varun; Hawley, Jessica; Schwarz, Dan; Harsha Bangura, Alex; Shrestha, Biplav; Schwarz, Ryan; Adhikari, Mukesh; Maru, Duncan
2017-06-23
Globally, electronic medical records are central to the infrastructure of modern healthcare systems. Yet the vast majority of electronic medical records have been designed for resource-rich environments and are not feasible in settings of poverty. Here we describe the design and implementation of an electronic medical record at a public sector district hospital in rural Nepal, and its subsequent expansion to an additional public sector facility.DevelopmentThe electronic medical record was designed to solve for the following elements of public sector healthcare delivery: 1) integration of the systems across inpatient, surgical, outpatient, emergency, laboratory, radiology, and pharmacy sites of care; 2) effective data extraction for impact evaluation and government regulation; 3) optimization for longitudinal care provision and patient tracking; and 4) effectiveness for quality improvement initiatives. For these purposes, we adapted Bahmni, a product built with open-source components for patient tracking, clinical protocols, pharmacy, laboratory, imaging, financial management, and supply logistics. In close partnership with government officials, we deployed the system in February of 2015, added on additional functionality, and iteratively improved the system over the following year. This experience enabled us then to deploy the system at an additional district-level hospital in a different part of the country in under four weeks. We discuss the implementation challenges and the strategies we pursued to build an electronic medical record for the public sector in rural Nepal.DiscussionOver the course of 18 months, we were able to develop, deploy and iterate upon the electronic medical record, and then deploy the refined product at an additional facility within only four weeks. Our experience suggests the feasibility of an integrated electronic medical record for public sector care delivery even in settings of rural poverty.
Design and implementation of an affordable, public sector electronic medical record in rural Nepal
Raut, Anant; Yarbrough, Chase; Singh, Vivek; Gauchan, Bikash; Citrin, David; Verma, Varun; Hawley, Jessica; Schwarz, Dan; Harsha, Alex; Shrestha, Biplav; Schwarz, Ryan; Adhikari, Mukesh; Maru, Duncan
2018-01-01
Introduction Globally, electronic medical records are central to the infrastructure of modern healthcare systems. Yet the vast majority of electronic medical records have been designed for resource-rich environments and are not feasible in settings of poverty. Here we describe the design and implementation of an electronic medical record at a public sector district hospital in rural Nepal, and its subsequent expansion to an additional public sector facility. Development The electronic medical record was designed to solve for the following elements of public sector healthcare delivery: 1) integration of the systems across inpatient, surgical, outpatient, emergency, laboratory, radiology, and pharmacy sites of care; 2) effective data extraction for impact evaluation and government regulation; 3) optimization for longitudinal care provision and patient tracking; and 4) effectiveness for quality improvement initiatives. Application For these purposes, we adapted Bahmni, a product built with open-source components for patient tracking, clinical protocols, pharmacy, laboratory, imaging, financial management, and supply logistics. In close partnership with government officials, we deployed the system in February of 2015, added on additional functionality, and iteratively improved the system over the following year. This experience enabled us then to deploy the system at an additional district-level hospital in a different part of the country in under four weeks. We discuss the implementation challenges and the strategies we pursued to build an electronic medical record for the public sector in rural Nepal. Discussion Over the course of 18 months, we were able to develop, deploy and iterate upon the electronic medical record, and then deploy the refined product at an additional facility within only four weeks. Our experience suggests the feasibility of an integrated electronic medical record for public sector care delivery even in settings of rural poverty. PMID:28749321
ERIC Educational Resources Information Center
Sutradhar, B.
2006-01-01
Purpose: To describe how an institutional repository (IR) was set up, using open source software, at the Indian Institute of Technology (IIT) in Kharagpur. Members of the IIT can publish their research documents in the IR for online access as well as digital preservation. Material in this IR includes instructional materials, records, data sets,…
50 CFR 403.04 - Determinations and hearings under section 109(c) of the MMPA.
Code of Federal Regulations, 2010 CFR
2010-10-01
... management program the state must provide for a process, consistent with section 109(c) of the Act, to... must include the elements set forth below. (b) Basis, purpose, and scope. The process set forth in this... made solely on the basis of the record developed at the hearing. The state agency in making its final...
Daugherty, Bethany L; Schap, TusaRebecca E; Ettienne-Gittens, Reynolette; Zhu, Fengqing M; Bosch, Marc; Delp, Edward J; Ebert, David S; Kerr, Deborah A; Boushey, Carol J
2012-04-13
The development of a mobile telephone food record has the potential to ameliorate much of the burden associated with current methods of dietary assessment. When using the mobile telephone food record, respondents capture an image of their foods and beverages before and after eating. Methods of image analysis and volume estimation allow for automatic identification and volume estimation of foods. To obtain a suitable image, all foods and beverages and a fiducial marker must be included in the image. To evaluate a defined set of skills among adolescents and adults when using the mobile telephone food record to capture images and to compare the perceptions and preferences between adults and adolescents regarding their use of the mobile telephone food record. We recruited 135 volunteers (78 adolescents, 57 adults) to use the mobile telephone food record for one or two meals under controlled conditions. Volunteers received instruction for using the mobile telephone food record prior to their first meal, captured images of foods and beverages before and after eating, and participated in a feedback session. We used chi-square for comparisons of the set of skills, preferences, and perceptions between the adults and adolescents, and McNemar test for comparisons within the adolescents and adults. Adults were more likely than adolescents to include all foods and beverages in the before and after images, but both age groups had difficulty including the entire fiducial marker. Compared with adolescents, significantly more adults had to capture more than one image before (38% vs 58%, P = .03) and after (25% vs 50%, P = .008) meal session 1 to obtain a suitable image. Despite being less efficient when using the mobile telephone food record, adults were more likely than adolescents to perceive remembering to capture images as easy (P < .001). A majority of both age groups were able to follow the defined set of skills; however, adults were less efficient when using the mobile telephone food record. Additional interactive training will likely be necessary for all users to provide extra practice in capturing images before entering a free-living situation. These results will inform age-specific development of the mobile telephone food record that may translate to a more accurate method of dietary assessment.
Carrell, David S; Schoen, Robert E; Leffler, Daniel A; Morris, Michele; Rose, Sherri; Baer, Andrew; Crockett, Seth D; Gourevitch, Rebecca A; Dean, Katie M; Mehrotra, Ateev
2017-09-01
Widespread application of clinical natural language processing (NLP) systems requires taking existing NLP systems and adapting them to diverse and heterogeneous settings. We describe the challenges faced and lessons learned in adapting an existing NLP system for measuring colonoscopy quality. Colonoscopy and pathology reports from 4 settings during 2013-2015, varying by geographic location, practice type, compensation structure, and electronic health record. Though successful, adaptation required considerably more time and effort than anticipated. Typical NLP challenges in assembling corpora, diverse report structures, and idiosyncratic linguistic content were greatly magnified. Strategies for addressing adaptation challenges include assessing site-specific diversity, setting realistic timelines, leveraging local electronic health record expertise, and undertaking extensive iterative development. More research is needed on how to make it easier to adapt NLP systems to new clinical settings. A key challenge in widespread application of NLP is adapting existing systems to new clinical settings. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Gizaw, Solomon; Goshme, Shenkute; Getachew, Tesfaye; Haile, Aynalem; Rischkowsky, Barbara; van Arendonk, Johan; Valle-Zárate, Anne; Dessie, Tadelle; Mwai, Ally Okeyo
2014-06-01
Pedigree recording and genetic selection in village flocks of smallholder farmers have been deemed infeasible by researchers and development workers. This is mainly due to the difficulty of sire identification under uncontrolled village breeding practices. A cooperative village sheep-breeding scheme was designed to achieve controlled breeding and implemented for Menz sheep of Ethiopia in 2009. In this paper, we evaluated the reliability of pedigree recording in village flocks by comparing genetic parameters estimated from data sets collected in the cooperative village and in a nucleus flock maintained under controlled breeding. Effectiveness of selection in the cooperative village was evaluated based on trends in breeding values over generations. Heritability estimates for 6-month weight recorded in the village and the nucleus flock were very similar. There was an increasing trend over generations in average estimated breeding values for 6-month weight in the village flocks. These results have a number of implications: the pedigree recorded in the village flocks was reliable; genetic parameters, which have so far been estimated based on nucleus data sets, can be estimated based on village recording; and appreciable genetic improvement could be achieved in village sheep selection programs under low-input smallholder farming systems.
Technical solutions for simultaneous MEG and SEEG recordings: towards routine clinical use.
Badier, J M; Dubarry, A S; Gavaret, M; Chen, S; Trébuchon, A S; Marquis, P; Régis, J; Bartolomei, F; Bénar, C G; Carron, R
2017-09-21
The simultaneous recording of intracerebral EEG (stereotaxic EEG, SEEG) and magnetoencephalography (MEG) is a promising strategy that provides both local and global views on brain pathological activity. Yet, acquiring simultaneous signals poses difficult technical issues that hamper their use in clinical routine. Our objective was thus to develop a set of solutions for recording a high number of SEEG channels while preserving signal quality. We recorded data in a patient with drug resistant epilepsy during presurgical evaluation. We used dedicated insertion screws and optically insulated amplifiers. We recorded 137 SEEG contacts on 10 depth electrodes (5-15 contacts each) and 248 MEG channels (magnetometers). Signal quality was assessed by comparing the distribution of RMS values in different frequency bands to a reference set of MEG acquisitions. The quality of signals was excellent for both MEG and SEEG; for MEG, it was comparable to that of MEG signals without concurrent SEEG. Discharges involving several structures on SEEG were visible on MEG, whereas discharges limited in space were not seen at the surface. SEEG can now be recorded simultaneously with whole-head MEG in routine. This opens new avenues, both methodologically for understanding signals and improving signal processing methods, and clinically for future combined analyses.
Jiang, Bo; Huang, Yu Dong
2014-01-01
Near infrared spectra combined with partial least squares were proposed as a means of non-contact analysis of the adsorptive ink capacity of recording coating materials in ink jet printing. First, the recording coating materials were prepared based on nano silica pigments. 80 samples of the recording coating materials were selected to develop the calibration of adsorptive ink capacity against ink adsorption (g/m2). The model developed predicted samples in the validation set with r2 = 0.80 and SEP = 1.108, analytical results showed that near infrared spectra had significant potential for the adsorption of ink capacity on the recording coating. The influence of factors such as recording coating thickness, mass ratio silica: binder-polyvinyl alcohol and the solution concentration on the adsorptive ink capacity were studied. With the help of the near infrared spectra, the adsorptive ink capacity of a recording coating material can be rapidly controlled. PMID:25329464
Jiang, Bo; Huang, Yu Dong
2014-01-01
Near infrared spectra combined with partial least squares were proposed as a means of non-contact analysis of the adsorptive ink capacity of recording coating materials in ink jet printing. First, the recording coating materials were prepared based on nano silica pigments. 80 samples of the recording coating materials were selected to develop the calibration of adsorptive ink capacity against ink adsorption (g/m2). The model developed predicted samples in the validation set with r2 = 0.80 and SEP = 1.108, analytical results showed that near infrared spectra had significant potential for the adsorption of ink capacity on the recording coating. The influence of factors such as recording coating thickness, mass ratio silica: binder-polyvinyl alcohol and the solution concentration on the adsorptive ink capacity were studied. With the help of the near infrared spectra, the adsorptive ink capacity of a recording coating material can be rapidly controlled.
Highly scalable parallel processing of extracellular recordings of Multielectrode Arrays.
Gehring, Tiago V; Vasilaki, Eleni; Giugliano, Michele
2015-01-01
Technological advances of Multielectrode Arrays (MEAs) used for multisite, parallel electrophysiological recordings, lead to an ever increasing amount of raw data being generated. Arrays with hundreds up to a few thousands of electrodes are slowly seeing widespread use and the expectation is that more sophisticated arrays will become available in the near future. In order to process the large data volumes resulting from MEA recordings there is a pressing need for new software tools able to process many data channels in parallel. Here we present a new tool for processing MEA data recordings that makes use of new programming paradigms and recent technology developments to unleash the power of modern highly parallel hardware, such as multi-core CPUs with vector instruction sets or GPGPUs. Our tool builds on and complements existing MEA data analysis packages. It shows high scalability and can be used to speed up some performance critical pre-processing steps such as data filtering and spike detection, helping to make the analysis of larger data sets tractable.
7 Steps to Better Reading--A Districtwide Approach
ERIC Educational Resources Information Center
Scroggins, John; Powers, Linda
2004-01-01
The Ponca City (Okla.) Public Schools set a higher literacy level as a goal, then developed a comprehensive plan to achieve that goal. A district focus, commitment, professional development for administrators and teachers, assessments to drive instruction, on-site coaching, monitoring, and careful record keeping all contributed to systemic reform.
Collection Development Analysis Using OCLC Archival Tapes. Final Report.
ERIC Educational Resources Information Center
Evans, Glyn T.; And Others
The purpose of this project is to develop a set of computer programs to perform a variety of collection development analyses on the machine-readable cataloging (MARC) records that are produced as a byproduct of use of the online cataloging subsystem of the Ohio College Library System (OCLC), and made available through the OCLC Distribution Tape…
Ghitza, Udi E; Gore-Langton, Robert E; Lindblad, Robert; Shide, David; Subramaniam, Geetha; Tai, Betty
2013-01-01
Electronic health records (EHRs) are essential in improving quality and enhancing efficiency of health-care delivery. By 2015, medical care receiving service reimbursement from US Centers for Medicare and Medicaid Services (CMS) must show 'meaningful use' of EHRs. Substance use disorders (SUD) are grossly under-detected and under-treated in current US medical care settings. Hence, an urgent need exists for improved identification of and clinical intervention for SUD in medical settings. The National Institute on Drug Abuse Clinical Trials Network (NIDA CTN) has leveraged its infrastructure and expertise and brought relevant stakeholders together to develop consensus on brief screening and initial assessment tools for SUD in general medical settings, with the objective of incorporation into US EHRs. Stakeholders were identified and queried for input and consensus on validated screening and assessment for SUD in general medical settings to develop common data elements to serve as shared resources for EHRs on screening, brief intervention and referral to treatment (SBIRT), with the intent of supporting interoperability and data exchange in a developing Nationwide Health Information Network. Through consensus of input from stakeholders, a validated screening and brief assessment instrument, supported by Clinical Decision Support tools, was chosen to be used at out-patient general medical settings. The creation and adoption of a core set of validated common data elements and the inclusion of such consensus-based data elements for general medical settings will enable the integration of SUD treatment within mainstream health care, and support the adoption and 'meaningful use' of the US Office of the National Coordinator for Health Information Technology (ONC)-certified EHRs, as well as CMS reimbursement. Published 2012. This article is a U.S. Government work and is in the public domain in the USA.
Greenstone belts: Their boundaries, surrounding rock terrains and interrelationships
NASA Technical Reports Server (NTRS)
Percival, J. A.; Card, K. D.
1986-01-01
Greenstone belts are an important part of the fragmented record of crustal evolution, representing samples of the magmatic activity that formed much of the Earth's crust. Most belts developed rapidly, in less than 100 Ma, leaving large gaps in the geological record. Surrounding terrains provide information on the context of greenstone belts. The effects of tectonic setting, structural geometry and evolution, associated plutonic activity and sedimentation are discussed.
An e-consent-based shared EHR system architecture for integrated healthcare networks.
Bergmann, Joachim; Bott, Oliver J; Pretschner, Dietrich P; Haux, Reinhold
2007-01-01
Virtual integration of distributed patient data promises advantages over a consolidated health record, but raises questions mainly about practicability and authorization concepts. Our work aims on specification and development of a virtual shared health record architecture using a patient-centred integration and authorization model. A literature survey summarizes considerations of current architectural approaches. Complemented by a methodical analysis in two regional settings, a formal architecture model was specified and implemented. Results presented in this paper are a survey of architectural approaches for shared health records and an architecture model for a virtual shared EHR, which combines a patient-centred integration policy with provider-oriented document management. An electronic consent system assures, that access to the shared record remains under control of the patient. A corresponding system prototype has been developed and is currently being introduced and evaluated in a regional setting. The proposed architecture is capable of partly replacing message-based communications. Operating highly available provider repositories for the virtual shared EHR requires advanced technology and probably means additional costs for care providers. Acceptance of the proposed architecture depends on transparently embedding document validation and digital signature into the work processes. The paradigm shift from paper-based messaging to a "pull model" needs further evaluation.
de Araujo Furtado, Marcio; Zheng, Andy; Sedigh-Sarvestani, Madineh; Lumley, Lucille; Lichtenstein, Spencer; Yourick, Debra
2009-10-30
The organophosphorous compound soman is an acetylcholinesterase inhibitor that causes damage to the brain. Exposure to soman causes neuropathology as a result of prolonged and recurrent seizures. In the present study, long-term recordings of cortical EEG were used to develop an unbiased means to quantify measures of seizure activity in a large data set while excluding other signal types. Rats were implanted with telemetry transmitters and exposed to soman followed by treatment with therapeutics similar to those administered in the field after nerve agent exposure. EEG, activity and temperature were recorded continuously for a minimum of 2 days pre-exposure and 15 days post-exposure. A set of automatic MATLAB algorithms have been developed to remove artifacts and measure the characteristics of long-term EEG recordings. The algorithms use short-time Fourier transforms to compute the power spectrum of the signal for 2-s intervals. The spectrum is then divided into the delta, theta, alpha, and beta frequency bands. A linear fit to the power spectrum is used to distinguish normal EEG activity from artifacts and high amplitude spike wave activity. Changes in time spent in seizure over a prolonged period are a powerful indicator of the effects of novel therapeutics against seizures. A graphical user interface has been created that simultaneously plots the raw EEG in the time domain, the power spectrum, and the wavelet transform. Motor activity and temperature are associated with EEG changes. The accuracy of this algorithm is also verified against visual inspection of video recordings up to 3 days after exposure.
Online personal medical records: are they reliable for acute/critical care?
Schneider, J H
2001-08-01
To provide an introduction to Internet-based Online Personal Medical Records (OPMRs), to assess their use and limitations in acute/critical care situations, and to identify potential improvements that could increase their usefulness. A review of publicly available Internet-based OPMRs conducted in April 2001. Twenty-nine OPMR sites were identified in March 2000 using ten Internet search engines with the search term "Personal Medical Records." Through 2000 and 2001, an additional 37 sites were identified using lists obtained from trade journals and through the author's participation in standards-setting meetings. Each publicly available site was reviewed to assess suitability for acute/critical care situations using four measures developed by the author and for general use using eight measures developed in a standards-setting process described in the article. Of the 66 companies identified, only 16 still offer OPMRs that are available to the public on the Internet. None of these met all of the evaluation measures. Only 19% had rapid emergency access capabilities and only 63% provided medical summaries of the record. Security and confidentiality issues were well addressed in 94% of sites. Data portability was virtually nonexistent because all OPMRs lacked the ability to exchange data electronically with other OPMRs, and only two OPMRs permitted data transfer from physician electronic medical records. Controls over data accuracy were poor: 81% of sites allowed entry of dates for medical treatment before the patient's date of birth, and one site actually gave incorrect medical advice. OPMRs were periodically inaccessible because of programming deficiencies. Finally, approximately 40 sites ceased providing OPMRs in the past year, with the probable loss of patient information. Most OPMRs are not ready for use in acute/critical care situations. Many are just electronic versions of the paper-based health record notebooks that patients have used for years. They have, however, great promise and, with further development, could form the basis of a new medical record system that could contribute to improving the quality of medical care.
Strasser, Torsten; Peters, Tobias; Jägle, Herbert; Zrenner, Eberhart
2018-02-01
The ISCEV standards and recommendations for electrophysiological recordings in ophthalmology define a set of protocols with stimulus parameters, acquisition settings, and recording conditions, to unify the data and enable comparability of results across centers. Up to now, however, there are no standards to define the storage and exchange of such electrophysiological recordings. The aim of this study was to develop an open standard data format for the exchange and storage of visual electrophysiological data (ElVisML). We first surveyed existing data formats for biomedical signals and examined their suitability for electrophysiological data in ophthalmology. We then compared the suitability of text-based and binary formats, as well as encoding in Extensible Markup Language (XML) and character/comma-separated values. The results of the methodological consideration led to the development of ElVisML with an XML-encoded text-based format. This allows referential integrity, extensibility, the storing of accompanying units, as well as ensuring confidentiality and integrity of the data. A visualization of ElVisML documents (ElVisWeb) has additionally been developed, which facilitates the exchange of recordings on mailing lists and allows open access to data along with published articles. The open data format ElVisML ensures the quality, validity, and integrity of electrophysiological data transmission and storage as well as providing manufacturer-independent access and long-term archiving in a future-proof format. Standardization of the format of such neurophysiology data would promote the development of new techniques and open software for the use of neurophysiological data in both clinic and research.
Ryan, K E; Walsh, J P; Corbett, D R; Winter, A
2008-06-01
Increased sediment flux to the coastal ocean due to coastal development is considered a major threat to the viability of coral reefs. A change in the nature of sediment supply and storage has been identified in a variety of coastal settings, particularly in response to European colonization, but sedimentation around reefs has received less attention. This research examines the sedimentary record adjacent to a coastal village that has experienced considerable land-use change over the last few decades. Sediment cores were analyzed to characterize composition and sediment accumulation rates. Sedimentation rates decreased seaward across the shelf from 0.85 cm y(-1) in a nearshore bay to 0.19 cm y(-1) in a fore-reef setting. Data reflected a significant (up to 2x) increase over the last approximately 80 years in terrestrial sediment accumulating in the back-reef setting, suggesting greater terrestrial sediment flux to the area. Reef health has declined, and increased turbidity is believed to be an important impact, particularly when combined with additional stressors.
Childrens Hospital Integrated Patient Electronic Record System Continuation (CHIPERS)
2015-12-01
of our interactive BPAs and order set technology. The Cumberland Group has been the primary consulting group assisting the entire inpatient and...to allow for continued BPA development and iteration during the upgrade period, while also effecting a seamless transition of BPA and order set...clinical decision support system. Our BPA logic includes: Severe Sepsis Alert Logic Summary: Age + Temperature + White Blood Cell and (lethargy or
Aero-Optics Measurement System for the AEDC Aero-Optics Test Facility
1991-02-01
Pulse Energy Statistics , 150 Pulses ........................................ 41 AEDC-TR-90-20 APPENDIXES A. Optical Performance of Heated Windows...hypersonic wind tunnel, where the requisite extensive statistical database can be developed in a cost- and time-effective manner. Ground testing...At the present time at AEDC, measured AO parameter statistics are derived from sets of image-spot recordings with a set containing as many as 150
The Evolution of Ambulatory Medical Record Systems in the U.S
Kuhn, Ingeborg M.; Wiederhold, Gio
1981-01-01
This paper is an overview of the developments in Automated Ambulatory Medical Record Systems (AAMRS) from 1975 to the present. A summary of findings from a 1975 state-of-the-art review is presented with the current findings of a follow-up study of the AAMRS. The studies revealed that effective automated medical record systems have been developed for ambulatory care settings and that they are now in the process of being transfered to other sites or users, either privately or as a commercial product. Since 1975 there have been no significant advances in system design. However, progress has been substantial in terms of achieving production goals. Even though a variety of system are commercially available, there is a continuing need for research and development to improve the effectiveness of the systems in use today.
Martin, B C
2000-01-01
The high cost of emergency department (ED) care is often viewed as an area for achieving cost savings through reduced utilization for inappropriate conditions. The implementation of outpatient prospective payment for Medicare ED patients heightens scrutiny of costs and utilization in the ED versus primary care settings. Data from hospital clinical records, financial records, and a provider survey was used to develop a costing methodology and complete a comparative analysis of the cost of care for three diagnoses by setting. Total costs were significantly higher in the ED due primarily to differences in ancillary tests and prescription drugs ordered.
Technical Assistance for the Conservation of Built Heritage at Bagan, Myanmar
NASA Astrophysics Data System (ADS)
Mezzino, D.; Santana Quintero, M.; Ma Pwint, P.; Tin Htut Latt, W.; Rellensmann, C.
2016-06-01
Presenting the outcomes of a capacity building activity, this contribution illustrates a replicable recording methodology to obtain timely, relevant and accurate information about conditions, materials and transformations of heritage structures. The purpose of the presented training activity consisted in developing local capabilities for the documentation of the built heritage at Bagan, Myanmar, employing different IT-supported techniques. Under the Director of UNESCO, the direct supervision of the chief of the culture unit, and in close consultation and cooperation with the Association of Myanmar Architects, the Department of Archaeology National Museum and Library (DoA) a documentation strategy has been developed in order to set up a recording methodology for the over three thousand Bagan monuments. The site, located in central Myanmar, in South East Asia, was developed between the IX and the XIII century as capital of the Myanmar kingdom. In the last years, this outstanding site has been exposed to an increasing number of natural hazards including earthquakes and flooding that strongly affected its built structures. Therefore, a documentation strategy to quickly capture shape, color, geometry and conditions of the monuments, in order to develop proper conservation projects, was needed. The scope of the training activity consisted in setting up a recording strategy updating the existing Bagan inventory, using three Buddhist temples as pilot cases study. The three documented temples were different in size, construction period, conditions and shape. The documentation included several IT-supported techniques including: Electronic Distance Measurements (EDM), SFM Photogrammetry, Laser Scanning, Record Photography as well as hand measurement and field notes. The monuments' surveying has been developed in accordance with the guidelines and standards established by the ICOMOS International Committee for Documentation of Cultural Heritage (CIPA). Recommendations on how to extend the adopted methodology to the other Bagan monuments have been also elaborated.
A Temporal Pattern Mining Approach for Classifying Electronic Health Record Data
Batal, Iyad; Valizadegan, Hamed; Cooper, Gregory F.; Hauskrecht, Milos
2013-01-01
We study the problem of learning classification models from complex multivariate temporal data encountered in electronic health record systems. The challenge is to define a good set of features that are able to represent well the temporal aspect of the data. Our method relies on temporal abstractions and temporal pattern mining to extract the classification features. Temporal pattern mining usually returns a large number of temporal patterns, most of which may be irrelevant to the classification task. To address this problem, we present the Minimal Predictive Temporal Patterns framework to generate a small set of predictive and non-spurious patterns. We apply our approach to the real-world clinical task of predicting patients who are at risk of developing heparin induced thrombocytopenia. The results demonstrate the benefit of our approach in efficiently learning accurate classifiers, which is a key step for developing intelligent clinical monitoring systems. PMID:25309815
An ontology-based method for secondary use of electronic dental record data.
Schleyer, Titus Kl; Ruttenberg, Alan; Duncan, William; Haendel, Melissa; Torniai, Carlo; Acharya, Amit; Song, Mei; Thyvalikakath, Thankam P; Liu, Kaihong; Hernandez, Pedro
2013-01-01
A key question for healthcare is how to operationalize the vision of the Learning Healthcare System, in which electronic health record data become a continuous information source for quality assurance and research. This project presents an initial, ontology-based, method for secondary use of electronic dental record (EDR) data. We defined a set of dental clinical research questions; constructed the Oral Health and Disease Ontology (OHD); analyzed data from a commercial EDR database; and created a knowledge base, with the OHD used to represent clinical data about 4,500 patients from a single dental practice. Currently, the OHD includes 213 classes and reuses 1,658 classes from other ontologies. We have developed an initial set of SPARQL queries to allow extraction of data about patients, teeth, surfaces, restorations and findings. Further work will establish a complete, open and reproducible workflow for extracting and aggregating data from a variety of EDRs for research and quality assurance.
Automated Coding Software: Development and Use to Enhance Anti-Fraud Activities*
Garvin, Jennifer H.; Watzlaf, Valerie; Moeini, Sohrab
2006-01-01
This descriptive research project identified characteristics of automated coding systems that have the potential to detect improper coding and to minimize improper or fraudulent coding practices in the setting of automated coding used with the electronic health record (EHR). Recommendations were also developed for software developers and users of coding products to maximize anti-fraud practices. PMID:17238546
PDP Implementation at English Universities: What Are the Issues?
ERIC Educational Resources Information Center
Quinton, Sarah; Smallbone, Teresa
2008-01-01
Personal development planing (PDP) is now a nationally required part of undergraduate and postgraduate education in the United Kingdom. Little is known about how universities in general are implementing personal development plans, nor how engaged students will become in compiling a set of records of their learning and progress, which they…
Developing Inquiry-as-Stance and Repertoires of Practice: Teacher Learning across Two Settings
ERIC Educational Resources Information Center
Braaten, Melissa L.
2011-01-01
Sixteen science educators joined a science teacher video club for one school year to collaboratively inquire into each other's classroom practice through the use of records of practice including classroom video clips and samples of student work. This group was focused on developing ambitious, equitable science teaching that capitalizes on…
Developing the System for Observing Behavioral Ecology for Youth in Schools Instrument
ERIC Educational Resources Information Center
Lorenz, Kent A.; van der Mars, Hans; Kulinna, Pamela H.; Ainsworth, Barbara E.; Hovell, Melbourne F.
2017-01-01
Background: Behavioral support may be effective in increasing physical activity (PA) in school settings. However, there are no data collection systems to concurrently record PA and behavioral support. This paper describes the development and validation of the System for Observing Behavioral Ecology for Youth in Schools (SOBEYS)--an instrument used…
Braitstein, Paula; Einterz, Robert M; Sidle, John E; Kimaiyo, Sylvester; Tierney, William
2009-11-01
Health care for patients with HIV infection in developing countries has increased substantially in response to major international funding. Scaling up treatment programs requires timely data on the type, quantity, and quality of care being provided. Increasingly, such programs are turning to electronic health records (EHRs) to provide these data. We describe how a medical school in the United States and another in Kenya collaborated to develop and implement an EHR in a large HIV/AIDS care program in western Kenya. These data were used to manage patients, providers, and the program itself as it grew to encompass 18 sites serving more than 90,000 patients. Lessons learned have been applicable beyond HIV/AIDS to include primary care, chronic disease management, and community-based health screening and disease prevention programs. EHRs will be key to providing the highest possible quality of care for the funds developing countries can commit to health care. Public, private, and academic partnerships can facilitate the development and implementation of EHRs in resource-constrained settings.
Nemeth, Lynne S; Feifer, Chris; Stuart, Gail W; Ornstein, Steven M
2008-01-16
Implementing change in primary care is difficult, and little practical guidance is available to assist small primary care practices. Methods to structure care and develop new roles are often needed to implement an evidence-based practice that improves care. This study explored the process of change used to implement clinical guidelines for primary and secondary prevention of cardiovascular disease in primary care practices that used a common electronic medical record (EMR). Multiple conceptual frameworks informed the design of this study designed to explain the complex phenomena of implementing change in primary care practice. Qualitative methods were used to examine the processes of change that practice members used to implement the guidelines. Purposive sampling in eight primary care practices within the Practice Partner Research Network-Translating Researching into Practice (PPRNet-TRIP II) clinical trial yielded 28 staff members and clinicians who were interviewed regarding how change in practice occurred while implementing clinical guidelines for primary and secondary prevention of cardiovascular disease and strokes. A conceptual framework for implementing clinical guidelines into primary care practice was developed through this research. Seven concepts and their relationships were modelled within this framework: leaders setting a vision with clear goals for staff to embrace; involving the team to enable the goals and vision for the practice to be achieved; enhancing communication systems to reinforce goals for patient care; developing the team to enable the staff to contribute toward practice improvement; taking small steps, encouraging practices' tests of small changes in practice; assimilating the electronic medical record to maximize clinical effectiveness, enhancing practices' use of the electronic tool they have invested in for patient care improvement; and providing feedback within a culture of improvement, leading to an iterative cycle of goal setting by leaders. This conceptual framework provides a mental model which can serve as a guide for practice leaders implementing clinical guidelines in primary care practice using electronic medical records. Using the concepts as implementation and evaluation criteria, program developers and teams can stimulate improvements in their practice settings. Investing in collaborative team development of clinicians and staff may enable the practice environment to be more adaptive to change and improvement.
Background noise model development for seismic stations of KMA
NASA Astrophysics Data System (ADS)
Jeon, Youngsoo
2010-05-01
The background noise recorded at seismometer is exist at any seismic signal due to the natural phenomena of the medium which the signal passed through. Reducing the seismic noise is very important to improve the data quality in seismic studies. But, the most important aspect of reducing seismic noise is to find the appropriate place before installing the seismometer. For this reason, NIMR(National Institution of Meteorological Researches) starts to develop a model of standard background noise for the broadband seismic stations of the KMA(Korea Meteorological Administration) using a continuous data set obtained from 13 broadband stations during the period of 2007 and 2008. We also developed the model using short period seismic data from 10 stations at the year of 2009. The method of Mcmara and Buland(2004) is applied to analyse background noise of Korean Peninsula. The fact that borehole seismometer records show low noise level at frequency range greater than 1 Hz compared with that of records at the surface indicate that the cultural noise of inland Korean Peninsula should be considered to process the seismic data set. Reducing Double Frequency peak also should be regarded because the Korean Peninsula surrounded by the seas from eastern, western and southern part. The development of KMA background model shows that the Peterson model(1993) is not applicable to fit the background noise signal generated from Korean Peninsula.
Avecilla-Ramírez, G N; Ruiz-Correa, S; Marroquin, J L; Harmony, T; Alba, A; Mendoza-Montoya, O
2011-12-01
This study presents evidence suggesting that electrophysiological responses to language-related auditory stimuli recorded at 46weeks postconceptional age (PCA) are associated with language development, particularly in infants with periventricular leukomalacia (PVL). In order to investigate this hypothesis, electrophysiological responses to a set of auditory stimuli consisting of series of syllables and tones were recorded from a population of infants with PVL at 46weeks PCA. A communicative development inventory (i.e., parent report) was applied to this population during a follow-up study performed at 14months of age. The results of this later test were analyzed with a statistical clustering procedure, which resulted in two well-defined groups identified as the high-score (HS) and low-score (LS) groups. The event-induced power of the EEG data recorded at 46weeks PCA was analyzed using a dimensionality reduction approach, resulting in a new set of descriptive variables. The LS and HS groups formed well-separated clusters in the space spanned by these descriptive variables, which can therefore be used to predict whether a new subject will belong to either of these groups. A predictive classification rate of 80% was obtained by using a linear classifier that was trained with a leave-one-out cross-validation technique. 2011 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Hubert, Daan; Lambert, Jean-Christopher; Verhoelst, Tijl; Granville, Jose; Keppens, Arno; Baray, Jean-Luc; Cortesi, Ugo; Degenstein, D. A.; Froidevaux, Lucien; Godin-Beekmann, Sophie;
2015-01-01
Most recent assessments of long-term changes in the vertical distribution of ozone (by e.g. WMO and SI2N) rely on data sets that integrate observations by multiple instruments. Several merged satellite ozone profile records have been developed over the past few years; each considers a particular set of instruments and adopts a particular merging strategy. Their intercomparison by Tummon et al. revealed that the current merging schemes are not sufficiently refined to correct for all major differences between the limb/occultation records. This shortcoming introduces uncertainties that need to be known to obtain a sound interpretation of the different satellite-based trend studies. In practice however, producing realistic uncertainty estimates is an intricate task which depends on a sufficiently detailed understanding of the characteristics of each contributing data record and on the subsequent interplay and propagation of these through the merging scheme. Our presentation discusses these challenges in the context of limb/occultation ozone profile records, but they are equally relevant for other instruments and atmospheric measurements. We start by showing how the NDACC and GAW-affiliated ground-based networks of ozonesonde and lidar instruments allowed us to characterize fourteen limb/occultation ozone profile records, together providing a global view over the last three decades. Our prime focus will be on techniques to estimate long-term drift since our results suggest this is the main driver of the major trend differences between the merged data sets. The single-instrument drift estimates are then used for a tentative estimate of the systematic uncertainty in the profile trends from merged data records. We conclude by reflecting on possible further steps needed to improve the merging algorithms and to obtain a better characterization of the uncertainties involved.
Dziadkowiec, Oliwier; Callahan, Tiffany; Ozkaynak, Mustafa; Reeder, Blaine; Welton, John
2016-01-01
Objectives: We examine the following: (1) the appropriateness of using a data quality (DQ) framework developed for relational databases as a data-cleaning tool for a data set extracted from two EPIC databases, and (2) the differences in statistical parameter estimates on a data set cleaned with the DQ framework and data set not cleaned with the DQ framework. Background: The use of data contained within electronic health records (EHRs) has the potential to open doors for a new wave of innovative research. Without adequate preparation of such large data sets for analysis, the results might be erroneous, which might affect clinical decision-making or the results of Comparative Effectives Research studies. Methods: Two emergency department (ED) data sets extracted from EPIC databases (adult ED and children ED) were used as examples for examining the five concepts of DQ based on a DQ assessment framework designed for EHR databases. The first data set contained 70,061 visits; and the second data set contained 2,815,550 visits. SPSS Syntax examples as well as step-by-step instructions of how to apply the five key DQ concepts these EHR database extracts are provided. Conclusions: SPSS Syntax to address each of the DQ concepts proposed by Kahn et al. (2012)1 was developed. The data set cleaned using Kahn’s framework yielded more accurate results than the data set cleaned without this framework. Future plans involve creating functions in R language for cleaning data extracted from the EHR as well as an R package that combines DQ checks with missing data analysis functions. PMID:27429992
Woodman, Jenny; Allister, Janice; Rafi, Imran; de Lusignan, Simon; Belsey, Jonathan; Petersen, Irene; Gilbert, Ruth
2012-01-01
Background Information is lacking on how concerns about child maltreatment are recorded in primary care records. Aim To determine how the recording of child maltreatment concerns can be improved. Design and setting Development of a quality improvement intervention involving: clinical audit, a descriptive survey, telephone interviews, a workshop, database analyses, and consensus development in UK general practice. Method Descriptive analyses and incidence estimates were carried out based on 11 study practices and 442 practices in The Health Improvement Network (THIN). Telephone interviews, a workshop, and a consensus development meeting were conducted with lead GPs from 11 study practices. Results The rate of children with at least one maltreatment-related code was 8.4/1000 child years (11 study practices, 2009–2010), and 8.0/1000 child years (THIN, 2009–2010). Of 25 patients with known maltreatment, six had no maltreatment-related codes recorded, but all had relevant free text, scanned documents, or codes. When stating their reasons for undercoding maltreatment concerns, GPs cited damage to the patient relationship, uncertainty about which codes to use, and having concerns about recording information on other family members in the child’s records. Consensus recommendations are to record the code ‘child is cause for concern’ as a red flag whenever maltreatment is considered, and to use a list of codes arranged around four clinical concepts, with an option for a templated short data entry form. Conclusion GPs under-record maltreatment-related concerns in children’s electronic medical records. As failure to use codes makes it impossible to search or audit these cases, an approach designed to be simple and feasible to implement in UK general practice was recommended. PMID:22781996
Automated Assessment of Child Vocalization Development Using LENA.
Richards, Jeffrey A; Xu, Dongxin; Gilkerson, Jill; Yapanel, Umit; Gray, Sharmistha; Paul, Terrance
2017-07-12
To produce a novel, efficient measure of children's expressive vocal development on the basis of automatic vocalization assessment (AVA), child vocalizations were automatically identified and extracted from audio recordings using Language Environment Analysis (LENA) System technology. Assessment was based on full-day audio recordings collected in a child's unrestricted, natural language environment. AVA estimates were derived using automatic speech recognition modeling techniques to categorize and quantify the sounds in child vocalizations (e.g., protophones and phonemes). These were expressed as phone and biphone frequencies, reduced to principal components, and inputted to age-based multiple linear regression models to predict independently collected criterion-expressive language scores. From these models, we generated vocal development AVA estimates as age-standardized scores and development age estimates. AVA estimates demonstrated strong statistical reliability and validity when compared with standard criterion expressive language assessments. Automated analysis of child vocalizations extracted from full-day recordings in natural settings offers a novel and efficient means to assess children's expressive vocal development. More research remains to identify specific mechanisms of operation.
Abstracting ICU Nursing Care Quality Data From the Electronic Health Record.
Seaman, Jennifer B; Evans, Anna C; Sciulli, Andrea M; Barnato, Amber E; Sereika, Susan M; Happ, Mary Beth
2017-09-01
The electronic health record is a potentially rich source of data for clinical research in the intensive care unit setting. We describe the iterative, multi-step process used to develop and test a data abstraction tool, used for collection of nursing care quality indicators from the electronic health record, for a pragmatic trial. We computed Cohen's kappa coefficient (κ) to assess interrater agreement or reliability of data abstracted using preliminary and finalized tools. In assessing the reliability of study data ( n = 1,440 cases) using the finalized tool, 108 randomly selected cases (10% of first half sample; 5% of last half sample) were independently abstracted by a second rater. We demonstrated mean κ values ranging from 0.61 to 0.99 for all indicators. Nursing care quality data can be accurately and reliably abstracted from the electronic health records of intensive care unit patients using a well-developed data collection tool and detailed training.
Setting the Record Straight. The Truth About Fad Diets.
ERIC Educational Resources Information Center
Wheat Foods Council, Parker, CO.
The Setting the Record Straight information packet presents facts to set the record straight about nutrition and debunk fad diets. The kit features materials designed to communicate the importance of balanced eating. Materials include: a time line of fad diets; four reproducible fad diet book review handouts that show the misleading claims rampant…
Programmable CGH on photochromic material using DMD
NASA Astrophysics Data System (ADS)
Alata, Romain; Pariani, Giorgio; Zamkotsian, Frederic; Lanzoni, Patrick; Bianco, Andrea; Bertarelli, Chiara
2016-07-01
Computer Generated Holograms (CGHs) are useful for wavefront shaping and complex optics testing, including aspherical and free-form optics. Today, CGHs are recorded directly with a laser or intermediates masks but allows only recording binary CGHs; binary CGHs are efficient but can reconstruct only pixilated images. We propose to use a Digital Micro-mirror Device (DMD) for writing binary CGHs as well as grayscale CGHs, able to reconstruct fulfilled images. DMD is actually studied at LAM, for generating programmable slit masks in multi-object spectrographs. It is composed of 2048x1080 individually controllable micro-mirrors, with a pitch of 13.68 μm. This is a real-time reconfigurable mask, perfect for recording CGHs. A first setup has been developed for hologram recording, where the DMD is enlightened with a collimated beam and illuminates a photosensible plate through an Offner relay, with a magnification of 1:1. Our set up resolution is 2-3 μm, leading to a CGH resolution equal to the DMD micro mirror size. In order to write and erase CGHs during test procedure or on request, we use a photochromic plate called PUR-GD71-50-ST developed at Politecnico di Milano. It is opaque at rest, and becomes transparent when it is illuminated with visible light, between 500 and 700 nm; then it can be erased by a UV flash. We choose to code the CGHs in equally spaced levels, so called stepped CGH. We recorded up to 1000x1000 pixels CGHs with a contrast greater than 50, knowing that the material is able to reach an ultimate contrast of 1000. A second bench has also been developed, dedicated to the reconstruction of the recorded images with a 632.8nm He-Ne laser beam. Very faithful reconstructions have been obtained. Thanks to our recording and reconstruction set-ups, we have been able to successfully record binary and stepped CGHs, and reconstruct them with a high fidelity, revealing the potential of this method for generating programmable/rewritable stepped CGHs on photochromic materials.
Pion-Massicotte, Joëlle; Godbout, Roger; Savard, Pierre; Roy, Jean-François
2018-02-23
Portable polysomnography is often too complex and encumbering for recording sleep at home. We recorded sleep using a biometric shirt (electrocardiogram sensors, respiratory inductance plethysmography bands and an accelerometer) in 21 healthy young adults recorded in a sleep laboratory for two consecutive nights, together with standard polysomnography. Polysomnographic recordings were scored using standard methods. An algorithm was developed to classify the biometric shirt recordings into rapid eye movement sleep, non-rapid eye movement sleep and wake. The algorithm was based on breathing rate and heart rate variability, body movement, and included a correction for sleep onset and offset. The overall mean percentage of agreement between the two sets of recordings was 77.4%; when non-rapid eye movement and rapid eye movement sleep epochs were grouped together, it increased to 90.8%. The overall kappa coefficient was 0.53. Five of the seven sleep variables were significantly correlated. The findings of this pilot study indicate that this simple portable system could be used to estimate the general sleep pattern of young healthy adults. © 2018 European Sleep Research Society.
The Cost of Doing Business: Cost Structure of Electronic Immunization Registries
Fontanesi, John M; Flesher, Don S; De Guire, Michelle; Lieberthal, Allan; Holcomb, Kathy
2002-01-01
Objective To predict the true cost of developing and maintaining an electronic immunization registry, and to set the framework for developing future cost-effective and cost-benefit analysis. Data Sources/Study Setting Primary data collected at three immunization registries located in California, accounting for 90 percent of all immunization records in registries in the state during the study period. Study Design A parametric cost analysis compared registry development and maintenance expenditures to registry performance requirements. Data Collection/Extraction Methods Data were collected at each registry through interviews, reviews of expenditure records, technical accomplishments development schedules, and immunization coverage rates. Principal Findings The cost of building immunization registries is predictable and independent of the hardware/software combination employed. The effort requires four man-years of technical effort or approximately $250,000 in 1998 dollars. Costs for maintaining a registry were approximately $5,100 per end user per three-year period. Conclusions There is a predictable cost structure for both developing and maintaining immunization registries. The cost structure can be used as a framework for examining the cost-effectiveness and cost-benefits of registries. The greatest factor effecting improvement in coverage rates was ongoing, user-based administrative investment. PMID:12479497
Computer-Mediated Assessment of Intelligibility in Aphasia and Apraxia of Speech
Haley, Katarina L.; Roth, Heidi; Grindstaff, Enetta; Jacks, Adam
2011-01-01
Background Previous work indicates that single word intelligibility tests developed for dysarthria are sensitive to segmental production errors in aphasic individuals with and without apraxia of speech. However, potential listener learning effects and difficulties adapting elicitation procedures to coexisting language impairments limit their applicability to left hemisphere stroke survivors. Aims The main purpose of this study was to examine basic psychometric properties for a new monosyllabic intelligibility test developed for individuals with aphasia and/or AOS. A related purpose was to examine clinical feasibility and potential to standardize a computer-mediated administration approach. Methods & Procedures A 600-item monosyllabic single word intelligibility test was constructed by assembling sets of phonetically similar words. Custom software was used to select 50 target words from this test in a pseudo-random fashion and to elicit and record production of these words by 23 speakers with aphasia and 20 neurologically healthy participants. To evaluate test-retest reliability, two identical sets of 50-word lists were elicited by requesting repetition after a live speaker model. To examine the effect of a different word set and auditory model, an additional set of 50 different words was elicited with a pre-recorded model. The recorded words were presented to normal-hearing listeners for identification via orthographic and multiple-choice response formats. To examine construct validity, production accuracy for each speaker was estimated via phonetic transcription and rating of overall articulation. Outcomes & Results Recording and listening tasks were completed in less than six minutes for all speakers and listeners. Aphasic speakers were significantly less intelligible than neurologically healthy speakers and displayed a wide range of intelligibility scores. Test-retest and inter-listener reliability estimates were strong. No significant difference was found in scores based on recordings from a live model versus a pre-recorded model, but some individual speakers favored the live model. Intelligibility test scores correlated highly with segmental accuracy derived from broad phonetic transcription of the same speech sample and a motor speech evaluation. Scores correlated moderately with rated articulation difficulty. Conclusions We describe a computerized, single-word intelligibility test that yields clinically feasible, reliable, and valid measures of segmental speech production in adults with aphasia. This tool can be used in clinical research to facilitate appropriate participant selection and to establish matching across comparison groups. For a majority of speakers, elicitation procedures can be standardized by using a pre-recorded auditory model for repetition. This assessment tool has potential utility for both clinical assessment and outcomes research. PMID:22215933
NASA Astrophysics Data System (ADS)
Gordon, S.; Dattore, E.; Williams, S.
2014-12-01
Even when a data center makes it's datasets accessible, they can still be hard to discover if the user is unaware of the laboratory or organization the data center supports. NCAR's Earth Observing Laboratory (EOL) is no exception. In response to this problem and as an inquiry into the feasibility of inter-connecting all of NCAR's repositories at a discovery layer, ESRI's Geoportal was researched. It was determined that an implementation of Geoportal would be a good choice to build a proof of concept model of inter-repository discovery around. This collaborative project between the University of Illinois and NCAR is coordinated through the Data Curation Education in Research Centers program. This program is funded by the Institute of Museum and Library Services.
Geoportal is open source software. It serves as an aggregation point for metadata catalogs of earth science datasets, with a focus on geospatial information. EOL's metadata is in static THREDDS catalogs. Geoportal can only create records from a THREDDS Data Server. The first step was to make EOL metadata more accessible by utilizing the ISO 19115-2 standard. It was also decided to create DIF records so EOL datasets could be ingested in NASA's Global Change Master Directory (GCMD).
To offer records for harvest, it was decided to develop an OAI-PMH server. To make a compliant server, the OAI_DC standard was also implemented. A server was written in Perl to serve a set of static records. We created a sample set of records in ISO 19115-2, FGDC, DIF, and OAI_DC. We utilized GCMD shared vocabularies to enhance discoverability and precision. The proof of concept was tested and verified by having another NCAR laboratory's Geoportal harvest our sample set.
To prepare for production, templates for each standard were developed and mapped to the database. These templates will help the automated creation of records. Once the OAI-PMH server is re-written in a Grails framework a dynamic representation of EOL's metadata will be available for harvest.
EOL will need to develop an implementation of a Geoportal and point GCMD to the OAI-PMH server. We will also seek out partnerships with other earth science and related discipline repositories that can communicate by OAI-PMH or Geoportal so that the scientific community will benefit from more discoverable data.
Age model for a continuous, ca 250-ka Quaternary lacustrine record from Bear Lake, Utah-Idaho
Colman, Steven M.; Kaufman, D.S.; Bright, Jordon; Heil, C.; King, J.W.; Dean, W.E.; Rosenbaum, J.G.; Forester, R.M.; Bischoff, J.L.; Perkins, Marie; McGeehin, J.P.
2006-01-01
The Quaternary sediments sampled by continuous 120-m-long drill cores from Bear Lake (Utah-Idaho) comprise one of the longest lacustrine sequences recovered from an extant lake. The cores serve as a good case study for the construction of an age model for sequences that extend beyond the range of radiocarbon dating. From a variety of potential age indicators, we selected a combination of radiocarbon ages, one magnetic excursion (correlated to a standard sequence), and a single Uranium-series age to develop an initial data set. The reliability of the excursion and U-series data require consideration of their position with respect to sediments of inferred interglacial character, but not direct correlation with other paleoclimate records. Data omitted from the age model include amino acid age estimates, which have a large amount of scatter, and tephrochronology correlations, which have relatively large uncertainties. Because the initial data set was restricted to the upper half of the BL00-1 core, we inferred additional ages by direct correlation to the independently dated paleoclimate record from Devils Hole. We developed an age model for the entire core using statistical methods that consider both the uncertainties of the original data and that of the curve-fitting process, with a combination of our initial data set and the climate correlations as control points. This age model represents our best estimate of the chronology of deposition in Bear Lake. Because the age model contains assumptions about the correlation of Bear Lake to other climate records, the model cannot be used to address some paleoclimate questions, such as phase relationships with other areas.
Development of the TeamOBS-PPH - targeting clinical performance in postpartum hemorrhage.
Brogaard, Lise; Hvidman, Lone; Hinshaw, Kim; Kierkegaard, Ole; Manser, Tanja; Musaeus, Peter; Arafeh, Julie; Daniels, Kay I; Judy, Amy E; Uldbjerg, Niels
2018-06-01
This study aimed to develop a valid and reliable TeamOBS-PPH tool for assessing clinical performance in the management of postpartum hemorrhage (PPH). The tool was evaluated using video-recordings of teams managing PPH in both real-life and simulated settings. A Delphi panel consisting of 12 obstetricians from the UK, Norway, Sweden, Iceland, and Denmark achieved consensus on (i) the elements to include in the assessment tool, (ii) the weighting of each element, and (iii) the final tool. The validity and reliability were evaluated according to Cook and Beckman. (Level 1) Four raters scored four video-recordings of in situ simulations of PPH. (Level 2) Two raters scored 85 video-recordings of real-life teams managing patients with PPH ≥1000 mL in two Danish hospitals. (Level 3) Two raters scored 15 video-recordings of in situ simulations of PPH from a US hospital. The tool was designed with scores from 0 to 100. (Level 1) Teams of novices had a median score of 54 (95% CI 48-60), whereas experienced teams had a median score of 75 (95% CI 71-79; p < 0.001). (Level 2) The intra-rater [intra-class correlation (ICC) = 0.96] and inter-rater (ICC = 0.83) agreements for real-life PPH were strong. The tool was applicable in all cases: atony, retained placenta, and lacerations. (Level 3) The tool was easily adapted to in situ simulation settings in the USA (ICC = 0.86). The TeamOBS-PPH tool appears to be valid and reliable for assessing clinical performance in real-life and simulated settings. The tool will be shared as the free TeamOBS App. © 2018 Nordic Federation of Societies of Obstetrics and Gynecology.
ERIC Educational Resources Information Center
Hayes, John; Pulliam, Robert
A video performance monitoring system was developed by the URS/Matrix Company, under contract to the USAF Human Resources Laboratory and was evaluated experimentally in three technical training settings. Using input from 1 to 8 video cameras, the system provided a flexible combination of signal processing, direct monitor, recording and replay…
Mining temporal data sets: hypoplastic left heart syndrome case study
NASA Astrophysics Data System (ADS)
Kusiak, Andrew; Caldarone, Christopher A.; Kelleher, Michael D.; Lamb, Fred S.; Persoon, Thomas J.; Gan, Yuan; Burns, Alex
2003-03-01
Hypoplastic left heart syndrome (HLHS) affects infants and is uniformly fatal without surgery. Post-surgery mortality rates are highly variable and dependent on postoperative management. The high mortality after the first stage surgery usually occurs within the first few days after procedure. Typically, the deaths are attributed to the unstable balance between the pulmonary and systemic circulations. An experienced team of physicians, nurses, and therapists is required to successfully manage the infant. However, even the most experienced teams report significant mortality due to the extremely complex relationships among physiologic parameters in a given patient. A data acquisition system was developed for the simultaneous collection of 73 physiologic, laboratory, and nurse-assessed variables. Data records were created at intervals of 30 seconds. An expert-validated wellness score was computed for each data record. A training data set consisting of over 5000 data records from multiple patients was collected. Preliminary results demonstratd that the knowledge discovery approach was over 94.57% accurate in predicting the "wellness score" of an infant. The discovered knowledge can improve care of complex patients by development of an intelligent simulator that can be used to support decisions.
Two-dimensional seismic velocity models of southern Taiwan from TAIGER transects
NASA Astrophysics Data System (ADS)
McIntosh, K. D.; Kuochen, H.; Van Avendonk, H. J.; Lavier, L. L.; Wu, F. T.; Okaya, D. A.
2013-12-01
We use a broad combination of wide-angle seismic data sets to develop high-resolution crustal-scale, two-dimensional, velocity models across southern Taiwan and the adjacent Huatung Basin. The data were recorded primarily during the TAIGER project and include records of thousands of marine airgun shots, several land explosive sources, and ~90 Earthquakes. Both airgun sources and earthquake data were recorded by dense land arrays, and ocean bottom seismographs (OBS) recorded airgun sources east of Taiwan. This combination of data sets enables us to develop a high-resolution upper- to mid-crustal model defined by marine and explosive sources, while also constraining the full crustal structure - with depths approaching 50 km - by using the earthquake and explosive sources. These data and the resulting models are particularly important for understanding the development of arc-continent collision in Taiwan. McIntosh et al. (2013) have shown that highly extended continental crust of the northeastern South China Sea rifted margin is underthrust at the Manila trench southwest of Taiwan but then is structurally underplated to the accretionary prism. This process of basement accretion is confirmed in the southern Central Range of Taiwan where basement outcrops can be directly linked to high seismic velocities measured in the accretionary prism well south of the continental shelf, even south of Taiwan. These observations indicate that the southern Central Range begins to grow well before there is any direct interaction between the North Luzon arc and the Eurasian continent. Our transects provide information on how the accreted mass behaves as it approaches the continental shelf and on deformation of the arc and forearc as this occurs. We suggest that arc-continent collision in Taiwan actually develops as arc-prism-continent collision.
Detection of Erroneous Payments Utilizing Supervised And Unsupervised Data Mining Techniques
2004-09-01
will look at which statistical analysis technique will work best in developing and enhancing existing erroneous payment models . Chapter I and II... payment models that are used for selection of records to be audited. The models are set up such that if two or more records have the same payment...Identification Number, Invoice Number and Delivery Order Number are not compared. The DM0102 Duplicate Payment Model will be analyzed in this thesis
Evaluation and implementation of chemotherapy regimen validation in an electronic health record.
Diaz, Amber H; Bubalo, Joseph S
2014-12-01
Computerized provider order entry of chemotherapy regimens is quickly becoming the standard for prescribing chemotherapy in both inpatient and ambulatory settings. One of the difficulties with implementation of chemotherapy regimen computerized provider order entry lies in verifying the accuracy and completeness of all regimens built in the system library. Our goal was to develop, implement, and evaluate a process for validating chemotherapy regimens in an electronic health record. We describe our experience developing and implementing a process for validating chemotherapy regimens in the setting of a standard, commercially available computerized provider order entry system. The pilot project focused on validating chemotherapy regimens in the adult inpatient oncology setting and adult ambulatory hematologic malignancy setting. A chemotherapy regimen validation process was defined as a result of the pilot project. Over a 27-week pilot period, 32 chemotherapy regimens were validated using the process we developed. Results of the study suggest that by validating chemotherapy regimens, the amount of time spent by pharmacists in daily chemotherapy review was decreased. In addition, the number of pharmacist modifications required to make regimens complete and accurate were decreased. Both physician and pharmacy disciplines showed improved satisfaction and confidence levels with chemotherapy regimens after implementation of the validation system. Chemotherapy regimen validation required a considerable amount of planning and time but resulted in increased pharmacist efficiency and improved provider confidence and satisfaction. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Academic Research Record-Keeping: Best Practices for Individuals, Group Leaders, and Institutions
Schreier, Alan A.; Wilson, Kenneth; Resnik, David
2014-01-01
During the last half of the 20th century, social and technological changes in academic research groups have challenged traditional research record-keeping practices, making them either insufficient or obsolete. New practices have developed but standards (best practices) are still evolving. Based on the authors’ review and analysis of a number of sources, they present a set of systematically compiled best practices for research record-keeping for academic research groups. These best practices were developed as an adjunct to a research project on research ethics aimed at examining the actual research record-keeping practices of active academic scientists and their impact on research misconduct inquiries. The best practices differentiate and provide separate standards for three different levels within the university: the individual researcher, the research group leader, and the department/institution. They were developed using a combination of literature reviews, surveys of university integrity officials, focus groups of active researchers, and inspection of university policies on research record-keeping. The authors believe these best practices constitute a “snapshot” of the current normative standards for research records within the academic research community. They are offered as ethical and practical guidelines subject to continuing evolution and not as absolute rules. They may be especially useful in training the next generation of researchers. PMID:16377817
Ionospheric Signatures in Radio Occultation Data
NASA Technical Reports Server (NTRS)
Mannucci, Anthony J.; Ao, Chi; Iijima, Byron A.; Kursinkski, E. Robert
2012-01-01
We can extend robustly the radio occultation data record by 6 years (+60%) by developing a singlefrequency processing method for GPS/MET data. We will produce a calibrated data set with profile-byprofile data characterization to determine robust upper bounds on ionospheric bias. Part of an effort to produce a calibrated RO data set addressing other key error sources such as upper boundary initialization. Planned: AIRS-GPS water vapor cross validation (water vapor climatology and trends).
Developing a data dictionary for the irish nursing minimum dataset.
Henry, Pamela; Mac Neela, Pádraig; Clinton, Gerard; Scott, Anne; Treacy, Pearl; Butler, Michelle; Hyde, Abbey; Morris, Roisin; Irving, Kate; Byrne, Anne
2006-01-01
One of the challenges in health care in Ireland is the relatively slow acceptance of standardised clinical information systems. Yet the national Irish health reform programme indicates that an Electronic Health Care Record (EHCR) will be implemented on a phased basis. [3-5]. While nursing has a key role in ensuring the quality and comparability of health information, the so- called 'invisibility' of some nursing activities makes this a challenging aim to achieve [3-5]. Any integrated health care system requires the adoption of uniform standards for electronic data exchange [1-2]. One of the pre-requisites for uniform standards is the composition of a data dictionary. Inadequate definition of data elements in a particular dataset hinders the development of an integrated data depository or electronic health care record (EHCR). This paper outlines how work on the data dictionary for the Irish Nursing Minimum Dataset (INMDS) has addressed this issue. Data set elements were devised on the basis of a large scale empirical research programme. ISO 18104, the reference terminology for nursing [6], was used to cross-map the data set elements with semantic domains, categories and links and data set items were dissected.
The current state of cancer family history collection tools in primary care: a systematic review.
Qureshi, Nadeem; Carroll, June C; Wilson, Brenda; Santaguida, Pasqualina; Allanson, Judith; Brouwers, Melissa; Raina, Parminder
2009-07-01
Systematic collection of family history is a prerequisite for identifying genetic risk. This study reviewed tools applicable to the primary care assessment of family history of breast, colorectal, ovarian, and prostate cancer. MEDLINE, EMBASE, CINAHL, and Cochrane Central were searched for publications. All primary study designs were included. Characteristics of the studies, the family history collection tools, and the setting were evaluated. Of 40 eligible studies, 18 relevant family history tools were identified, with 11 developed for use in primary care. Most collected information on more than one cancer and on affected relatives used self-administered questionnaires and paper-based formats. Eleven tools had been evaluated relative to current practice, demonstrating 46-78% improvement in data recording over family history recording in patient charts and 75-100% agreement with structured genetic interviews. Few tools have been developed specifically for primary care settings. The few that have been evaluated performed well. The very limited evidence, which depends in part on extrapolation from studies in settings other than primary care, suggests that systematic tools may add significant family health information compared with current primary care practice. The effect of their use on health outcomes has not been evaluated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilchrist, Kristin H., E-mail: kgilchrist@rti.org; Lewis, Gregory F.; Gay, Elaine A.
Microelectrode arrays (MEAs) recording extracellular field potentials of human-induced pluripotent stem cell-derived cardiomyocytes (hiPS-CM) provide a rich data set for functional assessment of drug response. The aim of this work is the development of a method for a systematic analysis of arrhythmia using MEAs, with emphasis on the development of six parameters accounting for different types of cardiomyocyte signal irregularities. We describe a software approach to carry out such analysis automatically including generation of a heat map that enables quick visualization of arrhythmic liability of compounds. We also implemented signal processing techniques for reliable extraction of the repolarization peak formore » field potential duration (FPD) measurement even from recordings with low signal to noise ratios. We measured hiPS-CM's on a 48 well MEA system with 5 minute recordings at multiple time points (0.5, 1, 2 and 4 h) after drug exposure. We evaluated concentration responses for seven compounds with a combination of hERG, QT and clinical proarrhythmia properties: Verapamil, Ranolazine, Flecainide, Amiodarone, Ouabain, Cisapride, and Terfenadine. The predictive utility of MEA parameters as surrogates of these clinical effects were examined. The beat rate and FPD results exhibited good correlations with previous MEA studies in stem cell derived cardiomyocytes and clinical data. The six-parameter arrhythmia assessment exhibited excellent predictive agreement with the known arrhythmogenic potential of the tested compounds, and holds promise as a new method to predict arrhythmic liability. - Highlights: • Six parameters describing arrhythmia were defined and measured for known compounds. • Software for efficient parameter extraction from large MEA data sets was developed. • The proposed cellular parameter set is predictive of clinical drug proarrhythmia.« less
[Essential data set's archetypes for nursing care of endometriosis patients].
Spigolon, Dandara Novakowski; Moro, Claudia Maria Cabral
2012-12-01
This study aimed to develop an Essential Data Set for Nursing Care of Patients with Endometriosis (CDEEPE), represented by archetypes. An exploratory applied research with specialists' participation that was carried out at Heath Informatics Laboratory of PUCPR, between February and November of 2010. It was divided in two stages: CDEEPE construction and evaluation including Nursing Process phases and Basic Human Needs, and archetypes development based on this data set. CDEEPE was evaluated by doctors and nurses with 95.9% of consensus and containing 51 data items. The archetype "Perception of Organs and Senses" was created to represents this data set. This study allowed identifying important information for nursing practices contributing to computerization and application of nursing process during care. The CDEEPE was the basis for archetype creation, that will make possible structured, organized, efficient, interoperable, and semantics records.
Gold, Rachel; Cottrell, Erika; Bunce, Arwen; Middendorf, Mary; Hollombe, Celine; Cowburn, Stuart; Mahr, Peter; Melgar, Gerardo
2017-01-01
"Social determinants of heath" (SDHs) are nonclinical factors that profoundly affect health. Helping community health centers (CHCs) document patients' SDH data in electronic health records (EHRs) could yield substantial health benefits, but little has been reported about CHCs' development of EHR-based tools for SDH data collection and presentation. We worked with 27 diverse CHC stakeholders to develop strategies for optimizing SDH data collection and presentation in their EHR, and approaches for integrating SDH data collection and the use of those data (eg, through referrals to community resources) into CHC workflows. We iteratively developed a set of EHR-based SDH data collection, summary, and referral tools for CHCs. We describe considerations that arose while developing the tools and present some preliminary lessons learned. Standardizing SDH data collection and presentation in EHRs could lead to improved patient and population health outcomes in CHCs and other care settings. We know of no previous reports of processes used to develop similar tools. This article provides an example of 1 such process. Lessons from our process may be useful to health care organizations interested in using EHRs to collect and act on SDH data. Research is needed to empirically test the generalizability of these lessons. © Copyright 2017 by the American Board of Family Medicine.
Ferrie, Gina M; Sky, Christy; Schutz, Paul J; Quinones, Glorieli; Breeding, Shawnlei; Plasse, Chelle; Leighty, Katherine A; Bettinger, Tammie L
2016-01-01
Incorporating technology with research is becoming increasingly important to enhance animal welfare in zoological settings. Video technology is used in the management of avian populations to facilitate efficient information collection on aspects of avian reproduction that are impractical or impossible to obtain through direct observation. Disney's Animal Kingdom(®) maintains a successful breeding colony of Northern carmine bee-eaters. This African species is a cavity nester, making their nesting behavior difficult to study and manage in an ex situ setting. After initial research focused on developing a suitable nesting environment, our goal was to continue developing methods to improve reproductive success and increase likelihood of chicks fledging. We installed infrared bullet cameras in five nest boxes and connected them to a digital video recording system, with data recorded continuously through the breeding season. We then scored and summarized nesting behaviors. Using remote video methods of observation provided much insight into the behavior of the birds in the colony's nest boxes. We observed aggression between birds during the egg-laying period, and therefore immediately removed all of the eggs for artificial incubation which completely eliminated egg breakage. We also used observations of adult feeding behavior to refine chick hand-rearing diet and practices. Although many video recording configurations have been summarized and evaluated in various reviews, we found success with the digital video recorder and infrared cameras described here. Applying emerging technologies to cavity nesting avian species is a necessary addition to improving management in and sustainability of zoo avian populations. © 2015 Wiley Periodicals, Inc.
Analysis of male volleyball players' motor activities during a top level match.
Mroczek, Dariusz; Januszkiewicz, Aleksander; Kawczyński, Adam S; Borysiuk, Zbigniew; Chmura, Jan
2014-08-01
The present study aims to assess motor activity of volleyball players using an original video recording method developed by the authors. Twenty-eight volleyball players taking part in 4 matches of the Polish Volleyball League were examined. The recorded data were analyzed in view of the mean total distance covered by volleyball players on different court positions during a match, set, and rally. The results showed that volleyball players cover the mean total distance of 1221 ± 327 m (mean ± SD) in a 3-set match, and 1757 ± 462 m in a 4-set match. A statistically significant difference (p ≤ 0.005) was found between the distance covered by the middle blockers and setters, defenders, spikers, and libero players in a match and in a set. The study revealed a tendency to lengthen the distance by the players in the final sets, which is indicative of the extended time of individual rallies. The mean distance covered in a single rally amounted to 10.92 ± 0.9 m in 4 matches (between 9.12 and 12.56 m). Considering the limited size of the field of play, volleyball players cover relatively long distances during a match and individual sets, with the shortest distance covered by middle blockers, and the longest by setters. From a practical application point of view, detailed topographic analysis of a player's movements on the court as well as precise data on the time of activity and rest breaks provide the coach with valuable information on the ways of development of arrhythmic, changing and dynamic training loads.
Clinical Nursing Records Study
1991-08-01
In-depth assessment of current AMEDD nursing documentation system used in fixed facilities; 2 - 4) development, implementation and assessment of...used in fixed facilities to: a) identify system problems; b) identify potential solutions to problems; c) set priorities fc problem resolution; d...enhance compatibility between any " hard copy" forms the group might develop and automation requirements. Discussions were also held with personnel from
USDA-ARS?s Scientific Manuscript database
The parasympathetic nervous system (PS) influences are critical in the autonomic control of the heart. To examine how early postnatal diet affects PS development, we used a measure of tonic PS control of cardiac activity, vagal tone, derived from resting heart rate recordings in 158 breastfed (BF), ...
Target Detection Routine (TADER). User’s Guide.
1987-09-01
o System range capability subset (one record - omitted for standoff SLAR and penetrating system) o System inherent detection probability subset ( IELT ...records, i.e., one per element type) * System capability modifier subset/A=1, E=1 ( IELT records) o System capability modifier subset/A=1, E=2 ( IELT ...records) s System capability modifier subset/A=2, E=1 ( IELT records) o System capability modifier subset/A=2, E=2 ( IELT records) Unit Data Set (one set
Developing Multi-Voice Speech Recognition Confidence Measures and Applying Them to AHLTA-Mobile
2011-05-01
target application, then only the phoneme models used in that application’s command set need be adapted. For the purpose of the recorder app , I opted...and solve if. We also plan on creating a simplified civilian version of the recorder for iPhone and Android . Conclusion: First, speaker search...pushed forward to the field hospital before the injured soldier arrives. It is not onerous to play all of them. Trouble Shooting: You say “Blood
Accident investigation: Analysis of aircraft motions from ATC radar recordings
NASA Technical Reports Server (NTRS)
Wingrove, R. C.
1976-01-01
A technique was developed for deriving time histories of an aircraft's motion from air traffic control (ATC) radar records. This technique uses the radar range and azimuth data, along with the downlinked altitude data (from an onboard Mode-C transponder), to derive an expanded set of data which includes airspeed, lift, thrust-drag, attitude angles (pitch, roll, and heading), etc. This method of analyzing aircraft motions was evaluated through flight experiments which used the CV-990 research aircraft and recordings from both the enroute and terminal ATC radar systems. The results indicate that the values derived from the ATC radar records are for the most part in good agreement with the corresponding values obtained from airborne measurements. In an actual accident, this analysis of ATC radar records can complement the flight-data recorders, now onboard airliners, and provide a source of recorded information for other types of aircraft that are equipped with Mode-C transponders but not with onboard recorders.
Tetherless ergonomics workstation to assess nurses' physical workload in a clinical setting.
Smith, Warren D; Nave, Michael E; Hreljac, Alan P
2011-01-01
Nurses are at risk of physical injury when moving immobile patients. This paper describes the development and testing of a tetherless ergonomics workstation that is suitable for studying nurses' physical workload in a clinical setting. The workstation uses wearable sensors to record multiple channels of body orientation and muscle activity and wirelessly transmits them to a base station laptop computer for display, storage, and analysis. In preparation for use in a clinical setting, the workstation was tested in a laboratory equipped for multi-camera video motion analysis. The testing included a pilot study of the effect of bed height on student nurses' physical workload while they repositioned a volunteer posing as a bedridden patient toward the head of the bed. Each nurse subject chose a preferred bed height, and data were recorded, in randomized order, with the bed at this height, at 0.1 m below this height, and at 0.1 m above this height. The testing showed that the body orientation recordings made by the wearable sensors agreed closely with those obtained from the video motion analysis system. The pilot study showed the following trends: As the bed height was raised, the nurses' trunk flexion at both thoracic and lumbar sites and lumbar muscle effort decreased, whereas trapezius and deltoid muscle effort increased. These trends will be evaluated by further studies of practicing nurses in the clinical setting.
Electronic Health Record Design and Implementation for Pharmacogenomics: a Local Perspective
Peterson, Josh F.; Bowton, Erica; Field, Julie R.; Beller, Marc; Mitchell, Jennifer; Schildcrout, Jonathan; Gregg, William; Johnson, Kevin; Jirjis, Jim N; Roden, Dan M.; Pulley, Jill M.; Denny, Josh C.
2014-01-01
Purpose The design of electronic health records (EHR) to translate genomic medicine into clinical care is crucial to successful introduction of new genomic services, yet there are few published guides to implementation. Methods The design, implemented features, and evolution of a locally developed EHR that supports a large pharmacogenomics program at a tertiary care academic medical center was tracked over a 4-year development period. Results Developers and program staff created EHR mechanisms for ordering a pharmacogenomics panel in advance of clinical need (preemptive genotyping) and in response to a specific drug indication. Genetic data from panel-based genotyping were sequestered from the EHR until drug-gene interactions (DGIs) met evidentiary standards and deemed clinically actionable. A service to translate genotype to predicted drug response phenotype populated a summary of DGIs, triggered inpatient and outpatient clinical decision support, updated laboratory records, and created gene results within online personal health records. Conclusion The design of a locally developed EHR supporting pharmacogenomics has generalizable utility. The challenge of representing genomic data in a comprehensible and clinically actionable format is discussed along with reflection on the scalability of the model to larger sets of genomic data. PMID:24009000
Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee
2016-07-01
In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research.
An ontology-based method for secondary use of electronic dental record data
Schleyer, Titus KL; Ruttenberg, Alan; Duncan, William; Haendel, Melissa; Torniai, Carlo; Acharya, Amit; Song, Mei; Thyvalikakath, Thankam P.; Liu, Kaihong; Hernandez, Pedro
A key question for healthcare is how to operationalize the vision of the Learning Healthcare System, in which electronic health record data become a continuous information source for quality assurance and research. This project presents an initial, ontology-based, method for secondary use of electronic dental record (EDR) data. We defined a set of dental clinical research questions; constructed the Oral Health and Disease Ontology (OHD); analyzed data from a commercial EDR database; and created a knowledge base, with the OHD used to represent clinical data about 4,500 patients from a single dental practice. Currently, the OHD includes 213 classes and reuses 1,658 classes from other ontologies. We have developed an initial set of SPARQL queries to allow extraction of data about patients, teeth, surfaces, restorations and findings. Further work will establish a complete, open and reproducible workflow for extracting and aggregating data from a variety of EDRs for research and quality assurance. PMID:24303273
Activity Catalog Tool (ACT) user manual, version 2.0
NASA Technical Reports Server (NTRS)
Segal, Leon D.; Andre, Anthony D.
1994-01-01
This report comprises the user manual for version 2.0 of the Activity Catalog Tool (ACT) software program, developed by Leon D. Segal and Anthony D. Andre in cooperation with NASA Ames Aerospace Human Factors Research Division, FLR branch. ACT is a software tool for recording and analyzing sequences of activity over time that runs on the Macintosh platform. It was designed as an aid for professionals who are interested in observing and understanding human behavior in field settings, or from video or audio recordings of the same. Specifically, the program is aimed at two primary areas of interest: human-machine interactions and interactions between humans. The program provides a means by which an observer can record an observed sequence of events, logging such parameters as frequency and duration of particular events. The program goes further by providing the user with a quantified description of the observed sequence, through application of a basic set of statistical routines, and enables merging and appending of several files and more extensive analysis of the resultant data.
MedEx: a medication information extraction system for clinical narratives
Stenner, Shane P; Doan, Son; Johnson, Kevin B; Waitman, Lemuel R; Denny, Joshua C
2010-01-01
Medication information is one of the most important types of clinical data in electronic medical records. It is critical for healthcare safety and quality, as well as for clinical research that uses electronic medical record data. However, medication data are often recorded in clinical notes as free-text. As such, they are not accessible to other computerized applications that rely on coded data. We describe a new natural language processing system (MedEx), which extracts medication information from clinical notes. MedEx was initially developed using discharge summaries. An evaluation using a data set of 50 discharge summaries showed it performed well on identifying not only drug names (F-measure 93.2%), but also signature information, such as strength, route, and frequency, with F-measures of 94.5%, 93.9%, and 96.0% respectively. We then applied MedEx unchanged to outpatient clinic visit notes. It performed similarly with F-measures over 90% on a set of 25 clinic visit notes. PMID:20064797
13 CFR 314.2 - Federal Interest.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 314.2 Business Credit and Assistance ECONOMIC DEVELOPMENT ADMINISTRATION, DEPARTMENT OF COMMERCE..., with Investment Assistance shall be held in trust by the Recipient for the benefit of the Project for..., statement or other recordable instrument setting forth EDA's Property interest in a Project (e.g., a...
Slack, J.R.; Landwehr, Jurate Maciunas
1992-01-01
Records of streamflow can provide an account of climatic variation over a hydrologic basin. The ability to do so is conditioned on the absence of confounding factors that diminish the climate signal. A national data set of streamflow records that are relatively free of confounding anthropogenic influences has been developed for the purpose of studying the variation in surface-water conditions throughout the United States. Records in the U.S. Geological Survey (USGS) National Water Storage and Retrieval System (WATSTORE) data base for active and discontinued streamflow gaging stations through water year 1988 (that is, through September 30, 1988) were reviewed jointly with data specialists in each USGS District office. The resulting collection of stations, each with its respective period of record satisfying the qualifying criteria, is called the Hydro-Climatic Data Network, or HCDN. The HCDN consists of 1,659 sites throughout the United States and its territories, totaling 73,231 water years of daily mean discharge values. For each station in the HCDN, information necessary for its identification, along with any qualifying comments about the available record and a set of descriptive watershed characteristics are provided in tabular format in this report, both on paper and on computer disk (enclosed). For each station in the HCDN, the appropriate daily mean discharge values were compiled, and statistical characteristics, including monthly mean discharges and annual mean, minimum and maximum discharges, were derived. The discharge data values are provided in a companion report.
Bouadjenek, Mohamed Reda; Verspoor, Karin; Zobel, Justin
2017-07-01
We investigate and analyse the data quality of nucleotide sequence databases with the objective of automatic detection of data anomalies and suspicious records. Specifically, we demonstrate that the published literature associated with each data record can be used to automatically evaluate its quality, by cross-checking the consistency of the key content of the database record with the referenced publications. Focusing on GenBank, we describe a set of quality indicators based on the relevance paradigm of information retrieval (IR). Then, we use these quality indicators to train an anomaly detection algorithm to classify records as "confident" or "suspicious". Our experiments on the PubMed Central collection show assessing the coherence between the literature and database records, through our algorithms, is an effective mechanism for assisting curators to perform data cleansing. Although fewer than 0.25% of the records in our data set are known to be faulty, we would expect that there are many more in GenBank that have not yet been identified. By automated comparison with literature they can be identified with a precision of up to 10% and a recall of up to 30%, while strongly outperforming several baselines. While these results leave substantial room for improvement, they reflect both the very imbalanced nature of the data, and the limited explicitly labelled data that is available. Overall, the obtained results show promise for the development of a new kind of approach to detecting low-quality and suspicious sequence records based on literature analysis and consistency. From a practical point of view, this will greatly help curators in identifying inconsistent records in large-scale sequence databases by highlighting records that are likely to be inconsistent with the literature. Copyright © 2017 Elsevier Inc. All rights reserved.
Lee, Jisan; Kim, James G Boram; Jin, Meiling; Ahn, Kiwhan; Kim, Byungjun; Kim, Sukwha; Kim, Jeongeun
2017-11-01
Healthcare consumers must be able to make decisions based on accurate health information. To assist with this, we designed and developed an integrated system connected with electronic medical records in hospitals to ensure delivery of accurate health information. The system-called the Consumer-centered Open Personal Health Record platform-is composed of two services: a portal for users with any disease and a mobile application for users with cleft lip/palate. To assess the benefits of these services, we used a quasi-experimental, pretest-posttest design, assigning participants to the portal (n = 50) and application (n = 52) groups. Both groups showed significantly increased knowledge, both objective (actual knowledge of health information) and subjective (perceived knowledge of health information), after the intervention. Furthermore, while both groups showed higher information needs satisfaction after the intervention, the application group was significantly more satisfied. Knowledge changes were more affected by participant characteristics in the application group. Our results may be due to the application's provision of specific disease information and a personalized treatment plan based on the participant and other users' data. We recommend that services connected with electronic medical records target specific diseases to provide personalized health management to patients in a hospital setting.
Building a Smartphone Seismic Network
NASA Astrophysics Data System (ADS)
Kong, Q.; Allen, R. M.
2013-12-01
We are exploring to build a new type of seismic network by using the smartphones. The accelerometers in smartphones can be used to record earthquakes, the GPS unit can give an accurate location, and the built-in communication unit makes the communication easier for this network. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. In order to build this network, we developed an application for android phones and server to record the acceleration in real time. These records can be sent back to a server in real time, and analyzed at the server. We evaluated the performance of the smartphone as a seismic recording instrument by comparing them with high quality accelerometer while located on controlled shake tables for a variety of tests, and also the noise floor test. Based on the daily human activity data recorded by the volunteers and the shake table tests data, we also developed algorithm for the smartphones to detect earthquakes from daily human activities. These all form the basis of setting up a new prototype smartphone seismic network in the near future.
New scientific ocean drilling depth record extends study of subseafloor life
NASA Astrophysics Data System (ADS)
Showstack, Randy
2012-09-01
The Japanese deep-sea drilling vessel Chikyu set a new depth record for scientific ocean drilling and core retrieval by reaching a depth of 2119.5 meters below the seafloor (mbsf) on 6 September. This is 8.5 meters deeper than the prior record, set 19 years ago. Three days later, on 9 September, Chikyu set another record by reaching a drilling depth of 2466 mbsf, the maximum depth that will be attempted during the current expedition. The 6 September record was set on day 44 of the Deep Coalbed Biosphere off Shimokita expedition, which is expedition 337 of the Integrated Ocean Drilling Program (IODP). It occurred at drilling site C0020 in the northwestern Pacific Ocean, approximately 80 kilometers northeast from Hachinohe, Japan. The expedition is scheduled to conclude on 30 September.
A device for emulating cuff recordings of action potentials propagating along peripheral nerves.
Rieger, Robert; Schuettler, Martin; Chuang, Sheng-Chih
2014-09-01
This paper describes a device that emulates propagation of action potentials along a peripheral nerve, suitable for reproducible testing of bio-potential recording systems using nerve cuff electrodes. The system is a microcontroller-based stand-alone instrument which uses established nerve and electrode models to represent neural activity of real nerves recorded with a nerve cuff interface, taking into consideration electrode impedance, voltages picked up by the electrodes, and action potential propagation characteristics. The system emulates different scenarios including compound action potentials with selectable propagation velocities and naturally occurring nerve traffic from different velocity fiber populations. Measured results from a prototype implementation are reported and compared with in vitro recordings from Xenopus Laevis frog sciatic nerve, demonstrating that the electrophysiological setting is represented to a satisfactory degree, useful for the development, optimization and characterization of future recording systems.
An experimental result of estimating an application volume by machine learning techniques.
Hasegawa, Tatsuhito; Koshino, Makoto; Kimura, Haruhiko
2015-01-01
In this study, we improved the usability of smartphones by automating a user's operations. We developed an intelligent system using machine learning techniques that periodically detects a user's context on a smartphone. We selected the Android operating system because it has the largest market share and highest flexibility of its development environment. In this paper, we describe an application that automatically adjusts application volume. Adjusting the volume can be easily forgotten because users need to push the volume buttons to alter the volume depending on the given situation. Therefore, we developed an application that automatically adjusts the volume based on learned user settings. Application volume can be set differently from ringtone volume on Android devices, and these volume settings are associated with each specific application including games. Our application records a user's location, the volume setting, the foreground application name and other such attributes as learning data, thereby estimating whether the volume should be adjusted using machine learning techniques via Weka.
Separation and reconstruction of BCG and EEG signals during continuous EEG and fMRI recordings
Xia, Hongjing; Ruan, Dan; Cohen, Mark S.
2014-01-01
Despite considerable effort to remove it, the ballistocardiogram (BCG) remains a major artifact in electroencephalographic data (EEG) acquired inside magnetic resonance imaging (MRI) scanners, particularly in continuous (as opposed to event-related) recordings. In this study, we have developed a new Direct Recording Prior Encoding (DRPE) method to extract and separate the BCG and EEG components from contaminated signals, and have demonstrated its performance by comparing it quantitatively to the popular Optimal Basis Set (OBS) method. Our modified recording configuration allows us to obtain representative bases of the BCG- and EEG-only signals. Further, we have developed an optimization-based reconstruction approach to maximally incorporate prior knowledge of the BCG/EEG subspaces, and of the signal characteristics within them. Both OBS and DRPE methods were tested with experimental data, and compared quantitatively using cross-validation. In the challenging continuous EEG studies, DRPE outperforms the OBS method by nearly sevenfold in separating the continuous BCG and EEG signals. PMID:25002836
The development of the Project NetWork administrative records database for policy evaluation.
Rupp, K; Driessen, D; Kornfeld, R; Wood, M
1999-01-01
This article describes the development of SSA's administrative records database for the Project NetWork return-to-work experiment targeting persons with disabilities. The article is part of a series of papers on the evaluation of the Project NetWork demonstration. In addition to 8,248 Project NetWork participants randomly assigned to receive case management services and a control group, the simulation identified 138,613 eligible nonparticipants in the demonstration areas. The output data files contain detailed monthly information on Supplemental Security Income (SSI) and Disability Insurance (DI) benefits, annual earnings, and a set of demographic and diagnostic variables. The data allow for the measurement of net outcomes and the analysis of factors affecting participation. The results suggest that it is feasible to simulate complex eligibility rules using administrative records, and create a clean and edited data file for a comprehensive and credible evaluation. The study shows that it is feasible to use administrative records data for selecting control or comparison groups in future demonstration evaluations.
NASA Astrophysics Data System (ADS)
Ouellette, G., Jr.; DeLong, K. L.
2016-02-01
High-resolution proxy records of sea surface temperature (SST) are increasingly being produced using trace element and isotope variability within the skeletal materials of marine organisms such as corals, mollusks, sclerosponges, and coralline algae. Translating the geochemical variations within these organisms into records of SST requires calibration with SST observations using linear regression methods, preferably with in situ SST records that span several years. However, locations with such records are sparse; therefore, calibration is often accomplished using gridded SST data products such as the Hadley Center's HADSST (5º) and interpolated HADISST (1º) data sets, NOAA's extended reconstructed SST data set (ERSST; 2º), optimum interpolation SST (OISST; 1º), and Kaplan SST data sets (5º). From these data products, the SST used for proxy calibration is obtained for a single grid cell that includes the proxy's study site. The gridded data sets are based on the International Comprehensive Ocean-Atmosphere Data Set (ICOADS) and each uses different methods of interpolation to produce the globally and temporally complete data products except for HadSST, which is not interpolated but quality controlled. This study compares SST for a single site from these gridded data products with a high-resolution satellite-based SST data set from NOAA (Pathfinder; 4 km) with in situ SST data and coral Sr/Ca variability for our study site in Haiti to assess differences between these SST records with a focus on seasonal variability. Our results indicate substantial differences in the seasonal variability captured for the same site among these data sets on the order of 1-3°C. This analysis suggests that of the data products, high-resolution satellite SST best captured seasonal variability at the study site. Unfortunately, satellite SST records are limited to the past few decades. If satellite SST are to be used to calibrate proxy records, collecting modern, living samples is desirable.
Wald, Hedy S; George, Paul; Reis, Shmuel P; Taylor, Julie Scott
2014-03-01
While electronic health record (EHR) use is becoming state-of-the-art, deliberate teaching of health care information technology (HCIT) competencies is not keeping pace with burgeoning use. Medical students require training to become skilled users of HCIT, but formal pedagogy within undergraduate medical education (UME) is sparse. How can medical educators best meet the needs of learners while integrating EHRs into medical education and practice? How can they help learners preserve and foster effective communication skills within the computerized setting? In general, how can UME curricula be devised for skilled use of EHRs to enhance rather than hinder provision of effective, humanistic health care?Within this Perspective, the authors build on recent publications that "set the stage" for next steps: EHR curricula innovation and implementation as concrete embodiments of theoretical underpinnings. They elaborate on previous calls for maximizing benefits and minimizing risks of EHR use with sufficient focus on physician-patient communication skills and for developing core competencies within medical education. The authors describe bridging theory into practice with systematic longitudinal curriculum development for EHR training in UME at their institution, informed by Kern and colleagues' curriculum development framework, narrative medicine, and reflective practice. They consider this innovation within a broader perspective-the overarching goal of empowering undergraduate medical students' patient- and relationship-centered skills while effectively demonstrating HCIT-related skills.
Development of a tethered personal health record framework for early end-of-life discussions.
Bose-Brill, Seuli; Kretovics, Matthew; Ballenger, Taylor; Modan, Gabriella; Lai, Albert; Belanger, Lindsay; Koesters, Stephen; Pressler-Vydra, Taylor; Wills, Celia
2016-06-01
End-of-life planning, known as advance care planning (ACP), is associated with numerous positive outcomes, such as improved patient satisfaction with care and improved patient quality of life in terminal illness. However, patient-provider ACP conversations are rarely performed or documented due to a number of barriers, including time required, perceived lack of skill, and a limited number of resources. Use of tethered personal health records (PHRs) may help streamline ACP conversations and documentations for outpatient workflows. Our objective was to develop an ACP-PHR framework that would be for use in a primary care, outpatient setting. Qualitative content analysis of focus groups and cognitive interviews (participatory design). A novel PHR-ACP tool was developed and tested using data and feedback collected from 4 patient focus groups (n = 13), 1 provider focus group (n = 4), and cognitive interviews (n = 22). Patient focus groups helped develop a focused, 4-question PHR communication tool. Cognitive interviews revealed that, while patients felt framework content and workflow were generally intuitive, minor changes to content and workflow would optimize the framework. A focused framework for electronic ACP communication using a patient portal tethered to the PHR was developed. This framework may provide an efficient way to have ACP conversations in busy outpatient settings.
13 CFR 130.830 - Audits and investigations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Audits and investigations. 130.830 Section 130.830 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS DEVELOPMENT CENTERS § 130.830 Audits and investigations. (a) Access to records. Applicable OMB Circulars set forth the...
13 CFR 130.830 - Audits and investigations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Audits and investigations. 130.830 Section 130.830 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION SMALL BUSINESS DEVELOPMENT CENTERS § 130.830 Audits and investigations. (a) Access to records. Applicable OMB Circulars set forth the...
RESTful Services Guidance for Developers v 1.0
2010-04-01
storing DDMS records. Tagging – Tagging enables the provider of information to associate a set of keywords or “tags” to content. Folksonomies ...or Collaborative Tagging – Folksonomies allow multiple users to attach their own tags to content. Content that gets associated to the same tag by
Methods to approximate reliabilities in single-step genomic evaluation
USDA-ARS?s Scientific Manuscript database
Reliability of predictions from single-step genomic BLUP (ssGBLUP) can be calculated by inversion, but that is not feasible for large data sets. Two methods of approximating reliability were developed based on decomposition of a function of reliability into contributions from records, pedigrees, and...
Ahalt, Cyrus; Binswanger, Ingrid A; Steinman, Michael; Tulsky, Jacqueline; Williams, Brie A
2012-02-01
Incarceration is associated with poor health and high costs. Given the dramatic growth in the criminal justice system's population and associated expenses, inclusion of questions related to incarceration in national health data sets could provide essential data to researchers, clinicians and policy-makers. To evaluate a representative sample of publically available national health data sets for their ability to be used to study the health of currently or formerly incarcerated persons and to identify opportunities to improve criminal justice questions in health data sets. DESIGN & APPROACH: We reviewed the 36 data sets from the Society of General Internal Medicine Dataset Compendium related to individual health. Through content analysis using incarceration-related keywords, we identified data sets that could be used to study currently or formerly incarcerated persons, and we identified opportunities to improve the availability of relevant data. While 12 (33%) data sets returned keyword matches, none could be used to study incarcerated persons. Three (8%) could be used to study the health of formerly incarcerated individuals, but only one data set included multiple questions such as length of incarceration and age at incarceration. Missed opportunities included: (1) data sets that included current prisoners but did not record their status (10, 28%); (2) data sets that asked questions related to incarceration but did not specifically record a subject's status as formerly incarcerated (8, 22%); and (3) longitudinal studies that dropped and/or failed to record persons who became incarcerated during the study (8, 22%). Few health data sets can be used to evaluate the association between incarceration and health. Three types of changes to existing national health data sets could substantially expand the available data, including: recording incarceration status for study participants who are incarcerated; recording subjects' history of incarceration when this data is already being collected; and expanding incarceration-related questions in studies that already record incarceration history.
Barasz, Kate; John, Leslie K; Keenan, Elizabeth A; Norton, Michael I
2017-10-01
Pseudo-set framing-arbitrarily grouping items or tasks together as part of an apparent "set"-motivates people to reach perceived completion points. Pseudo-set framing changes gambling choices (Study 1), effort (Studies 2 and 3), giving behavior (Field Data and Study 4), and purchase decisions (Study 5). These effects persist in the absence of any reward, when a cost must be incurred, and after participants are explicitly informed of the arbitrariness of the set. Drawing on Gestalt psychology, we develop a conceptual account that predicts what will-and will not-act as a pseudo-set, and defines the psychological process through which these pseudo-sets affect behavior: over and above typical reference points, pseudo-set framing alters perceptions of (in)completeness, making intermediate progress seem less complete. In turn, these feelings of incompleteness motivate people to persist until the pseudo-set has been fulfilled. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Evaluating a scalable model for implementing electronic health records in resource-limited settings.
Were, Martin C; Emenyonu, Nneka; Achieng, Marion; Shen, Changyu; Ssali, John; Masaba, John P M; Tierney, William M
2010-01-01
Current models for implementing electronic health records (EHRs) in resource-limited settings may not be scalable because they fail to address human-resource and cost constraints. This paper describes an implementation model which relies on shared responsibility between local sites and an external three-pronged support infrastructure consisting of: (1) a national technical expertise center, (2) an implementer's community, and (3) a developer's community. This model was used to implement an open-source EHR in three Ugandan HIV-clinics. Pre-post time-motion study at one site revealed that Primary Care Providers spent a third less time in direct and indirect care of patients (p<0.001) and 40% more time on personal activities (p=0.09) after EHRs implementation. Time spent by previously enrolled patients with non-clinician staff fell by half (p=0.004) and with pharmacy by 63% (p<0.001). Surveyed providers were highly satisfied with the EHRs and its support infrastructure. This model offers a viable approach for broadly implementing EHRs in resource-limited settings.
Fadoo, Zehra; Nisar, Muhammad I; Iftikhar, Raza; Ali, Sajida; Mushtaq, Naureen; Sayani, Raza
2015-10-01
Peripherally inserted central venous catheters (PICC) have been successfully used to provide central access for chemotherapy and frequent transfusions. The purpose of this study was to assess the feasibility of PICCs and determine PICC-related complications in pediatric hematology/oncology patients in a resource-poor setting. All pediatric patients (age below 16 y) with hematologic and malignant disorders who underwent PICC line insertion at Aga Khan University Hospital from January 2008 to June 2010 were enrolled in the study. Demographic features, primary diagnosis, catheter days, complications, and reasons for removal of device were recorded. Total of 36 PICC lines were inserted in 32 pediatric patients. Complication rate of 5.29/1000 catheter days was recorded. Our study showed comparable complication profile such as infection rate, occlusion, breakage, and dislodgement. The median catheter life was found to be 69 days. We conclude that PICC lines are feasible in a resource-poor setting and recommend its use for chemotherapy administration and prolonged venous access.
Barnado, April; Casey, Carolyn; Carroll, Robert J; Wheless, Lee; Denny, Joshua C; Crofford, Leslie J
2017-05-01
To study systemic lupus erythematosus (SLE) in the electronic health record (EHR), we must accurately identify patients with SLE. Our objective was to develop and validate novel EHR algorithms that use International Classification of Diseases, Ninth Revision (ICD-9), Clinical Modification codes, laboratory testing, and medications to identify SLE patients. We used Vanderbilt's Synthetic Derivative, a de-identified version of the EHR, with 2.5 million subjects. We selected all individuals with at least 1 SLE ICD-9 code (710.0), yielding 5,959 individuals. To create a training set, 200 subjects were randomly selected for chart review. A subject was defined as a case if diagnosed with SLE by a rheumatologist, nephrologist, or dermatologist. Positive predictive values (PPVs) and sensitivity were calculated for combinations of code counts of the SLE ICD-9 code, a positive antinuclear antibody (ANA), ever use of medications, and a keyword of "lupus" in the problem list. The algorithms with the highest PPV were each internally validated using a random set of 100 individuals from the remaining 5,759 subjects. The algorithm with the highest PPV at 95% in the training set and 91% in the validation set was 3 or more counts of the SLE ICD-9 code, ANA positive (≥1:40), and ever use of both disease-modifying antirheumatic drugs and steroids, while excluding individuals with systemic sclerosis and dermatomyositis ICD-9 codes. We developed and validated the first EHR algorithm that incorporates laboratory values and medications with the SLE ICD-9 code to identify patients with SLE accurately. © 2016, American College of Rheumatology.
Supersampling and Network Reconstruction of Urban Mobility.
Sagarra, Oleguer; Szell, Michael; Santi, Paolo; Díaz-Guilera, Albert; Ratti, Carlo
2015-01-01
Understanding human mobility is of vital importance for urban planning, epidemiology, and many other fields that draw policies from the activities of humans in space. Despite the recent availability of large-scale data sets of GPS traces or mobile phone records capturing human mobility, typically only a subsample of the population of interest is represented, giving a possibly incomplete picture of the entire system under study. Methods to reliably extract mobility information from such reduced data and to assess their sampling biases are lacking. To that end, we analyzed a data set of millions of taxi movements in New York City. We first show that, once they are appropriately transformed, mobility patterns are highly stable over long time scales. Based on this observation, we develop a supersampling methodology to reliably extrapolate mobility records from a reduced sample based on an entropy maximization procedure, and we propose a number of network-based metrics to assess the accuracy of the predicted vehicle flows. Our approach provides a well founded way to exploit temporal patterns to save effort in recording mobility data, and opens the possibility to scale up data from limited records when information on the full system is required.
Development of a bird banding recapture database
Tautin, J.; Doherty, P.F.; Metras, L.
2001-01-01
Recaptures (and resightings) constitute the vast majority of post-release data from banded or otherwise marked nongame birds. A powerful suite of contemporary analytical models is available for using recapture data to estimate population size, survival rates and other parameters, and many banders collect recapture data for their project specific needs. However, despite widely recognized, broader programmatic needs for more and better data, banders' recapture data are not centrally reposited and made available for use by others. To address this need, the US Bird Banding Laboratory, the Canadian Bird Banding Office and the Georgia Cooperative Fish and Wildlife Research Unit are developing a bird banding recapture database. In this poster we discuss the critical steps in developing the database, including: determining exactly which recapture data should be included; developing a standard record format and structure for the database; developing electronic means for collecting, vetting and disseminating the data; and most importantly, developing metadata descriptions and individual data set profiles to facilitate the user's selection of appropriate analytical models. We provide examples of individual data sets to be included in the database, and we assess the feasibility of developing a prescribed program for obtaining recapture data from banders who do not presently collect them. It is expected that the recapture database eventually will contain millions of records made available publicly for a variety of avian research and management purposes
Sharing clinical information across care settings: the birth of an integrated assessment system
Gray, Leonard C; Berg, Katherine; Fries, Brant E; Henrard, Jean-Claude; Hirdes, John P; Steel, Knight; Morris, John N
2009-01-01
Background Population ageing, the emergence of chronic illness, and the shift away from institutional care challenge conventional approaches to assessment systems which traditionally are problem and setting specific. Methods From 2002, the interRAI research collaborative undertook development of a suite of assessment tools to support assessment and care planning of persons with chronic illness, frailty, disability, or mental health problems across care settings. The suite constitutes an early example of a "third generation" assessment system. Results The rationale and development strategy for the suite is described, together with a description of potential applications. To date, ten instruments comprise the suite, each comprising "core" items shared among the majority of instruments and "optional" items that are specific to particular care settings or situations. Conclusion This comprehensive suite offers the opportunity for integrated multi-domain assessment, enabling electronic clinical records, data transfer, ease of interpretation and streamlined training. PMID:19402891
Springall, Fiona
2018-03-21
People with learning disabilities are often marginalised in healthcare, including in hospice settings, and as a result may not receive effective end of life care. Research in hospice settings has identified that many staff lack confidence, skills and knowledge in caring for people with learning disabilities, which can have a negative effect on the care these individuals receive. To address these issues, the author has proposed a service improvement initiative, which she developed as part of her learning disability nursing degree programme. This proposed initiative aimed to enhance end of life care for people with learning disabilities through the implementation of a community learning disability link nurse in the hospice setting. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.
Sheridan, Mary; Sandall, Jane
2010-12-01
to pilot the Optimality Index-US (OI-US) for the first time within a UK maternity setting in a sample of women at mixed risk. a multidisciplinary group reviewed the items and evidence base of the OI-US. A pilot study was undertaken to compare the availability and quality of data from maternity records to complete the OI-US. Data were collected from maternity records. a maternity unit of an inner city teaching hospital in England. clinical midwives, research midwives, midwifery lecturers and consultant obstetricians (n=10) reviewed the items and evidence base of the OI-US. Data were collected from the maternity records of 97 women receiving caseload care and 103 women receiving standard care. when the multidisciplinary group reviewed the items and evidence base of the OI-US, it was noted that some social and clinical factors should be considered for inclusion as part of the Perinatal Background Index (PBI) and OI. The results suggest that the inclusion of women at higher risk in this sample within the UK maternity setting has not been captured by the OI-US. the following social and clinical factors should be included as part of the PBI and OI for the UK setting: measure of social deprivation, woman's ability to speak and understand English in relation to accessing maternity care, mental health problems during pregnancy and history of domestic violence during pregnancy availability of items in electronic records is poor and it is recommended that the OI-UK version is a useful research tool in prospective data collection. The development of an international version would be valuable for comparison of background risk and outcomes across a range of care settings. Copyright © 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Strand, Kari
2005-04-01
The 2300-2600 m thick Palaeoproterozoic East Puolanka Group within the central Fennoscandian Shield records four major transgressions on the cratonic margin within the approximate time period 2.25-2.10 Ga. Stacking of siliciclastic facies in parasequences and parasequence sets provides data to evaluate oscillation of relative sea-level and subsidence on different temporal scales. The lowermost part of the passive margin prism is characterized by alluvial plain to shallow marine sediments deposited in incised valleys. The succeeding highstand period is recorded by ca. 250 m of progradational parasequence sets of predominantly rippled and horizontally laminated sandstones, representing stacked wave-dominated shoreline units in sequence 1, capped by a hiatus or, in some places, by a subaerial lava. As relative sea-level rose again, sand-rich barrier-beach complexes developed with microtidal lagoons and inlets, corresponding to a retrogradational parasequence set. This was followed by a highstand period, with aggradation and progradation of alluvial plain and coastal sediments grading up into wave-tide influenced shoreline deposits in sequence 2. In sequence 3, the succeeding mudstones represent tidal flat deposits in a back-barrier region. With continued transgression, the parasequences stacked retrogradationally, each flooding episode being recorded by increasingly deeper water deposits above low-angle cross-bedded sandstones of the swash zones. The succeeding highstand progradation is represented by alluvial plain deposits. The next transgressive systems tract, overlying an inferred erosional ravinement surface, is recorded by a retrogradational parasequence set dominated by low-angle cross-stratified swash zone deposits in sequence 4. The large-scale trough cross-bed sets in these parasequences represent sand shoals and sheets of the inner shelf system. The overall major transgression recorded in the lowermost part of the Palaeoproterozoic cratonic margin succession was related to first- to second-order sea-level changes, probably due to increasing regional thermal subsidence of the lithosphere following partial continental breakup. The stratigraphic evolution can be related to changes of relative sea-level with a frequency of ca. 25 million years, probably propagated by episodic thermal subsidence. The parasequences identified here are related to high-frequency cycles of relative sea-level change due to low-magnitude eustatic oscillations.
Guided Text Search Using Adaptive Visual Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Symons, Christopher T; Senter, James K
This research demonstrates the promise of augmenting interactive visualizations with semi- supervised machine learning techniques to improve the discovery of significant associations and insights in the search and analysis of textual information. More specifically, we have developed a system called Gryffin that hosts a unique collection of techniques that facilitate individualized investigative search pertaining to an ever-changing set of analytical questions over an indexed collection of open-source documents related to critical national infrastructure. The Gryffin client hosts dynamic displays of the search results via focus+context record listings, temporal timelines, term-frequency views, and multiple coordinate views. Furthermore, as the analyst interactsmore » with the display, the interactions are recorded and used to label the search records. These labeled records are then used to drive semi-supervised machine learning algorithms that re-rank the unlabeled search records such that potentially relevant records are moved to the top of the record listing. Gryffin is described in the context of the daily tasks encountered at the US Department of Homeland Security s Fusion Center, with whom we are collaborating in its development. The resulting system is capable of addressing the analysts information overload that can be directly attributed to the deluge of information that must be addressed in the search and investigative analysis of textual information.« less
Haile, Michael; Anderson, Kim; Evans, Alex; Crawford, Angela
2012-01-01
In part 1 of this series, we outlined the rationale behind the development of a centralized electronic database used to maintain nonsterile compounding formulation records in the Mission Health System, which is a union of several independent hospitals and satellite and regional pharmacies that form the cornerstone of advanced medical care in several areas of western North Carolina. Hospital providers in many healthcare systems require compounded formulations to meet the needs of their patients (in particular, pediatric patients). Before a centralized electronic compounding database was implemented in the Mission Health System, each satellite or regional pharmacy affiliated with that system had a specific set of formulation records, but no standardized format for those records existed. In this article, we describe the quality control, database platform selection, description, implementation, and execution of our intranet database system, which is designed to maintain, manage, and disseminate nonsterile compounding formulation records in the hospitals and affiliated pharmacies of the Mission Health System. The objectives of that project were to standardize nonsterile compounding formulation records, create a centralized computerized database that would increase healthcare staff members' access to formulation records, establish beyond-use dates based on published stability studies, improve quality control, reduce the potential for medication errors related to compounding medications, and (ultimately) improve patient safety.
NASA Astrophysics Data System (ADS)
Niell, A.
2008-12-01
The next generation geodetic VLBI instrument is being developed with a goal of 1 mm position uncertainty in twenty-four hours. Knowing that spatial and temporal fluctuations in the atmosphere delay are a major component of the error in position determination, the VLBI2010 committee has carried out a large number of simulations to arrive at design goals for the antenna system. These goals are fast slewing antennas and high delay precision per observation. With existing and anticipated data recording capabilities, these translate to an antenna diameter of 12 m or larger and a per-observation delay precision of approximately 4 psec. The major innovation for the VLBI2010 concept that allows the use of relatively small antennas to achieve these goals is the proposal to observe in four frequency bands, instead of the two currently used, in order to gain the higher precision of phase delays compared to the group delay. The other advance that enables the use of small antennas is the significant increase in data acquisition rates that has been made possible by the development of disk-based recorders and digital back ends. To evaluate this concept, a prototype of the feed-to-recorder system has been implemented by the Broadband Development Team* on two antennas, the 5 m MV-3 antenna at Goddard Space Flight Center near Washington, D.C., and the 18 m Westford antenna at Haystack Observatory near Boston. The system includes a broadband feed and low noise amplifiers covering the range approximately 2 GHz to 13 GHz, all cooled to 20K; a newly developed phase calibration generator; a flexible local oscillator (LO) that allows selection of any band in the range of the feed/LNAs; Digital Back End; and a disk-based recorder capable of a sustained rate of 2 gigabits per second (gbps). Four sets of the LO/DBE/recorder chain are used at each antenna to give a total record rate of 8 gbps. The systems have been successfully used in the band 8.5 to 9 GHz with one set of the recorder chain. Observations demonstrating the full four-band configuration are planned for October. In this talk the results of these tests, the improvements that are anticipated for the operational VLBI2010 network, and the status of other developments in the next generation of geodetic VLBI systems will be presented. * Bruce Whittier, Mike Titus, Jason SooHoo, Dan Smythe, Alan Rogers, Jay Redmond, Mike Poirier, Chuck Kodak, Alan Hinton, Ed Himwich, Skip Gordon, Mark Evangelista, Irv Diegel, Brian Corey, Tom Clark, Chris Beaudoin (in reverse alphabetical order)
A review of electronic medical record keeping on mobile medical service trips in austere settings.
Dainton, Christopher; Chu, Charlene H
2017-02-01
Electronic medical records (EMRs) may address the need for decision and language support for Western clinicians on mobile medical service trips (MSTs) in low resource settings abroad, while providing improved access to records and data management. However, there has yet to be a review of this emerging technology used by MSTs in low-resource settings. The aim of this study is to describe EMR systems designed specifically for use by mobile MSTs in remote settings, and accordingly, determine new opportunities for this technology to improve quality of healthcare provided by MSTs. A MEDLINE, EMBASE, and Scopus/IEEE search and supplementary Google search were performed for EMR systems specific to mobile MSTs. Information was extracted regarding EMR name, organization, scope of use, platform, open source coding, commercial availability, data integration, and capacity for linguistic and decision support. Missing information was requested by email. After screening of 122 abstracts, two articles remained that discussed deployment of EMR systems in MST settings (iChart, SmartList To Go), and thirteen additional EMR systems were found through the Google search. Of these, three systems (Project Buendia, TEBOW, and University of Central Florida's internally developed EMR) are based on modified versions of Open MRS software, while three are smartphone apps (QuickChart EMR, iChart, NotesFirst). Most of the systems use a local network to manage data, while the remaining systems use opportunistic cloud synchronization. Three (TimmyCare, Basil, and Backpack EMR) contain multilingual user interfaces, and only one (QuickChart EMR) contained MST-specific clinical decision support. There have been limited attempts to tailor EMRs to mobile MSTs. Only Open MRS has a broad user base, and other EMR systems should consider interoperability and data sharing with larger systems as a priority. Several systems include tablet compatibility, or are specifically designed for smartphone, which may be helpful given the environment and low resource context. Results from this review may be useful to non-government organizations (NGOs) considering modernization of their medical records practices as EMR use facilitates research, decreases paper administration costs, and improves perceptions of professionalism; however, most MST-specific EMRs remain in their early stages, and further development and research is required before reaching the stage of widespread adoption. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Lord Kelvin's atmospheric electricity measurements
NASA Astrophysics Data System (ADS)
Aplin, Karen; Harrison, R. Giles; Trainer, Matthew; Hough, James
2013-04-01
Lord Kelvin (William Thomson), one of the greatest Victorian scientists, made a substantial but little-recognised contribution to geophysics through his work on atmospheric electricity. He developed sensitive instrumentation for measuring the atmospheric electric field, including invention of a portable electrometer, which made mobile measurements possible for the first time. Kelvin's measurements of the atmospheric electric field in 1859, made during development of the portable electrometer, can be used to deduce the substantial levels of particulate pollution blown over the Scottish island of Arran from the industrial mainland. Kelvin was also testing the electrometer during the largest solar flare ever recorded, the "Carrington event" in the late summer of 1859. Subsequently, Lord Kelvin also developed a water dropper sensor, and employed photographic techniques for "incessant recording" of the atmospheric electric field, which led to the long series of measurements recorded at UK observatories for the remainder of the 19th and much of the 20th century. These data sets have been valuable in both studies of historical pollution and cosmic ray effects on atmospheric processes.
Translation table for DOE/OSTI - COSATI bibliographic records to MARC format records
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gursky, K.; Holtkamp, I.; Landenberger, S.
1985-11-01
This report contains the recommendations of the committee for the conversion of data in OSTI fields to MARC fields. It is intended as a tool for OSTI to use in developing software that would enable DOE libraries to download OSTI records into MARC-based systems. Goal is to transfer as complete a set of data for each record as possible. No attempt was made to incorporate changes in the use of numerical tags that OSTI has made over the years. In addition, there are a few OSTI fields generated for internal OSTI use only, or that cannot be transferred into anymore » MARC field in the Books format; these OSTI fields have been designated as ''not converted'' in the table.« less
Using the Electronic Health Record in Nursing Research: Challenges and Opportunities.
Samuels, Joanne G; McGrath, Robert J; Fetzer, Susan J; Mittal, Prashant; Bourgoine, Derek
2015-10-01
Changes in the patient record from the paper to the electronic health record format present challenges and opportunities for the nurse researcher. Current use of data from the electronic health record is in a state of flux. Novel data analytic techniques and massive data sets provide new opportunities for nursing science. Realization of a strong electronic data output future relies on meeting challenges of system use and operability, data presentation, and privacy. Nurse researchers need to rethink aspects of proposal development. Joining ongoing national efforts aimed at creating usable data output is encouraged as a means to affect system design. Working to address challenges and embrace opportunities will help grow the science in a way that answers important patient care questions. © The Author(s) 2015.
The Archival Appraisal of Photographs: A RAMP Study with Guidelines.
ERIC Educational Resources Information Center
Leary, William H.
Prepared for Unesco's Records and Archives Management Programme (RAMP), this study is designed to provide archivists, manuscript and museum curators, and other interested information professionals in both industrialized and developing countries with an understanding of the archival character of photographs, and a set of guidelines for the…
Development of the Prosodic Features of Infants' Vocalizing.
ERIC Educational Resources Information Center
Lane, Harlan; Sheppard, William
Traditional research methods of recording infant verbal behavior, namely, descriptions by a single observer transcribing the utterances of a single infant in a naturalistic setting, have been inadequate to provide data necessary for modern linguistic analyses. The Center for Research on Language and Language Behavior has undertaken to correct this…
Safety and fitness electronic records system (SAFER) : draft master test plan
DOT National Transportation Integrated Search
1995-12-31
The purpose of this plan is to establish a formal set of guidelines and activities to be : adhered to and performed by JHU/APL and the developer to ensure that the SAFER System has been tested successfully and is fully compliant with the SAFER System...
DOT National Transportation Integrated Search
2014-01-01
Regression analysis techniques were used to develop a : set of equations for rural ungaged stream sites for estimating : discharges with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent : annual exceedance probabilities, which are equivalent to : ann...
Magare, Steve; Monda, Jonathan; Kamau, Onesmus; Houston, Stuart; Fraser, Hamish; Powell, John; English, Mike; Paton, Chris
2018-01-01
Background The Kenyan government, working with international partners and local organizations, has developed an eHealth strategy, specified standards, and guidelines for electronic health record adoption in public hospitals and implemented two major health information technology projects: District Health Information Software Version 2, for collating national health care indicators and a rollout of the KenyaEMR and International Quality Care Health Management Information Systems, for managing 600 HIV clinics across the country. Following these projects, a modified version of the Open Medical Record System electronic health record was specified and developed to fulfill the clinical and administrative requirements of health care facilities operated by devolved counties in Kenya and to automate the process of collating health care indicators and entering them into the District Health Information Software Version 2 system. Objective We aimed to present a descriptive case study of the implementation of an open source electronic health record system in public health care facilities in Kenya. Methods We conducted a landscape review of existing literature concerning eHealth policies and electronic health record development in Kenya. Following initial discussions with the Ministry of Health, the World Health Organization, and implementing partners, we conducted a series of visits to implementing sites to conduct semistructured individual interviews and group discussions with stakeholders to produce a historical case study of the implementation. Results This case study describes how consultants based in Kenya, working with developers in India and project stakeholders, implemented the new system into several public hospitals in a county in rural Kenya. The implementation process included upgrading the hospital information technology infrastructure, training users, and attempting to garner administrative and clinical buy-in for adoption of the system. The initial deployment was ultimately scaled back due to a complex mix of sociotechnical and administrative issues. Learning from these early challenges, the system is now being redesigned and prepared for deployment in 6 new counties across Kenya. Conclusions Implementing electronic health record systems is a challenging process in high-income settings. In low-income settings, such as Kenya, open source software may offer some respite from the high costs of software licensing, but the familiar challenges of clinical and administration buy-in, the need to adequately train users, and the need for the provision of ongoing technical support are common across the North-South divide. Strategies such as creating local support teams, using local development resources, ensuring end user buy-in, and rolling out in smaller facilities before larger hospitals are being incorporated into the project. These are positive developments to help maintain momentum as the project continues. Further integration with existing open source communities could help ongoing development and implementations of the project. We hope this case study will provide some lessons and guidance for other challenging implementations of electronic health record systems as they continue across Africa. PMID:29669709
1977-04-01
task of data organization, management, and storage has been given to a select group of specialists . These specialists (the Data Base Administrators...report writers, etc.) the task of data organi?9tion, management, and storage has been given to a select group of specialists . These specialists (the...distributed DBMS Involves first identifying a set of two or more tasks blocking each other from a collection of shared 12 records. Once the set of
Towards large-scale, human-based, mesoscopic neurotechnologies.
Chang, Edward F
2015-04-08
Direct human brain recordings have transformed the scope of neuroscience in the past decade. Progress has relied upon currently available neurophysiological approaches in the context of patients undergoing neurosurgical procedures for medical treatment. While this setting has provided precious opportunities for scientific research, it also has presented significant constraints on the development of new neurotechnologies. A major challenge now is how to achieve high-resolution spatiotemporal neural recordings at a large scale. By narrowing the gap between current approaches, new directions tailored to the mesoscopic (intermediate) scale of resolution may overcome the barriers towards safe and reliable human-based neurotechnology development, with major implications for advancing both basic research and clinical translation. Copyright © 2015 Elsevier Inc. All rights reserved.
DRUMS: Disk Repository with Update Management and Select option for high throughput sequencing data.
Nettling, Martin; Thieme, Nils; Both, Andreas; Grosse, Ivo
2014-02-04
New technologies for analyzing biological samples, like next generation sequencing, are producing a growing amount of data together with quality scores. Moreover, software tools (e.g., for mapping sequence reads), calculating transcription factor binding probabilities, estimating epigenetic modification enriched regions or determining single nucleotide polymorphism increase this amount of position-specific DNA-related data even further. Hence, requesting data becomes challenging and expensive and is often implemented using specialised hardware. In addition, picking specific data as fast as possible becomes increasingly important in many fields of science. The general problem of handling big data sets was addressed by developing specialized databases like HBase, HyperTable or Cassandra. However, these database solutions require also specialized or distributed hardware leading to expensive investments. To the best of our knowledge, there is no database capable of (i) storing billions of position-specific DNA-related records, (ii) performing fast and resource saving requests, and (iii) running on a single standard computer hardware. Here, we present DRUMS (Disk Repository with Update Management and Select option), satisfying demands (i)-(iii). It tackles the weaknesses of traditional databases while handling position-specific DNA-related data in an efficient manner. DRUMS is capable of storing up to billions of records. Moreover, it focuses on optimizing relating single lookups as range request, which are needed permanently for computations in bioinformatics. To validate the power of DRUMS, we compare it to the widely used MySQL database. The test setting considers two biological data sets. We use standard desktop hardware as test environment. DRUMS outperforms MySQL in writing and reading records by a factor of two up to a factor of 10000. Furthermore, it can work with significantly larger data sets. Our work focuses on mid-sized data sets up to several billion records without requiring cluster technology. Storing position-specific data is a general problem and the concept we present here is a generalized approach. Hence, it can be easily applied to other fields of bioinformatics.
Building professional competence in dental hygiene students through a community-based practicum.
Yoon, M N; Compton, S M
2017-11-01
As Canadians age, there is an increased need for oral health professionals specializing in services for this unique population. Dental hygiene students require exposure to this population to develop professional competencies. This study investigated the dimensions of professional competence that were developed through a practicum for dental hygiene students in long-term care settings while working with older adults. Nine dental hygiene students were recruited across two cohorts. All students completed reflective journals describing their practicum experiences. Five students also participated in an audio-recorded focus group and completed a pre-focus group questionnaire. Additionally, the practicum course coordinator completed an audio-recorded interview. Transcripts and journals were coded using a constant comparative approach and themes were identified. Students described developing client-focused skills, such as effective verbal and non-verbal communication with older adults with dementia. Context-based learning was also a large part of the competency development for the practicum students. Understanding the care environment within which these residents lived helped students to understand and empathize why oral health may not be prioritized. Students also developed an understanding of the work of other health professionals in the settings and improved their abilities to communicate with other healthcare providers. However, students recognized that the utility of those interprofessional skills in private practice may be limited. Dental hygiene students developed personal and ethical competencies during practicum that are highly transferrable across professional settings. Exposure of students to older adult populations in long-term care may increase the likelihood of dental hygienists working in this area. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Non-invasive distress evaluation in preterm newborn infants.
Manfredi, C; Bocchi, L; Orlandi, S; Calisti, M; Spaccaterra, L; Donzelli, G P
2008-01-01
With the increased survival of very preterm infants, there is a growing concern for their developmental outcomes. Infant cry characteristics reflect the development and possibly the integrity of the central nervous system. In this paper, relationships between fundamental frequency (F(0)) and vocal tract resonance frequencies (F(1)-F(3)) are investigated for a set of preterm newborns, by means of a multi-purpose voice analysis tool (BioVoice), characterised by high-resolution and tracking capabilities. Also, first results about possible distress occurring during cry in preterm newborn infants, as related to the decrease of central blood oxygenation, are presented. To this aim, a recording system (Newborn Recorder) has been developed, that allows synchronised, non-invasive monitoring of blood oxygenation and audio recordings of newborn infant's cry. The method has been applied to preterm newborns at the Intensive Care Unit, A.Meyer Children Hospital, Firenze, Italy.
ERIC Educational Resources Information Center
American Association of Advertising Agencies Educational Foundation, New York, NY.
This set of papers represents the written record of the 1973 national conference for advertising educators held at Arizona State University in March. The conference focus was on current developments in the practice and teaching of advertising. The purpose of the conference was to bring insights about current advertising developments to the…
Dillahunt-Aspillaga, Christina; Finch, Dezon; Massengale, Jill; Kretzmer, Tracy; Luther, Stephen L.; McCart, James A.
2014-01-01
Objective The purpose of this pilot study is 1) to develop an annotation schema and a training set of annotated notes to support the future development of a natural language processing (NLP) system to automatically extract employment information, and 2) to determine if information about employment status, goals and work-related challenges reported by service members and Veterans with mild traumatic brain injury (mTBI) and post-deployment stress can be identified in the Electronic Health Record (EHR). Design Retrospective cohort study using data from selected progress notes stored in the EHR. Setting Post-deployment Rehabilitation and Evaluation Program (PREP), an in-patient rehabilitation program for Veterans with TBI at the James A. Haley Veterans' Hospital in Tampa, Florida. Participants Service members and Veterans with TBI who participated in the PREP program (N = 60). Main Outcome Measures Documentation of employment status, goals, and work-related challenges reported by service members and recorded in the EHR. Results Two hundred notes were examined and unique vocational information was found indicating a variety of self-reported employment challenges. Current employment status and future vocational goals along with information about cognitive, physical, and behavioral symptoms that may affect return-to-work were extracted from the EHR. The annotation schema developed for this study provides an excellent tool upon which NLP studies can be developed. Conclusions Information related to employment status and vocational history is stored in text notes in the EHR system. Information stored in text does not lend itself to easy extraction or summarization for research and rehabilitation planning purposes. Development of NLP systems to automatically extract text-based employment information provides data that may improve the understanding and measurement of employment in this important cohort. PMID:25541956
van der Meer, Aize Franciscus; Touw, Daniël J; Marcus, Marco A E; Neef, Cornelis; Proost, Johannes H
2012-10-01
Observational data sets can be used for population pharmacokinetic (PK) modeling. However, these data sets are generally less precisely recorded than experimental data sets. This article aims to investigate the influence of erroneous records on population PK modeling and individual maximum a posteriori Bayesian (MAPB) estimation. A total of 1123 patient records of neonates who were administered vancomycin were used for population PK modeling by iterative 2-stage Bayesian (ITSB) analysis. Cut-off values for weighted residuals were tested for exclusion of records from the analysis. A simulation study was performed to assess the influence of erroneous records on population modeling and individual MAPB estimation. Also the cut-off values for weighted residuals were tested in the simulation study. Errors in registration have limited the influence on outcomes of population PK modeling but can have detrimental effects on individual MAPB estimation. A population PK model created from a data set with many registration errors has little influence on subsequent MAPB estimates for precisely recorded data. A weighted residual value of 2 for concentration measurements has good discriminative power for identification of erroneous records. ITSB analysis and its individual estimates are hardly affected by most registration errors. Large registration errors can be detected by weighted residuals of concentration.
Indico central - events organisation, ergonomics and collaboration tools integration
NASA Astrophysics Data System (ADS)
Benito Gonzélez López, José; Ferreira, José Pedro; Baron, Thomas
2010-04-01
While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)
On the Edge of Life, II: House Officer Struggles Recorded in an Intensive Care Unit Journal
Sekeres, Mikkael A.; Stern, Theodore A.
2002-01-01
Background: In a general hospital, few clinical settings match the intensity of the intensive care unit (ICU) experience. Clinical rotations in ICUs elicit and emphasize the struggles house officers face on a daily basis throughout their training. Method: These struggles were recorded by hundreds of residents in a journal maintained in one Medical ICU for the past 20 years. We systematically reviewed these unsolicited entries to develop categories that define and illustrate common stressors. Results: Stressors for house officers include isolation, insecurity, care for the terminally ill, sleep deprivation, and long work weeks. Conclusion: By placing the struggles of house staff in context, trainees and their residency training programs can be prepared for the intensity of the experience and for work in clinical practice settings that follows completion of training. PMID:15014706
Holographic optical system for aberration corrections in laser Doppler velocimetry
NASA Technical Reports Server (NTRS)
Kim, R. C.; Case, S. K.; Schock, H. J.
1985-01-01
An optical system containing multifaceted holographic optical elements (HOEs) has been developed to correct for aberrations introduced by nonflat windows in laser Doppler velocimetry. The multifacet aberration correction approach makes it possible to record on one plate many sets of adjacent HOEs that address different measurement volume locations. By using 5-mm-diameter facets, it is practical to place 10-20 sets of holograms on one 10 x 12.5-cm plate, so that the procedure of moving the entire optical system to examine different locations may not be necessary. The holograms are recorded in dichromated gelatin and therefore are nonabsorptive and suitable for use with high-power argon laser beams. Low f-number optics coupled with a 90-percent efficient distortion-correcting hologram in the collection side of the system yield high optical efficiency.
Wang, Yang; Zekveld, Adriana A; Wendt, Dorothea; Lunner, Thomas; Naylor, Graham; Kramer, Sophia E
2018-01-01
Pupil light reflex (PLR) has been widely used as a method for evaluating parasympathetic activity. The first aim of the present study is to develop a PLR measurement using a computer screen set-up and compare its results with the PLR generated by a more conventional setup using light-emitting diode (LED). The parasympathetic nervous system, which is known to control the 'rest and digest' response of the human body, is considered to be associated with daily life fatigue. However, only few studies have attempted to test the relationship between self-reported daily fatigue and physiological measurement of the parasympathetic nervous system. Therefore, the second aim of this study was to investigate the relationship between daily-life fatigue, assessed using the Need for Recovery scale, and parasympathetic activity, as indicated by the PLR parameters. A pilot study was conducted first to develop a PLR measurement set-up using a computer screen. PLRs evoked by light stimuli with different characteristics were recorded to confirm the influence of light intensity, flash duration, and color on the PLRs evoked by the system. In the subsequent experimental study, we recorded the PLR of 25 adult participants to light flashes generated by the screen set-up as well as by a conventional LED set-up. PLR parameters relating to parasympathetic and sympathetic activity were calculated from the pupil responses. We tested the split-half reliability across two consecutive blocks of trials, and the relationships between the parameters of PLRs evoked by the two set-ups. Participants rated their need for recovery prior to the PLR recordings. PLR parameters acquired in the screen and LED set-ups showed good reliability for amplitude related parameters. The PLRs evoked by both set-ups were consistent, but showed systematic differences in absolute values of all parameters. Additionally, higher need for recovery was associated with faster and larger constriction of the PLR. This study assessed the PLR generated by a computer screen and the PLR generated by a LED. The good reliability within set-ups and the consistency between the PLRs evoked by the set-ups indicate that both systems provides a valid way to evoke the PLR. A higher need for recovery was associated with faster and larger constricting PLRs, suggesting increased levels of parasympathetic nervous system activity in people experiencing higher levels of need for recovery on a daily basis.
NASA Astrophysics Data System (ADS)
Merchant, C. J.; Llewellyn-Jones, D.; Saunders, R. W.; Rayner, N. A.; Kent, E. C.; Old, C. P.; Berry, D.; Birks, A. R.; Blackmore, T.; Corlett, G. K.; Embury, O.; Jay, V. L.; Kennedy, J.; Mutlow, C. T.; Nightingale, T. J.; O'Carroll, A. G.; Pritchard, M. J.; Remedios, J. J.; Tett, S.
We describe the approach to be adopted for a major new initiative to derive a homogeneous record of sea surface temperature for 1991 2007 from the observations of the series of three along-track scanning radiometers (ATSRs). This initiative is called (A)RC: (Advanced) ATSR Re-analysis for Climate. The main objectives are to reduce regional biases in retrieved sea surface temperature (SST) to less than 0.1 K for all global oceans, while creating a very homogenous record that is stable in time to within 0.05 K decade-1, with maximum independence of the record from existing analyses of SST used in climate change research. If these stringent targets are achieved, this record will enable significantly improved estimates of surface temperature trends and variability of sufficient quality to advance questions of climate change attribution, climate sensitivity and historical reconstruction of surface temperature changes. The approach includes development of new, consistent estimators for SST for each of the ATSRs, and detailed analysis of overlap periods. Novel aspects of the approach include generation of multiple versions of the record using alternative channel sets and cloud detection techniques, to assess for the first time the effect of such choices. There will be extensive effort in quality control, validation and analysis of the impact on climate SST data sets. Evidence for the plausibility of the 0.1 K target for systematic error is reviewed, as is the need for alternative cloud screening methods in this context.
78 FR 17778 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-22
... System,'' paragraph (iii), first sentence, we inadvertently left out ``minimum data set''. We are adding ``minimum data set'' right after the word ``registry''. And under the heading, ``Routine Uses of Records... sentence, insert ``minimum data set,'' immediately after the word ``registry,''. On the same page, renumber...
Developing Air Force Strategic Leadership - A Career Long Process
2012-05-14
experience, the Air Force as an organization is more prone to parochialism, myopia and monistic thinking. 27 Besides providing a historical record of... myopia and monistic thinking.” 120 Worden also claims the uniformity of perspective hindered strategic leaders from understanding or recognizing the...discussions in a virtual setting. While the use of social media is a developing concept, some officers have enjoyed early opportunities for PME or
Assessment of NDE Reliability Data
NASA Technical Reports Server (NTRS)
Yee, B. G. W.; Chang, F. H.; Couchman, J. C.; Lemon, G. H.; Packman, P. F.
1976-01-01
Twenty sets of relevant Nondestructive Evaluation (NDE) reliability data have been identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations has been formulated. A model to grade the quality and validity of the data sets has been developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, have been formulated for each NDE method. A comprehensive computer program has been written to calculate the probability of flaw detection at several confidence levels by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. Probability of detection curves at 95 and 50 percent confidence levels have been plotted for individual sets of relevant data as well as for several sets of merged data with common sets of NDE parameters.
NASA Astrophysics Data System (ADS)
Schroeder, Marc; Graw, Kathrin; Andersson, Axel; Fennig, Karsten; Bakan, Stephan; Klepp, Christian
2017-04-01
The global water cycle is a key component of the global climate system as it describes and links many important processes such as evaporation, convection, cloud formation and precipitation. Through latent heat release, it is also closely connected to the global energy cycle and its changes. The difference between precipitation and evaporation yields the freshwater flux, which indicates if a particular region of the earth receives more water through precipitation than it loses through evaporation or vice versa. On global scale and long time periods, however, the amounts of evaporation and precipitation are balanced. A profound understanding of the water cycle is therefore a key prerequisite for successful climate modelling. The Hamburg Ocean Atmosphere Parameters and Fluxes from Satellite Data (HOAPS) set is a fully satellite based climatology of precipitation, evaporation and freshwater budget as well as related turbulent heat fluxes and atmospheric state variables over the global ice free oceans. All geophysical parameters are derived from passive microwave radiometers, except for the SST, which is taken from AVHRR measurements based on thermal emission of the Earth. Starting with the release 3.1, the HOAPS climate data record is hosted by the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF) and the further development is shared with the University of Hamburg and the MPI-M. While the HOAPS release 3.2 in 2012 covered the entire record of the passive microwave radiometer SSM/I, the new version of the HOAPS data set, version 4, includes also the SSMIS record up to December 2014 and uncertainty estimates for parameters related to evaporation. These HOAPS data products are available as monthly averages and 6-hourly composites on a regular latitude/longitude grid with a spatial resolution of 0.5° x 0.5° from July 1987 to December 2014 (December 2008 for HOAPS3.2). Covering nearly 28 years the new HOAPS data set is highly valuable for climate applications. The data can be retrieved from the CM SAF web user interface http://wui.cmsaf.eu and from http://www.hoaps.org. The presentation will cover details of the HOAPS-4 release, recent enhancements as well as future plans for the further development of the HOAPS data set. E.g., for the integrated water vapour and the near surface wind speed product, a new 1D-Var based retrieval was developed. We show the differences between the statistical retrievals used in HOAPS-3.2 compared to the new HOAPS-4 products, results from comparisons to various satellite-based data records and results from comparisons to buoy and ship observations. A specific focus is on the assessment of the stability and uncertainties.
NASA Astrophysics Data System (ADS)
Thierens, M.; Browning, E.; Pirlet, H.; Loutre, M.-F.; Dorschel, B.; Huvenne, V. A. I.; Titschack, J.; Colin, C.; Foubert, A.; Wheeler, A. J.
2013-08-01
Through the interplay of a stabilising cold-water coral framework and a dynamic sedimentary environment, cold-water coral carbonate mounds create distinctive centres of bio-geological accumulation in often complex (continental margin) settings. The IODP Expedition 307 drilling of the Challenger Mound (eastern Porcupine Seabight; NE Atlantic) not only retrieved the first complete developmental history of a coral carbonate mound, it also exposed a unique, Early-Pleistocene sedimentary sequence of exceptional resolution along the mid-latitudinal NE Atlantic margin. In this study, a comprehensive assessment of the Challenger Mound as an archive of Quaternary palaeo-environmental change and long-term coral carbonate mound development is presented. New and existing environmental proxy records, including clay mineralogy, planktonic foraminifer and calcareous nannofossil biostratigraphy and assemblage counts, planktonic foraminifer oxygen isotopes and siliciclastic particle-size, are thereby discussed within a refined chronostratigraphic and climatic context. Overall, the development of the Challenger Mound shows a strong affinity to the Plio-Pleistocene evolution of the Northern Hemisphere climate system, albeit not being completely in phase with it. The two major oceanographic and climatic transitions of the Plio-Pleistocene - the Late Pliocene/Early Pleistocene intensification of continental ice-sheet development and the mid-Pleistocene transition to the more extremely variable and more extensively glaciated late Quaternary - mark two major thresholds in Challenger Mound development: its Late Pliocene (>2.74 Ma) origin and its Middle-Late Pleistocene to recent decline. Distinct surface-water perturbations (i.e. water-mass/polar front migrations, productivity changes, melt-water pulses) are identified throughout the sequence, which can be linked to the intensity and extent of ice development on the nearby British-Irish Isles since the earliest Pleistocene. Glaciation-induced shifts in surface-water primary productivity are thereby proposed to fundamentally control cold-water coral growth, which in turn influences on-mound sediment accumulation and, hence, coral carbonate mound development throughout the Pleistocene. As local factors, such as proximal ice-sheet dynamics and on-mound changes in cold-water coral density, significantly affected the development of the Challenger Mound, they can potentially explain the nature of its palaeo-record and its offsets with the periodicities of global climate variability. On the other hand, owing to this unique setting, a regionally exceptional, high-resolution palaeo-record of Early Pleistocene (ca 2.6 to 2.1 Ma) environmental change (including early British-Irish ice-sheet development), broadly in phase with the 41 ka-paced global climate system, is preserved in the lower Challenger Mound. All in all, the Challenger Mound record highlights the wider relevance of coral carbonate mound archives and their potential to capture unique records from dynamic (continental margin) environments.
Use of the proGAV shunt valve in normal-pressure hydrocephalus.
Toma, Ahmed K; Tarnaris, Andrew; Kitchen, Neil D; Watkins, Laurence D
2011-06-01
Overdrainage is a common complication associated with shunt insertion in normal-pressure hydrocephalus (NPH) patients. Using adjustable valves with antigravity devices has been shown to reduce its incidence. The optimal starting setting of an adjustable shunt valve in NPH is debatable. To audit our single-center practice of setting adjustable valves. We performed a retrospective review of clinical records of all NPH patients treated in our unit between 2006 and 2009 by the insertion of shunts with a proGAV valve, recording demographic and clinical data, shunt complications, and revision rates. Radiological reports of postoperative follow-up computed tomography scans of the brain were reviewed for detected subdural hematomas. A proGAV adjustable valve was inserted in 50 probable NPH patients between July 2006 and November 2009. Mean ± SD age was 76 ± 7 years. Mean follow-up was 15 months. The initial shunt setting was 6 ± 3 cm H2O, and the final setting was 4.9 ± 1.9 cm H2O. Nineteen patients required 24 readjustment procedures (readjustment rate, 38%; readjustment number, 0.48 times per patient). One patient (2%) developed delayed bilateral subdural hematoma after readjustment of his shunt valve setting as an outpatient. Starting with a low opening pressure setting on a proGAV adjustable shunt valve does not increase the chances of overdrainage complications and reduces the need for repeated readjustments.
High-throughput single-molecule force spectroscopy for membrane proteins
NASA Astrophysics Data System (ADS)
Bosshart, Patrick D.; Casagrande, Fabio; Frederix, Patrick L. T. M.; Ratera, Merce; Bippes, Christian A.; Müller, Daniel J.; Palacin, Manuel; Engel, Andreas; Fotiadis, Dimitrios
2008-09-01
Atomic force microscopy-based single-molecule force spectroscopy (SMFS) is a powerful tool for studying the mechanical properties, intermolecular and intramolecular interactions, unfolding pathways, and energy landscapes of membrane proteins. One limiting factor for the large-scale applicability of SMFS on membrane proteins is its low efficiency in data acquisition. We have developed a semi-automated high-throughput SMFS (HT-SMFS) procedure for efficient data acquisition. In addition, we present a coarse filter to efficiently extract protein unfolding events from large data sets. The HT-SMFS procedure and the coarse filter were validated using the proton pump bacteriorhodopsin (BR) from Halobacterium salinarum and the L-arginine/agmatine antiporter AdiC from the bacterium Escherichia coli. To screen for molecular interactions between AdiC and its substrates, we recorded data sets in the absence and in the presence of L-arginine, D-arginine, and agmatine. Altogether ~400 000 force-distance curves were recorded. Application of coarse filtering to this wealth of data yielded six data sets with ~200 (AdiC) and ~400 (BR) force-distance spectra in each. Importantly, the raw data for most of these data sets were acquired in one to two days, opening new perspectives for HT-SMFS applications.
Min, Yul Ha; Park, Hyeoun-Ae; Chung, Eunja; Lee, Hyunsook
2013-12-01
The purpose of this paper is to describe the components of a next-generation electronic nursing records system ensuring full semantic interoperability and integrating evidence into the nursing records system. A next-generation electronic nursing records system based on detailed clinical models and clinical practice guidelines was developed at Seoul National University Bundang Hospital in 2013. This system has two components, a terminology server and a nursing documentation system. The terminology server manages nursing narratives generated from entity-attribute-value triplets of detailed clinical models using a natural language generation system. The nursing documentation system provides nurses with a set of nursing narratives arranged around the recommendations extracted from clinical practice guidelines. An electronic nursing records system based on detailed clinical models and clinical practice guidelines was successfully implemented in a hospital in Korea. The next-generation electronic nursing records system can support nursing practice and nursing documentation, which in turn will improve data quality.
High-Speed Recording of Test Data on Hard Disks
NASA Technical Reports Server (NTRS)
Lagarde, Paul M., Jr.; Newnan, Bruce
2003-01-01
Disk Recording System (DRS) is a systems-integration computer program for a direct-to-disk (DTD) high-speed data acquisition system (HDAS) that records rocket-engine test data. The HDAS consists partly of equipment originally designed for recording the data on tapes. The tape recorders were replaced with hard-disk drives, necessitating the development of DRS to provide an operating environment that ties two computers, a set of five DTD recorders, and signal-processing circuits from the original tape-recording version of the HDAS into one working system. DRS includes three subsystems: (1) one that generates a graphical user interface (GUI), on one of the computers, that serves as a main control panel; (2) one that generates a GUI, on the other computer, that serves as a remote control panel; and (3) a data-processing subsystem that performs tasks on the DTD recorders according to instructions sent from the main control panel. The software affords capabilities for dynamic configuration to record single or multiple channels from a remote source, remote starting and stopping of the recorders, indexing to prevent overwriting of data, and production of filtered frequency data from an original time-series data file.
Bru, Juan; Berger, Christopher A
2012-01-01
Background Point-of-care electronic medical records (EMRs) are a key tool to manage chronic illness. Several EMRs have been developed for use in treating HIV and tuberculosis, but their applicability to primary care, technical requirements and clinical functionalities are largely unknown. Objectives This study aimed to address the needs of clinicians from resource-limited settings without reliable internet access who are considering adopting an open-source EMR. Study eligibility criteria Open-source point-of-care EMRs suitable for use in areas without reliable internet access. Study appraisal and synthesis methods The authors conducted a comprehensive search of all open-source EMRs suitable for sites without reliable internet access. The authors surveyed clinician users and technical implementers from a single site and technical developers of each software product. The authors evaluated availability, cost and technical requirements. Results The hardware and software for all six systems is easily available, but they vary considerably in proprietary components, installation requirements and customisability. Limitations This study relied solely on self-report from informants who developed and who actively use the included products. Conclusions and implications of key findings Clinical functionalities vary greatly among the systems, and none of the systems yet meet minimum requirements for effective implementation in a primary care resource-limited setting. The safe prescribing of medications is a particular concern with current tools. The dearth of fully functional EMR systems indicates a need for a greater emphasis by global funding agencies to move beyond disease-specific EMR systems and develop a universal open-source health informatics platform. PMID:22763661
Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun
2007-11-01
Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.
Patients' experiences when accessing their on-line electronic patient records in primary care.
Pyper, Cecilia; Amery, Justin; Watson, Marion; Crook, Claire
2004-01-01
BACKGROUND: Patient access to on-line primary care electronic patient records is being developed nationally. Knowledge of what happens when patients access their electronic records is poor. AIM: To enable 100 patients to access their electronic records for the first time to elicit patients' views and to understand their requirements. DESIGN OF STUDY: In-depth interviews using semi-structured questionnaires as patients accessed their electronic records, plus a series of focus groups. SETTING: Secure facilities for patients to view their primary care records privately. METHOD: One hundred patients from a randomised group viewed their on-line electronic records for the first time. The questionnaire and focus groups addressed patients' views on the following topics: ease of use; confidentiality and security; consent to access; accuracy; printing records; expectations regarding content; exploitation of electronic records; receiving new information and bad news. RESULTS: Most patients found the computer technology used acceptable. The majority found viewing their record useful and understood most of the content, although medical terms and abbreviations required explanation. Patients were concerned about security and confidentiality, including potential exploitation of records. They wanted the facility to give informed consent regarding access and use of data. Many found errors, although most were not medically significant. Many expected more detail and more information. Patients wanted to add personal information. CONCLUSION: Patients have strong views on what they find acceptable regarding access to electronic records. Working in partnership with patients to develop systems is essential to their success. Further work is required to address legal and ethical issues of electronic records and to evaluate their impact on patients, health professionals and service provision. PMID:14965405
Data-driven approach for creating synthetic electronic medical records.
Buczak, Anna L; Babin, Steven; Moniz, Linda
2010-10-14
New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs) that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia) and for background records. The method developed has three major steps: 1) synthetic patient identity and basic information generation; 2) identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3) adaptation of these care patterns to the synthetic patient population. We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders). The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious diseases. The pilot synthetic background records were in the 4-11 year old age group. The adaptations that must be made to the algorithms to produce synthetic background EMRs for other age groups are indicated.
Electroencephalography in premature and full-term infants. Developmental features and glossary.
André, M; Lamblin, M-D; d'Allest, A M; Curzi-Dascalova, L; Moussalli-Salefranque, F; S Nguyen The, Tich; Vecchierini-Blineau, M-F; Wallois, F; Walls-Esquivel, E; Plouin, P
2010-05-01
Following the pioneering work of C. Dreyfus-Brisac and N. Monod, research into neonatal electroencephalography (EEG) has developed tremendously in France. French neurophysiologists who had been trained in Paris (France) collaborated on a joint project on the introduction, development, and currently available neonatal EEG recording techniques. They assessed the analytical criteria for the different maturational stages and standardized neonatal EEG terminology on the basis of the large amount of data available in the French and the English literature. The results of their work were presented in 1999. Since the first edition, technology has moved towards the widespread use of digitized recordings. Although the data obtained with analog recordings can be applied to digitized EEG tracings, the present edition, including new published data, is illustrated with digitized recordings. Herein, the reader can find a comprehensive description of EEG features and neonatal behavioural states at different gestational ages, and also a definition of the main aspects and patterns of both pathological and normal EEGs, presented in glossary form. In both sections, numerous illustrations have been provided. This precise neonatal EEG terminology should improve homogeneity in the analysis of neonatal EEG recordings, and facilitate the setting up of multicentric studies on certain aspects of normal EEG recordings and various pathological patterns. Copyright 2010 Elsevier Masson SAS. All rights reserved.
Pathfinder aircraft liftoff on altitude record setting flight of 71,500 feet
1997-07-07
The Pathfinder aircraft has set a new unofficial world record for high-altitude flight of over 71,500 feet for solar-powered aircraft at the U.S. Navy's Pacific Missile Range Facility, Kauai, Hawaii. Pathfinder was designed and manufactured by AeroVironment, Inc, of Simi Valley, California, and was operated by the firm under a jointly sponsored research agreement with NASA's Dryden Flight Research Center, Edwards, California. Pathfinder's record-breaking flight occurred July 7, 1997. The aircraft took off at 11:34 a.m. PDT, passed its previous record altitude of 67,350 feet at about 5:45 p.m. and then reached its new record altitude at 7 p.m. The mission ended with a perfect nighttime landing at 2:05 a.m. PDT July 8. The new record is the highest altitude ever attained by a propellor-driven aircraft. Before Pathfinder, the altitude record for propellor-driven aircraft was 67,028 feet, set by the experimental Boeing Condor remotely piloted aircraft.
Children's Language Assessment--Situational Tasks.
ERIC Educational Resources Information Center
Conrad, Eva E.; And Others
The Children's Language Assessment-Situational Tasks (CLA-ST) was developed to collect language samples within a normally operating classroom. The language is taken on a cassette tape recorder, which is placed at the foot of a small table. At this table, in a committee setting, four children are engaged with a teacher in an activity similar to…
ERIC Educational Resources Information Center
Porter, Marclyn D.
2011-01-01
Many alternatively certified teachers, as was the case in this study, are employed as the teacher of record while simultaneously enrolled in education courses. Therefore, experiencing the collaborative, supportive, peer mentoring environmental elements that are present in many traditional "fieldwork" settings is not an option. By…
Outcomes of Adult Learning: Taking the Debate Forward.
ERIC Educational Resources Information Center
Jones, Huw, Ed.; Mace, Jackie, Ed.
The four papers in this collection are intended to stimulate debate in the adult education sector and to set the agenda for further development work. "Learning Outcomes: Towards a Synthesis of Progress" (Peter Lavender) provides a summary of recent efforts to identify, record, and value learning that does not lead to qualifications.…
Budgeting: The Basics and Beyond. Learn at Home.
ERIC Educational Resources Information Center
Prochaska-Cue, Kathy; Sugden, Marilyn
Designed as an at-home course to help users develop a realistic budget plan and set up a workable record-keeping system, these course materials provide practical tips, ideas, and suggestions for budgeting. The course begins with a nine-step budgeting process which emphasizes communicating among family members, considering personal or family…
Getting the Most Out of Progress Files and Personal Development Planning
ERIC Educational Resources Information Center
Croot, David; Gedye, Sharon
2006-01-01
Progress Files have been set by the government as a specific element of all higher education provision in England and "should consist of two elements: a transcript recording student achievement which should follow a common format devised by institutions collectively through their representative bodies; and a means by which students can …
HERO HELPS for Home Economics Related Occupation Coordinators. Volume I.
ERIC Educational Resources Information Center
Northern Arizona Univ., Flagstaff. Center for Vocational Education.
These 25 modules for independent study comprise the first volume of a two-volume set of HERO (Home Economics Related Occupations) HELPS for student use in competency-based professional development. A management system that includes a filing system, testing, record keeping, and scheduling is discussed. A sample contract and other class management…
User's Guide to the Stand Prognosis Model
William R. Wykoff; Nicholas L. Crookston; Albert R. Stage
1982-01-01
The Stand Prognosis Model is a computer program that projects the development of forest stands in the Northern Rocky Mountains. Thinning options allow for simulation of a variety of management strategies. Input consists of a stand inventory, including sample tree records, and a set of option selection instructions. Output includes data normally found in stand, stock,...
Communication with Deaf Pre-School Children Using Cochlear Implants.
ERIC Educational Resources Information Center
Tvingstedt, A. L.; Preisler, G.; Ahlstrom, M.
This study evaluated the communicative, social, and emotional development of 22 deaf Swedish pre-school children with cochlear implants over a 2-year period. Video-recordings (every 3 months) and observations of the children in natural interactional settings at home and school as well as interviews with parents and teachers provided the study…
Neonatal Seizure Detection Using Deep Convolutional Neural Networks.
Ansari, Amir H; Cherian, Perumpillichira J; Caicedo, Alexander; Naulaers, Gunnar; De Vos, Maarten; Van Huffel, Sabine
2018-04-02
Identifying a core set of features is one of the most important steps in the development of an automated seizure detector. In most of the published studies describing features and seizure classifiers, the features were hand-engineered, which may not be optimal. The main goal of the present paper is using deep convolutional neural networks (CNNs) and random forest to automatically optimize feature selection and classification. The input of the proposed classifier is raw multi-channel EEG and the output is the class label: seizure/nonseizure. By training this network, the required features are optimized, while fitting a nonlinear classifier on the features. After training the network with EEG recordings of 26 neonates, five end layers performing the classification were replaced with a random forest classifier in order to improve the performance. This resulted in a false alarm rate of 0.9 per hour and seizure detection rate of 77% using a test set of EEG recordings of 22 neonates that also included dubious seizures. The newly proposed CNN classifier outperformed three data-driven feature-based approaches and performed similar to a previously developed heuristic method.
AOIPS data base management systems support for GARP data sets
NASA Technical Reports Server (NTRS)
Gary, J. P.
1977-01-01
A data base management system is identified, developed to provide flexible access to data sets produced by GARP during its data systems tests. The content and coverage of the data base are defined and a computer-aided, interactive information storage and retrieval system, implemented to facilitate access to user specified data subsets, is described. The computer programs developed to provide the capability were implemented on the highly interactive, minicomputer-based AOIPS and are referred to as the data retrieval system (DRS). Implemented as a user interactive but menu guided system, the DRS permits users to inventory the data tape library and create duplicate or subset data sets based on a user selected window defined by time and latitude/longitude boundaries. The DRS permits users to select, display, or produce formatted hard copy of individual data items contained within the data records.
ERIC Educational Resources Information Center
Wiggley, Shirley L.
2011-01-01
Purpose: The purpose of this study was to examine the relationship between the electronic health record system components and patient outcomes in an acute hospital setting, given that the current presidential administration has earmarked nearly $50 billion to the implementation of the electronic health record. The relationship between the…
Eeckhout, Thomas; Gerits, Michiel; Bouquillon, Dries; Schoenmakers, Birgitte
2016-08-01
Since many years, teaching and training in communication skills are cornerstones in the medical education curriculum. Although video recording in a real-time consultation is expected to positively contribute to the learning process, research on this topic is scarce. This study will focus on the feasibility and acceptability of video recording during real-time patient encounters performed by general practitioner (GP) trainees. The primary research question addressed the experiences (defined as feasibility and acceptability) of GP trainees in video-recorded vocational training in a general practice. The second research question addressed the appraisal of this training. The procedure of video-recorded training is developed, refined and validated by the Academic Teaching Practice of Leuven since 1974 (Faculty of Medicine of the University of Leuven). The study is set up as a cross-sectional survey without follow-up. Outcome measures were defined as 'feasibility and acceptability' (experiences of trainees) of the video-recorded training and were approached by a structured questionnaire with the opportunity to add free text comments. The studied sample consisted of all first-phase trainees of the GP Master 2011-2012 at the University of Leuven. Almost 70% of the trainees were positive about recording consultations. Nevertheless, over 60% believed that patients felt uncomfortable during the video-recorded encounter. Almost 90% noticed an improvement of own communication skills through the observation and evaluation of. Most students (85%) experienced the logistical issues as major barrier to perform video consultations on a regular base. This study lays the foundation stone for further exploration of the video training in real-time consultations. Both students and teachers on the field acknowledge that the power of imaging is underestimated in the training of communication and vocational skills. The development of supportive material and protocols will lower thresholds. Time investment for teachers could be tempered by bringing up students to peer tutors and by an accurate scheduling of the video training. The development of supportive material and protocols will lower thresholds. Further research should finally focus on long-term efficacy and efficiency in terms of learning outcomes and on the facilitation of the technical process. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Eeckhout, Thomas; Gerits, Michiel; Bouquillon, Dries; Schoenmakers, Birgitte
2016-01-01
Objective Since many years, teaching and training in communication skills are cornerstones in the medical education curriculum. Although video recording in a real-time consultation is expected to positively contribute to the learning process, research on this topic is scarce. This study will focus on the feasibility and acceptability of video recording during real-time patient encounters performed by general practitioner (GP) trainees. Method The primary research question addressed the experiences (defined as feasibility and acceptability) of GP trainees in video-recorded vocational training in a general practice. The second research question addressed the appraisal of this training. The procedure of video-recorded training is developed, refined and validated by the Academic Teaching Practice of Leuven since 1974 (Faculty of Medicine of the University of Leuven). The study is set up as a cross-sectional survey without follow-up. Outcome measures were defined as ‘feasibility and acceptability’ (experiences of trainees) of the video-recorded training and were approached by a structured questionnaire with the opportunity to add free text comments. The studied sample consisted of all first-phase trainees of the GP Master 2011–2012 at the University of Leuven. Results Almost 70% of the trainees were positive about recording consultations. Nevertheless, over 60% believed that patients felt uncomfortable during the video-recorded encounter. Almost 90% noticed an improvement of own communication skills through the observation and evaluation of. Most students (85%) experienced the logistical issues as major barrier to perform video consultations on a regular base. Conclusions This study lays the foundation stone for further exploration of the video training in real-time consultations. Both students and teachers on the field acknowledge that the power of imaging is underestimated in the training of communication and vocational skills. The development of supportive material and protocols will lower thresholds. Practice implications Time investment for teachers could be tempered by bringing up students to peer tutors and by an accurate scheduling of the video training. The development of supportive material and protocols will lower thresholds. Further research should finally focus on long-term efficacy and efficiency in terms of learning outcomes and on the facilitation of the technical process. PMID:26842970
Does the accuracy of blood pressure measurement correlate with hearing loss of the observer?
Song, Soohwa; Lee, Jongshill; Chee, Youngjoon; Jang, Dong Pyo; Kim, In Young
2014-02-01
The auscultatory method is influenced by the hearing level of the observers. If the observer has hearing loss, it is possible to measure blood pressure inaccurately by misreading the Korotkoff sounds at systolic blood pressure (SBP) and diastolic blood pressure (DBP). Because of the potential clinical problems this discrepancy may cause, we used a hearing loss simulator to determine how hearing level affects the accuracy of blood pressure measurements. Two data sets (data set A, 32 Korotkoff sound video clips recorded by the British Hypertension Society; data set B, 28 Korotkoff sound data acquired from the Korotkoff sound recording system developed by Hanyang University) were used and all the data were attenuated to simulate a hearing loss of 5, 10, 15, 20, and 25 dB using the hearing loss simulator. Five observers with normal hearing assessed the blood pressures from these data sets and the differences between the values measured from the original recordings (no attenuation) and the attenuated versions were analyzed. Greater attenuation of the Korotkoff sounds, or greater hearing loss, resulted in larger blood pressure measurement differences when compared with the original data. When measuring blood pressure with hearing loss, the SBP tended to be underestimated and the DBP was overestimated. The mean differences between the original data and the 25 dB hearing loss data for the two data sets combined were 1.55±2.71 and -4.32±4.21 mmHg for SBP and DBP, respectively. This experiment showed that the accuracy of blood pressure measurements using the auscultatory method is affected by observer hearing level. Therefore, to reduce possible error using the auscultatory method, observers' hearing should be tested.
Quantifying Data Quality for Clinical Trials Using Electronic Data Capture
Nahm, Meredith L.; Pieper, Carl F.; Cunningham, Maureen M.
2008-01-01
Background Historically, only partial assessments of data quality have been performed in clinical trials, for which the most common method of measuring database error rates has been to compare the case report form (CRF) to database entries and count discrepancies. Importantly, errors arising from medical record abstraction and transcription are rarely evaluated as part of such quality assessments. Electronic Data Capture (EDC) technology has had a further impact, as paper CRFs typically leveraged for quality measurement are not used in EDC processes. Methods and Principal Findings The National Institute on Drug Abuse Treatment Clinical Trials Network has developed, implemented, and evaluated methodology for holistically assessing data quality on EDC trials. We characterize the average source-to-database error rate (14.3 errors per 10,000 fields) for the first year of use of the new evaluation method. This error rate was significantly lower than the average of published error rates for source-to-database audits, and was similar to CRF-to-database error rates reported in the published literature. We attribute this largely to an absence of medical record abstraction on the trials we examined, and to an outpatient setting characterized by less acute patient conditions. Conclusions Historically, medical record abstraction is the most significant source of error by an order of magnitude, and should be measured and managed during the course of clinical trials. Source-to-database error rates are highly dependent on the amount of structured data collection in the clinical setting and on the complexity of the medical record, dependencies that should be considered when developing data quality benchmarks. PMID:18725958
Development and pilot study of an essential set of indicators for general surgery services.
Soria-Aledo, Victor; Angel-Garcia, Daniel; Martinez-Nicolas, Ismael; Rebasa Cladera, Pere; Cabezali Sanchez, Roger; Pereira García, Luis Francisco
2016-11-01
At present there is a lack of appropriate quality measures for benchmarking in general surgery units of Spanish National Health System. The aim of this study is to present the selection, development and pilot-testing of an initial set of surgical quality indicators for this purpose. A modified Delphi was performed with experts from the Spanish Surgeons Association in order to prioritize previously selected indicators. Then, a pilot study was carried out in a public hospital encompassing qualitative analysis of feasibility for prioritized indicators and an additional qualitative and quantitative three-rater reliability assessment for medical record-based indicators. Observed inter-rater agreement, prevalence adjusted and bias adjusted kappa and non-adjusted kappa were performed, using a systematic random sample (n=30) for each of these indicators. Twelve out of 13 proposed indicators were feasible: 5 medical record-based indicators and 7 indicators based on administrative databases. From medical record-based indicators, 3 were reliable (observed agreement >95%, adjusted kappa index >0.6 or non-adjusted kappa index >0.6 for composites and its components) and 2 needed further refinement. Currently, medical record-based indicators could be used for comparison purposes, whilst further research must be done for validation and risk-adjustment of outcome indicators from administrative databases. Compliance results in the adequacy of informed consent, diagnosis-to-treatment delay in colorectal cancer, and antibiotic prophylaxis show room for improvement in the pilot-tested hospital. Copyright © 2016 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.
Electronic Detection of Delayed Test Result Follow-Up in Patients with Hypothyroidism.
Meyer, Ashley N D; Murphy, Daniel R; Al-Mutairi, Aymer; Sittig, Dean F; Wei, Li; Russo, Elise; Singh, Hardeep
2017-07-01
Delays in following up abnormal test results are a common problem in outpatient settings. Surveillance systems that use trigger tools to identify delayed follow-up can help reduce missed opportunities in care. To develop and test an electronic health record (EHR)-based trigger algorithm to identify instances of delayed follow-up of abnormal thyroid-stimulating hormone (TSH) results in patients being treated for hypothyroidism. We developed an algorithm using structured EHR data to identify patients with hypothyroidism who had delayed follow-up (>60 days) after an abnormal TSH. We then retrospectively applied the algorithm to a large EHR data warehouse within the Department of Veterans Affairs (VA), on patient records from two large VA networks for the period from January 1, 2011, to December 31, 2011. Identified records were reviewed to confirm the presence of delays in follow-up. During the study period, 645,555 patients were seen in the outpatient setting within the two networks. Of 293,554 patients with at least one TSH test result, the trigger identified 1250 patients on treatment for hypothyroidism with elevated TSH. Of these patients, 271 were flagged as potentially having delayed follow-up of their test result. Chart reviews confirmed delays in 163 of the 271 flagged patients (PPV = 60.1%). An automated trigger algorithm applied to records in a large EHR data warehouse identified patients with hypothyroidism with potential delays in thyroid function test results follow-up. Future prospective application of the TSH trigger algorithm can be used by clinical teams as a surveillance and quality improvement technique to monitor and improve follow-up.
Primary care physicians’ experiences with electronic medical records
Ludwick, Dave; Manca, Donna; Doucette, John
2010-01-01
OBJECTIVE To understand how remuneration and care setting affect the implementation of electronic medical records (EMRs). DESIGN Semistructured interviews were used to illicit descriptions from community-based family physicians (paid on a fee-for-service basis) and from urban, hospital, and academic family physicians (remunerated via alternative payment models or sessional pay for activities pertaining to EMR implementation). SETTING Small suburban community and large urban-, hospital-, and academic-based family medicine clinics in Alberta. All participants were supported by a jurisdictional EMR certification funding mechanism. PARTICIPANTS Physicians who practised in 1 or a combination of the above settings and had experience implementing and using EMRs. METHODS Purposive and maximum variation sampling was used to obtain descriptive data from key informants through individually conducted semistructured interviews. The interview guide, which was developed from key findings of our previous literature review, was used in a previous study of community-based family physicians on this same topic. Field notes were analyzed to generate themes through a comparative immersion approach. MAIN FINDINGS Physicians in urban, hospital, and academic settings leverage professional working relationships to investigate EMRs, a resource not available to community physicians. Physicians in urban, hospital, and academic settings work in larger interdisciplinary teams with a greater need for interdisciplinary care coordination, EMR training, and technical support. These practices were able to support the cost of project management or technical support resources. These physicians followed a planned system rollout approach compared with community physicians who installed their systems quickly and required users to transition to the new system immediately. Electronic medical records did not increase, or decrease, patient throughput. Physicians developed ways of including patients in the note-taking process. CONCLUSION We studied physicians’ procurement approaches under various payment models. Our findings do not suggest that one remuneration approach supports EMR adoption any more than another. Rather, this study suggests that stronger physician professional networks used in information gathering, more complete training, and in-house technical support might be more influential than remuneration in facilitating the EMR adoption experience. PMID:20090083
Technical review of SRT-CMA-930058 revalidation studies of Mark 16 experiments: J70
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, R.L.
1993-10-25
This study is a reperformance of a set of MGBS-TGAL criticality safety code validation calculations previously reported by Clark. The reperformance was needed because the records of the previous calculations could not be located in current APG files and records. As noted by the author, preliminary attempts to reproduce the Clark results by direct modeling in MGBS and TGAL were unsuccessful. Consultation with Clark indicated that the MGBS-TGAL (EXPT) option within the KOKO system should be used to set up the MGBS and TGAL input data records. The results of the study indicate that the technique used by Clark hasmore » been established and that the technique is now documented for future use. File records of the calculations have also been established in APG files. The review was performed per QAP 11--14 of 1Q34. Since the reviewer was involved in developing the procedural technique used for this study, this review can not be considered a fully independent review, but should be considered a verification that the document contains adequate information to allow a new user to perform similar calculations, a verification of the procedure by performing several calculations independently with identical results to the reported results, and a verification of the readability of the report.« less
NASA Astrophysics Data System (ADS)
Wassermann, J. M.; Wietek, A.; Hadziioannou, C.; Igel, H.
2014-12-01
Microzonation, i.e. the estimation of (shear) wave velocity profiles of the upper few 100m in dense 2D surface grids is one of the key methods to understand the variation in seismic hazard caused by ground shaking events. In this presentation we introduce a novel method for estimating the Love-wave phase velocity dispersion by using ambient noise recordings. We use the vertical component of rotational motions inherently present in ambient noise and the well established relation to simultaneous recordings of transverse acceleration. In this relation the frequency dependent phase velocity of a plane SH (or Love)-type wave acts as a proportionality factor between the anti-correlated amplitudes of both measures. In a first step we used synthetic data sets with increasing complexity to evaluate the proposed technique and the developed algorithm to extract the direction and amplitude of the incoming ambient noise wavefield measured at a single site. Since reliable weak rotational motion sensors are not yet readily available, we apply array derived rotation measurements in order to test our method. We next use the technique to analyze different real data sets of ambient noise measurements as well as seismic recordings at active volcanoes and compare these results with findings of the Spatial AutoCorrelation technique which was applied to the same data set. We demonstrate that the newly developed technique shows comparable results to more classical, strictly array based methods. Furthermore, we show that as soon as portable weak motion rotational motion sensors are available, a single 6C-station approach will be feasible, not only for microzonation but also for general array applications, with performance comparable to more classical techniques. An important advantage, especially in urban environments, is that with this approach, the number of seismic stations needed is drastically reduced.
Towards decadal time series of Arctic and Antarctic sea ice thickness from radar altimetry
NASA Astrophysics Data System (ADS)
Hendricks, S.; Rinne, E. J.; Paul, S.; Ricker, R.; Skourup, H.; Kern, S.; Sandven, S.
2016-12-01
The CryoSat-2 mission has demonstrated the value of radar altimetry to assess the interannual variability and short-term trends of Arctic sea ice over the existing observational record of 6 winter seasons. CryoSat-2 is a particular successful mission for sea ice mass balance assessment due to its novel radar altimeter concept and orbit configuration, but radar altimetry data is available since 1993 from the ERS-1/2 and Envisat missions. Combining these datasets promises a decadal climate data record of sea ice thickness, but inter-mission biases must be taken into account due to the evolution of radar altimeters and the impact of changing sea ice conditions on retrieval algorithm parametrizations. The ESA Climate Change Initiative on Sea Ice aims to extent the list of data records for Essential Climate Variables (ECV's) with a consistent time series of sea ice thickness from available radar altimeter data. We report on the progress of the algorithm development and choices for auxiliary data sets for sea ice thickness retrieval in the Arctic and Antarctic Oceans. Particular challenges are the classification of surface types and freeboard retrieval based on radar waveforms with significantly varying footprint sizes. In addition, auxiliary data sets, e.g. for snow depth, are far less developed in the Antarctic and we will discuss the expected skill of the sea ice thickness ECV's in both hemispheres.
Assessment of Lower Limb Prosthesis through Wearable Sensors and Thermography
Cutti, Andrea Giovanni; Perego, Paolo; Fusca, Marcello C.; Sacchetti, Rinaldo; Andreoni, Giuseppe
2014-01-01
This study aimed to explore the application of infrared thermography in combination with ambulatory wearable monitoring of temperature and relative humidity, to assess the residual limb-to-liner interface in lower-limb prosthesis users. Five male traumatic transtibial amputees were involved, who reported no problems or discomfort while wearing the prosthesis. A thermal imaging camera was used to measure superficial thermal distribution maps of the stump. A wearable system for recording the temperature and relative humidity in up to four anatomical points was developed, tested in vitro and integrated with the measurement set. The parallel application of an infrared camera and wearable sensors provided complementary information. Four main Regions of Interest were identified on the stump (inferior patella, lateral/medial epicondyles, tibial tuberosity), with good inter-subject repeatability. An average increase of 20% in hot areas (P < 0.05) is shown after walking compared to resting conditions. The sensors inside the cuff did not provoke any discomfort during recordings and provide an inside of the thermal exchanges while walking and recording the temperature increase (a regime value is ∼+1.1 ± 0.7 °C) and a more significant one (∼+4.1 ± 2.3%) in humidity because of the sweat produced. This study has also begun the development of a reference data set for optimal socket/liner-stump construction. PMID:24618782
de Wet, C; Bowie, P
2009-04-01
A multi-method strategy has been proposed to understand and improve the safety of primary care. The trigger tool is a relatively new method that has shown promise in American and secondary healthcare settings. It involves the focused review of a random sample of patient records using a series of "triggers" that alert reviewers to potential errors and previously undetected adverse events. To develop and test a global trigger tool to detect errors and adverse events in primary-care records. Trigger tool development was informed by previous research and content validated by expert opinion. The tool was applied by trained reviewers who worked in pairs to conduct focused audits of 100 randomly selected electronic patient records in each of five urban general practices in central Scotland. Review of 500 records revealed 2251 consultations and 730 triggers. An adverse event was found in 47 records (9.4%), indicating that harm occurred at a rate of one event per 48 consultations. Of these, 27 were judged to be preventable (42%). A further 17 records (3.4%) contained evidence of a potential adverse event. Harm severity was low to moderate for most patients (82.9%). Error and harm rates were higher in those aged > or =60 years, and most were medication-related (59%). The trigger tool was successful in identifying undetected patient harm in primary-care records and may be the most reliable method for achieving this. However, the feasibility of its routine application is open to question. The tool may have greater utility as a research rather than an audit technique. Further testing in larger, representative study samples is required.
Use of a secure Internet Web site for collaborative medical research.
Marshall, W W; Haley, R W
2000-10-11
Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.
Cornford, Tony; Barber, Nicholas; Avery, Anthony; Takian, Amirhossein; Lichtner, Valentina; Petrakaki, Dimitra; Crowe, Sarah; Marsden, Kate; Robertson, Ann; Morrison, Zoe; Klecun, Ela; Prescott, Robin; Quinn, Casey; Jani, Yogini; Ficociello, Maryam; Voutsina, Katerina; Paton, James; Fernando, Bernard; Jacklin, Ann; Cresswell, Kathrin
2011-01-01
Objectives To evaluate the implementation and adoption of the NHS detailed care records service in “early adopter” hospitals in England. Design Theoretically informed, longitudinal qualitative evaluation based on case studies. Setting 12 “early adopter” NHS acute hospitals and specialist care settings studied over two and a half years. Data sources Data were collected through in depth interviews, observations, and relevant documents relating directly to case study sites and to wider national developments that were perceived to impact on the implementation strategy. Data were thematically analysed, initially within and then across cases. The dataset consisted of 431 semistructured interviews with key stakeholders, including hospital staff, developers, and governmental stakeholders; 590 hours of observations of strategic meetings and use of the software in context; 334 sets of notes from observations, researchers’ field notes, and notes from national conferences; 809 NHS documents; and 58 regional and national documents. Results Implementation has proceeded more slowly, with a narrower scope and substantially less clinical functionality than was originally planned. The national strategy had considerable local consequences (summarised under five key themes), and wider national developments impacted heavily on implementation and adoption. More specifically, delays related to unrealistic expectations about the capabilities of systems; the time needed to build, configure, and customise the software; the work needed to ensure that systems were supporting provision of care; and the needs of end users for training and support. Other factors hampering progress included the changing milieu of NHS policy and priorities; repeatedly renegotiated national contracts; different stages of development of diverse NHS care records service systems; and a complex communication process between different stakeholders, along with contractual arrangements that largely excluded NHS providers. There was early evidence that deploying systems resulted in important learning within and between organisations and the development of relevant competencies within NHS hospitals. Conclusions Implementation of the NHS Care Records Service in “early adopter” sites proved time consuming and challenging, with as yet limited discernible benefits for clinicians and no clear advantages for patients. Although our results might not be directly transferable to later adopting sites because the functionalities we evaluated were new and untried in the English context, they shed light on the processes involved in implementing major new systems. The move to increased local decision making that we advocated based on our interim analysis has been pursued and welcomed by the NHS, but it is important that policymakers do not lose sight of the overall goal of an integrated interoperable solution. PMID:22006942
The Web-based CanMEDS Resident Learning Portfolio Project (WEBCAM): how we got started.
Glen, Peter; Balaa, Fady; Momoli, Franco; Martin, Louise; Found, Dorothy; Arnaout, Angel
2016-12-01
The CanMEDS framework is ubiquitous in Canadian postgraduate medical education; however, training programs do not have a universal method of assessing competence. We set out to develop a novel portfolio that allowed trainees to generate a longitudinal record of their training and development within the framework. The portfolio provided an objective means for the residency program director to document and evaluate resident progress within the CanMEDS roles.
Jazayeri, Darius; Teich, Jonathan M; Ball, Ellen; Nankubuge, Patricia Alexandra; Rwebembera, Job; Wing, Kevin; Sesay, Alieu Amara; Kanter, Andrew S; Ramos, Glauber D; Walton, David; Cummings, Rachael; Checchi, Francesco; Fraser, Hamish S
2017-01-01
Background Stringent infection control requirements at Ebola treatment centers (ETCs), which are specialized facilities for isolating and treating Ebola patients, create substantial challenges for recording and reviewing patient information. During the 2014-2016 West African Ebola epidemic, paper-based data collection systems at ETCs compromised the quality, quantity, and confidentiality of patient data. Electronic health record (EHR) systems have the potential to address such problems, with benefits for patient care, surveillance, and research. However, no suitable software was available for deployment when large-scale ETCs opened as the epidemic escalated in 2014. Objective We present our work on rapidly developing and deploying OpenMRS-Ebola, an EHR system for the Kerry Town ETC in Sierra Leone. We describe our experience, lessons learned, and recommendations for future health emergencies. Methods We used the OpenMRS platform and Agile software development approaches to build OpenMRS-Ebola. Key features of our work included daily communications between the development team and ground-based operations team, iterative processes, and phased development and implementation. We made design decisions based on the restrictions of the ETC environment and regular user feedback. To evaluate the system, we conducted predeployment user questionnaires and compared the EHR records with duplicate paper records. Results We successfully built OpenMRS-Ebola, a modular stand-alone EHR system with a tablet-based application for infectious patient wards and a desktop-based application for noninfectious areas. OpenMRS-Ebola supports patient tracking (registration, bed allocation, and discharge); recording of vital signs and symptoms; medication and intravenous fluid ordering and monitoring; laboratory results; clinician notes; and data export. It displays relevant patient information to clinicians in infectious and noninfectious zones. We implemented phase 1 (patient tracking; drug ordering and monitoring) after 2.5 months of full-time development. OpenMRS-Ebola was used for 112 patient registrations, 569 prescription orders, and 971 medication administration recordings. We were unable to fully implement phases 2 and 3 as the ETC closed because of a decrease in new Ebola cases. The phase 1 evaluation suggested that OpenMRS-Ebola worked well in the context of the rollout, and the user feedback was positive. Conclusions To our knowledge, OpenMRS-Ebola is the most comprehensive adaptable clinical EHR built for a low-resource setting health emergency. It is designed to address the main challenges of data collection in highly infectious environments that require robust infection prevention and control measures and it is interoperable with other electronic health systems. Although we built and deployed OpenMRS-Ebola more rapidly than typical software, our work highlights the challenges of having to develop an appropriate system during an emergency rather than being able to rapidly adapt an existing one. Lessons learned from this and previous emergencies should be used to ensure that a set of well-designed, easy-to-use, pretested health software is ready for quick deployment in future. PMID:28827211
A Codasyl-Type Schema for Natural Language Medical Records
Sager, N.; Tick, L.; Story, G.; Hirschman, L.
1980-01-01
This paper describes a CODASYL (network) database schema for information derived from narrative clinical reports. The goal of this work is to create an automated process that accepts natural language documents as input and maps this information into a database of a type managed by existing database management systems. The schema described here represents the medical events and facts identified through the natural language processing. This processing decomposes each narrative into a set of elementary assertions, represented as MEDFACT records in the database. Each assertion in turn consists of a subject and a predicate classed according to a limited number of medical event types, e.g., signs/symptoms, laboratory tests, etc. The subject and predicate are represented by EVENT records which are owned by the MEDFACT record associated with the assertion. The CODASYL-type network structure was found to be suitable for expressing most of the relations needed to represent the natural language information. However, special mechanisms were developed for storing the time relations between EVENT records and for recording connections (such as causality) between certain MEDFACT records. This schema has been implemented using the UNIVAC DMS-1100 DBMS.
Image retrieval and processing system version 2.0 development work
NASA Technical Reports Server (NTRS)
Slavney, Susan H.; Guinness, Edward A.
1991-01-01
The Image Retrieval and Processing System (IRPS) is a software package developed at Washington University and used by the NASA Regional Planetary Image Facilities (RPIF's). The IRPS combines data base management and image processing components to allow the user to examine catalogs of image data, locate the data of interest, and perform radiometric and geometric calibration of the data in preparation for analysis. Version 1.0 of IRPS was completed in Aug. 1989 and was installed at several IRPS's. Other RPIF's use remote logins via NASA Science Internet to access IRPS at Washington University. Work was begun on designing and population a catalog of Magellan image products that will be part of IRPS Version 2.0, planned for release by the end of calendar year 1991. With this catalog, a user will be able to search by orbit and by location for Magellan Basic Image Data Records (BIDR's), Mosaicked Image Data Records (MIDR's), and Altimetry-Radiometry Composite Data Records (ARCDR's). The catalog will include the Magellan CD-ROM volume, director, and file name for each data product. The image processing component of IRPS is based on the Planetary Image Cartography Software (PICS) developed by the U.S. Geological Survey, Flagstaff, Arizona. To augment PICS capabilities, a set of image processing programs were developed that are compatible with PICS-format images. This software includes general-purpose functions that PICS does not have, analysis and utility programs for specific data sets, and programs from other sources that were modified to work with PICS images. Some of the software will be integrated into the Version 2.0 release of IRPS. A table is presented that lists the programs with a brief functional description of each.
2006-01-12
KENNEDY SPACE CENTER, FLA. - Pilot Steve Fossett talks to the media after his landing of the Virgin Atlantic Airways GlobalFlyer aircraft at NASA Kennedy Space Center’s Shuttle Landing Facility. Standing at left are KSC Spaceport Development Manager Jim Ball, Center Director James Kennedy and Executive Director of Florida Space Authority Winston Scott. The aircraft is being relocated from Salina, Kan., to the Shuttle Landing Facility to begin preparations for an attempt to set a new world record for the longest flight made by any aircraft. An exact takeoff date for the record-setting flight has not been determined and is contingent on weather and jet-stream conditions. The window for the attempt opens in mid-January, making the flight possible anytime between then and the end of February. NASA agreed to let Virgin Atlantic Airways use Kennedy's Shuttle Landing Facility as a takeoff site. The facility use is part of a pilot program to expand runway access for non-NASA activities.
Promoting meaningful use of health information technology in Israel: ministry of health vision.
Gerber, Ayala; Topaz, Maxim Max
2014-01-01
The Ministry of Health (MOH) of Israel has overall responsibility for the healthcare system. In recent years the MOH has developed strong capabilities in the areas of technology assessment and prioritization of new technologies. Israel completed the transition to computerized medical records a decade ago in most care settings; however, the processes in Israel was spontaneous, without government control and standards settings, therefore large variations among systems and among organizations were created. Currently, the main challenge is to convert the information scattered in different systems, to organized, visible information and to make it available to various levels in health management. The MOH's solution is of implementing a selected information system from a specific vendor, at all the hospitals and all HMO's clinics, in order to achieve interoperability. The sys-tem will enable access to the patient's medical record history from any location.
Tweya, Hannock; Feldacker, Caryl; Gadabu, Oliver Jintha; Ng'ambi, Wingston; Mumba, Soyapi L; Phiri, Dave; Kamvazina, Luke; Mwakilama, Shawo; Kanyerere, Henry; Keiser, Olivia; Mwafilaso, Johnbosco; Kamba, Chancy; Egger, Matthias; Jahn, Andreas; Simwaka, Bertha; Phiri, Sam
2016-03-05
Implementation of user-friendly, real-time, electronic medical records for patient management may lead to improved adherence to clinical guidelines and improved quality of patient care. We detail the systematic, iterative process that implementation partners, Lighthouse clinic and Baobab Health Trust, employed to develop and implement a point-of-care electronic medical records system in an integrated, public clinic in Malawi that serves HIV-infected and tuberculosis (TB) patients. Baobab Health Trust, the system developers, conducted a series of technical and clinical meetings with Lighthouse and Ministry of Health to determine specifications. Multiple pre-testing sessions assessed patient flow, question clarity, information sequencing, and verified compliance to national guidelines. Final components of the TB/HIV electronic medical records system include: patient demographics; anthropometric measurements; laboratory samples and results; HIV testing; WHO clinical staging; TB diagnosis; family planning; clinical review; and drug dispensing. Our experience suggests that an electronic medical records system can improve patient management, enhance integration of TB/HIV services, and improve provider decision-making. However, despite sufficient funding and motivation, several challenges delayed system launch including: expansion of system components to include of HIV testing and counseling services; changes in the national antiretroviral treatment guidelines that required system revision; and low confidence to use the system among new healthcare workers. To ensure a more robust and agile system that met all stakeholder and user needs, our electronic medical records launch was delayed more than a year. Open communication with stakeholders, careful consideration of ongoing provider input, and a well-functioning, backup, paper-based TB registry helped ensure successful implementation and sustainability of the system. Additional, on-site, technical support provided reassurance and swift problem-solving during the extended launch period. Even when system users are closely involved in the design and development of an electronic medical record system, it is critical to allow sufficient time for software development, solicitation of detailed feedback from both users and stakeholders, and iterative system revisions to successfully transition from paper to point-of-care electronic medical records. For those in low-resource settings, electronic medical records for integrated care is a possible and positive innovation.
Muinga, Naomi; Magare, Steve; Monda, Jonathan; Kamau, Onesmus; Houston, Stuart; Fraser, Hamish; Powell, John; English, Mike; Paton, Chris
2018-04-18
The Kenyan government, working with international partners and local organizations, has developed an eHealth strategy, specified standards, and guidelines for electronic health record adoption in public hospitals and implemented two major health information technology projects: District Health Information Software Version 2, for collating national health care indicators and a rollout of the KenyaEMR and International Quality Care Health Management Information Systems, for managing 600 HIV clinics across the country. Following these projects, a modified version of the Open Medical Record System electronic health record was specified and developed to fulfill the clinical and administrative requirements of health care facilities operated by devolved counties in Kenya and to automate the process of collating health care indicators and entering them into the District Health Information Software Version 2 system. We aimed to present a descriptive case study of the implementation of an open source electronic health record system in public health care facilities in Kenya. We conducted a landscape review of existing literature concerning eHealth policies and electronic health record development in Kenya. Following initial discussions with the Ministry of Health, the World Health Organization, and implementing partners, we conducted a series of visits to implementing sites to conduct semistructured individual interviews and group discussions with stakeholders to produce a historical case study of the implementation. This case study describes how consultants based in Kenya, working with developers in India and project stakeholders, implemented the new system into several public hospitals in a county in rural Kenya. The implementation process included upgrading the hospital information technology infrastructure, training users, and attempting to garner administrative and clinical buy-in for adoption of the system. The initial deployment was ultimately scaled back due to a complex mix of sociotechnical and administrative issues. Learning from these early challenges, the system is now being redesigned and prepared for deployment in 6 new counties across Kenya. Implementing electronic health record systems is a challenging process in high-income settings. In low-income settings, such as Kenya, open source software may offer some respite from the high costs of software licensing, but the familiar challenges of clinical and administration buy-in, the need to adequately train users, and the need for the provision of ongoing technical support are common across the North-South divide. Strategies such as creating local support teams, using local development resources, ensuring end user buy-in, and rolling out in smaller facilities before larger hospitals are being incorporated into the project. These are positive developments to help maintain momentum as the project continues. Further integration with existing open source communities could help ongoing development and implementations of the project. We hope this case study will provide some lessons and guidance for other challenging implementations of electronic health record systems as they continue across Africa. ©Naomi Muinga, Steve Magare, Jonathan Monda, Onesmus Kamau, Stuart Houston, Hamish Fraser, John Powell, Mike English, Chris Paton. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 18.04.2018.
A method for recording verbal behavior in free-play settings.
Nordquist, V M
1971-01-01
The present study attempted to test the reliability of a new method of recording verbal behavior in a free-play preschool setting. Six children, three normal and three speech impaired, served as subjects. Videotaped records of verbal behavior were scored by two experimentally naive observers. The results suggest that the system provides a means of obtaining reliable records of both normal and impaired speech, even when the subjects exhibit nonverbal behaviors (such as hyperactivity) that interfere with direct observation techniques.
Quantin, Catherine; Jaquet-Chiffelle, David-Olivier; Coatrieux, Gouenou; Benzenine, Eric; Allaert, François-André
2011-02-01
The purpose of our multidisciplinary study was to define a pragmatic and secure alternative to the creation of a national centralised medical record which could gather together the different parts of the medical record of a patient scattered in the different hospitals where he was hospitalised without any risk of breaching confidentiality. We first analyse the reasons for the failure and the dangers of centralisation (i.e. difficulty to define a European patients' identifier, to reach a common standard for the contents of the medical record, for data protection) and then propose an alternative that uses the existing available data on the basis that setting up a safe though imperfect system could be better than continuing a quest for a mythical perfect information system that we have still not found after a search that has lasted two decades. We describe the functioning of Medical Record Search Engines (MRSEs), using pseudonymisation of patients' identity. The MRSE will be able to retrieve and to provide upon an MD's request all the available information concerning a patient who has been hospitalised in different hospitals without ever having access to the patient's identity. The drawback of this system is that the medical practitioner then has to read all of the information and to create his own synthesis and eventually to reject extra data. Faced with the difficulties and the risks of setting up a centralised medical record system, a system that gathers all of the available information concerning a patient could be of great interest. This low-cost pragmatic alternative which could be developed quickly should be taken into consideration by health authorities. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Paired charcoal and tree-ring records of high-frequency Holocene fire from two New Mexico bog sites
Allen, Craig D.; Anderson, R. Scott; Jass, R.B.; Toney, J.L.; Baisan, C.H.
2008-01-01
Two primary methods for reconstructing paleofire occurrence include dendrochronological dating of fire scars and stand ages from live or dead trees (extending back centuries into the past) and sedimentary records of charcoal particles from lakes and bogs, providing perspectives on fire history that can extend back for many thousands of years. Studies using both proxies have become more common in regions where lakes are present and fire frequencies are low, but are rare where high-frequency surface fires dominate and sedimentary deposits are primarily bogs and wetlands. Here we investigate sedimentary and fire-scar records of fire in two small watersheds in northern New Mexico, in settings recently characterised by relatively high-frequency fire where bogs and wetlands (Chihuahuen??os Bog and Alamo Bog) are more common than lakes. Our research demonstrates that: (1) essential features of the sedimentary charcoal record can be reproduced between multiple cores within a bog deposit; (2) evidence from both fire-scarred trees and charcoal deposits documents an anomalous lack of fire since ???1900, compared with the remainder of the Holocene; (3) sedimentary charcoal records probably underestimate the recurrence of fire events at these high-frequency fire sites; and (4) the sedimentary records from these bogs are complicated by factors such as burning and oxidation of these organic deposits, diversity of vegetation patterns within watersheds, and potential bioturbation by ungulates. We consider a suite of particular challenges in developing and interpreting fire histories from bog and wetland settings in the Southwest. The identification of these issues and constraints with interpretation of sedimentary charcoal fire records does not diminish their essential utility in assessing millennial-scale patterns of fire activity in this dry part of North America. ?? IAWF 2008.
An, Vadim A.; Ovtchinnikov, Vladimir M.; Kaazik, Pyotr B.; ...
2015-03-27
Seismologists from Kazakhstan, Russia, and the United States have rescued the Soviet-era archive of nuclear explosion seismograms recorded at Borovoye in northern Kazakhstan during the period 1966–1996. The signals had been stored on about 8000 magnetic tapes, which were held at the recording observatory. After hundreds of man-years of work, these digital waveforms together with significant metadata are now available via the project URL, namely http://www.ldeo.columbia.edu/res/pi/Monitoring/Data/ as a modern open database, of use to diverse communities. Three different sets of recording systems were operated at Borovoye, each using several different seismometers and different gain levels. For some explosions, more thanmore » twenty different channels of data are available. A first data release, in 2001, contained numerous glitches and lacked many instrument responses, but could still be used for measuring accurate arrival times and for comparison of the strengths of different types of seismic waves. The project URL also links to our second major data release, for nuclear explosions in Eurasia recorded in Borovoye, in which the data have been deglitched, all instrument responses have been included, and recording systems are described in detail. This second dataset consists of more than 3700 waveforms (digital seismograms) from almost 500 nuclear explosions in Eurasia, many of them recorded at regional distances. It is important as a training set for the development and evaluation of seismological methods of discriminating between earthquakes and underground explosions, and can be used for assessment of three-dimensional models of the Earth’s interior structure.« less
Thrasher, Ashley B.; Walker, Stacy E.; Hankemeier, Dorice A.; Pitney, William A.
2015-01-01
Context: Many newly credentialed athletic trainers gain initial employment as graduate assistants (GAs) in the collegiate setting, yet their socialization into their role is unknown. Exploring the socialization process of GAs in the collegiate setting could provide insight into how that process occurs. Objective: To explore the professional socialization of GAs in the collegiate setting to determine how GAs are socialized and developed as athletic trainers. Design: Qualitative study. Setting: Individual phone interviews. Patients or Other Participants: Athletic trainers (N = 21) who had supervised GAs in the collegiate setting for a minimum of 8 years (16 men [76%], 5 women [24%]; years of supervision experience = 14.6 ± 6.6). Data Collection and Analysis: Data were collected via phone interviews, which were recorded and transcribed verbatim. Data were analyzed by a 4-person consensus team with a consensual qualitative-research design. The team independently coded the data and compared ideas until a consensus was reached, and a codebook was created. Trustworthiness was established through member checks and multianalyst triangulation. Results: Four themes emerged: (1) role orientation, (2) professional development and support, (3) role expectations, and (4) success. Role orientation occurred both formally (eg, review of policies and procedures) and informally (eg, immediate role immersion). Professional development and support consisted of the supervisor mentoring and intervening when appropriate. Role expectations included decision-making ability, independent practice, and professionalism; however, supervisors often expected GAs to function as experienced, full-time staff. Success of the GAs depended on their adaptability and on the proper selection of GAs by supervisors. Conclusions: Supervisors socialize GAs into the collegiate setting by providing orientation, professional development, mentoring, and intervention when necessary. Supervisors are encouraged to use these socialization tactics to enhance the professional development of GAs in the collegiate setting. PMID:25347237
PERFORMANCE OF OVID MEDLINE SEARCH FILTERS TO IDENTIFY HEALTH STATE UTILITY STUDIES.
Arber, Mick; Garcia, Sonia; Veale, Thomas; Edwards, Mary; Shaw, Alison; Glanville, Julie M
2017-01-01
This study was designed to assess the sensitivity of three Ovid MEDLINE search filters developed to identify studies reporting health state utility values (HSUVs), to improve the performance of the best performing filter, and to validate resulting search filters. Three quasi-gold standard sets (QGS1, QGS2, QGS3) of relevant studies were harvested from reviews of studies reporting HSUVs. The performance of three initial filters was assessed by measuring their relative recall of studies in QGS1. The best performing filter was then developed further using QGS2. This resulted in three final search filters (FSF1, FSF2, and FSF3), which were validated using QGS3. FSF1 (sensitivity maximizing) retrieved 132/139 records (sensitivity: 95 percent) in the QGS3 validation set. FSF1 had a number needed to read (NNR) of 842. FSF2 (balancing sensitivity and precision) retrieved 128/139 records (sensitivity: 92 percent) with a NNR of 502. FSF3 (precision maximizing) retrieved 123/139 records (sensitivity: 88 percent) with a NNR of 383. We have developed and validated a search filter (FSF1) to identify studies reporting HSUVs with high sensitivity (95 percent) and two other search filters (FSF2 and FSF3) with reasonably high sensitivity (92 percent and 88 percent) but greater precision, resulting in a lower NNR. These seem to be the first validated filters available for HSUVs. The availability of filters with a range of sensitivity and precision options enables researchers to choose the filter which is most appropriate to the resources available for their specific research.
Bhatti, Junaid A; Razzak, Junaid A; Lagarde, Emmanuel; Salmi, Louis-Rachid
2011-03-22
Research undertaken in developing countries has assessed discrepancies in police reporting of Road Traffic Injury (RTI) for urban settings only. The objective of this study was to assess differences in RTI reporting across police, ambulance, and hospital Emergency Department (ED) datasets on an interurban road section in Pakistan. The study setting was the 196-km long Karachi-Hala road section. RTIs reported to the police, Edhi Ambulance Service (EAS), and five hospital EDs in Karachi during 2008 (Jan to Dec) were compared in terms of road user involved (pedestrians, motorcyclists, four-wheeled vehicle occupants) and outcome (died or injured). Further, records from these data were matched to assess ascertainment of traffic injuries and deaths by the three datasets. A total of 143 RTIs were reported to the police, 531 to EAS, and 661 to hospital EDs. Fatality per hundred traffic injuries was twice as high in police records (19 per 100 RTIs) than in ambulance (10 per 100 RTIs) and hospital ED records (9 per 100 RTIs). Pedestrian and motorcyclist involvement per hundred traffic injuries was lower in police records (8 per 100 RTIs) than in ambulance (17 per 100 RTIs) and hospital ED records (43 per 100 RTIs). Of the 119 deaths independently identified after matching, police recorded 22.6%, EAS 46.2%, and hospital ED 50.4%. Similarly, police data accounted for 10.6%, EAS 43.5%, and hospital ED 54.9% of the 1 095 independently identified injured patients. Police reporting, particularly of non-fatal RTIs and those involving vulnerable road users, should be improved in Pakistan.
The 'Seamless Web': the development of the electronic patient record in Aarhus region, Denmark.
Jensen, C B
2003-01-01
The article surveys the organization of the current project to develop an electronic patient record in the Aarhus Region, Denmark. The article is based on various policy documents and reports as well as a number of semi-structured interviews with project managers from the EPR organization in Aarhus County and with participants in the development process at local hospitals. This material is used to present and discuss the framing of the project in a 'discourse coalition'. The stabilization of a specific discourse coalition has been an important factor in ensuring the success of the development project up to the present moment. This coalition became relatively stable by integrating a diverse set of actors in a story-line about the relationships between co-operation, management and technology in the medial sector, and has influenced the modular organization of the project. The successful maintenance of the discourse coalition allows the project to appear 'seamless' from the outside. Conversely, the project is likely to be continually reviewed as successful only to the extent that it is able to flexibly keep the fluctuating set of relevant actors in alignment. If the practical work of keeping a coalition in place remains invisible it becomes easy to imagine an ideal way of planning large socio-technical projects, like developing an ECR. But practical success is more likely to be achieved if one takes seriously the thorough intertwining of discursive, organizational and technical aspects of development projects.
2011-01-01
Background Cardiotocography (CTG) is the most widely used tool for fetal surveillance. The visual analysis of fetal heart rate (FHR) traces largely depends on the expertise and experience of the clinician involved. Several approaches have been proposed for the effective interpretation of FHR. In this paper, a new approach for FHR feature extraction based on empirical mode decomposition (EMD) is proposed, which was used along with support vector machine (SVM) for the classification of FHR recordings as 'normal' or 'at risk'. Methods The FHR were recorded from 15 subjects at a sampling rate of 4 Hz and a dataset consisting of 90 randomly selected records of 20 minutes duration was formed from these. All records were labelled as 'normal' or 'at risk' by two experienced obstetricians. A training set was formed by 60 records, the remaining 30 left as the testing set. The standard deviations of the EMD components are input as features to a support vector machine (SVM) to classify FHR samples. Results For the training set, a five-fold cross validation test resulted in an accuracy of 86% whereas the overall geometric mean of sensitivity and specificity was 94.8%. The Kappa value for the training set was .923. Application of the proposed method to the testing set (30 records) resulted in a geometric mean of 81.5%. The Kappa value for the testing set was .684. Conclusions Based on the overall performance of the system it can be stated that the proposed methodology is a promising new approach for the feature extraction and classification of FHR signals. PMID:21244712
A 3,000-year quantitative drought record derived from XRF element data from a south Texas playa
NASA Astrophysics Data System (ADS)
Livsey, D. N.; Simms, A.; Hangsterfer, A.; Nisbet, R.; DeWitt, R.
2013-12-01
Recent droughts throughout the central United States highlight the need for a better understanding of the past frequency and severity of drought occurrence. Current records of past drought for the south Texas coast are derived from tree-ring data that span approximately the last 900 years before present (BP). In this study we utilize a supervised learning routine to create a transfer function between X-Ray Fluorescence (XRF) derived elemental data from Laguna Salada, Texas core LS10-02 to a locally derived tree-ring drought record. From this transfer function the 900 BP tree-ring drought record was extended to 3,000 BP. The supervised learning routine was trained on the first 100 years of XRF element data and tree-ring drought data to create the transfer function and training data set output. The model was then projected from the XRF elemental data for the next 800 years to create a deployed data set output and to test the transfer function parameters. The coefficients of determination between the model output and observed values are 0.77 and 0.70 for the 100-year training data set and 900-year deployed data set respectively. Given the relatively high coefficients of determination for both the training data set and deployed data set we interpret the model parameters are fairly robust and that a high-resolution drought record can be derived from the XRF element data. These results indicate that XRF element data can be used as a quantitative tool to reconstruct past drought records.
Describing Images: A Case Study of Visual Literacy among Library and Information Science Students
ERIC Educational Resources Information Center
Beaudoin, Joan E.
2016-01-01
This paper reports on a study that examined the development of pedagogical methods for increasing the visual literacy skills of a group of library and information science students. Through a series of three assignments, students were asked to provide descriptive information for a set of historical photographs and record reflections on their…
ERIC Educational Resources Information Center
Hatch, C. Richard
A 15- to 20-hour course on materials recycling, teaching junior high school students about environmental problems and solutions, is developed in this set of materials. It attempts to stimulate them to participate in community efforts aimed at improving the environment. Items in the kit include: (1) teacher's manual, with lesson plans enumerating…
USDA-ARS?s Scientific Manuscript database
The long-term goal of the research reported in this review is to develop methodology for assessment of grapevine resistant to sharpshooter inoculation of Xylella fastidiosa(Xf)into healthy grapevines, thereby preventing Xf infection. Such a trait would be quite different from the more common mechani...
Assertiveness: making yourself heard in district nursing.
Lawton, Sally; Stewart, Fiona
2005-06-01
Being assertive is not the same as being aggressive. Assertiveness is a tool for expressing ourselves confidently, and a way of saying 'yes' and 'no' in an appropriate way. This article explores issues concerned with assertiveness in district nurse settings. It outlines helpful techniques to develop assertiveness, such as the broken record, fogging, negative assertion and negative inquiry.
Making Sense of Students' Actions in an Open-Ended Virtual Laboratory Environment
ERIC Educational Resources Information Center
Gal, Ya'akov; Uzan, Oriel; Belford, Robert; Karabinos, Michael; Yaron, David
2015-01-01
A process for analyzing log files collected from open-ended learning environments is developed and tested on a virtual lab problem involving reaction stoichiometry. The process utilizes a set of visualization tools that, by grouping student actions in a hierarchical manner, helps experts make sense of the linear list of student actions recorded in…
Ontology-Based Data Integration of Open Source Electronic Medical Record and Data Capture Systems
ERIC Educational Resources Information Center
Guidry, Alicia F.
2013-01-01
In low-resource settings, the prioritization of clinical care funding is often determined by immediate health priorities. As a result, investment directed towards the development of standards for clinical data representation and exchange are rare and accordingly, data management systems are often redundant. Open-source systems such as OpenMRS and…
Menne, M. J. [National Climatic Data Center, National Oceanic and Atmospheric Administration; Williams, Jr., C. N. [National Climatic Data Center, National Oceanic and Atmospheric Administration; Vose, R. S. [National Climatic Data Center, National Oceanic and Atmospheric Administration
2016-01-01
The United States Historical Climatology Network (USHCN) is a high-quality data set of daily and monthly records of basic meteorological variables from 1218 observing stations across the 48 contiguous United States. Daily data include observations of maximum and minimum temperature, precipitation amount, snowfall amount, and snow depth; monthly data consist of monthly-averaged maximum, minimum, and mean temperature and total monthly precipitation. Most of these stations are U.S. Cooperative Observing Network stations located generally in rural locations, while some are National Weather Service First-Order stations that are often located in more urbanized environments. The USHCN has been developed over the years at the National Oceanic and Atmospheric Administration's (NOAA) National Climatic Data Center (NCDC) to assist in the detection of regional climate change. Furthermore, it has been widely used in analyzing U.S. climte. The period of record varies for each station. USHCN stations were chosen using a number of criteria including length of record, percent of missing data, number of station moves and other station changes that may affect data homogeneity, and resulting network spatial coverage. Collaboration between NCDC and CDIAC on the USHCN project dates to the 1980s (Quinlan et al. 1987). At that time, in response to the need for an accurate, unbiased, modern historical climate record for the United States, the Global Change Research Program of the U.S. Department of Energy and NCDC chose a network of 1219 stations in the contiguous United States that would become a key baseline data set for monitoring U.S. climate. This initial USHCN data set contained monthly data and was made available free of charge from CDIAC. Since then it has been comprehensively updated several times [e.g., Karl et al. (1990) and Easterling et al. (1996)]. The initial USHCN daily data set was made available through CDIAC via Hughes et al. (1992) and contained a 138-station subset of the USHCN. This product was updated by Easterling et al. (1999) and expanded to include 1062 stations. In 2009 the daily USHCN dataset was expanded to include all 1218 stations in the USHCN.
Home page of Arnold Air Force Base
time to reflect on the men and women who have gi... Facebook Logo Free-jet engine test at AEDC facility record for free-jet mode engines by achieving transonic speeds! @AEDCnews https://t.co/6lD4T5bnte Free-jet engine test at AEDC facility sets record Free-jet engine test at AEDC facility sets record
Measuring Data Quality Through a Source Data Verification Audit in a Clinical Research Setting.
Houston, Lauren; Probst, Yasmine; Humphries, Allison
2015-01-01
Health data has long been scrutinised in relation to data quality and integrity problems. Currently, no internationally accepted or "gold standard" method exists measuring data quality and error rates within datasets. We conducted a source data verification (SDV) audit on a prospective clinical trial dataset. An audit plan was applied to conduct 100% manual verification checks on a 10% random sample of participant files. A quality assurance rule was developed, whereby if >5% of data variables were incorrect a second 10% random sample would be extracted from the trial data set. Error was coded: correct, incorrect (valid or invalid), not recorded or not entered. Audit-1 had a total error of 33% and audit-2 36%. The physiological section was the only audit section to have <5% error. Data not recorded to case report forms had the greatest impact on error calculations. A significant association (p=0.00) was found between audit-1 and audit-2 and whether or not data was deemed correct or incorrect. Our study developed a straightforward method to perform a SDV audit. An audit rule was identified and error coding was implemented. Findings demonstrate that monitoring data quality by a SDV audit can identify data quality and integrity issues within clinical research settings allowing quality improvement to be made. The authors suggest this approach be implemented for future research.
Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets
ERIC Educational Resources Information Center
Kulp, Christopher W.; Sprechini, Gene D.
2016-01-01
A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…
Constructing Topic Models of Internet of Things for Information Processing
Xin, Jie; Cui, Zhiming; Zhang, Shukui; He, Tianxu; Li, Chunhua; Huang, Haojing
2014-01-01
Internet of Things (IoT) is regarded as a remarkable development of the modern information technology. There is abundant digital products data on the IoT, linking with multiple types of objects/entities. Those associated entities carry rich information and usually in the form of query records. Therefore, constructing high quality topic hierarchies that can capture the term distribution of each product record enables us to better understand users' search intent and benefits tasks such as taxonomy construction, recommendation systems, and other communications solutions for the future IoT. In this paper, we propose a novel record entity topic model (RETM) for IoT environment that is associated with a set of entities and records and a Gibbs sampling-based algorithm is proposed to learn the model. We conduct extensive experiments on real-world datasets and compare our approach with existing methods to demonstrate the advantage of our approach. PMID:25110737
Operating manual for the R200 downhole recorder with husky hunter retriever
Johnson, Roy A.; Rorabaugh, James I.
1988-01-01
The R200 Downhole Recorder is a battery-powered device that, when placed in a well casing, monitors water levels for a period of up to 1 year. This instrument measures a 1- to 70-foot range of water levels. These water-level data can be retrieved through use of a commercially available portable microcomputer. The R200 Downhole Recorder was developed at the U.S. Geological Survey 's Hydrologic Instrumentation Facility, Stennis Space Center, Mississippi. This operating manual describes the R200 Downhole Recorder, provides initial set-up instructions, and gives directions for on-site operation. Design specifications and routine maintenance steps are included. The R200 data-retriever program is a user-friendly, menu-driven program. The manual guides the user through the procedures required to perform specific operations. Numerous screens are reproduced in the text with a discussion of user input for desired responses. Help is provided for specific problems. (USGS)
Operating manual for the R200 downhole recorder with Tandy 102 retriever
Johnson, Roy A.; Rorabaugh, James I.
1988-01-01
The R200 Downhole Recorder is a battery-powered device that, when placed in a well casing, monitors water levels for a period of up to 1 year. This instrument measures a 1- to 70-ft range of water levels. These water level data can be retrieved through use of a commercially available portable microcomputer. The R200 Downhole Recorder was developed at the U. S. Geological Survey 's Hydrologic Instrumentation Facility, Stennis Space Center, Mississippi. This operating manual describes the R200 Downhole Recorder, provides initial set-up instructions, and gives directions for on-site operation. Design specifications and routine maintenance steps are included. The R200 data-retriever program is a user-friendly, menu-driven program. The manual guides the user through the procedures required to perform specific operations. Numerous screens are reproduced in the text with a discussion of user input for desired responses. Help is provided for specific problems. (USGS)
Constructing topic models of Internet of Things for information processing.
Xin, Jie; Cui, Zhiming; Zhang, Shukui; He, Tianxu; Li, Chunhua; Huang, Haojing
2014-01-01
Internet of Things (IoT) is regarded as a remarkable development of the modern information technology. There is abundant digital products data on the IoT, linking with multiple types of objects/entities. Those associated entities carry rich information and usually in the form of query records. Therefore, constructing high quality topic hierarchies that can capture the term distribution of each product record enables us to better understand users' search intent and benefits tasks such as taxonomy construction, recommendation systems, and other communications solutions for the future IoT. In this paper, we propose a novel record entity topic model (RETM) for IoT environment that is associated with a set of entities and records and a Gibbs sampling-based algorithm is proposed to learn the model. We conduct extensive experiments on real-world datasets and compare our approach with existing methods to demonstrate the advantage of our approach.
Botti, F; Alexander, A; Drygajlo, A
2004-12-02
This paper deals with a procedure to compensate for mismatched recording conditions in forensic speaker recognition, using a statistical score normalization. Bayesian interpretation of the evidence in forensic automatic speaker recognition depends on three sets of recordings in order to perform forensic casework: reference (R) and control (C) recordings of the suspect, and a potential population database (P), as well as a questioned recording (QR) . The requirement of similar recording conditions between suspect control database (C) and the questioned recording (QR) is often not satisfied in real forensic cases. The aim of this paper is to investigate a procedure of normalization of scores, which is based on an adaptation of the Test-normalization (T-norm) [2] technique used in the speaker verification domain, to compensate for the mismatch. Polyphone IPSC-02 database and ASPIC (an automatic speaker recognition system developed by EPFL and IPS-UNIL in Lausanne, Switzerland) were used in order to test the normalization procedure. Experimental results for three different recording condition scenarios are presented using Tippett plots and the effect of the compensation on the evaluation of the strength of the evidence is discussed.
Use of globally unique identifiers (GUIDs) to link herbarium specimen records to physical specimens.
Nelson, Gil; Sweeney, Patrick; Gilbert, Edward
2018-02-01
With the advent of the U.S. National Science Foundation's Advancing Digitization of Biodiversity Collections program and related worldwide digitization initiatives, the rate of herbarium specimen digitization in the United States has expanded exponentially. As the number of electronic herbarium records proliferates, the importance of linking these records to the physical specimens they represent as well as to related records from other sources will intensify. Although a rich and diverse literature has developed over the past decade that addresses the use of specimen identifiers for facilitating linking across the internet, few implementable guidelines or recommended practices for herbaria have been advanced. Here we review this literature with the express purpose of distilling a specific set of recommendations especially tailored to herbarium specimen digitization, curation, and management. We argue that associating globally unique identifiers (GUIDs) with physical herbarium specimens and including these identifiers in all electronic records about those specimens is essential to effective digital data curation. We also address practical applications for ensuring these associations.
Measuring Nursing Value from the Electronic Health Record.
Welton, John M; Harper, Ellen M
2016-01-01
We report the findings of a big data nursing value expert group made up of 14 members of the nursing informatics, leadership, academic and research communities within the United States tasked with 1. Defining nursing value, 2. Developing a common data model and metrics for nursing care value, and 3. Developing nursing business intelligence tools using the nursing value data set. This work is a component of the Big Data and Nursing Knowledge Development conference series sponsored by the University Of Minnesota School Of Nursing. The panel met by conference calls for fourteen 1.5 hour sessions for a total of 21 total hours of interaction from August 2014 through May 2015. Primary deliverables from the bit data expert group were: development and publication of definitions and metrics for nursing value; construction of a common data model to extract key data from electronic health records; and measures of nursing costs and finance to provide a basis for developing nursing business intelligence and analysis systems.
Liu, Nehemiah T; Holcomb, John B; Wade, Charles E; Batchinsky, Andriy I; Cancio, Leopoldo C; Darrah, Mark I; Salinas, José
2014-02-01
Accurate and effective diagnosis of actual injury severity can be problematic in trauma patients. Inherent physiologic compensatory mechanisms may prevent accurate diagnosis and mask true severity in many circumstances. The objective of this project was the development and validation of a multiparameter machine learning algorithm and system capable of predicting the need for life-saving interventions (LSIs) in trauma patients. Statistics based on means, slopes, and maxima of various vital sign measurements corresponding to 79 trauma patient records generated over 110,000 feature sets, which were used to develop, train, and implement the system. Comparisons among several machine learning models proved that a multilayer perceptron would best implement the algorithm in a hybrid system consisting of a machine learning component and basic detection rules. Additionally, 295,994 feature sets from 82 h of trauma patient data showed that the system can obtain 89.8 % accuracy within 5 min of recorded LSIs. Use of machine learning technologies combined with basic detection rules provides a potential approach for accurately assessing the need for LSIs in trauma patients. The performance of this system demonstrates that machine learning technology can be implemented in a real-time fashion and potentially used in a critical care environment.
Ertmer, David J.; Jung, Jongmin
2012-01-01
Background Evidence of auditory-guided speech development can be heard as the prelinguistic vocalizations of young cochlear implant recipients become increasingly complex, phonetically diverse, and speech-like. In research settings, these changes are most often documented by collecting and analyzing speech samples. Sampling, however, may be too time-consuming and impractical for widespread use in clinical settings. The Conditioned Assessment of Speech Production (CASP; Ertmer & Stoel-Gammon, 2008) is an easily administered and time-efficient alternative to speech sample analysis. The current investigation examined the concurrent validity of the CASP and data obtained from speech samples recorded at the same intervals. Methods Nineteen deaf children who received CIs before their third birthdays participated in the study. Speech samples and CASP scores were gathered at 6, 12, 18, and 24 months post-activation. Correlation analyses were conducted to assess the concurrent validity of CASP scores and data from samples. Results CASP scores showed strong concurrent validity with scores from speech samples gathered across all recording sessions (6 – 24 months). Conclusions The CASP was found to be a valid, reliable, and time-efficient tool for assessing progress in vocal development during young CI recipient’s first 2 years of device experience. PMID:22628109
CONEDEP: COnvolutional Neural network based Earthquake DEtection and Phase Picking
NASA Astrophysics Data System (ADS)
Zhou, Y.; Huang, Y.; Yue, H.; Zhou, S.; An, S.; Yun, N.
2017-12-01
We developed an automatic local earthquake detection and phase picking algorithm based on Fully Convolutional Neural network (FCN). The FCN algorithm detects and segments certain features (phases) in 3 component seismograms to realize efficient picking. We use STA/LTA algorithm and template matching algorithm to construct the training set from seismograms recorded 1 month before and after the Wenchuan earthquake. Precise P and S phases are identified and labeled to construct the training set. Noise data are produced by combining back-ground noise and artificial synthetic noise to form the equivalent scale of noise set as the signal set. Training is performed on GPUs to achieve efficient convergence. Our algorithm has significantly improved performance in terms of the detection rate and precision in comparison with STA/LTA and template matching algorithms.
2011-12-30
which data sets containing GT0 events (explosions and mine tremors) are available, local crustal structure is well known, and hand-picked arrival...available, local crustal structure is well known, and hand-picked arrival times have been obtained. Boomer et al. (2010) describes the development of...local criteria for the simple crustal structure of the Archean Kaapvaal Craton in southern Africa. Continuing the development of local criteria in
2013-08-01
surgeries, hospitalizations, etc). Once our model is developed we hope to apply our model at an outside institution, specifically University of...to build predictive models with the hope of improving disease management. It is difficult to find these factors in EMR systems as the...death, surgeries, hospitalizations, etc.) Once our model is developed, we hope to apply the model to de-identified data set from the University of
Smartphone attachment for stethoscope recording.
Thompson, Jeff
2015-01-01
With the ubiquity of smartphones and the rising technology of 3D printing, novel devices can be developed that leverage the "computer in your pocket" and rapid prototyping technologies toward scientific, medical, engineering, and creative purposes. This paper describes such a device: a simple 3D-printed extension for Apple's iPhone that allows the sound from an off-the-shelf acoustic stethoscope to be recorded using the phone's built-in microphone. The attachment's digital 3D files can be easily shared, modified for similar phones and devices capable of recording audio, and in combination with 3D printing technology allow for fabrication of a durable device without need for an entire factory of expensive and specialized machining tools. It is hoped that by releasing this device as an open source set of printable files that can be downloaded and reproduced cheaply, others can make use of these developments where access to cost-prohibitive, specialized medical instruments are not available. Coupled with specialized smartphone software ("apps"), more sophisticated and automated diagnostics may also be possible on-site.
Personal Health Records: Is Rapid Adoption Hindering Interoperability?
Studeny, Jana; Coustasse, Alberto
2014-01-01
The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822
Niijima, H; Ito, N; Ogino, S; Takatori, T; Iwase, H; Kobayashi, M
2000-11-01
For the purpose of practical use of speech recognition technology for recording of forensic autopsy, a language model of the speech recording system, specialized for the forensic autopsy, was developed. The language model for the forensic autopsy by applying 3-gram model was created, and an acoustic model for Japanese speech recognition by Hidden Markov Model in addition to the above were utilized to customize the speech recognition engine for forensic autopsy. A forensic vocabulary set of over 10,000 words was compiled and some 300,000 sentence patterns were made to create the forensic language model, then properly mixing with a general language model to attain high exactitude. When tried by dictating autopsy findings, this speech recognition system was proved to be about 95% of recognition rate that seems to have reached to the practical usability in view of speech recognition software, though there remains rooms for improving its hardware and application-layer software.
DRUMS: Disk Repository with Update Management and Select option for high throughput sequencing data
2014-01-01
Background New technologies for analyzing biological samples, like next generation sequencing, are producing a growing amount of data together with quality scores. Moreover, software tools (e.g., for mapping sequence reads), calculating transcription factor binding probabilities, estimating epigenetic modification enriched regions or determining single nucleotide polymorphism increase this amount of position-specific DNA-related data even further. Hence, requesting data becomes challenging and expensive and is often implemented using specialised hardware. In addition, picking specific data as fast as possible becomes increasingly important in many fields of science. The general problem of handling big data sets was addressed by developing specialized databases like HBase, HyperTable or Cassandra. However, these database solutions require also specialized or distributed hardware leading to expensive investments. To the best of our knowledge, there is no database capable of (i) storing billions of position-specific DNA-related records, (ii) performing fast and resource saving requests, and (iii) running on a single standard computer hardware. Results Here, we present DRUMS (Disk Repository with Update Management and Select option), satisfying demands (i)-(iii). It tackles the weaknesses of traditional databases while handling position-specific DNA-related data in an efficient manner. DRUMS is capable of storing up to billions of records. Moreover, it focuses on optimizing relating single lookups as range request, which are needed permanently for computations in bioinformatics. To validate the power of DRUMS, we compare it to the widely used MySQL database. The test setting considers two biological data sets. We use standard desktop hardware as test environment. Conclusions DRUMS outperforms MySQL in writing and reading records by a factor of two up to a factor of 10000. Furthermore, it can work with significantly larger data sets. Our work focuses on mid-sized data sets up to several billion records without requiring cluster technology. Storing position-specific data is a general problem and the concept we present here is a generalized approach. Hence, it can be easily applied to other fields of bioinformatics. PMID:24495746
NASA Snaps Picture of Eastern U.S. in a Record-Breaking "Freezer"
2015-02-20
NASA's Terra satellite captured an image of the snow-covered eastern U.S. that looks like the states have been sitting in a freezer. In addition to the snow cover, Arctic and Siberian air masses have settled in over the Eastern U.S. triggering many record low temperatures in many states. On Feb. 19 at 16:40 UTC (11:40 a.m. EST), the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument that flies aboard NASA's Terra satellite captured a picture of the snowy landscape. The snow cover combined with the frosty air mass made the eastern U.S. feel like the inside of freezer. The MODIS image was created at NASA's Goddard Space Flight Center in Greenbelt, Maryland. On the morning of Feb. 20, NOAA's Weather Prediction Center (WPC) noted, "There were widespread subzero overnight lows Thursday night (Feb. 19) extending from Illinois to western Virginia, and numerous record lows were set. Bitterly-cold arctic air is setting numerous temperature records across the eastern U.S. and will keep temperatures well below normal on Friday (Feb. 20)." In Baltimore, Maryland, a low temperature of 1F broke the record low for coldest morning recorded at the Thurgood Marshall Baltimore Washington-International Airport. In Louisville, Kentucky, temperatures dropped to -6F, breaking the old record low of 0F, according to meteorologist Brian Goode of WAVE-TV. Meanwhile, Richmond Kentucky bottomed out at a frigid -32F. In North Carolina, a record low temperature was set at Charlotte where the overnight temperature bottomed out at 7F breaking the old record of 13F in 1896. In Asheville, temperatures dropped to just 4F breaking the old record of 10F in 1979. Temperature records for Asheville extend back to 1876. Several records were also broken in Georgia, according to Matt Daniel, a meteorologist at WMAZ-TV, Macon Georgia, who cited data from the National Weather Service. Daniel said that Macon set a new record low when the temperature dropped to 18F, beating the previous record of 21F set in 1958. Athens broke a new record low, too dropping to 14F and beating the old record of 18F set in 1958/1928. NOAA's NPC noted that "Highs on Friday (Feb. 20) will struggle to get out of the teens from the Ohio Valley to the Mid-Atlantic region. After Friday, temperatures are forecast to moderate and get closer to February averages as a storm system approaches from the west." Image Credit: NASA Goddard MODIS Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
González-Beltrán, Alejandra N; Yong, May Y; Dancey, Gairin; Begent, Richard
2012-01-06
Biology, biomedicine and healthcare have become data-driven enterprises, where scientists and clinicians need to generate, access, validate, interpret and integrate different kinds of experimental and patient-related data. Thus, recording and reporting of data in a systematic and unambiguous fashion is crucial to allow aggregation and re-use of data. This paper reviews the benefits of existing biomedical data standards and focuses on key elements to record experiments for therapy development. Specifically, we describe the experiments performed in molecular, cellular, animal and clinical models. We also provide an example set of elements for a therapy tested in a phase I clinical trial. We introduce the Guidelines for Information About Therapy Experiments (GIATE), a minimum information checklist creating a consistent framework to transparently report the purpose, methods and results of the therapeutic experiments. A discussion on the scope, design and structure of the guidelines is presented, together with a description of the intended audience. We also present complementary resources such as a classification scheme, and two alternative ways of creating GIATE information: an electronic lab notebook and a simple spreadsheet-based format. Finally, we use GIATE to record the details of the phase I clinical trial of CHT-25 for patients with refractory lymphomas. The benefits of using GIATE for this experiment are discussed. While data standards are being developed to facilitate data sharing and integration in various aspects of experimental medicine, such as genomics and clinical data, no previous work focused on therapy development. We propose a checklist for therapy experiments and demonstrate its use in the 131Iodine labeled CHT-25 chimeric antibody cancer therapy. As future work, we will expand the set of GIATE tools to continue to encourage its use by cancer researchers, and we will engineer an ontology to annotate GIATE elements and facilitate unambiguous interpretation and data integration.
2012-01-01
Background Biology, biomedicine and healthcare have become data-driven enterprises, where scientists and clinicians need to generate, access, validate, interpret and integrate different kinds of experimental and patient-related data. Thus, recording and reporting of data in a systematic and unambiguous fashion is crucial to allow aggregation and re-use of data. This paper reviews the benefits of existing biomedical data standards and focuses on key elements to record experiments for therapy development. Specifically, we describe the experiments performed in molecular, cellular, animal and clinical models. We also provide an example set of elements for a therapy tested in a phase I clinical trial. Findings We introduce the Guidelines for Information About Therapy Experiments (GIATE), a minimum information checklist creating a consistent framework to transparently report the purpose, methods and results of the therapeutic experiments. A discussion on the scope, design and structure of the guidelines is presented, together with a description of the intended audience. We also present complementary resources such as a classification scheme, and two alternative ways of creating GIATE information: an electronic lab notebook and a simple spreadsheet-based format. Finally, we use GIATE to record the details of the phase I clinical trial of CHT-25 for patients with refractory lymphomas. The benefits of using GIATE for this experiment are discussed. Conclusions While data standards are being developed to facilitate data sharing and integration in various aspects of experimental medicine, such as genomics and clinical data, no previous work focused on therapy development. We propose a checklist for therapy experiments and demonstrate its use in the 131Iodine labeled CHT-25 chimeric antibody cancer therapy. As future work, we will expand the set of GIATE tools to continue to encourage its use by cancer researchers, and we will engineer an ontology to annotate GIATE elements and facilitate unambiguous interpretation and data integration. PMID:22226027
Gyrocopter-Based Remote Sensing Platform
NASA Astrophysics Data System (ADS)
Weber, I.; Jenal, A.; Kneer, C.; Bongartz, J.
2015-04-01
In this paper the development of a lightweight and highly modularized airborne sensor platform for remote sensing applications utilizing a gyrocopter as a carrier platform is described. The current sensor configuration consists of a high resolution DSLR camera for VIS-RGB recordings. As a second sensor modality, a snapshot hyperspectral camera was integrated in the aircraft. Moreover a custom-developed thermal imaging system composed of a VIS-PAN camera and a LWIR-camera is used for aerial recordings in the thermal infrared range. Furthermore another custom-developed highly flexible imaging system for high resolution multispectral image acquisition with up to six spectral bands in the VIS-NIR range is presented. The performance of the overall system was tested during several flights with all sensor modalities and the precalculated demands with respect to spatial resolution and reliability were validated. The collected data sets were georeferenced, georectified, orthorectified and then stitched to mosaics.
Cimino, James J.; Ayres, Elaine J.; Remennik, Lyubov; Rath, Sachi; Freedman, Robert; Beri, Andrea; Chen, Yang; Huser, Vojtech
2013-01-01
The US National Institutes of Health (NIH) has developed the Biomedical Translational Research Information System (BTRIS) to support researchers’ access to translational and clinical data. BTRIS includes a data repository, a set of programs for loading data from NIH electronic health records and research data management systems, an ontology for coding the disparate data with a single terminology, and a set of user interface tools that provide access to identified data from individual research studies and data across all studies from which individually identifiable data have been removed. This paper reports on unique design elements of the system, progress to date and user experience after five years of development and operation. PMID:24262893
Gage, Barbara; Stineman, Margaret; Deutsch, Anne; Mallinson, Trudy; Heinemann, Allen; Bernard, Shulamit; Constantine, Roberta
2007-12-01
Better measurement of the case-mix complexity of patients receiving rehabilitation services is critical to understanding variations in the outcomes achieved by patients treated in different postacute care (PAC) settings. The Medicare program recognized this issue and is undertaking a major initiative to develop a new patient-assessment instrument that would standardize case-mix measurement in inpatient rehabilitation facilities, long-term care hospitals, skilled nursing facilities, and home health agencies. The new instrument, called the Continuity Assessment Record and Evaluation Tool, builds on the scientific advances in measurement to develop standard measures of medical acuity, functional status, cognitive impairment, and social support related to resource need, outcomes, and continuity of care for use in all PAC settings.
Automated Data Base Implementation Requirements for the Avionics Planning Baseline - Army
1983-07-01
PJRQT PJRSG .... PRJR owns PJRQTR Item EFT A32 A26 In record EFR Item ESFT A36 A40 In record ESFR Item EQPOC ALCPOC A20 In record EQR Iten EPHONE LPHONE...USING EF DUPLICATES ARE NOT ALLOWED WITHIN EQSEG. EF TYPE CHARACTER 4. EFT TYPE CHARACTER 32. EG TYPE CHARACTER 4. RECORD NAME IS ESFR LOCATION MODE... ESFR MANDATORY AUTOMATIC LINKED TO OWNER ASCENDING KEY IS ESF DUPLICATES NOT SET SELECTION THRU LOCATION MODE OF OWNER. SET NAME IS ESEQ MODE CHAIN
A method for recording verbal behavior in free-play settings1
Nordquist, Vey M.
1971-01-01
The present study attempted to test the reliability of a new method of recording verbal behavior in a free-play preschool setting. Six children, three normal and three speech impaired, served as subjects. Videotaped records of verbal behavior were scored by two experimentally naive observers. The results suggest that the system provides a means of obtaining reliable records of both normal and impaired speech, even when the subjects exhibit nonverbal behaviors (such as hyperactivity) that interfere with direct observation techniques. ImagesFig. 1Fig. 2 PMID:16795310
Hydrologic conditions in Florida during Water Year 2008
Verdi, Richard J.; Holt, Sandra L.; Irvin, Ronald B.; Fulcher, David L.
2010-01-01
Record-high and record-low hydrologic conditions occurred during water year 2008 (October 1, 2007-September 30, 2008). Record-low levels were caused by a continuation of the 2007 water year drought conditions into the 2008 water year and persisting until summer rainfall. The gage at the Santa Fe River near Fort White site recorded record-low monthly mean discharges in October and November 2007. The previous records for this site were set in 1956 and 2002, respectively. Record-high conditions in northeast and northwest Florida were caused by the rainfall and runoff associated with Tropical Storm Fay. For example, St. Mary's River near Macclenny recorded a new record-high monthly mean discharge in August 2008. The previous record for this site was set in 1945. Lake Okeechobee in south Florida reached new minimum monthly mean lake levels since monitoring began in 1912 from October to March during the 2008 water year. Some wells throughout northwest and south Florida registered period-of-record lowest daily maximum water levels.
Blank, H. Richard; Healy, J.H.; Roller, John; Lamson, Ralph; Fisher, Fred; McClearn, Robert; Allen, Steve
1979-01-01
In February 1978 a seismic deep-refraction profile was recorded by the USGS along a 1000-km line across the Arabian Shield in western Saudi Arabia. The line begins in Paleozoic and Mesozoic cover rocks near Riyadh on the Arabian Platform, leads southwesterly across three major Precambrian tectonic provinces, traverses Cenozoic rocks of the coastal plain near Jizan (Tihamat Asir), and terminates at the outer edge of the Farasan Bank in the southern Red Sea. More than 500 surveyed recording sites were occupied, including 19 in the Farasan Islands. Six shot points were used--five on land, with charges placed mostly below water table in drill holes, and one at sea, with charges placed on the sea floor and fired from a ship. The total charge consumed was slightly in excess of 61 metric tons in 21 discrete firings. Seismic energy was recorded by means of a set of 100 newly developed portable seismic stations. Each station consists of a standard 2-Hz vertical geophone coupled to a self-contained analog recording instrument equipped with a magnetic-tape cassette. The stations were deployed in groups of 20 by five observer teams, each generally consisting of two scientist-technicians and a surveyor-guide. On the day prior to deployment, the instruments were calibrated and programmed for automatic operation by means of a specially designed device called a hand-held tester. At each of ten pre-selected recording time windows on a designated firing day, the instruments were programmed to turn on, stabilize, record internal calibration signals, record the seismic signals at three levels of amplification, and then deactivate. After the final window in the firing sequence, all instruments were retrieved and their data tapes removed for processing. A specially designed, field tape- dubbing system was utilized at shot point camps to organize and edit data recorded on the cassette tapes. The main functions of this system are to concatenate all data from each shot on any given day onto a single shot tape, and to provide hard copy for monitoring recorder performance so that any problems can be corrected prior to the next deployment. Composite digital record sections were produced from the dubbed tapes for each shot point by a portable processing and plotting system. The heart of this system is a DEC PDP 11VO3 computer, which controls a cassette playback unit identical to those used in the recorders and dubbers, a set of discriminators, a time-code translator, a digitizer, and a digital plotter. The system was used to maintain various informational data sets and to produce tabulations and listings of various sorts during the field operations, in addition to its main task of producing digital record sections. Two master clocks, both set to time signals broadcast by the British Broadcasting Corporation, provided absolute time for the recording operations. One was located on the ship and the other was stationed at a base camp on the mainland. The land-based master clock was used to set three additional master clocks located at the other active shot points a few days in advance of each firing, and these clocks were then used to set the internal clocks in the portable seismic stations via the hand-held tester. A master clock signal was also linked to the firing system at each shot point for determination of the absolute shot instant. It is possible to construct a generalized crustal model from examination of the six shot point composite record sections obtained in the field. Such a model rests upon a number of simplifying assumptions and will almost certainly be modified at a later stage of interpretation. The main assumptions are that the crust consists of two homogeneous isotropic layers having no velocity inversion,, that the Mohorovicic discontinuity is sharp, and that effects of surface inhomogeneities and elevation changes can be ignored. The main characteristics of the tentative model are the following: (1) The thickness of th
Hatcher, Robert L
2015-05-01
Comments on the article, "Guidelines for competency development and measurement in rehabilitation psychology postdoctoral training," by Stiers et al. (see record 2014-55195-001). Stiers and colleagues have provided a thorough and well-conceived set of guidelines that lay out the competencies expected for graduates of postdoctoral residencies in rehabilitation psychology, accompanied by a set of more specific, observable indicators of the residents' competence level. This work is an important aspect of the broader project of the Rehabilitation Psychology Specialty Council (APA Division 22, the American Board of Rehabilitation Psychology, the Foundation for Rehabilitation Psychology, the Academy of Rehabilitation Psychology, and the Council of Rehabilitation Psychology Postdocotral Training Programs) to develop overall guidelines for programs providing postdoctoral training in this field (Stiers et al., 2012). (c) 2015 APA, all rights reserved).
Campos, A A; Nathanson, D
1999-10-01
Addition silicones (polyvinyl siloxanes) are universally accepted as accurate and stable impression materials. They have also gained popularity as interocclusal record materials. However, it has not been defined if it is possible to work with polyvinyl siloxanes without changing the recorded maxillomandibular relations. This study examined the compressibility of 2 addition silicones as interocclusal record materials, analyzing the changes of maxillomandibular relations at the condyle region when different compressive forces are used to stabilize articulated casts. Sixteen interocclusal records, obtained from the same patient (8 of each polyvinyl siloxane, Blu-Mousse, Fast Set), were interposed between the patient casts in a new measuring system obtaining 48 curves of load versus maxillomandibular positional changes in 3 axes (x, y, z). These curves were compared with curves obtained with the casts in maximum intercuspation without interocclusal records (reference curves). Analysis of variance was used to compare maxillomandibular positional changes among the 3 groups (n = 48 each): Blu-Mousse, Fast Set, and control group or maximum intercuspation without interocclusal record. There was no significant change in maxillomandibular relations when forces up to 1 kgf were applied to stabilize the casts related by means of Blu-Mousse and Fast Set addition silicone interocclusal records. It is possible to use these polyvinyl siloxanes as interocclusal record materials without changing the recorded maxillomandibular relations.
Early childhood numeracy in a multiage setting
NASA Astrophysics Data System (ADS)
Wood, Karen; Frid, Sandra
2005-10-01
This research is a case study examining numeracy teaching and learning practices in an early childhood multiage setting with Pre-Primary to Year 2 children. Data were collected via running records, researcher reflection notes, and video and audio recordings. Video and audio transcripts were analysed using a mathematical discourse and social interactions coding system designed by MacMillan (1998), while the running records and reflection notes contributed to descriptions of the children's interactions with each other and with the teachers. Teachers used an `assisted performance' approach to instruction that supported problem solving and inquiry processes in mathematics activities, and this, combined with a child-centred pedagogy and specific values about community learning, created a learning environment designed to stimulate and foster learning. The mathematics discourse analysis showed a use of explanatory language in mathematics discourse, and this language supported scaffolding among children for new mathematics concepts. These and other interactions related to peer sharing, tutoring and regulation also emerged as key aspects of students' learning practices. However, the findings indicated that multiage grouping alone did not support learning. Rather, effective learning was dependent upon the teacher's capacities to develop productive discussion among children, as well as implement developmentally appropriate curricula that addressed the needs of the different children.
Using old technology to implement modern computer-aided decision support for primary diabetes care.
Hunt, D. L.; Haynes, R. B.; Morgan, D.
2001-01-01
BACKGROUND: Implementation rates of interventions known to be beneficial for people with diabetes mellitus are often suboptimal. Computer-aided decision support systems (CDSSs) can improve these rates. The complexity of establishing a fully integrated electronic medical record that provides decision support, however, often prevents their use. OBJECTIVE: To develop a CDSS for diabetes care that can be easily introduced into primary care settings and diabetes clinics. THE SYSTEM: The CDSS uses fax-machine-based optical character recognition software for acquiring patient information. Simple, 1-page paper forms, completed by patients or health practitioners, are faxed to a central location. The information is interpreted and recorded in a database. This initiates a routine that matches the information against a knowledge base so that patient-specific recommendations can be generated. These are formatted and faxed back within 4-5 minutes. IMPLEMENTATION: The system is being introduced into 2 diabetes clinics. We are collecting information on frequency of use of the system, as well as satisfaction with the information provided. CONCLUSION: Computer-aided decision support can be provided in any setting with a fax machine, without the need for integrated electronic medical records or computerized data-collection devices. PMID:11825194
Using old technology to implement modern computer-aided decision support for primary diabetes care.
Hunt, D L; Haynes, R B; Morgan, D
2001-01-01
Implementation rates of interventions known to be beneficial for people with diabetes mellitus are often suboptimal. Computer-aided decision support systems (CDSSs) can improve these rates. The complexity of establishing a fully integrated electronic medical record that provides decision support, however, often prevents their use. To develop a CDSS for diabetes care that can be easily introduced into primary care settings and diabetes clinics. THE SYSTEM: The CDSS uses fax-machine-based optical character recognition software for acquiring patient information. Simple, 1-page paper forms, completed by patients or health practitioners, are faxed to a central location. The information is interpreted and recorded in a database. This initiates a routine that matches the information against a knowledge base so that patient-specific recommendations can be generated. These are formatted and faxed back within 4-5 minutes. The system is being introduced into 2 diabetes clinics. We are collecting information on frequency of use of the system, as well as satisfaction with the information provided. Computer-aided decision support can be provided in any setting with a fax machine, without the need for integrated electronic medical records or computerized data-collection devices.
Morineau, Thierry; Chapelain, Pascal; Quinio, Philippe
2016-06-01
Our objective was to develop the analysis of task management skills by proposing a framework classifying task management stages and deficiencies. Few studies of non-technical skills have detailed the components of task management skills through behavioural markers, despite their central role in care delivery. A post hoc qualitative behavioural analysis was performed of recordings made of professional training sessions based upon simulated scenarios. Four recorded sessions in a high-fidelity simulation setting were observed and recorded. Two scenarios were used (cardiac arrest and respiratory failure), and there were two training sessions per scenario. Four types of task management deficiencies were identified with regards to task constraints: constraint relaxation, unsatisfied constraints, additional constraints and constraint transgression. Both equipment and space constraints were also identified. The lack of prerequisite actions when preparing the environment, corequisite actions for equipment and protocol monitoring, or postrequisite actions to restore the environment were associated with task management deficiencies. Deficiencies in task management behaviours can be identified in simulated as well as actual medical emergency settings. This framework opens perspectives for both training caregivers and designing ergonomic work situations. Copyright © 2015 Elsevier Ltd. All rights reserved.
NOAA's Scientific Data Stewardship Program
NASA Astrophysics Data System (ADS)
Bates, J. J.
2004-12-01
The NOAA mission is to understand and predict changes in the Earth's environment and conserve and manage coastal and marine resources to meet the Nation's economic, social and environmental needs. NOAA has responsibility for long-term archiving of the United States environmental data and has recently integrated several data management functions into a concept called Scientific Data Stewardship. Scientific Data Stewardship a new paradigm in data management consisting of an integrated suite of functions to preserve and exploit the full scientific value of NOAA's, and the world's, environmental data These functions include careful monitoring of observing system performance for long-term applications, the generation of authoritative long-term climate records from multiple observing platforms, and the proper archival of and timely access to data and metadata. NOAA has developed a conceptual framework to implement the functions of scientific data stewardship. This framework has five objectives: 1) develop real-time monitoring of all satellite observing systems for climate applications, 2) process large volumes of satellite data extending up to decades in length to account for systematic errors and to eliminate artifacts in the raw data (referred to as fundamental climate data records, FCDRs), 3) generate retrieved geophysical parameters from the FCDRs (referred to as thematic climate data records TCDRs) including combining observations from all sources, 4) conduct monitoring and research by analyzing data sets to uncover climate trends and to provide evaluation and feedback for steps 2) and 3), and 5) provide archives of metadata, FCDRs, and TCDRs, and facilitate distribution of these data to the user community. The term `climate data record' and related terms, such as climate data set, have been used for some time, but the climate community has yet to settle on a concensus definition. A recent United States National Academy of Sciences report recommends using the following definition: a climate data record (CDR) is a time series of measurements of sufficient length, consistency, and continuity to determine climate variability and change.
Thrasher, Ashley B; Walker, Stacy E; Hankemeier, Dorice A; Pitney, William A
2015-03-01
Many newly credentialed athletic trainers gain initial employment as graduate assistants (GAs) in the collegiate setting, yet their socialization into their role is unknown. Exploring the socialization process of GAs in the collegiate setting could provide insight into how that process occurs. To explore the professional socialization of GAs in the collegiate setting to determine how GAs are socialized and developed as athletic trainers. Qualitative study. Individual phone interviews. Athletic trainers (N = 21) who had supervised GAs in the collegiate setting for a minimum of 8 years (16 men [76%], 5 women [24%]; years of supervision experience = 14.6 ± 6.6). Data were collected via phone interviews, which were recorded and transcribed verbatim. Data were analyzed by a 4-person consensus team with a consensual qualitative-research design. The team independently coded the data and compared ideas until a consensus was reached, and a codebook was created. Trustworthiness was established through member checks and multianalyst triangulation. Four themes emerged: (1) role orientation, (2) professional development and support, (3) role expectations, and (4) success. Role orientation occurred both formally (eg, review of policies and procedures) and informally (eg, immediate role immersion). Professional development and support consisted of the supervisor mentoring and intervening when appropriate. Role expectations included decision-making ability, independent practice, and professionalism; however, supervisors often expected GAs to function as experienced, full-time staff. Success of the GAs depended on their adaptability and on the proper selection of GAs by supervisors. Supervisors socialize GAs into the collegiate setting by providing orientation, professional development, mentoring, and intervention when necessary. Supervisors are encouraged to use these socialization tactics to enhance the professional development of GAs in the collegiate setting.
Kannan, Vaishnavi; Fish, Jason S; Mutz, Jacqueline M; Carrington, Angela R; Lai, Ki; Davis, Lisa S; Youngblood, Josh E; Rauschuber, Mark R; Flores, Kathryn A; Sara, Evan J; Bhat, Deepa G; Willett, DuWayne L
2017-06-14
Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. We adopted as guiding principles to (a) capture data as a byproduct of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed - either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM) - were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined "grains" from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-generated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week "sprints" for rapid-cycle feedback and refinement. Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often.
Kannan, Vaishnavi; Fish, Jason S; Mutz, Jacqueline M; Carrington, Angela R; Lai, Ki; Davis, Lisa S; Youngblood, Josh E; Rauschuber, Mark R; Flores, Kathryn A; Sara, Evan J; Bhat, Deepa G; Willett, DuWayne L
2017-01-01
Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. We adopted as guiding principles to (a) capture data as a byproduct of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed - either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM) - were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined "grains" from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-gener-ated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week "sprints" for rapid-cycle feedback and refinement. Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often. Schattauer GmbH.
A Crowdsourcing Framework for Medical Data Sets.
Ye, Cheng; Coco, Joseph; Epishova, Anna; Hajaj, Chen; Bogardus, Henry; Novak, Laurie; Denny, Joshua; Vorobeychik, Yevgeniy; Lasko, Thomas; Malin, Bradley; Fabbri, Daniel
2018-01-01
Crowdsourcing services like Amazon Mechanical Turk allow researchers to ask questions to crowds of workers and quickly receive high quality labeled responses. However, crowds drawn from the general public are not suitable for labeling sensitive and complex data sets, such as medical records, due to various concerns. Major challenges in building and deploying a crowdsourcing system for medical data include, but are not limited to: managing access rights to sensitive data and ensuring data privacy controls are enforced; identifying workers with the necessary expertise to analyze complex information; and efficiently retrieving relevant information in massive data sets. In this paper, we introduce a crowdsourcing framework to support the annotation of medical data sets. We further demonstrate a workflow for crowdsourcing clinical chart reviews including (1) the design and decomposition of research questions; (2) the architecture for storing and displaying sensitive data; and (3) the development of tools to support crowd workers in quickly analyzing information from complex data sets.
Development of a tele-stethoscope and its application in pediatric cardiology.
Hedayioglu, F L; Mattos, S S; Moser, L; de Lima, M E
2007-01-01
Over the years, many attempts have been made to develop special stethoscopes for the teaching of auscultation. The objective of this article is to report on the experience with the development and implementation of an electronic stethoscope and a virtual library of cardiac sounds. There were four stages to this project: (1) the building of the prototype to acquire, filter and amplify the cardiac sounds, (2) the development of a software program to record, reproduce and visualize them, (3) the testing of the prototype in a clinical scenario, and (4) the development of an internet site, to store and display the sounds collected. The first two stages are now complete. The prototype underwent an initial evaluation in a clinical scenario within the Unit and during virtual out-patient clinical sessions. One hundred auscultations were recorded during these tests. They were reviewed and discussed on-line by a panel of experience cardiologists during the sessions. Although the sounds were considered "satisfactory" for diagnostic purposes by the cardiology team, they identified some qualitative differences in the electronic recorded auscultations, such as a higher pitch of the recorded sounds. Prospective clinical studies are now being conducted to further evaluate the interference of the electronic device in the physicians' capability to diagnose different cardiac conditions. An internet site (www.caduceusvirtual.com.br/ auscultaped) was developed to host these cardiac auscultations. It is set as a library of cardiac sounds, catalogued by pathologies and already contains examples from auscultations of the majority of common congenital heart lesions, such as septal defects and valvar lesions.
"Reversed" intraguild predation: red fox cubs killed by pine marten.
Brzeziński, Marcin; Rodak, Lukasz; Zalewski, Andrzej
2014-01-01
Camera traps deployed at a badger Meles meles set in mixed pine forest in north-eastern Poland recorded interspecific killing of red fox Vulpes vulpes cubs by pine marten Martes martes . The vixen and her cubs settled in the set at the beginning of May 2013, and it was abandoned by the badgers shortly afterwards. Five fox cubs were recorded playing in front of the den each night. Ten days after the first recording of the foxes, a pine marten was filmed at the set; it arrived in the morning, made a reconnaissance and returned at night when the vixen was away from the set. The pine marten entered the den several times and killed at least two fox cubs. It was active at the set for about 2 h. This observation proves that red foxes are not completely safe from predation by smaller carnivores, even those considered to be subordinate species in interspecific competition.
Near-field optical recording based on solid immersion lens system
NASA Astrophysics Data System (ADS)
Hong, Tao; Wang, Jia; Wu, Yan; Li, Dacheng
2002-09-01
Near-field optical recording based on solid immersion lens (SIL) system has attracted great attention in the field of high-density data storage in recent years. The diffraction limited spot size in optical recording and lithography can be decreased by utilizing the SIL. The SIL near-field optical storage has advantages of high density, mass storage capacity and compatibility with many technologies well developed. We have set up a SIL near-field static recording system. The recording medium is placed on a 3-D scanning stage with the scanning range of 70×70×70μm and positioning accuracy of sub-nanometer, which will ensure the rigorous separation control in SIL system and the precision motion of the recording medium. The SIL is mounted on an inverted microscope. The focusing between long working distance objective and SIL can be monitored and observed by the CCD camera and eyes. Readout signal can be collected by a detector. Some experiments have been performed based on the SIL near-field recording system. The attempt of the near-field recording on photochromic medium has been made and the resolution improvement of the SIL has been presented. The influence factors in SIL near-field recording system are also discussed in the paper.
Improving quality: bridging the health sector divide.
Pringle, Mike
2003-12-01
All too often, quality assurance looks at just one small part of the complex system that is health care. However, evidently each individual patient has one set of experiences and outcomes, often involving a range of health professionals in a number of settings across multiple sectors. In order to solve the problems of this complexity, we need to establish high-quality electronic recording in each of the settings. In the UK, primary care has been leading the way in adopting information technology and can now use databases for individual clinical care, for quality assurance using significant event and conventional auditing, and for research. Before we can understand and quality-assure the whole health care system, we need electronic patient records in all settings and good communication to build a summary electronic health record for each patient. Such an electronic health record will be under the control of the patient concerned, will be shared with the explicit consent of the patient, and will form the vehicle for quality assurance across all sectors of the health service.
Changing world extreme temperature statistics
NASA Astrophysics Data System (ADS)
Finkel, J. M.; Katz, J. I.
2018-04-01
We use the Global Historical Climatology Network--daily database to calculate a nonparametric statistic that describes the rate at which all-time daily high and low temperature records have been set in nine geographic regions (continents or major portions of continents) during periods mostly from the mid-20th Century to the present. This statistic was defined in our earlier work on temperature records in the 48 contiguous United States. In contrast to this earlier work, we find that in every region except North America all-time high records were set at a rate significantly (at least $3\\sigma$) higher than in the null hypothesis of a stationary climate. Except in Antarctica, all-time low records were set at a rate significantly lower than in the null hypothesis. In Europe, North Africa and North Asia the rate of setting new all-time highs increased suddenly in the 1990's, suggesting a change in regional climate regime; in most other regions there was a steadier increase.
Boeschoten, K H; Folmer, K B; van der Lee, J H; Nollet, F
2007-02-01
To develop an observational instrument that can be used to evaluate the quality of arm and hand skills in daily functional activities in children with obstetric brachial plexus lesion (OBPL). A set of functional activities was constructed and standardized, and the intra-observer reliability of the assessment of this set of activities was studied. Department of Occupational Therapy and Department of Rehabilitation Medicine, VU University Medical Centre. Twenty-six children with OBPL in the age range of 4 -6 years. The children were asked to perform 47 bimanual activities, which were recorded on videotape. The videotapes were scored twice by the same occupational therapist. The percentage of agreement in scoring 'hand-use', 'speed' and 'assistance' was over 80% for a substantial number of activities, indicating a strong agreement. However, in scoring 'deviations in movements and body posture' the percentage of agreement was insufficient in most activities. This set of activities has good potential for assessment of the performance of functional activities in children with OBPL. This study, however, showed a number of difficulties in observing and scoring the activities that have to be considered when developing a standardized video observation.
Inducing any virtual two-dimensional movement in humans by applying muscle tendon vibration.
Roll, Jean-Pierre; Albert, Frédéric; Thyrion, Chloé; Ribot-Ciscar, Edith; Bergenheim, Mikael; Mattei, Benjamin
2009-02-01
In humans, tendon vibration evokes illusory sensation of movement. We developed a model mimicking the muscle afferent patterns corresponding to any two-dimensional movement and checked its validity by inducing writing illusory movements through specific sets of muscle vibrators. Three kinds of illusory movements were compared. The first was induced by vibration patterns copying the responses of muscle spindle afferents previously recorded by microneurography during imposed ankle movements. The two others were generated by the model. Sixteen different vibratory patterns were applied to 20 motionless volunteers in the absence of vision. After each vibration sequence, the participants were asked to name the corresponding graphic symbol and then to reproduce the illusory movement perceived. Results showed that the afferent patterns generated by the model were very similar to those recorded microneurographically during actual ankle movements (r=0.82). The model was also very efficient for generating afferent response patterns at the wrist level, if the preferred sensory directions of the wrist muscle groups were first specified. Using recorded and modeled proprioceptive patterns to pilot sets of vibrators placed at the ankle or wrist levels evoked similar illusory movements, which were correctly identified by the participants in three quarters of the trials. Our proprioceptive model, based on neurosensory data recorded in behaving humans, should then be a useful tool in fields of research such as sensorimotor learning, rehabilitation, and virtual reality.
From Many Records to One Graph: Heterogeneity Conflicts in the Linked Data Restructuring Cycle
ERIC Educational Resources Information Center
Tallerås, Kim
2013-01-01
Introduction: During the last couple of years the library community has developed a number of comprehensive metadata standardization projects inspired by the idea of linked data, such as the BIBFRAME model. Linked data is a set of best practice principles of publishing and exposing data on the Web utilizing a graph based data model powered with…
African Historical Religions: A Conceptual and Ethnical Foundation for "Western Religions."
ERIC Educational Resources Information Center
Alexander, E. Curtis
This paper attempts to set the record straight with regard to the following assumptions: (1) the Africans of the antiquities of Ethiopia and Egypt were black people; and (2) the same black people developed the foundation that provides the basis for the so-called major Western religions of Judaism, Christianity, and Islam. There are two parts to…
ERIC Educational Resources Information Center
Lerner, Claire; Barr, Rachel
2015-01-01
A robust body of research shows that the most important factor in a child's healthy development is a positive parent-child relationship, characterized by warm, loving interactions in which parents and other caregivers sensitively respond to their child's cues and provide age-appropriate activities that nurture curiosity, exploration, and learning.…
Towards the automated analysis and database development of defibrillator data from cardiac arrest.
Eftestøl, Trygve; Sherman, Lawrence D
2014-01-01
During resuscitation of cardiac arrest victims a variety of information in electronic format is recorded as part of the documentation of the patient care contact and in order to be provided for case review for quality improvement. Such review requires considerable effort and resources. There is also the problem of interobserver effects. We show that it is possible to efficiently analyze resuscitation episodes automatically using a minimal set of the available information. A minimal set of variables is defined which describe therapeutic events (compression sequences and defibrillations) and corresponding patient response events (annotated rhythm transitions). From this a state sequence representation of the resuscitation episode is constructed and an algorithm is developed for reasoning with this representation and extract review variables automatically. As a case study, the method is applied to the data abstraction process used in the King County EMS. The automatically generated variables are compared to the original ones with accuracies ≥ 90% for 18 variables and ≥ 85% for the remaining four variables. It is possible to use the information present in the CPR process data recorded by the AED along with rhythm and chest compression annotations to automate the episode review.
Developing a corpus of spoken language variability
NASA Astrophysics Data System (ADS)
Carmichael, Lesley; Wright, Richard; Wassink, Alicia Beckford
2003-10-01
We are developing a novel, searchable corpus as a research tool for investigating phonetic and phonological phenomena across various speech styles. Five speech styles have been well studied independently in previous work: reduced (casual), careful (hyperarticulated), citation (reading), Lombard effect (speech in noise), and ``motherese'' (child-directed speech). Few studies to date have collected a wide range of styles from a single set of speakers, and fewer yet have provided publicly available corpora. The pilot corpus includes recordings of (1) a set of speakers participating in a variety of tasks designed to elicit the five speech styles, and (2) casual peer conversations and wordlists to illustrate regional vowels. The data include high-quality recordings and time-aligned transcriptions linked to text files that can be queried. Initial measures drawn from the database provide comparison across speech styles along the following acoustic dimensions: MLU (changes in unit duration); relative intra-speaker intensity changes (mean and dynamic range); and intra-speaker pitch values (minimum, maximum, mean, range). The corpus design will allow for a variety of analyses requiring control of demographic and style factors, including hyperarticulation variety, disfluencies, intonation, discourse analysis, and detailed spectral measures.
A new real-time tsunami detection algorithm
NASA Astrophysics Data System (ADS)
Chierici, F.; Embriaco, D.; Pignagnoli, L.
2016-12-01
Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection based on the real-time tide removal and real-time band-pass filtering of sea-bed pressure recordings. The algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability, at low computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. Pressure data sets acquired by Bottom Pressure Recorders in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event which occurred at Haida Gwaii on October 28th, 2012 using data recorded by the Bullseye underwater node of Ocean Networks Canada. The algorithm successfully ran for test purpose in year-long missions onboard the GEOSTAR stand-alone multidisciplinary abyssal observatory, deployed in the Gulf of Cadiz during the EC project NEAREST and on NEMO-SN1 cabled observatory deployed in the Western Ionian Sea, operational node of the European research infrastructure EMSO.
Respiratory rate estimation during triage of children in hospitals.
Shah, Syed Ahmar; Fleming, Susannah; Thompson, Matthew; Tarassenko, Lionel
2015-01-01
Accurate assessment of a child's health is critical for appropriate allocation of medical resources and timely delivery of healthcare in Emergency Departments. The accurate measurement of vital signs is a key step in the determination of the severity of illness and respiratory rate is currently the most difficult vital sign to measure accurately. Several previous studies have attempted to extract respiratory rate from photoplethysmogram (PPG) recordings. However, the majority have been conducted in controlled settings using PPG recordings from healthy subjects. In many studies, manual selection of clean sections of PPG recordings was undertaken before assessing the accuracy of the signal processing algorithms developed. Such selection procedures are not appropriate in clinical settings. A major limitation of AR modelling, previously applied to respiratory rate estimation, is an appropriate selection of model order. This study developed a novel algorithm that automatically estimates respiratory rate from a median spectrum constructed applying multiple AR models to processed PPG segments acquired with pulse oximetry using a finger probe. Good-quality sections were identified using a dynamic template-matching technique to assess PPG signal quality. The algorithm was validated on 205 children presenting to the Emergency Department at the John Radcliffe Hospital, Oxford, UK, with reference respiratory rates up to 50 breaths per minute estimated by paediatric nurses. At the time of writing, the authors are not aware of any other study that has validated respiratory rate estimation using data collected from over 200 children in hospitals during routine triage.
Arheiam, A; Albadri, S; Brown, S; Burnside, G; Higham, S; Harris, R
2016-11-04
Objectives Current guidance recommends that dental practitioners should routinely give dietary advice to patients, with diet diaries as a tool to help diet assessment. We explored patients' compliance with diet-diaries usage in a paediatric clinic within a teaching hospital setting, where remuneration is not an issue. Objectives were to investigate associated factors affecting diet diaries return rate and the information obtained from returned diaries.Methods A retrospective study of 200 randomly selected clinical records of children aged 5-11 years who had received diet analysis and advice as part of a preventive dental care programme at a dental teaching hospital between 2010 and 2013. Clinical records, with a preventive care pro forma, were included in the study. Data on social and family history, DMFT-dmft, oral hygiene practices, dental attendance and dietary habits were obtained and compared with information given in completed diet-diaries. A deductive content analysis of returned diet-diaries was undertaken using a pre-developed coding scheme.Results Of 174 complete records included in this study, diet diaries were returned in 60 (34.5%) of them. Diet diaries were more likely to be returned by those children who reported that they regularly brushed their teeth (P <0.05), and those who came from smaller families (P <0.05). Content analysis of diet diaries enabled the identification of harmful types of foods and drinks in 100% of diaries. General dietary issues, frequency and between-meals intake of sugars were also all captured in the majority of diaries (95.0%, N = 56). Information on sugar amount (53.0%, N = 32), prolonged-contact with teeth (57.0%, N = 34) and near bedtime intakes (17.0%, N = 28) was reported in fewer diaries.Conclusions The return rate of diet-diaries in this setting was low, and associated with patients' demographic and oral health characteristics. Returned diet-diaries showed a varied range of missing important dietary information, such as sugar amount, which appears to compromise their validity as a diet assessment tool. Development of a more reliable and acceptable dietary assessment tool for use in the dental setting is needed.
Emotional recognition from the speech signal for a virtual education agent
NASA Astrophysics Data System (ADS)
Tickle, A.; Raghu, S.; Elshaw, M.
2013-06-01
This paper explores the extraction of features from the speech wave to perform intelligent emotion recognition. A feature extract tool (openSmile) was used to obtain a baseline set of 998 acoustic features from a set of emotional speech recordings from a microphone. The initial features were reduced to the most important ones so recognition of emotions using a supervised neural network could be performed. Given that the future use of virtual education agents lies with making the agents more interactive, developing agents with the capability to recognise and adapt to the emotional state of humans is an important step.
Quantum Hamiltonian identification from measurement time traces.
Zhang, Jun; Sarovar, Mohan
2014-08-22
Precise identification of parameters governing quantum processes is a critical task for quantum information and communication technologies. In this Letter, we consider a setting where system evolution is determined by a parametrized Hamiltonian, and the task is to estimate these parameters from temporal records of a restricted set of system observables (time traces). Based on the notion of system realization from linear systems theory, we develop a constructive algorithm that provides estimates of the unknown parameters directly from these time traces. We illustrate the algorithm and its robustness to measurement noise by applying it to a one-dimensional spin chain model with variable couplings.
Technical details in the structural development of Rohrbach seaplanes
NASA Technical Reports Server (NTRS)
Mathias, Gotthold; Holzapfel, Adolf
1929-01-01
The recent trial flights and acceptance tests of the Rohrbach "Romar," the largest seaplane in the world, have yielded results fully confirming the principles followed in its development. Its take-off weight of 19,000 kg, its beating the world record for raising the greatest useful load to 2000 m by almost 2500 kg and its remarkable showing in the seaworthiness tests are the results of intelligent researches, the guiding principles of which are briefly set forth in this article.
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
The science and application of forest carbon projects
Robert J. Moulton
2002-01-01
This year-1999, now in its 7th month-is well on its way to becoming the hottest year in what has already been documented as the hottest decade on record both in the United States and worldwide. This makes a good setting for discussing global climate change, a much better setting than, say, the second week of a record-setting cold spell in February, 1999 when some...
Empower, not impose!-Preventing academic procrastination.
Hoppe, Johannes D; Prokop, Philipp; Rau, Renate
2018-01-01
In the frame of the goal setting process between supervisor and student while writing a thesis, it is hypothesized that mutually set goals (participation) and writing down the results of the meeting (recording) can prevent procrastination and increase engagement of the student. With a questionnaire relating to the latest written thesis (n = 97, academic sample), the effects of goal setting characteristics (recording, participation) and task characteristics (ambiguity, control) on engagement and procrastination were examined. Results of a multiple mediation model indicate that recording indirectly influences engagement and procrastination through its effect on ambiguity. Moreover, participation indirectly influences engagement through its effect on control. It is concluded that goal setting characteristics and task characteristics can affect student's procrastination. Thus, the present research provides criteria for how supervisors can prevent students from procrastinating.
77 FR 65939 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-31
... Technology Architecture (VistA) Records-VA'' (79VA19) as set forth in the Federal Register 75 FR 4454. VA is... Health Information Systems and Technology Architecture (VistA) Records-VA ROUTINE USES OF RECORDS...
Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T
2015-04-30
New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Design of sEMG assembly to detect external anal sphincter activity: a proof of concept.
Shiraz, Arsam; Leaker, Brian; Mosse, Charles Alexander; Solomon, Eskinder; Craggs, Michael; Demosthenous, Andreas
2017-10-31
Conditional trans-rectal stimulation of the pudendal nerve could provide a viable solution to treat hyperreflexive bladder in spinal cord injury. A set threshold of the amplitude estimate of the external anal sphincter surface electromyography (sEMG) may be used as the trigger signal. The efficacy of such a device should be tested in a large scale clinical trial. As such, a probe should remain in situ for several hours while patients attend to their daily routine; the recording electrodes should be designed to be large enough to maintain good contact while observing design constraints. The objective of this study was to arrive at a design for intra-anal sEMG recording electrodes for the subsequent clinical trials while deriving the possible recording and processing parameters. Having in mind existing solutions and based on theoretical and anatomical considerations, a set of four multi-electrode probes were designed and developed. These were tested in a healthy subject and the measured sEMG traces were recorded and appropriately processed. It was shown that while comparatively large electrodes record sEMG traces that are not sufficiently correlated with the external anal sphincter contractions, smaller electrodes may not maintain a stable electrode tissue contact. It was shown that 3 mm wide and 1 cm long electrodes with 5 mm inter-electrode spacing, in agreement with Nyquist sampling, placed 1 cm from the orifice may intra-anally record a sEMG trace sufficiently correlated with external anal sphincter activity. The outcome of this study can be used in any biofeedback, treatment or diagnostic application where the activity of the external anal sphincter sEMG should be detected for an extended period of time.
Robertson, Ann R R; Fernando, Bernard; Morrison, Zoe; Kalra, Dipak; Sheikh, Aziz
2015-03-27
Globally, diabetes mellitus presents a substantial and increasing burden to individuals, health care systems and society. Structuring and coding of information in the electronic health record underpin attempts to improve sharing and searching for information. Digital records for those with long-term conditions are expected to bring direct and secondary uses benefits, and potentially to support patient self-management. We sought to investigate if how and why records for adults with diabetes were structured and coded and to explore a range of UK stakeholders' perceptions of current practice in the National Health Service. We carried out a qualitative, theoretically informed case study of documenting health care information for diabetes in family practice and hospital settings in England, using semi-structured interviews, observations, systems demonstrations and documentary data. We conducted 22 interviews and four on-site observations. With respect to secondary uses - research, audit, public health and service planning - interviewees clearly articulated the benefits of highly structured and coded diabetes data and it was believed that benefits would expand through linkage to other datasets. Direct, more marginal, clinical benefits in terms of managing and monitoring diabetes and perhaps encouraging patient self-management were also reported. We observed marked differences in levels of record structuring and/or coding between family practices, where it was high, and the hospital. We found little evidence that structured and coded data were being exploited to improve information sharing between care settings. Using high levels of data structuring and coding in records for diabetes patients has the potential to be exploited more fully, and lessons might be learned from successful developments elsewhere in the UK. A first step would be for hospitals to attain levels of health information technology infrastructure and systems use commensurate with family practices.
Developing an integrated electronic nursing record based on standards.
van Grunsven, Arno; Bindels, Rianne; Coenen, Chel; de Bel, Ernst
2006-01-01
The Radboud University Nijmegen Medical Centre in the Netherlands develops a multidisciplinar (Electronic Health Record) based on the latest HL7 v3 (Health Level 7 version 3) D-MIM : Care provision. As part of this process we are trying to establish which nursing diagnoses and activities are minimally required. These NMDS (Nursing Minimal Data Set) are mapped or translated to ICF (for diagnoses) and CEN1828 Structures for (for activities). The mappings will be the foundation for the development of user interfaces for the registration of nursing activities. A homegrown custom-made web based configuration tool is used to exploit the possibilities of HL7 v3. This enables a sparkling launch of user interfaces that can contain the diversity of health care work processes. The first screens will be developed to support history taking for the nursing chart of the Neurology ward. The screens will contain both Dutch NMDS items and ward specific information. This will be configured dynamically per (group of) ward(s).
A Time Series of Mean Global Sea Surface Temperature from the Along-Track Scanning Radiometers
NASA Astrophysics Data System (ADS)
Veal, Karen L.; Corlett, Gary; Remedios, John; Llewellyn-Jones, David
2010-12-01
A climate data set requires a long time series of consistently processed data with suitably long periods of overlap of different instruments which allows characterization of any inter-instrument biases. The data obtained from ESA's three Along-Track Scanning Radiometers (ATSRs) together comprise an 18 year record of SST with overlap periods of at least 6 months. The data from all three ATSRs has been consistently processed. These factors together with the stability of the instruments and the precision of the derived SST makes this data set eminently suitable for the construction of a time series of SST that complies with many of the GCOS requirements for a climate data set. A time series of global and regional average SST anomalies has been constructed from the ATSR version 2 data set. An analysis of the overlap periods of successive instruments was used to remove intra-series biases and align the series to a common reference. An ATSR climatology has been developed and has been used to calculate the SST anomalies. The ATSR-1 time series and the AATSR time series have been aligned to ATSR-2. The largest adjustment is ~0.2 K between ATSR-2 and AATSR which is suspected to be due to a shift of the 12 μm filter function for AATSR. An uncertainty of 0.06 K is assigned to the relative anomaly record that is derived from the dual three-channel night-time data. A relative uncertainty of 0.07 K is assigned to the dual night-time two-channel record, except in the ATSR-1 period (1994-1996) where it is larger.
A Standardized Reference Data Set for Vertebrate Taxon Name Resolution
Zermoglio, Paula F.; Guralnick, Robert P.; Wieczorek, John R.
2016-01-01
Taxonomic names associated with digitized biocollections labels have flooded into repositories such as GBIF, iDigBio and VertNet. The names on these labels are often misspelled, out of date, or present other problems, as they were often captured only once during accessioning of specimens, or have a history of label changes without clear provenance. Before records are reliably usable in research, it is critical that these issues be addressed. However, still missing is an assessment of the scope of the problem, the effort needed to solve it, and a way to improve effectiveness of tools developed to aid the process. We present a carefully human-vetted analysis of 1000 verbatim scientific names taken at random from those published via the data aggregator VertNet, providing the first rigorously reviewed, reference validation data set. In addition to characterizing formatting problems, human vetting focused on detecting misspelling, synonymy, and the incorrect use of Darwin Core. Our results reveal a sobering view of the challenge ahead, as less than 47% of name strings were found to be currently valid. More optimistically, nearly 97% of name combinations could be resolved to a currently valid name, suggesting that computer-aided approaches may provide feasible means to improve digitized content. Finally, we associated names back to biocollections records and fit logistic models to test potential drivers of issues. A set of candidate variables (geographic region, year collected, higher-level clade, and the institutional digitally accessible data volume) and their 2-way interactions all predict the probability of records having taxon name issues, based on model selection approaches. We strongly encourage further experiments to use this reference data set as a means to compare automated or computer-aided taxon name tools for their ability to resolve and improve the existing wealth of legacy data. PMID:26760296
Surface roughness measurement in the submicrometer range using laser scattering
NASA Astrophysics Data System (ADS)
Wang, S. H.; Quan, Chenggen; Tay, C. J.; Shang, H. M.
2000-06-01
A technique for measuring surface roughness in the submicrometer range is developed. The principle of the method is based on laser scattering from a rough surface. A telecentric optical setup that uses a laser diode as a light source is used to record the light field scattered from the surface of a rough object. The light intensity distribution of the scattered band, which is correlated to the surface roughness, is recorded by a linear photodiode array and analyzed using a single-chip microcomputer. Several sets of test surfaces prepared by different machining processes are measured and a method for the evaluation of surface roughness is proposed.
A student-centred electronic health record system for clinical education.
Elliott, Kristine; Judd, Terry; McColl, Geoff
2011-01-01
Electronic Health Record (EHR) systems are an increasingly important feature of the national healthcare system [1]. However, little research has investigated the impact this will have on medical students' learning. As part of an innovative technology platform for a new masters level program in medicine, we are developing a student-centred EHR system for clinical education. A prototype was trialed with medical students over several weeks during 2010. This paper reports on the findings of the trial, which had the overall aim of assisting our understanding of how trainee doctors might use an EHR system for learning and communication in a clinical setting. In primary care and hospital settings, EHR systems offer potential benefits to medical students' learning: Longitudinal tracking of clinical progress towards established learning objectives [2]; Capacity to search across a substantial body of records [3]; Integration with online medical databases [3]; Development of expertise in creating, accessing and managing high quality EHRs [4]. While concerns have been raised that EHR systems may alter the interaction between teachers and students [3], and may negatively influence physician-patient communication [6], there is general consensus that the EHR is changing the current practice environment and teaching practice needs to respond. Final year medical students on clinical placement at a large university teaching hospital were recruited for the trial. Following a four-week period of use, semi-structured interviews were conducted with 10 participants. Audio-recorded interviews were transcribed and data analysed for emerging themes. Study participants were also surveyed about the importance of EHR systems in general, their familiarity with them, and general perceptions of sharing patient records. Medical students in this pilot study identified a number of educational, practical and administrative advantages that the student-centred EHR system offered over their existing ad-hoc procedures for recording patient encounters. Findings from this preliminary study point to the need to introduce and instruct students' on the use of EHR systems from their earliest clinical encounters, and to closely integrate learning activities based on the EHR system with established learning objectives. Further research is required to evaluate the impact of student-centred EHR systems on learning outcomes.
New control concepts for uncertain water resources systems: 1. Theory
NASA Astrophysics Data System (ADS)
Georgakakos, Aris P.; Yao, Huaming
1993-06-01
A major complicating factor in water resources systems management is handling unknown inputs. Stochastic optimization provides a sound mathematical framework but requires that enough data exist to develop statistical input representations. In cases where data records are insufficient (e.g., extreme events) or atypical of future input realizations, stochastic methods are inadequate. This article presents a control approach where input variables are only expected to belong in certain sets. The objective is to determine sets of admissible control actions guaranteeing that the system will remain within desirable bounds. The solution is based on dynamic programming and derived for the case where all sets are convex polyhedra. A companion paper (Yao and Georgakakos, this issue) addresses specific applications and problems in relation to reservoir system management.
Assessment of NDE reliability data
NASA Technical Reports Server (NTRS)
Yee, B. G. W.; Couchman, J. C.; Chang, F. H.; Packman, D. F.
1975-01-01
Twenty sets of relevant nondestructive test (NDT) reliability data were identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations was formulated, and a model to grade the quality and validity of the data sets was developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, were formulated for each NDE method. A comprehensive computer program was written and debugged to calculate the probability of flaw detection at several confidence limits by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. An example of the calculated reliability of crack detection in bolt holes by an automatic eddy current method is presented.
5 CFR 1632.10 - Transcripts, recordings, and minutes.
Code of Federal Regulations, 2010 CFR
2010-01-01
... maintain a complete transcript or electronic recording or transcription thereof adequate to record fully.... Transcriptions of recordings will disclose the identity of each speaker. (b) The Board will maintain either such a transcript, recording or transcription thereof, or a set of minutes that will fully and clearly...
5 CFR 1632.10 - Transcripts, recordings, and minutes.
Code of Federal Regulations, 2011 CFR
2011-01-01
... maintain a complete transcript or electronic recording or transcription thereof adequate to record fully.... Transcriptions of recordings will disclose the identity of each speaker. (b) The Board will maintain either such a transcript, recording or transcription thereof, or a set of minutes that will fully and clearly...
Changing Requirements for Archiving Climate Data Records Derived From Remotely Sensed Data
NASA Astrophysics Data System (ADS)
Fleig, A. J.; Tilmes, C.
2007-05-01
With the arrival of long term sets of measurements of remotely sensed data it becomes important to improve the standard practices associated with archival of information needed to allow creation of climate data records, CDRs, from individual sets of measurements. Several aspects of the production of CDRs suggest that there should be changes in standard best practices for archival. A fundamental requirement for understanding long- term trends in climate data is that changes with time shown by the data reflect changes in actual geophysical parameters rather than changes in the measurement system. Even well developed and validated data sets from remotely sensed measurements contain artifacts. If the nature of the measurement and the algorithm is consistent over time, these artifacts may have little impact on trends derived from the data. However data sets derived with different algorithms created with different assumptions are likely to introduce non-physical changes in trend data. Yet technology for making measurements and analyzing data improves with time and this must be accounted for. To do this for an ongoing long term data set based on multiple instruments it is important to understand exactly how the preceding data was produced. But we are reaching the point where the scientists and engineers that developed the initial measurements and algorithms are no longer available to explain and assist in adapting today's systems for use with future measurement systems. In an era where tens to hundreds of man years are involved in calibrating an instrument and producing and validating a set of geophysical measurements from the calibrated data we have long passed the time when it was reasonable to say "just give me the basic measurement and a bright graduate student and I can produce anything I need in a year." Examples of problems encountered and alternative solutions will be provided based on developing and reprocessing data sets from long term measurements of atmospheric, land surface and ocean measurements covering, in one case, a series of fifteen instruments currently scheduled to continue from 1978 through several decades into the 2030s. Possible changes in approach for developers of instruments and processing algorithms, archival centers, funding organizations and the climate science community will be suggested. There is a cost in both time and money associated with most of these changes and the hope is that this presentation will prompt further discussion on what should be done.
RESEARCH RECORDS AND THE RESOLUTION OF MISCONDUCT ALLEGATIONS AT RESEARCH UNIVERSITIES
WILSON, KENNETH; SCHREIER, ALAN; GRIFFIN, ANGEL; RESNIK, DAVID
2014-01-01
Accurate record keeping is an important part of the responsible conduct of research. However, there is very little empirical research on scientific record keeping. No one knows the incidence of serious problems with research records, the types of problems that occur, nor their consequences. In this study, we examined the role of research records in the resolution of misconduct allegations as a useful barometer for the incidence and types of problems that occur with records. We interviewed Research Integrity Officers (RIOs) at 90 major research universities and conducted focus groups with active research faculty. RIOs reported problems with research records in 38% of the 553 investigations they conducted. Severe problems with research records often prevented completion of investigations while problems that are more typical lengthened them by 2 to 3 weeks. Five types of poor record keeping practices accounted for 75% of the problems with incomplete/inadequate records being the most common (30%). The focus groups concurred with the findings from the interviews with RIOs, stressed the importance of the research group leader in setting and maintaining record practices, and offered additional insights. While university officials and faculty members have suspected for many years that there are serious problems with research record keeping, our study provides empirical evidence for this belief. By documenting some of the problems with record keeping in university-based research, the results of our study provide information that will be useful for policy development at academic institutions. PMID:17847607
Research records and the resolution of misconduct allegations at research universities.
Wilson, Kenneth; Schreier, Alan; Griffin, Angel; Resnik, David
2007-01-01
Accurate record keeping is an important part of the responsible conduct of research. However, there is very little empirical research on scientific record keeping. No one knows the incidence of serious problems with research records, the types of problems that occur, nor their consequences. In this study, we examined the role of research records in the resolution of misconduct allegations as a useful barometer for the incidence and types of problems that occur with records. We interviewed Research Integrity Officers (RIOs) at 90 major research universities and conducted focus groups with active research faculty. RIOs reported problems with research records in 38% of the 553 investigations they conducted. Severe problems with research records often prevented completion of investigations while problems that are more typical lengthened them by 2 to 3 weeks. Five types of poor record keeping practices accounted for 75 % of the problems with incomplete/inadequate records being the most common (30%). The focus groups concurred with the findings from the interviews with RIOs, stressed the importance of the research group leader in setting and maintaining record practices, and offered additional insights. While university officials and faculty members have suspected for many years that there are serious problems with research record keeping, our study provides empirical evidence for this belief. By documenting some of the problems with record keeping in university-based research, the results of our study provide information that will be useful for policy development at academic institutions.
Field, Karl; Bailey, Michele; Foresman, Larry L; Harris, Robert L; Motzel, Sherri L; Rockar, Richard A; Ruble, Gaye; Suckow, Mark A
2007-01-01
Medical records are considered to be a key element of a program of adequate veterinary care for animals used in research, teaching, and testing. However, prior to the release of the public statement on medical records by the American College of Laboratory Animal Medicine (ACLAM), the guidance that was available on the form and content of medical records used for the research setting was not consistent and, in some cases, was considered to be too rigid. To address this concern, ACLAM convened an ad hoc Medical Records Committee and charged the Committee with the task of developing a medical record guideline that was based on both professional judgment and performance standards. The Committee provided ACLAM with a guidance document titled Public Statements: Medical Records for Animals Used in Research, Teaching, and Testing, which was approved by ACLAM in late 2004. The ACLAM public statement on medical records provides guidance on the definition and content of medical records, and clearly identifies the Attending Veterinarian as the individual who is charged with authority and responsibility for oversight of the institution's medical records program. The document offers latitude to institutions in the precise form and process used for medical records but identifies typical information to be included in such records. As a result, the ACLAM public statement on medical records provides practical yet flexible guidelines to assure that documentation of animal health is performed in research, teaching, and testing situations.
'Seed + expand': a general methodology for detecting publication oeuvres of individual researchers.
Reijnhoudt, Linda; Costas, Rodrigo; Noyons, Ed; Börner, Katy; Scharnhorst, Andrea
2014-01-01
The study of science at the individual scholar level requires the disambiguation of author names. The creation of author's publication oeuvres involves matching the list of unique author names to names used in publication databases. Despite recent progress in the development of unique author identifiers, e.g., ORCID, VIVO, or DAI, author disambiguation remains a key problem when it comes to large-scale bibliometric analysis using data from multiple databases. This study introduces and tests a new methodology called seed + expand for semi-automatic bibliographic data collection for a given set of individual authors. Specifically, we identify the oeuvre of a set of Dutch full professors during the period 1980-2011. In particular, we combine author records from a Dutch National Research Information System (NARCIS) with publication records from the Web of Science. Starting with an initial list of 8,378 names, we identify 'seed publications' for each author using five different approaches. Subsequently, we 'expand' the set of publications in three different approaches. The different approaches are compared and resulting oeuvres are evaluated on precision and recall using a 'gold standard' dataset of authors for which verified publications in the period 2001-2010 are available.
ERIC Educational Resources Information Center
Berry, John N., III; Fialkoff, Francine; Fox, Bette-Lee; Hadro, Josh; Horrocks, Norman; Kuzyk, Raya; Oder, Norman
2009-01-01
Even as libraries face the economic downturn, a record-setting number of people attended the American Library Association (ALA) annual conference in Chicago, July 9-15. The tough economy, however, was felt in the number of exhibitors, which declined from the previous record set in 2007 in Washington, DC, and in anecdotal evidence that suggested…
Early Childhood Numeracy in a Multiage Setting
ERIC Educational Resources Information Center
Wood, Karen; Frid, Sandra
2005-01-01
This research is a case study examining numeracy teaching and learning practices in an early childhood multiage setting with Pre-Primary to Year 2 children. Data were collected via running records, researcher reflection notes, and video and audio recordings. Video and audio transcripts were analysed using a mathematical discourse and social…
32 CFR 286.23 - Initial determinations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... record is denied in whole or in part in accordance with procedures set forth in the FOIA. (c) Denial... a single request, which would otherwise satisfy the unusual circumstances set forth in paragraph (f... record or information (also known as “the submitter” for matters pertaining to proprietary data under 5 U...
32 CFR 286.23 - Initial determinations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... record is denied in whole or in part in accordance with procedures set forth in the FOIA. (c) Denial... a single request, which would otherwise satisfy the unusual circumstances set forth in paragraph (f... record or information (also known as “the submitter” for matters pertaining to proprietary data under 5 U...
NASA Astrophysics Data System (ADS)
Lerot, C.; Danckaert, T.; van Gent, J.; Coldewey-Egbers, M.; Loyola, D. G.; Errera, Q.; Spurr, R. J. D.; Garane, K.; Koukouli, M.; Balis, D.; Verhoelst, T.; Granville, J.; Lambert, J. C.; Van Roozendael, M.
2017-12-01
Total ozone is one of the Essential Climate Variables (ECV) operationally produced within the European Copernicus Climate Change Service (C3S), which aims at providing the geophysical information needed to monitor and study our climate system. The C3S total ozone processing chain relies on algorithmic developments realized for the last six years as part of the ESA's Ozone Climate Change Initiative (Ozone_cci) project. The C3S Climate Data Store currently contains a total ozone record based on observations from the nadir UV-Vis hyperspectral spectrometers GOME/ERS-2, SCIAMACHY/Envisat, GOME-2/Metop-A, GOME-2/Metop-B and OMI/Aura, spanning more than 23 years.Individual level-2 datasets were generated with the retrieval algorithm GODFIT (GOME-type Direct FITting). The retrievals are based on a non-linear least squares adjustment of reflectances simulated with radiative transfer tools from the LIDORT suite, to the measured spectra in the Huggins bands (325-335 nm). The inter-sensor consistency and the time stability of those data sets is significantly enhanced with the application of a soft-calibration procedure to the level-1 reflectances, in which GOME and OMI are used together as a long-term reference. Level-2 data sets are then combined to produce the level-3 GOME-type Total Ozone (GTO-ECV) record consisting of homogenized 1°x1° monthly mean grids. The merging procedure corrects for subsisting inter-satellite biases and temporal drifts. Some developments for minimizing sampling errors have also been recently investigated and will be discussed. Total ozone level-2 and level-3 data sets are regularly verified and validated by independent measurements both from space (independent algorithms and/or instruments) and ground (Brewer/Dobson/SAOZ) and their excellent quality and stability, as well as their consistency with other long-term total ozone data sets will be illustrated here. In future, in addition to be continuously extended in time, the C3S total ozone record will also incorporate new sensors such as OMPS aboard Suomi NPP or TROPOMI/S5p.
Storage-based Intrusion Detection: Watching storage activity for suspicious behavior
2002-10-01
password management involves a pair of inter-related files (/etc/ passwd and /etc/shadow). The corresponding access pat- terns seen at the storage...example, consider a UNIX system password file (/etc/ passwd ), which con- sists of a set of well-defined records. Records are delimited by a line-break, and...etc/ passwd and verify that they conform to a set of basic integrity rules: 7-field records, non-empty password field, legal default shell, legal home
Tu, Karen; Bevan, Lindsay; Hunter, Katie; Rogers, Jess; Young, Jacqueline; Nesrallah, Gihad
2017-01-01
The detection and management of chronic kidney disease lies within primary care; however, performance measures applicable in the Canadian context are lacking. We sought to develop a set of primary care quality indicators for chronic kidney disease in the Canadian setting and to assess the current state of the disease's detection and management in primary care. We used a modified Delphi panel approach, involving 20 panel members from across Canada (10 family physicians, 7 nephrologists, 1 patient, 1 primary care nurse and 1 pharmacist). Indicators identified from peer-reviewed and grey literature sources were subjected to 3 rounds of voting to develop a set of quality indicators for the detection and management of chronic kidney disease in the primary care setting. The final indicators were applied to primary care electronic medical records in the Electronic Medical Record Administrative data Linked Database (EMRALD) to assess the current state of primary care detection and management of chronic kidney disease in Ontario. Seventeen indicators made up the final list, with 1 under the category Prevalence, Incidence and Mortality; 4 under Screening, Diagnosis and Risk Factors; 11 under Management; and 1 under Referral to a Specialist. In a sample of 139 993 adult patients not on dialysis, 6848 (4.9%) had stage 3 or higher chronic kidney disease, with the average age of patients being 76.1 years (standard deviation [SD] 11.0); 62.9% of patients were female. Diagnosis and screening for chronic kidney disease were poorly performed. Only 27.1% of patients with stage 3 or higher disease had their diagnosis documented in their cumulative patient profile. Albumin-creatinine ratio testing was only performed for 16.3% of patients with a low estimated glomerular filtration rate (eGFR) and for 28.5% of patients with risk factors for chronic kidney disease. Family physicians performed relatively better with the management of chronic kidney disease, with 90.4% of patients with stage 3 or higher disease having an eGFR performed in the previous 18 months and 83.1% having a blood pressure recorded in the previous 9 months. We propose a set of measurable indicators to evaluate the quality of the management of chronic kidney disease in primary care. These indicators may be used to identify opportunities to improve current practice in Canada.
Low-cost synchronization of high-speed audio and video recordings in bio-acoustic experiments.
Laurijssen, Dennis; Verreycken, Erik; Geipel, Inga; Daems, Walter; Peremans, Herbert; Steckel, Jan
2018-02-27
In this paper, we present a method for synchronizing high-speed audio and video recordings of bio-acoustic experiments. By embedding a random signal into the recorded video and audio data, robust synchronization of a diverse set of sensor streams can be performed without the need to keep detailed records. The synchronization can be performed using recording devices without dedicated synchronization inputs. We demonstrate the efficacy of the approach in two sets of experiments: behavioral experiments on different species of echolocating bats and the recordings of field crickets. We present the general operating principle of the synchronization method, discuss its synchronization strength and provide insights into how to construct such a device using off-the-shelf components. © 2018. Published by The Company of Biologists Ltd.
Cold Regions Test of Tracked and Wheeled Vehicles
2015-12-11
with CTIS setting in the Highway setting and Mud, Sand and Snow setting. (7) Conduct the trials a minimum of three times at each speed as stated in...lock brake system. Record the stopping distance data and record any slew from the centerline. Document if the vehicle experiences engine stall ...while operating in snow. The TOP includes guidance for snow as well as mud, sand , swamps, and wet clay. Most conventional wheeled vehicles cannot
Rasmussen, Luke V; Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2012-06-01
Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline.
Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2011-01-01
Background Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. Methods We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. Observations The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. Discussion While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline. PMID:21890871
Mack, Heather G; Meng, Ngy; Parsons, Tanya; Schlenther, Gerhard; Murray, Neil; Hart, Richard
2017-08-01
To design and implement a continuing professional development (CPD) program for Cambodian ophthalmologists. Partnering (twinning) between the Royal Australian and New Zealand College of Ophthalmologists (RANZCO) and the Cambodian Ophthalmological Society (COS). Practicing ophthalmologists in Cambodia. A conjoint committee comprising 4 ophthalmologists from RANZCO and 3 ophthalmologists from COS was established, supported by a RANZCO administrative team experienced in CPD administration. CPD requirements and recording were adapted from the RANZCO CPD framework. Cambodian ophthalmologists were surveyed during program implementation and after handover to COS. At the end of the 3-year program at handover to COS, a CPD program and online recording system was established. All 47 (100%) practicing ophthalmologists in Cambodia were registered for CPD, and 21/47 (45%) were actively participating in the COS CPD program online recording. Surveys of attitudes toward CPD demonstrated no significant change. Partnering was moderately effective in establishing a CPD program for Cambodian ophthalmologists. Uptake of CPD may have been limited by lack of a requirement for CPD for continuing medical licensure in Cambodia. Follow-up will be necessary to demonstrate CPD program longevity. Copyright © 2017 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.
McQuade, Kevin; Price, Robert; Liu, Nelson; Ciol, Marcia A
2012-08-30
Examination of articular joints is largely based on subjective assessment of the "end-feel" of the joint in response to manually applied forces at different joint orientations. This technical report aims to describe the development of an objective method to examine joints in general, with specific application to the shoulder, and suitable for clinical use. We adapted existing hardware and developed laptop-based software to objectively record the force/displacement behavior of the glenohumeral joint during three common manual joint examination tests with the arm in six positions. An electromagnetic tracking system recorded three-dimensional positions of sensors attached to a clinician examiner and a patient. A hand-held force transducer recorded manually applied translational forces. The force and joint displacement were time-synchronized and the joint stiffness was calculated as a quantitative representation of the joint "end-feel." A methodology and specific system checks were developed to enhance clinical testing reproducibility and precision. The device and testing protocol were tested on 31 subjects (15 with healthy shoulders, and 16 with a variety of shoulder impairments). Results describe the stiffness responses, and demonstrate the feasibility of using the device and methods in clinical settings.
Whitt, Karen J; Eden, Lacey; Merrill, Katreena Collette; Hughes, Mckenna
2017-01-01
Previous research has linked improper electronic health record configuration and use with adverse patient events. In response to this problem, the US Office of the National Coordinator for Health Information Technology developed the Safety and Assurance Factors for EHR Resilience guides to evaluate electronic health records for optimal use and safety features. During the course of their education, nursing students are exposed to a variety of clinical practice settings and electronic health records. This descriptive study evaluated 108 undergraduate and 51 graduate nursing students' ratings of electronic health record features and safe practices, as well as what they learned from utilizing the computerized provider order entry and clinician communication Safety and Assurance Factors for EHR Resilience guide checklists. More than 80% of the undergraduate and 70% of the graduate students reported that they experienced user problems with electronic health records in the past. More than 50% of the students felt that electronic health records contribute to adverse patient outcomes. Students reported that many of the features assessed were not fully implemented in their electronic health record. These findings highlight areas where electronic health records can be improved to optimize patient safety. The majority of students reported that utilizing the Safety and Assurance Factors for EHR Resilience guides increased their understanding of electronic health record features.
Bokov, Plamen; Mahut, Bruno; Flaud, Patrice; Delclaux, Christophe
2016-03-01
Respiratory diseases in children are a common reason for physician visits. A diagnostic difficulty arises when parents hear wheezing that is no longer present during the medical consultation. Thus, an outpatient objective tool for recognition of wheezing is of clinical value. We developed a wheezing recognition algorithm from recorded respiratory sounds with a Smartphone placed near the mouth. A total of 186 recordings were obtained in a pediatric emergency department, mostly in toddlers (mean age 20 months). After exclusion of recordings with artefacts and those with a single clinical operator auscultation, 95 recordings with the agreement of two operators on auscultation diagnosis (27 with wheezing and 68 without) were subjected to a two phase algorithm (signal analysis and pattern classifier using machine learning algorithms) to classify records. The best performance (71.4% sensitivity and 88.9% specificity) was observed with a Support Vector Machine-based algorithm. We further tested the algorithm over a set of 39 recordings having a single operator and found a fair agreement (kappa=0.28, CI95% [0.12, 0.45]) between the algorithm and the operator. The main advantage of such an algorithm is its use in contact-free sound recording, thus valuable in the pediatric population. Copyright © 2016 Elsevier Ltd. All rights reserved.
Implications of the law on video recording in clinical practice.
Henken, Kirsten R; Jansen, Frank Willem; Klein, Jan; Stassen, Laurents P S; Dankelman, Jenny; van den Dobbelsteen, John J
2012-10-01
Technological developments allow for a variety of applications of video recording in health care, including endoscopic procedures. Although the value of video registration is recognized, medicolegal concerns regarding the privacy of patients and professionals are growing. A clear understanding of the legal framework is lacking. Therefore, this research aims to provide insight into the juridical position of patients and professionals regarding video recording in health care practice. Jurisprudence was searched to exemplify legislation on video recording in health care. In addition, legislation was translated for different applications of video in health care found in the literature. Three principles in Western law are relevant for video recording in health care practice: (1) regulations on privacy regarding personal data, which apply to the gathering and processing of video data in health care settings; (2) the patient record, in which video data can be stored; and (3) professional secrecy, which protects the privacy of patients including video data. Practical implementation of these principles in video recording in health care does not exist. Practical regulations on video recording in health care for different specifically defined purposes are needed. Innovations in video capture technology that enable video data to be made anonymous automatically can contribute to protection for the privacy of all the people involved.
Probabilistic Common Spatial Patterns for Multichannel EEG Analysis
Chen, Zhe; Gao, Xiaorong; Li, Yuanqing; Brown, Emery N.; Gao, Shangkai
2015-01-01
Common spatial patterns (CSP) is a well-known spatial filtering algorithm for multichannel electroencephalogram (EEG) analysis. In this paper, we cast the CSP algorithm in a probabilistic modeling setting. Specifically, probabilistic CSP (P-CSP) is proposed as a generic EEG spatio-temporal modeling framework that subsumes the CSP and regularized CSP algorithms. The proposed framework enables us to resolve the overfitting issue of CSP in a principled manner. We derive statistical inference algorithms that can alleviate the issue of local optima. In particular, an efficient algorithm based on eigendecomposition is developed for maximum a posteriori (MAP) estimation in the case of isotropic noise. For more general cases, a variational algorithm is developed for group-wise sparse Bayesian learning for the P-CSP model and for automatically determining the model size. The two proposed algorithms are validated on a simulated data set. Their practical efficacy is also demonstrated by successful applications to single-trial classifications of three motor imagery EEG data sets and by the spatio-temporal pattern analysis of one EEG data set recorded in a Stroop color naming task. PMID:26005228
Marques, Paulo A M; Magalhães, Daniel M; Pereira, Susana F; Jorge, Paulo E
2014-01-01
The preservation of historical and contemporary data safeguards our scientific legacy. Bioacoustic recordings can have historical as well as scientific value and should be assessed for their conservation requirements. Unpreserved bioacoustics recordings are generally not referenced and are frequently at high risk of loss by material degradation and/or by misplacement. In this study we investigated the preservation status of sets of natural sound recordings made in Portugal from 1983 until 2010 inclusive. We evaluated the recordings on the basis of their rate of loss, the degree to which unpreserved recordings could be preserved, and their risk of loss. Recordists of animal sounds were surveyed (by questionnaire or interview) to identify sets of recordings and to collect information on their quality and state of preservation. Of the 78 recordists identified, we found that 32% of the recordings have an unclear status and that only 9% of the recordings are lost. Of the c. 6 terabytes of unpreserved sound recordings discovered, an estimated 49% were recoverable. Moreover, 95% of the recoverable sets of recordings were at high risk of loss by their being misplaced. These risks can be minimized if recordists are persuaded to deposit their material in an institution committed to long-term curation of such data (e.g. sound archives). Overall, the study identified a considerable body of unpreserved animal sound recordings that could contribute to our scientific heritage and knowledge of the biodiversity found in Portugal. It highlights the need to implement effective policies to promote the deposit of recordings for preservation and to reverse the present scenario so that scientific material can be preserved for future generations.
Patient empowerment: a systematic review of questionnaires measuring empowerment in cancer patients.
Eskildsen, Nanna Bjerg; Joergensen, Clara Ruebner; Thomsen, Thora Grothe; Ross, Lone; Dietz, Susanne Malchau; Groenvold, Mogens; Johnsen, Anna Thit
2017-02-01
There is an increased attention to and demand for patient empowerment in cancer treatment and follow-up programs. Patient empowerment has been defined as feeling in control of or having mastery in relation to cancer and cancer care. This calls for properly developed questionnaires assessing empowerment from the user perspective. The aim of this review was to identify questionnaires and subscales measuring empowerment and manifestations of empowerment among cancer patients. We conducted a systematic search of the PubMed, PsycINFO and CINAHL databases. Empowerment and multiple search terms associated with empowerment were included. We included peer-reviewed articles published in English, which described questionnaires measuring empowerment or manifestations of empowerment in a cancer setting. In addition, the questionnaire had to be a patient-reported outcome measure for adult cancer patients. Database searches identified 831 records. Title and abstract screening resulted in 482 records being excluded. The remaining 349 full text articles were retrieved and assessed for eligibility. This led to the inclusion of 33 individual instruments measuring empowerment and manifestations of empowerment. Of these, only four were specifically developed to measure empowerment, and two were originally developed for the cancer setting, whereas the remaining two were developed elsewhere, but adapted to the cancer setting. The other 29 questionnaires were not intended to measure the concept of empowerment, but focused on patient-centered care, patient competence, self-efficacy, etc. However, they were included because part of the instrument (at least five items) was considered to measure empowerment or manifestations of empowerment. Our study provides an overview of the available questionnaires, which can be used by researchers and practitioners who wish to measure the concept of empowerment among cancer patients. Very few questionnaires were explicitly developed to explore empowerment, and the review brings to light a significant lack of questionnaires that measure patient empowerment comprehensively.
Rabatin, Joseph S; Lipkin, Mack; Rubin, Alan S; Schachter, Allison; Nathan, Michael; Kalet, Adina
2004-05-01
We describe a specific mentoring approach in an academic general internal medicine setting by audiotaping and transcribing all mentoring sessions in the year. In advance, the mentor recorded his model. During the year, the mentee kept a process journal. Qualitative analysis revealed development of an intimate relationship based on empathy, trust, and honesty. The mentor's model was explicitly intended to develop independence, initiative, improved thinking, skills, and self-reflection. The mentor's methods included extensive and varied use of questioning, active listening, standard setting, and frequent feedback. During the mentoring, the mentee evolved as a teacher, enhanced the creativity in his teaching, and matured as a person. Specific accomplishments included a national workshop on professional writing, an innovative approach to inpatient attending, a new teaching skills curriculum for a residency program, and this study. A mentoring model stressing safety, intimacy, honesty, setting of high standards, praxis, and detailed planning and feedback was associated with mentee excitement, personal and professional growth and development, concrete accomplishments, and a commitment to teaching.
An inventory of undiscovered Canadian mineral resources
NASA Technical Reports Server (NTRS)
Labovitz, M. L.; Griffiths, J. C.
1982-01-01
Unit regional value (URV) and unit regional weight are area standardized measures of the expected value and quantity, respectively, of the mineral resources of a region. Estimation and manipulation of the URV statistic is the basis of an approach to mineral resource evaluation. Estimates of the kind and value of exploitable mineral resources yet to be discovered in the provinces of Canada are used as an illustration of the procedure. The URV statistic is set within a previously developed model wherein geology, as measured by point counting geologic maps, is related to the historical record of mineral resource production of well-developed regions of the world, such as the 50 states of the U.S.A.; these may be considered the training set. The Canadian provinces are related to this training set using geological information obtained in the same way from geologic maps of the provinces. The desired predictions of yet to be discovered mineral resources in the Canadian provinces arise as a consequence. The implicit assumption is that regions of similar geology, if equally well developed, will produce similar weights and values of mineral resources.
A Prototype of Mathematical Treatment of Pen Pressure Data for Signature Verification.
Li, Chi-Keung; Wong, Siu-Kay; Chim, Lai-Chu Joyce
2018-01-01
A prototype using simple mathematical treatment of the pen pressure data recorded by a digital pen movement recording device was derived. In this study, a total of 48 sets of signature and initial specimens were collected. Pearson's correlation coefficient was used to compare the data of the pen pressure patterns. From the 820 pair comparisons of the 48 sets of genuine signatures, a high degree of matching was found in which 95.4% (782 pairs) and 80% (656 pairs) had rPA > 0.7 and rPA > 0.8, respectively. In the comparison of the 23 forged signatures with their corresponding control signatures, 20 of them (89.2% of pairs) had rPA values < 0.6, showing a lower degree of matching when compared with the results of the genuine signatures. The prototype could be used as a complementary technique to improve the objectivity of signature examination and also has a good potential to be developed as a tool for automated signature identification. © 2017 American Academy of Forensic Sciences.
Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila
2011-01-01
Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10(-6). Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful.
2006-01-12
KENNEDY SPACE CENTER, FLA. - After the landing of the Virgin Atlantic Airways GlobalFlyer aircraft at NASA Kennedy Space Center’s Shuttle Landing Facility, Center Director James Kennedy (center, in front of the plane) addresses the media. At right is the pilot, Steve Fossett. At left are Jim Ball, KSC Spaceport Development manager, and Winston Scott, executive director of Florida Space Authority. The aircraft is being relocated from Salina, Kan., to the Shuttle Landing Facility to begin preparations for an attempt to set a new world record for the longest flight made by any aircraft. An exact takeoff date for the record-setting flight has not been determined and is contingent on weather and jet-stream conditions. The window for the attempt opens in mid-January, making the flight possible anytime between then and the end of February. NASA agreed to let Virgin Atlantic Airways use Kennedy's Shuttle Landing Facility as a takeoff site. The facility use is part of a pilot program to expand runway access for non-NASA activities.
Kim, Jihoon; Grillo, Janice M; Boxwala, Aziz A; Jiang, Xiaoqian; Mandelbaum, Rose B; Patel, Bhakti A; Mikels, Debra; Vinterbo, Staal A; Ohno-Machado, Lucila
2011-01-01
Our objective is to facilitate semi-automated detection of suspicious access to EHRs. Previously we have shown that a machine learning method can play a role in identifying potentially inappropriate access to EHRs. However, the problem of sampling informative instances to build a classifier still remained. We developed an integrated filtering method leveraging both anomaly detection based on symbolic clustering and signature detection, a rule-based technique. We applied the integrated filtering to 25.5 million access records in an intervention arm, and compared this with 8.6 million access records in a control arm where no filtering was applied. On the training set with cross-validation, the AUC was 0.960 in the control arm and 0.998 in the intervention arm. The difference in false negative rates on the independent test set was significant, P=1.6×10−6. Our study suggests that utilization of integrated filtering strategies to facilitate the construction of classifiers can be helpful. PMID:22195129
On the predictive ability of mechanistic models for the Haitian cholera epidemic.
Mari, Lorenzo; Bertuzzo, Enrico; Finger, Flavio; Casagrandi, Renato; Gatto, Marino; Rinaldo, Andrea
2015-03-06
Predictive models of epidemic cholera need to resolve at suitable aggregation levels spatial data pertaining to local communities, epidemiological records, hydrologic drivers, waterways, patterns of human mobility and proxies of exposure rates. We address the above issue in a formal model comparison framework and provide a quantitative assessment of the explanatory and predictive abilities of various model settings with different spatial aggregation levels and coupling mechanisms. Reference is made to records of the recent Haiti cholera epidemics. Our intensive computations and objective model comparisons show that spatially explicit models accounting for spatial connections have better explanatory power than spatially disconnected ones for short-to-intermediate calibration windows, while parsimonious, spatially disconnected models perform better with long training sets. On average, spatially connected models show better predictive ability than disconnected ones. We suggest limits and validity of the various approaches and discuss the pathway towards the development of case-specific predictive tools in the context of emergency management. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Charng, Jason; He, Zheng; Bui, Bang; Vingrys, Algis; Ivarsson, Magnus; Fish, Rebecca; Gurrell, Rachel; Nguyen, Christine
2016-06-29
The full-field electroretinogram (ERG) and visual evoked potential (VEP) are useful tools to assess retinal and visual pathway integrity in both laboratory and clinical settings. Currently, preclinical ERG and VEP measurements are performed with anesthesia to ensure stable electrode placements. However, the very presence of anesthesia has been shown to contaminate normal physiological responses. To overcome these anesthesia confounds, we develop a novel platform to assay ERG and VEP in conscious rats. Electrodes are surgically implanted sub-conjunctivally on the eye to assay the ERG and epidurally over the visual cortex to measure the VEP. A range of amplitude and sensitivity/timing parameters are assayed for both the ERG and VEP at increasing luminous energies. The ERG and VEP signals are shown to be stable and repeatable for at least 4 weeks post surgical implantation. This ability to record ERG and VEP signals without anesthesia confounds in the preclinical setting should provide superior translation to clinical data.
Furmaga, Jakub; Wax, Paul; Kleinschmidt, Kurt
2015-09-01
Intravenous N-acetylcysteine (NAC) causes few adverse drug events, with mild anaphylactoid reactions being the most common. Hyponatremia as a complication of hypoosmolar NAC solution has been reported. We describe how a locally constructed electronic medical record (EMR) order set for IV NAC resulted in a seizure from hyponatremia due to excess free water administration. A 13-month-old female with no past medical history presented to a hospital after ingesting an unknown number of acetaminophen 500 mg tablets. The 4-h acetaminophen concentration was 343 mcg/mL, and she was started on IV NAC. 8.2 h into her 21-h IV NAC protocol, she developed a tonic-clonic seizure. Repeat serum sodium was 124 mEq/L, a decrease from 142 mEq/L at the time of admission. She was treated with hypertonic saline, lorazepam, and levetiracetam and had no further seizures. A brain MRI and EEG were both normal. After the seizure was stabilized, the providers noticed that the patient had receive a total of 900 mL of D5W (112.5 mL/kg) in the first 9 h of hospitalization. This was caused by a poorly constructed, restrictive, EMR order set that did not allow customization of the IV NAC preparation. Because the 21-h IV NAC administration involves preparation of 3 different doses infused over 3 different time intervals, an order set was developed to reduce ordering errors. However, error in its construction caused the pharmacist to prepare a solution containing too much free water, decreasing patient's intravascular sodium and resulting in a seizure. The purposes of our case report were to highlight the dangers of overreliance on EMR order sets and to recognize hyponatremic seizures as an adverse reaction of an inappropriately prepared IV NAC.
da Silva, Kátia Regina; Costa, Roberto; Crevelari, Elizabeth Sartori; Lacerda, Marianna Sobral; de Moraes Albertini, Caio Marcos; Filho, Martino Martinelli; Santana, José Eduardo; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo; Barros, Jacson V
2013-01-01
The ability to apply standard and interoperable solutions for implementing and managing medical registries as well as aggregate, reproduce, and access data sets from legacy formats and platforms to advanced standard formats and operating systems are crucial for both clinical healthcare and biomedical research settings. Our study describes a reproducible, highly scalable, standard framework for a device registry implementation addressing both local data quality components and global linking problems. We developed a device registry framework involving the following steps: (1) Data standards definition and representation of the research workflow, (2) Development of electronic case report forms using REDCap (Research Electronic Data Capture), (3) Data collection according to the clinical research workflow and, (4) Data augmentation by enriching the registry database with local electronic health records, governmental database and linked open data collections, (5) Data quality control and (6) Data dissemination through the registry Web site. Our registry adopted all applicable standardized data elements proposed by American College Cardiology / American Heart Association Clinical Data Standards, as well as variables derived from cardiac devices randomized trials and Clinical Data Interchange Standards Consortium. Local interoperability was performed between REDCap and data derived from Electronic Health Record system. The original data set was also augmented by incorporating the reimbursed values paid by the Brazilian government during a hospitalization for pacemaker implantation. By linking our registry to the open data collection repository Linked Clinical Trials (LinkedCT) we found 130 clinical trials which are potentially correlated with our pacemaker registry. This study demonstrates how standard and reproducible solutions can be applied in the implementation of medical registries to constitute a re-usable framework. Such approach has the potential to facilitate data integration between healthcare and research settings, also being a useful framework to be used in other biomedical registries.
47 CFR 0.455 - Other locations at which records may be inspected.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ORGANIZATION General Information Public Information and Inspection of Records § 0.455 Other locations at which records may be inspected. Except as provided in §§ 0.453, 0.457, and 0.459, records are routinely... records may be inspected. Examples of the records available from Bureaus and Offices are set forth in...
Archer, Melissa; Proulx, Joshua; Shane-McWhorter, Laura; Bray, Bruce E; Zeng-Treitler, Qing
2014-01-01
While potential medication-to-medication interaction alerting engines exist in many clinical applications, few systems exist to automatically alert on potential medication to herbal supplement interactions. We have developed a preliminary knowledge base and rules alerting engine that detects 259 potential interactions between 9 supplements, 62 cardiac medications, and 19 drug classes. The rules engine takes into consideration 12 patient risk factors and 30 interaction warning signs to help determine which of three different alert levels to categorize each potential interaction. A formative evaluation was conducted with two clinicians to set initial thresholds for each alert level. Additional work is planned add more supplement interactions, risk factors, and warning signs as well as to continue to set and adjust the inputs and thresholds for each potential interaction.
Testing Electronic Algorithms to Create Disease Registries in a Safety Net System
Hanratty, Rebecca; Estacio, Raymond O.; Dickinson, L. Miriam; Chandramouli, Vijayalaxmi; Steiner, John F.; Havranek, Edward P.
2008-01-01
Electronic disease registries are a critical feature of the chronic disease management programs that are used to improve the care of individuals with chronic illnesses. These registries have been developed primarily in managed care settings; use in safety net institutions—organizations whose mission is to serve the uninsured and underserved—has not been described. We sought to assess the feasibility of developing disease registries from electronic data in a safety net institution, focusing on hypertension because of its importance in minority populations. We compared diagnoses obtained from algorithms utilizing electronic data, including laboratory and pharmacy records, against diagnoses derived from chart review. We found good concordance between diagnoses identified from electronic data and those identified by chart review, suggesting that registries of patients with chronic diseases can be developed outside the setting of closed panel managed care organizations. PMID:18469416
Smith, Lorraine; Alles, Chehani; Lemay, Kate; Reddel, Helen; Saini, Bandana; Bosnic-Anticevich, Sinthia; Emmerton, Lynne; Stewart, Kay; Burton, Debbie; Krass, Ines; Armour, Carol
2013-01-01
Goal setting was investigated as part of an implementation trial of an asthma management service (PAMS) conducted in 96 Australian community pharmacies. Patients and pharmacists identified asthma-related issues of concern to the patient and collaboratively set goals to address these. Although goal setting is commonly integrated into disease state management interventions, the nature of goals, and their contribution to goal attainment and health outcomes are not well understood. To identify and describe: 1) goals set collaboratively between adult patients with asthma and their pharmacist, 2) goal specificity and goal achievement, and 3) describe the relationships between specificity, achievement, asthma control and asthma-related quality of life. Measures of goal specificity, and goal achievement were developed and applied to patient data records. Goals set were thematically analyzed into goal domains. Proportions of goals set, goals achieved and their specificity were calculated. Correlational and regression analyses were undertaken to determine the relationships between goal specificity, goal achievement, asthma control and asthma-related quality of life. Data were drawn from 498 patient records. Findings showed that patients set a wide range and number of asthma-related goals (N = 1787) and the majority (93%) were either achieved or being working toward by the end of the study. Goal achievement was positively associated with specific and moderately specific goals, but not non-specific goals. However, on closer inspection, an inconsistent pattern of relationships emerged as a function of goal domain. Findings also showed that goal setting was associated with end-of-study asthma control but not to asthma-related quality of life. Pharmacists can help patients to set achievable and specific asthma management goals, and these have the potential to directly impact health outcomes such as asthma control. Goal specificity appears to be an important feature in the achievement of goals, but other factors may also play a role. Copyright © 2013 Elsevier Inc. All rights reserved.
Global Trends in Chlorophyll Concentration Observed with the Satellite Ocean Colour Data Record
NASA Astrophysics Data System (ADS)
Melin, F.; Vantrepotte, V.; Chuprin, A.; Grant, M.; Jackson, T.; Sathyendranath, S.
2016-08-01
To detect climate change signals in the data records derived from remote sensing of ocean colour, combining data from multiple missions is required, which implies that the existence of inter-mission differences be adequately addressed prior to undertaking trend studies. Trend distributions associated with merged products are compared with those obtained from single-mission data sets in order to evaluate their suitability for climate studies. Merged products originally developed for operational applications such as near-real time distribution (GlobColour) do not appear to be proper climate data records, showing large parts of the ocean with trends significantly different from trends obtained with SeaWiFS, MODIS or MERIS. On the other hand, results obtained from the Climate Change Initiative (CCI) data are encouraging, showing a good consistency with single-mission products.
Reanalysis of Korean War Anthropological Records to Support the Resolution of Cold Cases.
Wilson, Emily K
2017-09-01
Re-investigation of previously unidentified remains from the Korean War has yielded 55 new identifications, each with corresponding records of prior anthropological analyses. This study compares biological assessments for age at death, stature, and ancestry across (i) anthropological analyses from the 1950s, (ii) recent anthropological analyses of those same sets of remains, and (iii) the reported antemortem biological information for the identified individual. A comparison of long bone measurements from both the 1950s and during reanalysis is also presented. These comparisons demonstrate commonalities and continuing patterns of errors that are useful in refining both research on Korean War cold case records and forensic anthropological analyses performed using methods developed from the 1950s identifications. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
2014-07-01
mucos"x1; N Acquired Abnormality 4.7350 93696 76 0.85771...4. Roden DM, Pulley JM, Basford MA, et al. Development of a large- scale de-identified DNA biobank to enable personalized medicine. Clin Pharmacol...large healthcare system which incorporated clinical information from a 20-hospital setting (both aca- demic and community hospitals) of University of
Misra, A; Burke, JF; Ramayya, A; Jacobs, J; Sperling, MR; Moxon, KA; Kahana, MJ; Evans, JJ; Sharan, AD
2014-01-01
Objective The authors report methods developed for the implantation of micro-wire bundles into mesial temporal lobe structures and subsequent single neuron recording in epileptic patients undergoing in-patient diagnostic monitoring. This is done with the intention of lowering the perceived barriers to routine single neuron recording from deep brain structures in the clinical setting. Approach Over a 15 month period, 11 patients were implanted with platinum micro-wire bundles into mesial temporal structures. Protocols were developed for A) monitoring electrode integrity through impedance testing, B) ensuring continuous 24-7 recording, C) localizing micro-wire position and “splay” pattern and D) monitoring grounding and referencing to maintain the quality of recordings. Main Result Five common modes of failure were identified: 1) broken micro-wires from acute tensile force, 2) broken micro-wires from cyclic fatigue at stress points, 3) poor in-vivo micro-electrode separation, 4) motion artifact and 5) deteriorating ground connection and subsequent drop in common mode noise rejection. Single neurons have been observed up to 14 days post implantation and on 40% of micro-wires. Significance Long-term success requires detailed review of each implant by both the clinical and research teams to identify failure modes, and appropriate refinement of techniques while moving forward. This approach leads to reliable unit recordings without prolonging operative times, which will help increase the availability and clinical viability of human single neuron data. PMID:24608589
Evaluation of identifier field agreement in linked neonatal records.
Hall, E S; Marsolo, K; Greenberg, J M
2017-08-01
To better address barriers arising from missing and unreliable identifiers in neonatal medical records, we evaluated agreement and discordance among traditional and non-traditional linkage fields within a linked neonatal data set. The retrospective, descriptive analysis represents infants born from 2013 to 2015. We linked children's hospital neonatal physician billing records to newborn medical records originating from an academic delivery hospital and evaluated rates of agreement, discordance and missingness for a set of 12 identifier field pairs used in the linkage algorithm. We linked 7293 of 7404 physician billing records (98.5%), all of which were deemed valid upon manual review. Linked records contained a mean of 9.1 matching and 1.6 non-matching identifier pairs. Only 4.8% had complete agreement among all 12 identifier pairs. Our approach to selection of linkage variables and data formatting preparatory to linkage have generalizability, which may inform future neonatal and perinatal record linkage efforts.
Mining and Integration of Environmental Data
NASA Astrophysics Data System (ADS)
Tran, V.; Hluchy, L.; Habala, O.; Ciglan, M.
2009-04-01
The project ADMIRE (Advanced Data Mining and Integration Research for Europe) is a 7th FP EU ICT project aims to deliver a consistent and easy-to-use technology for extracting information and knowledge. The project is motivated by the difficulty of extracting meaningful information by data mining combinations of data from multiple heterogeneous and distributed resources. It will also provide an abstract view of data mining and integration, which will give users and developers the power to cope with complexity and heterogeneity of services, data and processes. The data sets describing phenomena from domains like business, society, and environment often contain spatial and temporal dimensions. Integration of spatio-temporal data from different sources is a challenging task due to those dimensions. Different spatio-temporal data sets contain data at different resolutions (e.g. size of the spatial grid) and frequencies. This heterogeneity is the principal challenge of geo-spatial and temporal data sets integration - the integrated data set should hold homogeneous data of the same resolution and frequency. Thus, to integrate heterogeneous spatio-temporal data from distinct source, transformation of one or more data sets is necessary. Following transformation operation are required: • transformation to common spatial and temporal representation - (e.g. transformation to common coordinate system), • spatial and/or temporal aggregation - data from detailed data source are aggregated to match the resolution of other resources involved in the integration process, • spatial and/or temporal record decomposition - records from source with lower resolution data are decomposed to match the granularity of the other data source. This operation decreases data quality (e.g. transformation of data from 50km grid to 10 km grid) - data from lower resolution data set in the integrated schema are imprecise, but it allows us to preserve higher resolution data. We can decompose the spatio-temporal data integration to following phases: • pre-integration data processing - different data set can be physically stored in different formats (e.g. relational databases, text files); it might be necessary to pre-process the data sets to be integrated, • identification of transformation operations necessary to integrate data in spatio-temporal dimensions, • identification of transformation operations to be performed on non-spatio-temporal attributes and • output data schema and set generation - given prepared data and the set of transformation, operations, the final integrated schema is produces. Spatio-temporal dimension brings its specifics also to the problem of mining spatio-temporal data sets. Spatio-temporal relationships exist among records in (s-t) data sets and those relationships should be considered in mining operation. This means that when analyzing a record in spatio-temporal data set, the records in its spatial and/or temporal proximity should be taken into account. In addition, the relationships discovered in spatio-temporal data can be different when mining the same data on different scales (e.g. mining the same data sets on 50 km grid with daily data vs. 10 km grid with hourly data). To be able to do effective data mining, we first needed to gather a sufficient amount of environmental data covering similar area and time span. For this purpose we have engaged in cooperation with several organizations working in the environmental domain in Slovakia, some of which are also our partners from previous research efforts. The organizations which volunteered some of their data are the Slovak Hydro-meteorological Institute (SHMU), the Slovak Water Enterprise (SVP), the Soil Science and Conservation Institute (VUPOP), and the Institute of Hydrology of the Slovak Academy of Sciences (UHSAV). We have prepared scenarios from general meteorology, as well as specialized in hydrology and soil protection.
Elshehawi, Waleed; Alsaffar, Hani; Roberts, Graham; Lucas, Victoria; McDonald, Fraser; Camilleri, Simon
2016-04-01
The purpose of this study was to develop and validate a Reference Data Set for Dental Age Assessment of the Maltese population and compare the mean Age of Attainment to a UK Caucasian Reference Data Set. The Maltese Reference Data Set was developed from 1593 Dental Panoramic Tomograms of patients aged between 4 and 26 years, taken from the radiographic archives of the Dental Department, Mater Dei Hospital, Malta. Tooth Development Stages were recorded for all 16 maxillary and mandibular permanent teeth on the left side and both permanent third molars on the right, according to Demirjian's staging method. Summary and percentile data were calculated for each Tooth Development Stage, including the mean Age of Attainment. These means were used to estimate the Dental Age of each subject in the study sample using the simple unweighted average method. The estimated Dental Age was compared to the gold standard of the Chronological Age. Comparison of the Maltese and UK Caucasian Reference Data Set was by a series of t-tests, carried out for each paired Tooth Development Stage by gender. The mean Age of Attainment was slightly higher for the Maltese than the UK Caucasians in both males and females. However there was no statistically significant difference between the Chronological Age and Dental Age for either sex. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
China Report, Red Flag, Number 7, 1 April 1986.
1986-05-30
for 1,000 percent of the foreign exchange retained by Hainan, and approved the import of more than 89,000 automobiles, over 250,000 video recorders...businessmen and earned money from the hinterland. In the end, it was Hainan that suffered. The influx of a large number of automobiles, video recorders...color TV sets and so on has been tragic for Hainan. At present, over 88,000 video recorders, 520,000 colot TV sets, and other imported goods, worth a
Shanahan, C W; Sorensen-Alawad, A; Carney, B L; Persand, I; Cruz, A; Botticelli, M; Pressman, K; Adams, W G; Brolin, M; Alford, D P
2014-01-01
The Massachusetts Screening, Brief Intervention and Referral to Treatment (MASBIRT) Program, a substance use screening program in general medical settings, created a web-based, point-of-care (POC), application--the MASBIRT Portal (the "Portal") to meet program goals. We report on development and implementation of the Portal. Five year program process outcomes recorded by an independent evaluator and an anonymous survey of Health Educator's (HEs) adoption, perceptions and Portal use with a modified version of the Technology Readiness Index are described. [8] Specific management team members, selected based on their roles in program leadership, development and implementation of the Portal and supervision of HEs, participated in semi-structured, qualitative interviews. At the conclusion of the program 73% (24/33) of the HEs completed a survey on their experience using the Portal. HEs reported that the Portal made recording screening information easy (96%); improved planning their workday (83%); facilitated POC data collection (84%); decreased time dedicated to data entry (100%); and improved job satisfaction (59%). The top two barriers to use were "no or limited wireless connectivity" (46%) and "the tablet was too heavy/bulky to carry" (29%). Qualitative management team interviews identified strategies for successful HIT implementation: importance of engaging HEs in outlining specifications and workflow needs, collaborative testing prior to implementation and clear agreement on data collection purpose, quality requirements and staff roles. Overall, HEs perceived the Portal favorably with regard to time saving ability and improved workflow. Lessons learned included identifying core requirements early during system development and need for managers to institute and enforce consistent behavioral work norms. Barriers and HEs' views of technology impacted the utilization of the MASBIRT Portal. Further research is needed to determine best approaches for HIT system implementation in general medical settings.
Set up and Operation of Video Cassette Recorders or "...How Do I Work This Thing???"
ERIC Educational Resources Information Center
Alaska State Dept. of Education, Juneau.
Designed to assist Alaskans in making optimum use of the LearnAlaska TV transmitter network, this booklet provides instructions for the operation and maintenance of videocassette recorders (VCRs). After a brief introduction, which lists state film library addresses for ordering an accompanying videocassette entitled "Set Up & Operation…
ERIC Educational Resources Information Center
Collett, Peter
Data were collected for this study of the relationship between television watching and family life via a recording device (C-Box) consisting of a television set and a video camera. Designed for the study, this device was installed in 20 homes for one week to record the viewing area in front of the television set together with information on…
A Climate Data Record (CDR) for the global terrestrial water budget: 1984–2010
Zhang, Yu; Pan, Ming; Sheffield, Justin; ...
2018-01-12
Closing the terrestrial water budget is necessary to provide consistent estimates of budget components for understanding water resources and changes over time. Given the lack of in situ observations of budget components at anything but local scale, merging information from multiple data sources (e.g., in situ observation, satellite remote sensing, land surface model, and reanalysis) through data assimilation techniques that optimize the estimation of fluxes is a promising approach. Conditioned on the current limited data availability, a systematic method is developed to optimally combine multiple available data sources for precipitation ( P), evapotranspiration (ET), runoff ( R), and the totalmore » water storage change (TWSC) at 0.5° spatial resolution globally and to obtain water budget closure (i.e., to enforce P-ET- R-TWSC = 0) through a constrained Kalman filter (CKF) data assimilation technique under the assumption that the deviation from the ensemble mean of all data sources for the same budget variable is used as a proxy of the uncertainty in individual water budget variables. The resulting long-term (1984–2010), monthly 0.5° resolution global terrestrial water cycle Climate Data Record (CDR) data set is developed under the auspices of the National Aeronautics and Space Administration (NASA) Earth System Data Records (ESDRs) program. This data set serves to bridge the gap between sparsely gauged regions and the regions with sufficient in situ observations in investigating the temporal and spatial variability in the terrestrial hydrology at multiple scales. The CDR created in this study is validated against in situ measurements like river discharge from the Global Runoff Data Centre (GRDC) and the United States Geological Survey (USGS), and ET from FLUXNET. The data set is shown to be reliable and can serve the scientific community in understanding historical climate variability in water cycle fluxes and stores, benchmarking the current climate, and validating models.« less
A Climate Data Record (CDR) for the global terrestrial water budget: 1984–2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yu; Pan, Ming; Sheffield, Justin
Closing the terrestrial water budget is necessary to provide consistent estimates of budget components for understanding water resources and changes over time. Given the lack of in situ observations of budget components at anything but local scale, merging information from multiple data sources (e.g., in situ observation, satellite remote sensing, land surface model, and reanalysis) through data assimilation techniques that optimize the estimation of fluxes is a promising approach. Conditioned on the current limited data availability, a systematic method is developed to optimally combine multiple available data sources for precipitation ( P), evapotranspiration (ET), runoff ( R), and the totalmore » water storage change (TWSC) at 0.5° spatial resolution globally and to obtain water budget closure (i.e., to enforce P-ET- R-TWSC = 0) through a constrained Kalman filter (CKF) data assimilation technique under the assumption that the deviation from the ensemble mean of all data sources for the same budget variable is used as a proxy of the uncertainty in individual water budget variables. The resulting long-term (1984–2010), monthly 0.5° resolution global terrestrial water cycle Climate Data Record (CDR) data set is developed under the auspices of the National Aeronautics and Space Administration (NASA) Earth System Data Records (ESDRs) program. This data set serves to bridge the gap between sparsely gauged regions and the regions with sufficient in situ observations in investigating the temporal and spatial variability in the terrestrial hydrology at multiple scales. The CDR created in this study is validated against in situ measurements like river discharge from the Global Runoff Data Centre (GRDC) and the United States Geological Survey (USGS), and ET from FLUXNET. The data set is shown to be reliable and can serve the scientific community in understanding historical climate variability in water cycle fluxes and stores, benchmarking the current climate, and validating models.« less
NASA Astrophysics Data System (ADS)
Azerêdo, Ana C.; Paul Wright, V.; Mendonça-Filho, João; Cristina Cabral, M.; Duarte, Luís V.
2015-06-01
The unusual occurrence of calcretes and prominent organic matter in the Middle Jurassic (Lower Bathonian, Serra de Aire Formation) of the Lusitanian Basin of western Portugal (Western Iberian Margin) revealed a complex palimpsest exposure record, here interpreted as reflecting hydrological changes caused by phases of emergence and immersion. It serves as a potential model for understanding stratigraphic development at lowstand surfaces in carbonate successions. The exposure-dominated facies association grades upwards into peritidal and lagoonal limestones, and the interval is assigned to the regressive peak of a Transgressive-Regressive Facies Cycle (2nd order) of the thick Middle Jurassic carbonate ramp succession. The Galinha Quarry, Fátima region, NE of Lisbon, a type section for this lowstand assemblage, exhibits varied calcretes, with black-clasts, interbedded with, and grading into: organic-rich marly/clayey seams and lenses, locally with carbonate nodules; carbonates with evaporite traces; microbial laminites; black-clast and fenestral limestones; some lithofacies are dolomitized. The palynofacies contains phytoclasts associated with less refractory, more prone to degradation components, which suggests natural combustion/pyrolysis (wild fires). The lowstand surface represents a low relief landscape with small depressions/ponds bordering a more distal marginal-littoral setting; the partly subaerial and partly subaqueous settings were subjected to lengthy exposure and to fluctuating, very shallow water bodies and water table. Coeval climatic regime was a seasonally dry/wet one, with dry/semi-arid phases dominating over the sub-humid, as shown by the combined record of intense calcrete development, rhizogenic structures, microbial mats, brecciation, desiccation, evaporites and wild fire evidence. However, sea level rise caused changes to shallow, sea-water influenced restricted lagoonal-peritidal settings. Comparisons and differences with modern and ancient coast marginal carbonates are made to provide a guide to the variability in such lowstand deposystems in the stratigraphic record.
A Climate Data Record (CDR) for the global terrestrial water budget: 1984-2010
NASA Astrophysics Data System (ADS)
Zhang, Yu; Pan, Ming; Sheffield, Justin; Siemann, Amanda L.; Fisher, Colby K.; Liang, Miaoling; Beck, Hylke E.; Wanders, Niko; MacCracken, Rosalyn F.; Houser, Paul R.; Zhou, Tian; Lettenmaier, Dennis P.; Pinker, Rachel T.; Bytheway, Janice; Kummerow, Christian D.; Wood, Eric F.
2018-01-01
Closing the terrestrial water budget is necessary to provide consistent estimates of budget components for understanding water resources and changes over time. Given the lack of in situ observations of budget components at anything but local scale, merging information from multiple data sources (e.g., in situ observation, satellite remote sensing, land surface model, and reanalysis) through data assimilation techniques that optimize the estimation of fluxes is a promising approach. Conditioned on the current limited data availability, a systematic method is developed to optimally combine multiple available data sources for precipitation (P), evapotranspiration (ET), runoff (R), and the total water storage change (TWSC) at 0.5° spatial resolution globally and to obtain water budget closure (i.e., to enforce P - ET - R - TWSC = 0) through a constrained Kalman filter (CKF) data assimilation technique under the assumption that the deviation from the ensemble mean of all data sources for the same budget variable is used as a proxy of the uncertainty in individual water budget variables. The resulting long-term (1984-2010), monthly 0.5° resolution global terrestrial water cycle Climate Data Record (CDR) data set is developed under the auspices of the National Aeronautics and Space Administration (NASA) Earth System Data Records (ESDRs) program. This data set serves to bridge the gap between sparsely gauged regions and the regions with sufficient in situ observations in investigating the temporal and spatial variability in the terrestrial hydrology at multiple scales. The CDR created in this study is validated against in situ measurements like river discharge from the Global Runoff Data Centre (GRDC) and the United States Geological Survey (USGS), and ET from FLUXNET. The data set is shown to be reliable and can serve the scientific community in understanding historical climate variability in water cycle fluxes and stores, benchmarking the current climate, and validating models.
NASA Astrophysics Data System (ADS)
De Smedt, Isabelle; Richter, Andreas; Beirle, Steffen; Danckaert, Thomas; Van Roozendael, Michel; Yu, Huan; Bösch, Tim; Hilboll, Andreas; Peters, Enno; Doerner, Steffen; Wagner, Thomas; Wang, Yang; Lorente, Alba; Eskes, Henk; Van Geffen, Jos; Boersma, Folkert
2016-04-01
One of the main goals of the QA4ECV project is to define community best-practices for the generation of multi-decadal ECV data records from satellite instruments. QA4ECV will develop retrieval algorithms for the Land ECVs surface albedo, leaf area index (LAI), and fraction of active photosynthetic radiation (fAPAR), as well as for the Atmosphere ECV ozone and aerosol precursors nitrogen dioxide (NO2), formaldehyde (HCHO), and carbon monoxide (CO). Here we assess best practices and provide recommendations for the retrieval of HCHO. Best practices are established based on (1) a detailed intercomparison exercise between the QA4ECV partner's for each specific algorithm processing steps, (2) the feasibility of implementation, and (3) the requirement to generate consistent multi-sensor multi-decadal data records. We propose a fitting window covering the 328.5-346 nm spectral interval for the morning sensors (GOME, SCIAMACHY and GOME-2) and an extension to 328.5-359 nm for OMI and GOME-2, allowed by improved quality of the recorded spectra. A high level of consistency between group algorithms is found when the retrieval settings are carefully aligned. However, the retrieval of slant columns is highly sensitive to any change in the selected settings. The use of a mean background radiance as DOAS reference spectrum allows for a stabilization of the retrievals. A background correction based on the reference sector method is recommended for implementation in the QA4ECV HCHO algorithm as it further reduces retrieval uncertainties. HCHO AMFs using different radiative transfer codes show a good overall consistency when harmonized settings are used. As for NO2, it is proposed to use a priori HCHO profiles from the TM5 model. These are provided on a 1°x1° latitude-longitude grid.
The cost of doing business: cost structure of electronic immunization registries.
Fontanesi, John M; Flesher, Don S; De Guire, Michelle; Lieberthal, Allan; Holcomb, Kathy
2002-10-01
To predict the true cost of developing and maintaining an electronic immunization registry, and to set the framework for developing future cost-effective and cost-benefit analysis. Primary data collected at three immunization registries located in California, accounting for 90 percent of all immunization records in registries in the state during the study period. A parametric cost analysis compared registry development and maintenance expenditures to registry performance requirements. Data were collected at each registry through interviews, reviews of expenditure records, technical accomplishments development schedules, and immunization coverage rates. The cost of building immunization registries is predictable and independent of the hardware/software combination employed. The effort requires four man-years of technical effort or approximately $250,000 in 1998 dollars. Costs for maintaining a registry were approximately $5,100 per end user per three-year period. There is a predictable cost structure for both developing and maintaining immunization registries. The cost structure can be used as a framework for examining the cost-effectiveness and cost-benefits of registries. The greatest factor effecting improvement in coverage rates was ongoing, user-based administrative investment.
Estimation of age in the living: in matters civil and criminal.
Aggrawal, Anil
2009-05-11
Estimation of age is one of the main tasks of a forensic practitioner, especially in third world countries, where many births take place in rural settings without the benefit of the expert supervision of a trained obstetrician. Such births are poorly recorded or more often not recorded at all in terms of exact dates. In many other cases, records are fraudulently falsified for some gain, e.g. to get government jobs or pensions. Developed countries, where ordinarily the birth records are meticulously maintained, are not immune to this problem either. Estimation of the age of living individuals may be needed here for refugees or other persons who arrive without acceptable identification papers. A wide variety of methods are used by the forensic clinician to assess the age of the individual in such cases. This paper discusses and evaluates the most common methods used in India, although the methods can be effectively utilized by medico-legal professionals anywhere in the world.
Methods, caveats and the future of large-scale microelectrode recordings in the non-human primate
Dotson, Nicholas M.; Goodell, Baldwin; Salazar, Rodrigo F.; Hoffman, Steven J.; Gray, Charles M.
2015-01-01
Cognitive processes play out on massive brain-wide networks, which produce widely distributed patterns of activity. Capturing these activity patterns requires tools that are able to simultaneously measure activity from many distributed sites with high spatiotemporal resolution. Unfortunately, current techniques with adequate coverage do not provide the requisite spatiotemporal resolution. Large-scale microelectrode recording devices, with dozens to hundreds of microelectrodes capable of simultaneously recording from nearly as many cortical and subcortical areas, provide a potential way to minimize these tradeoffs. However, placing hundreds of microelectrodes into a behaving animal is a highly risky and technically challenging endeavor that has only been pursued by a few groups. Recording activity from multiple electrodes simultaneously also introduces several statistical and conceptual dilemmas, such as the multiple comparisons problem and the uncontrolled stimulus response problem. In this perspective article, we discuss some of the techniques that we, and others, have developed for collecting and analyzing large-scale data sets, and address the future of this emerging field. PMID:26578906
NASA Astrophysics Data System (ADS)
Dearing Crampton-Flood, Emily; Peterse, Francien; Munsterman, Dirk; Sinninghe Damsté, Jaap S.
2018-05-01
The Pliocene is often regarded as a suitable analogue for future climate, due to an overall warmer climate (2-3 °C) coupled with atmospheric CO2 concentrations largely similar to present values (∼400 ppmv). Numerous Pliocene sea surface temperature (SST) records are available, however, little is known about climate in the terrestrial realm. Here we generated a Pliocene continental temperature record for Northwestern Europe based on branched glycerol dialkyl glycerol tetraether (brGDGT) membrane lipids stored in a marine sedimentary record from the western Netherlands. The total organic carbon (TOC) content of the sediments and its stable carbon isotopic composition (δ13Corg) indicate a strong transition from primarily marine derived organic matter (OM) during the Pliocene, to predominantly terrestrially derived OM after the transition into the Pleistocene. This trend is supported by the ratio of branched and isoprenoid tetraethers (BIT index). The marine-terrestrial transition indicates a likely change in brGDGT sources in the core, which may complicate the applicability of the brGDGT paleotemperature proxy in this setting. Currently, the application of the brGDGT-based paleothermometer on coastal marine sediments has been hampered by a marine overprint. Here, we propose a method to disentangle terrestrial and marine sources based on the degree of cyclization of tetramethylated brGDGTs (#rings) using a linear mixing model based on the global soil calibration set and a newly developed coastal marine temperature transfer function. Application of this method on our brGDGT record resulted in a 'corrected' terrestrial temperature record (MATterr). This latter record indicates that continental temperatures were ∼12-14 °C during the Early Pliocene, and 10.5-12 °C during the Mid Pliocene, confirming other Pliocene pollen based terrestrial temperature estimates from Northern and Central Europe. Furthermore, two colder (Δ 5-7 °C) periods in the Pliocene MATterr record show that the influence of Pliocene glacials reached well into NW Europe.
A 184-year record of river meander migration from tree rings, aerial imagery, and cross sections
NASA Astrophysics Data System (ADS)
Schook, Derek M.; Rathburn, Sara L.; Friedman, Jonathan M.; Wolf, J. Marshall
2017-09-01
Channel migration is the primary mechanism of floodplain turnover in meandering rivers and is essential to the persistence of riparian ecosystems. Channel migration is driven by river flows, but short-term records cannot disentangle the effects of land use, flow diversion, past floods, and climate change. We used three data sets to quantify nearly two centuries of channel migration on the Powder River in Montana. The most precise data set came from channel cross sections measured an average of 21 times from 1975 to 2014. We then extended spatial and temporal scales of analysis using aerial photographs (1939-2013) and by aging plains cottonwoods along transects (1830-2014). Migration rates calculated from overlapping periods across data sets mostly revealed cross-method consistency. Data set integration revealed that migration rates have declined since peaking at 5 m/year in the two decades after the extreme 1923 flood (3000 m3/s). Averaged over the duration of each data set, cross section channel migration occurred at 0.81 m/year, compared to 1.52 m/year for the medium-length air photo record and 1.62 m/year for the lengthy cottonwood record. Powder River peak annual flows decreased by 48% (201 vs. 104 m3/s) after the largest flood of the post-1930 gaged record (930 m3/s in 1978). Declining peak discharges led to a 53% reduction in channel width and a 29% increase in sinuosity over the 1939-2013 air photo record. Changes in planform geometry and reductions in channel migration make calculations of floodplain turnover rates dependent on the period of analysis. We found that the intensively studied last four decades do not represent the past two centuries.
A 184-year record of river meander migration from tree rings, aerial imagery, and cross sections
Schook, Derek M.; Rathburn, Sara L.; Friedman, Jonathan M.; Wolf, J. Marshall
2017-01-01
Channel migration is the primary mechanism of floodplain turnover in meandering rivers and is essential to the persistence of riparian ecosystems. Channel migration is driven by river flows, but short-term records cannot disentangle the effects of land use, flow diversion, past floods, and climate change. We used three data sets to quantify nearly two centuries of channel migration on the Powder River in Montana. The most precise data set came from channel cross sections measured an average of 21 times from 1975 to 2014. We then extended spatial and temporal scales of analysis using aerial photographs (1939–2013) and by aging plains cottonwoods along transects (1830–2014). Migration rates calculated from overlapping periods across data sets mostly revealed cross-method consistency. Data set integration revealed that migration rates have declined since peaking at 5 m/year in the two decades after the extreme 1923 flood (3000 m3/s). Averaged over the duration of each data set, cross section channel migration occurred at 0.81 m/year, compared to 1.52 m/year for the medium-length air photo record and 1.62 m/year for the lengthy cottonwood record. Powder River peak annual flows decreased by 48% (201 vs. 104 m3/s) after the largest flood of the post-1930 gaged record (930 m3/s in 1978). Declining peak discharges led to a 53% reduction in channel width and a 29% increase in sinuosity over the 1939–2013 air photo record. Changes in planform geometry and reductions in channel migration make calculations of floodplain turnover rates dependent on the period of analysis. We found that the intensively studied last four decades do not represent the past two centuries
Bergman-Marković, Biserka; Katić, Milica; Kern, Josipa
2007-01-01
Well-organised medical records are the prerequisite for achieving a high level of performance in primary healthcare settings. Recording balanced structured and coded data as well as free text can improve both quality and organisation of work in the office. It provides a more substantiated support of financial transactions and accountancy, allows better communication with other facilities and institutions, and is a source of valuable scientific research material. This article is the result of an individual experience gained in general practice use of various programs/systems employed within the family medicine frame, and the frame of evaluation of available and commonly-exploited program solutions. The use of various programs allows for systematic adjustments as to the increasingly complex requirements imposed on electronic medical records (EMRs). The experience of a general practitioner, presented in this paper, confirms the assumption that an adequate program to be employed with EMRs should be developed, provided that family medicine practitioners, that is, the final users, have been involved in each and every stage of its development, adjustment, implementation and evaluation.
Moving electronic medical records upstream: incorporating social determinants of health.
Gottlieb, Laura M; Tirozzi, Karen J; Manchanda, Rishi; Burns, Abby R; Sandel, Megan T
2015-02-01
Knowledge of the biological pathways and mechanisms connecting social factors with health has increased exponentially over the past 25 years, yet in most clinical settings, screening and intervention around social determinants of health are not part of standard clinical care. Electronic medical records provide new opportunities for assessing and managing social needs in clinical settings, particularly those serving vulnerable populations. To illustrate the feasibility of capturing information and promoting interventions related to social determinants of health in electronic medical records. Three case studies were examined in which electronic medical records have been used to collect data and address social determinants of health in clinical settings. From these case studies, we identified multiple functions that electronic medical records can perform to facilitate the integration of social determinants of health into clinical systems, including screening, triaging, referring, tracking, and data sharing. If barriers related to incentives, training, and privacy can be overcome, electronic medical record systems can improve the integration of social determinants of health into healthcare delivery systems. More evidence is needed to evaluate the impact of such integration on health care outcomes before widespread adoption can be recommended. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Records. 1633.11 Section 1633.11 Commercial... (OPEN FLAME) OF MATTRESS SETS Rules and Regulations § 1633.11 Records. (a) Test and manufacturing records C general. Every manufacturer and any other person initially introducing into commerce mattress...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Records. 1633.11 Section 1633.11 Commercial... (OPEN FLAME) OF MATTRESS SETS Rules and Regulations § 1633.11 Records. (a) Test and manufacturing records C general. Every manufacturer and any other person initially introducing into commerce mattress...
Development of a New Optical Measuring Set-Up
NASA Astrophysics Data System (ADS)
Miroshnichenko, I. P.; Parinov, I. A.
2018-06-01
The paper proposes a description of the developed optical measuring set-up for the contactless recording and processing of measurement results for small spatial (linear and angular) displacements of control surfaces based on the use of laser technologies and optical interference methods. The proposed set-up is designed to solve all the arising measurement tasks in the study of the physical and mechanical properties of new materials and in the process of diagnosing the state of structural materials by acoustic active methods of nondestructive testing. The structure of the set-up, its constituent parts are described, and the features of construction and functioning during measurements are discussed. New technical solutions for the implementation of the components of the set-up under consideration are obtained. The purpose and description of the original specialized software, used to perform a priori analysis of measurement results, are present, while performing measurements, for a posteriori analysis of measurement results. Moreover, the influences of internal and external disturbance effects on the measurement results and correcting measurement results directly in their implementation are determined. The technical solutions, used in the set-up, are protected by the patents of the Russian Federation for inventions, and software is protected by the certificates of state registration of computer programs. The proposed set-up is intended for use in instrumentation, mechanical engineering, shipbuilding, aviation, energy sector, etc.
An Elaborate Data Set Characterizing the Mechanical Response of the Foot
Erdemir, Ahmet; Sirimamilla, Pavana A.; Halloran, Jason P.; van den Bogert, Antonie J.
2010-01-01
Background Mechanical properties of the foot are responsible for its normal function and play a role in various clinical problems. Specifically, we are interested in quantification of foot mechanical properties to assist the development of computational models for movement analysis and detailed simulations of tissue deformation. Current available data are specific to a foot region and the loading scenarios are limited to a single direction. A data set that incorporates regional response, to quantify individual function of foot components, as well as overall response, to illustrate their combined operation, does not exist. Furthermore, combined three-dimensional loading scenarios while measuring the complete three-dimensional deformation response are lacking. When combined with an anatomical image data set, development of anatomically realistic and mechanically validated models becomes possible. Therefore, the goal of this study was to record and disseminate the mechanical response of a foot specimen, supported by imaging data. Method of Approach Robotic testing was conducted at the rear foot, forefoot, metatarsal heads, and the foot as a whole. Complex foot deformations were induced by single mode loading, e.g. compression, and combined loading, e.g. compression and shear. Small and large indenters were used for heel and metatarsal head loading; an elevated platform was utilized to isolate the rear foot and forefoot; and a full platform compressed the whole foot. Three-dimensional tool movements and reaction loads were recorded simultaneously. Computed tomography scans of the same specimen were collected for anatomical reconstruction a-priori. Results Three-dimensional mechanical response of the specimen was nonlinear and viscoelastic. A low stiffness region was observed starting with contact between the tool and foot regions, increasing with loading. Loading and unloading response portrayed hysteresis. Loading range ensured capturing the toe and linear regions of the load deformation curves for the dominant loading direction, with the rates approximating those of walking. Conclusion A large data set was successfully obtained to characterize the overall as well as regional mechanical response of an intact foot specimen under single and combined loads. Medical imaging complemented the mechanical testing data to establish the potential relationship between the anatomical architecture and mechanical response, and for further development of foot models that are mechanically realistic and anatomically consistent. This combined data set has been documented and disseminated in the public domain to promote future development in foot biomechanics. PMID:19725699
Kannan, V; Fish, JS; Mutz, JM; Carrington, AR; Lai, K; Davis, LS; Youngblood, JE; Rauschuber, MR; Flores, KA; Sara, EJ; Bhat, DG; Willett, DL
2017-01-01
Summary Background Creation of a new electronic health record (EHR)-based registry often can be a "one-off" complex endeavor: first developing new EHR data collection and clinical decision support tools, followed by developing registry-specific data extractions from the EHR for analysis. Each development phase typically has its own long development and testing time, leading to a prolonged overall cycle time for delivering one functioning registry with companion reporting into production. The next registry request then starts from scratch. Such an approach will not scale to meet the emerging demand for specialty registries to support population health and value-based care. Objective To determine if the creation of EHR-based specialty registries could be markedly accelerated by employing (a) a finite core set of EHR data collection principles and methods, (b) concurrent engineering of data extraction and data warehouse design using a common dimensional data model for all registries, and (c) agile development methods commonly employed in new product development. Methods We adopted as guiding principles to (a) capture data as a by product of care of the patient, (b) reinforce optimal EHR use by clinicians, (c) employ a finite but robust set of EHR data capture tool types, and (d) leverage our existing technology toolkit. Registries were defined by a shared condition (recorded on the Problem List) or a shared exposure to a procedure (recorded on the Surgical History) or to a medication (recorded on the Medication List). Any EHR fields needed—either to determine registry membership or to calculate a registry-associated clinical quality measure (CQM)—were included in the enterprise data warehouse (EDW) shared dimensional data model. Extract-transform-load (ETL) code was written to pull data at defined “grains” from the EHR into the EDW model. All calculated CQM values were stored in a single Fact table in the EDW crossing all registries. Registry-specific dashboards were created in the EHR to display both (a) real-time patient lists of registry patients and (b) EDW-generated CQM data. Agile project management methods were employed, including co-development, lightweight requirements documentation with User Stories and acceptance criteria, and time-boxed iterative development of EHR features in 2-week “sprints” for rapid-cycle feedback and refinement. Results Using this approach, in calendar year 2015 we developed a total of 43 specialty chronic disease registries, with 111 new EHR data collection and clinical decision support tools, 163 new clinical quality measures, and 30 clinic-specific dashboards reporting on both real-time patient care gaps and summarized and vetted CQM measure performance trends. Conclusions This study suggests concurrent design of EHR data collection tools and reporting can quickly yield useful EHR structured data for chronic disease registries, and bodes well for efforts to migrate away from manual abstraction. This work also supports the view that in new EHR-based registry development, as in new product development, adopting agile principles and practices can help deliver valued, high-quality features early and often. PMID:28930362
The FORBIO Climate data set for climate analyses
NASA Astrophysics Data System (ADS)
Delvaux, C.; Journée, M.; Bertrand, C.
2015-06-01
In the framework of the interdisciplinary FORBIO Climate research project, the Royal Meteorological Institute of Belgium is in charge of providing high resolution gridded past climate data (i.e. temperature and precipitation). This climate data set will be linked to the measurements on seedlings, saplings and mature trees to assess the effects of climate variation on tree performance. This paper explains how the gridded daily temperature (minimum and maximum) data set was generated from a consistent station network between 1980 and 2013. After station selection, data quality control procedures were developed and applied to the station records to ensure that only valid measurements will be involved in the gridding process. Thereafter, the set of unevenly distributed validated temperature data was interpolated on a 4 km × 4 km regular grid over Belgium. The performance of different interpolation methods has been assessed. The method of kriging with external drift using correlation between temperature and altitude gave the most relevant results.
Evaluation of setting time and flow properties of self-synthesize alginate impressions
NASA Astrophysics Data System (ADS)
Halim, Calista; Cahyanto, Arief; Sriwidodo, Harsatiningsih, Zulia
2018-02-01
Alginate is an elastic hydrocolloid dental impression materials to obtain negative reproduction of oral mucosa such as to record soft-tissue and occlusal relationships. The aim of the present study was to synthesize alginate and to determine the setting time and flow properties. There were five groups of alginate consisted of fifty samples self-synthesize alginate and commercial alginate impression product. Fifty samples were divided according to two tests, each twenty-five samples for setting time and flow test. Setting time test was recorded in the s unit, meanwhile, flow test was recorded in the mm2 unit. The fastest setting time result was in the group three (148.8 s) and the latest was group fours). The highest flow test result was in the group three (69.70 mm2) and the lowest was group one (58.34 mm2). Results were analyzed statistically by one way ANOVA (α= 0.05), showed that there was a statistical significance of setting time while no statistical significance of flow properties between self-synthesize alginate and alginate impression product. In conclusion, the alginate impression was successfully self-synthesized and variation composition gives influence toward setting time and flow properties. The most resemble setting time of control group is group three. The most resemble flow of control group is group four.
NASA Astrophysics Data System (ADS)
El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali
2015-09-01
The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.
Garrido, Terhilda; Kumar, Sudheen; Lekas, John; Lindberg, Mark; Kadiyala, Dhanyaja; Whippy, Alan; Crawford, Barbara; Weissberg, Jed
2014-01-01
Using electronic health records (EHR) to automate publicly reported quality measures is receiving increasing attention and is one of the promises of EHR implementation. Kaiser Permanente has fully or partly automated six of 13 the joint commission measure sets. We describe our experience with automation and the resulting time savings: a reduction by approximately 50% of abstractor time required for one measure set alone (surgical care improvement project). However, our experience illustrates the gap between the current and desired states of automated public quality reporting, which has important implications for measure developers, accrediting entities, EHR vendors, public/private payers, and government. PMID:23831833
NASA Astrophysics Data System (ADS)
Laws, Priscilla W.
2004-05-01
The Workshop Physics Activity Guide is a set of student workbooks designed to serve as the foundation for a two-semester calculus-based introductory physics course. It consists of 28 units that interweave text materials with activities that include prediction, qualitative observation, explanation, equation derivation, mathematical modeling, quantitative experiments, and problem solving. Students use a powerful set of computer tools to record, display, and analyze data, as well as to develop mathematical models of physical phenomena. The design of many of the activities is based on the outcomes of physics education research.
Feng, Rung-Chuang; Tseng, Kuan-Jui; Yan, Hsiu-Fang; Huang, Hsiu-Ya; Chang, Polun
2012-01-01
This study examines the capability of the Clinical Care Classification (CCC) system to represent nursing record data in a medical center in Taiwan. Nursing care records were analyzed using the process of knowledge discovery in data sets. The study data set included all the nursing care plan records from December 1998 to October 2008, totaling 2,060,214 care plan documentation entries. Results show that 75.42% of the documented diagnosis terms could be mapped using the CCC system. A total of 21 established nursing diagnoses were recommended to be added into the CCC system. Results show that one-third of the assessment and care tasks were provided by nursing professionals. This study shows that the CCC system is useful for identifying patterns in nursing practices and can be used to construct a nursing database in the acute setting. PMID:24199066
NASA Astrophysics Data System (ADS)
Barth, A. P.; Tani, K.; Meffre, S.; Wooden, J. L.; Coble, M. A.; Arculus, R. J.; Ishizuka, O.; Shukle, J. T.
2017-10-01
A 1.2 km thick Paleogene volcaniclastic section at International Ocean Discovery Program Site 351-U1438 preserves the deep-marine, proximal record of Izu-Bonin oceanic arc initiation, and volcano evolution along the Kyushu-Palau Ridge (KPR). Pb/U ages and trace element compositions of zircons recovered from volcaniclastic sandstones preserve a remarkable temporal record of juvenile island arc evolution. Pb/U ages ranging from 43 to 27 Ma are compatible with provenance in one or more active arc edifices of the northern KPR. The abundances of selected trace elements with high concentrations provide insight into the genesis of U1438 detrital zircon host melts, and represent useful indicators of both short and long-term variations in melt compositions in arc settings. The Site U1438 zircons span the compositional range between zircons from mid-ocean ridge gabbros and zircons from relatively enriched continental arcs, as predicted for melts in a primitive oceanic arc setting derived from a highly depleted mantle source. Melt zircon saturation temperatures and Ti-in-zircon thermometry suggest a provenance in relatively cool and silicic melts that evolved toward more Th and U-rich compositions with time. Th, U, and light rare earth element enrichments beginning about 35 Ma are consistent with detrital zircons recording development of regional arc asymmetry and selective trace element-enriched rear arc silicic melts as the juvenile Izu-Bonin arc evolved.
Beyer, Maila; Nazareno, Alison G.; Lohmann, Lúcia G.
2017-01-01
Premise of the study: We developed chloroplast microsatellite markers (cpSSRs) to be used to study the patterns of genetic structure and genetic diversity of populations of Stizophyllum riparium (Bignonieae, Bignoniaceae). Methods and Results: We used genomic data obtained through an Illumina HiSeq sequencing platform to develop a set of cpSSRs for S. riparium. A total of 36 primer pairs were developed, of which 28 displayed polymorphisms across 59 individuals from three populations. Two to 12 alleles were recorded, and the unbiased haploid diversity per locus ranged from 0.037 to 0.905. All 28 cpSSRs presented transferability to two closely related species, S. inaequilaterum and S. perforatum. Conclusions: We report a set of 28 cpSSRs for S. riparium. All markers were shown to be variable in S. riparium, indicating that these markers will be valuable for population genetic studies across S. riparium and congeneric species. PMID:29109920
Lansdowne, Krystal; Strauss, David G; Scully, Christopher G
2016-01-01
The cacophony of alerts and alarms in a hospital produced by medical devices results in alarm fatigue. The pulse oximeter is one of the most common sources of alarms. One of the ways to reduce alarm rates is to adjust alarm settings at the bedside. This study is aimed to retrospectively examine individual pulse oximeter alarm settings on alarm rates and inter- and intra- patient variability. Nine hundred sixty-two previously collected intensive care unit (ICU) patient records were obtained from the Multiparameter Intelligent Monitoring in Intensive Care II Database (Beth Israel Deaconess Medical Center, Boston, MA). Inclusion criteria included patient records that contained SpO2 trend data sampled at 1 Hz for at least 1 h and a matching clinical record. SpO2 alarm rates were simulated by applying a range of thresholds (84, 86, 88, and 90 %) and delay times (10 to 60 s) to the SpO2 data. Patient records with at least 12 h of SpO2 data were examined for the variability in alarm rate over time. Decreasing SpO2 thresholds and increasing delay times resulted in decreased alarm rates. A limited number of patient records accounted for most alarms, and this number increased as alarm settings loosened (the top 10 % of patient records were responsible for 57.4 % of all alarms at an SpO2 threshold of 90 % and 15 s delay and 81.6 % at an SpO2 threshold of 84 % and 45 s delay). Alarm rates were not consistent over time for individual patients with periods of high and low alarms for all alarm settings. Pulse oximeter SpO2 alarm rates are variable between patients and over time, and the alarm rate and the extent of inter- and intra-patient variability can be affected by the alarm settings. Personalized alarm settings for a patient's current status may help to reduce alarm fatigue for nurses.
Pacific northwest vowels: A Seattle neighborhood dialect study
NASA Astrophysics Data System (ADS)
Ingle, Jennifer K.; Wright, Richard; Wassink, Alicia
2005-04-01
According to current literature a large region encompassing nearly the entire west half of the U.S. belongs to one dialect region referred to as Western, which furthermore, according to Labov et al., ``... has developed a characteristic but not unique phonology.'' [http://www.ling.upenn.edu/phono-atlas/NationalMap/NationalMap.html] This paper will describe the vowel space of a set of Pacific Northwest American English speakers native to the Ballard neighborhood of Seattle, Wash. based on the acoustical analysis of high-quality Marantz CDR 300 recordings. Characteristics, such as low back merger and [u] fronting will be compared to findings by other studies. It is hoped that these recordings will contribute to a growing number of corpora of North American English dialects. All participants were born in Seattle and began their residence in Ballard between ages 0-8. They were recorded in two styles of speech: individually reading repetitions of a word list containing one token each of 10 vowels within carrier phrases, and in casual conversation for 40 min with a partner matched in age, gender, and social mobility. The goal was to create a compatible data set for comparison with current acoustic studies. F1 and F2 and vowel duration from LPC spectral analysis will be presented.
A new web-based system to improve the monitoring of snow avalanche hazard in France
NASA Astrophysics Data System (ADS)
Bourova, Ekaterina; Maldonado, Eric; Leroy, Jean-Baptiste; Alouani, Rachid; Eckert, Nicolas; Bonnefoy-Demongeot, Mylene; Deschatres, Michael
2016-05-01
Snow avalanche data in the French Alps and Pyrenees have been recorded for more than 100 years in several databases. The increasing amount of observed data required a more integrative and automated service. Here we report the comprehensive web-based Snow Avalanche Information System newly developed to this end for three important data sets: an avalanche chronicle (Enquête Permanente sur les Avalanches, EPA), an avalanche map (Carte de Localisation des Phénomènes d'Avalanche, CLPA) and a compilation of hazard and vulnerability data recorded on selected paths endangering human settlements (Sites Habités Sensibles aux Avalanches, SSA). These data sets are now integrated into a common database, enabling full interoperability between all different types of snow avalanche records: digitized geographic data, avalanche descriptive parameters, eyewitness reports, photographs, hazard and risk levels, etc. The new information system is implemented through modular components using Java-based web technologies with Spring and Hibernate frameworks. It automates the manual data entry and improves the process of information collection and sharing, enhancing user experience and data quality, and offering new outlooks to explore and exploit the huge amount of snow avalanche data available for fundamental research and more applied risk assessment.
Gough, Karen; Magness, Laura; Winstanley, Julia
2012-07-01
This study is an audit of the Somerset Court Advice and Assessment Service (CAAS) throughout its first year of implementation. It reports that the service successfully met the six desired objectives as set out in its Service Level Agreement. Further to this, it reports that the use of National Health Service electronic patient records within a court setting facilitated the provision of apposite and timely information to the court. Specific findings were that deliberate self-harm/suicidal ideation and mood disorders were the primary reasons for a person requiring CAAS involvement. Violence against the person, breach of orders and theft were the most prevalent categories of offending within this referred group. The prevalence of previous psychiatric history was significantly higher than found in comparable audits. It is likely that this is due to the efficacy of proactive and in vivo utilization of electronic patient records. Conclusions include the need to work in partnership with drug and alcohol agencies and the importance of recognizing that these services have significant clinical benefits for defendants with mental health problems, and the court system in terms of financial savings. We suggest ongoing audit is necessary to guide the development of other schemes in this pioneering service area.
[Shared electronic health record in Catalonia, Spain].
Marimon-Suñol, Santiago; Rovira-Barberà, María; Acedo-Anta, Mateo; Nozal-Baldajos, Montserrat A; Guanyabens-Calvet, Joan
2010-02-01
Under the law adopted by its Parliament, the Government of Catalonia has developed an electronic medical record system for its National Health System (NHS). The model is governed by the following principles: 1) The citizen as owner of the data: direct access to his data and right to exercise his opposition's privileges; 2) Generate confidence in the system: security and confidentiality strength; 3) Shared model of information management: publishing system and access to organized and structured information, keeping in mind that the NHS of Catalonia is formally an "Integrated system of healthcare public use" (catalan acronym: SISCAT) with a wide variety of legal structures within its healthcare institutions; 4) Use of communication standards and catalogs as a need for technological and functional integration. In summary: single system of medical records shared between different actors, using interoperability tools and whose development is according to the legislation applicable in Catalonia and within its healthcare system. The result has been the establishment of a set of components and relation rules among which we highlight the following: 1) Display of information that collects sociodemographic data of the citizen, documents or reports (radiology, laboratory, therapeutic procedures, hospital release, emergency room), diagnostic health, prescription and immunization plus a summary screen with the most recent and relevant references; 2) Set of tools helping the user and direct messaging between professionals to facilitate their cooperation; 3) Model designed for supranational connections which will allow adding later, with ad hoc rules, clinical data provided by the private health sector or the proper citizen. 2010 Elsevier España S.L. All rights reserved.
A Pharmacy Blueprint for Electronic Medical Record Implementation Success
Bach, David S.; Risko, Kenneth R.; Farber, Margo S.; Polk, Gregory J.
2015-01-01
Objective: Implementation of an integrated, electronic medical record (EMR) has been promoted as a means of improving patient safety and quality. While there are a few reports of such processes that incorporate computerized prescriber order entry, pharmacy verification, an electronic medication administration record (eMAR), point-of-care barcode scanning, and clinical decision support, there are no published reports on how a pharmacy department can best participate in implementing such a process across a multihospital health care system. Method: This article relates the experience of the design, build, deployment, and maintenance of an integrated EMR solution from the pharmacy perspective. It describes a 9-month planning and build phase and the subsequent rollout at 8 hospitals over the following 13 months. Results: Key components to success are identified, as well as a set of guiding principles that proved invaluable in decision making and dispute resolution. Labor/personnel requirements for the various stages of the process are discussed, as are issues involving medication workflow analysis, drug database considerations, the development of clinical order sets, and incorporation of bar-code scanning of medications. Recommended implementation and maintenance strategies are presented, and the impact of EMR implementation on the pharmacy practice model and revenue analysis are examined. Conclusion: Adherence to the principles and practices outlined in this article can assist pharmacy administrators and clinicians during all medication-related phases of the development, implementation, and maintenance of an EMR solution. Furthermore, review and incorporation of some or all of practices presented may help ease the process and ensure its success. PMID:26405340
Code of Federal Regulations, 2014 CFR
2014-01-01
... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Records. 313.9 Section 313.9... ADJUSTMENT ASSISTANCE Administrative Provisions § 313.9 Records. Communities that receive assistance under this part are subject to the records requirements set out in § 302.14 of this chapter. ...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Records. 313.9 Section 313.9... ADJUSTMENT ASSISTANCE Administrative Provisions § 313.9 Records. Communities that receive assistance under this part are subject to the records requirements set out in § 302.14 of this chapter. ...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Records. 313.9 Section 313.9... ADJUSTMENT ASSISTANCE Administrative Provisions § 313.9 Records. Communities that receive assistance under this part are subject to the records requirements set out in § 302.14 of this chapter. ...
HomeBank: An Online Repository of Daylong Child-Centered Audio Recordings
VanDam, Mark; Warlaumont, Anne S.; Bergelson, Elika; Cristia, Alejandrina; Soderstrom, Melanie; De Palma, Paul; MacWhinney, Brian
2017-01-01
HomeBank is introduced here. It is a public, permanent, extensible, online database of daylong audio recorded in naturalistic environments. HomeBank serves two primary purposes. First, it is a repository for raw audio and associated files: one database requires special permissions, and another redacted database allows unrestricted public access. Associated files include metadata such as participant demographics and clinical diagnostics, automated annotations, and human-generated transcriptions and annotations. Many recordings use the child-perspective LENA recorders (LENA Research Foundation, Boulder, Colorado, United States), but various recordings and metadata can be accommodated. The HomeBank database can have both vetted and unvetted recordings, with different levels of accessibility. Additionally, HomeBank is an open repository for processing and analysis tools for HomeBank or similar data sets. HomeBank is flexible for users and contributors, making primary data available to researchers, especially those in child development, linguistics, and audio engineering. HomeBank facilitates researchers’ access to large-scale data and tools, linking the acoustic, auditory, and linguistic characteristics of children’s environments with a variety of variables including socioeconomic status, family characteristics, language trajectories, and disorders. Automated processing applied to daylong home audio recordings is now becoming widely used in early intervention initiatives, helping parents to provide richer speech input to at-risk children. PMID:27111272
A prototype of a computerized patient record.
Adelhard, K; Eckel, R; Hölzel, D; Tretter, W
1995-01-01
Computerized medical record systems (CPRS) should present user and problem oriented views of the patient file. Problem lists, clinical course, medication profiles and results of examinations have to be recorded in a computerized patient record. Patient review screens should give a synopsis of the patient data to inform whenever the patient record is opened. Several different types of data have to be stored in a patient record. Qualitative and quantitative measurements, narratives and images are such examples. Therefore, a CPR must also be able to handle these different data types. New methods and concepts appear frequently in medicine. Thus a CPRS must be flexible enough to cope with coming demands. We developed a prototype of a computer based patient record with a graphical user interface on a SUN workstation. The basis of the system are a dynamic data dictionary, an interpreter language and a large set of basic functions. This approach gives optimal flexibility to the system. A lot of different data types are already supported. Extensions are easily possible. There is also almost no limit concerning the number of medical concepts that can be handled by our prototype. Several applications were built on this platform. Some of them are presented to exemplify the patient and problem oriented handling of the CPR.
NASA Astrophysics Data System (ADS)
St Jacques, J.; Cumming, B. F.; Sauchyn, D.; Vanstone, J. R.; Dickenson, J.; Smol, J. P.
2013-12-01
A vital component of paleoclimatology is the validation of paleoclimatological reconstructions. Unfortunately, there is scant instrumental data prior to the 20th century available for this. Hence, typically, we can only do long-term validation using other proxy-inferred climate reconstructions. Minnesota, USA, with its long military fort climate records beginning in 1820 and early dense network of climate stations, offers a rare opportunity for proxy validation. We compare a high-resolution (4-year), millennium-scale, pollen-inferred paleoclimate record derived from varved Lake Mina in central Minnesota to early military fort records and dendroclimatological records. When inferring a paleoclimate record from a pollen record, we rely upon the pollen-climate relationship being constant in time. However, massive human impacts have significantly altered vegetation; and the relationship between modern instrumental climate data and the modern pollen rain becomes altered from what it was in the past. In the Midwest, selective logging, fire suppression, deforestation and agriculture have strongly influenced the modern pollen rain since Euro-American settlement in the mid-1800s. We assess the signal distortion introduced by using the conventional method of modern post-settlement pollen and climate calibration sets to infer climate at Lake Mina from pre-settlement pollen data. Our first February and May temperature reconstructions are based on a pollen dataset contemporaneous with early settlement to which corresponding climate data from the earliest instrumental records has been added to produce a 'pre-settlement' calibration set. The second February and May temperature reconstructions are based on a conventional 'modern' pollen-climate dataset from core-top pollen samples and modern climate normals. The temperature reconstructions are then compared to the earliest instrumental records from Fort Snelling, Minnesota, and it is shown that the reconstructions based on the pre-settlement calibration set give much more credible reconstructions. We then compare the temperature reconstructions based upon the two calibration sets for AD 1116-2002. Significant signal flattening and bias exist when using the conventional modern pollen-climate calibration set rather than the pre-settlement pollen-climate calibration set, resulting in an overestimation of Little Ice Age monthly mean temperatures of 0.5-1.5 oC. Therefore, regional warming from anthropogenic global warming is significantly underestimated when using the conventional method of building pollen-climate calibration sets. We also compare the Lake Mina pollen-inferred effective moisture record to early 19th century climate data and to a four-century tree-ring inferred moisture reconstruction based upon sites in Minnesota and the Dakotas. This comparison shows that regional tree-ring reconstructions are biased towards dry conditions and record wet periods poorly relative to high-resolution pollen reconstructions, giving a false impression of regional aridity. It also suggests that varve chronologies should be based upon cross-dating to ensure a more accurate chronology.
Mann, Devin M; Lin, Jenny J
2012-01-23
Studies have shown that lifestyle behavior changes are most effective to prevent onset of diabetes in high-risk patients. Primary care providers are charged with encouraging behavior change among their patients at risk for diabetes, yet the practice environment and training in primary care often do not support effective provider counseling. The goal of this study is to develop an electronic health record-embedded tool to facilitate shared patient-provider goal setting to promote behavioral change and prevent diabetes. The ADAPT (Avoiding Diabetes Thru Action Plan Targeting) trial leverages an innovative system that integrates evidence-based interventions for behavioral change with already-existing technology to enhance primary care providers' effectiveness to counsel about lifestyle behavior changes. Using principles of behavior change theory, the multidisciplinary design team utilized in-depth interviews and in vivo usability testing to produce a prototype diabetes prevention counseling system embedded in the electronic health record. The core element of the tool is a streamlined, shared goal-setting module within the electronic health record system. The team then conducted a series of innovative, "near-live" usability testing simulations to refine the tool and enhance workflow integration. The system also incorporates a pre-encounter survey to elicit patients' behavior-change goals to help tailor patient-provider goal setting during the clinical encounter and to encourage shared decision making. Lastly, the patients interact with a website that collects their longitudinal behavior data and allows them to visualize their progress over time and compare their progress with other study members. The finalized ADAPT system is now being piloted in a small randomized control trial of providers using the system with prediabetes patients over a six-month period. The ADAPT system combines the influential powers of shared goal setting and feedback, tailoring, modeling, contracting, reminders, and social comparisons to integrate evidence-based behavior-change principles into the electronic health record to maximize provider counseling efficacy during routine primary care clinical encounters. If successful, the ADAPT system may represent an adaptable and scalable technology-enabled behavior-change tool for all primary care providers. ClinicalTrials.gov Identifier NCT01473654.
Detection and analysis of radio frequency lightning emissions
NASA Technical Reports Server (NTRS)
Jalali, F.
1982-01-01
The feasibility study of detection of lightning discharges from a geosynchronous satellite requires adequate ground-based information regarding emission characteristics. In this investigation, a measurement system for collection of S-band emission data is set up and calibrated, and the operations procedures for rapid data collection during a storm activity developed. The system collects emission data in two modes; a digitized, high-resolution, short duration record stored in solid-state memory, and a continuous long-duration record on magnetic tape. Representative lightning flash data are shown. Preliminary results indicate appreciable RF emissions at 2 gHz from both the leader and return strokes portions of the cloud-to-ground discharge with strong peaks associated with the return strokes.
Method for extracting long-equivalent wavelength interferometric information
NASA Technical Reports Server (NTRS)
Hochberg, Eric B. (Inventor)
1991-01-01
A process for extracting long-equivalent wavelength interferometric information from a two-wavelength polychromatic or achromatic interferometer. The process comprises the steps of simultaneously recording a non-linear sum of two different frequency visible light interferograms on a high resolution film and then placing the developed film in an optical train for Fourier transformation, low pass spatial filtering and inverse transformation of the film image to produce low spatial frequency fringes corresponding to a long-equivalent wavelength interferogram. The recorded non-linear sum irradiance derived from the two-wavelength interferometer is obtained by controlling the exposure so that the average interferogram irradiance is set at either the noise level threshold or the saturation level threshold of the film.
Rohrer Vitek, Carolyn R; Abul-Husn, Noura S; Connolly, John J; Hartzler, Andrea L; Kitchner, Terrie; Peterson, Josh F; Rasmussen, Luke V; Smith, Maureen E; Stallings, Sarah; Williams, Marc S; Wolf, Wendy A; Prows, Cynthia A
2017-01-01
Ten organizations within the Electronic Medical Records and Genomics Network developed programs to implement pharmacogenomic sequencing and clinical decision support into clinical settings. Recognizing the importance of informed prescribers, a variety of strategies were used to incorporate provider education to support implementation. Education experiences with pharmacogenomics are described within the context of each organization's prior involvement, including the scope and scale of implementation specific to their Electronic Medical Records and Genomics projects. We describe common and distinct education strategies, provide exemplars and share challenges. Lessons learned inform future perspectives. Future pharmacogenomics clinical implementation initiatives need to include funding toward implementing provider education and evaluating outcomes. PMID:28639489
Oza, Shefali; Jazayeri, Darius; Teich, Jonathan M; Ball, Ellen; Nankubuge, Patricia Alexandra; Rwebembera, Job; Wing, Kevin; Sesay, Alieu Amara; Kanter, Andrew S; Ramos, Glauber D; Walton, David; Cummings, Rachael; Checchi, Francesco; Fraser, Hamish S
2017-08-21
Stringent infection control requirements at Ebola treatment centers (ETCs), which are specialized facilities for isolating and treating Ebola patients, create substantial challenges for recording and reviewing patient information. During the 2014-2016 West African Ebola epidemic, paper-based data collection systems at ETCs compromised the quality, quantity, and confidentiality of patient data. Electronic health record (EHR) systems have the potential to address such problems, with benefits for patient care, surveillance, and research. However, no suitable software was available for deployment when large-scale ETCs opened as the epidemic escalated in 2014. We present our work on rapidly developing and deploying OpenMRS-Ebola, an EHR system for the Kerry Town ETC in Sierra Leone. We describe our experience, lessons learned, and recommendations for future health emergencies. We used the OpenMRS platform and Agile software development approaches to build OpenMRS-Ebola. Key features of our work included daily communications between the development team and ground-based operations team, iterative processes, and phased development and implementation. We made design decisions based on the restrictions of the ETC environment and regular user feedback. To evaluate the system, we conducted predeployment user questionnaires and compared the EHR records with duplicate paper records. We successfully built OpenMRS-Ebola, a modular stand-alone EHR system with a tablet-based application for infectious patient wards and a desktop-based application for noninfectious areas. OpenMRS-Ebola supports patient tracking (registration, bed allocation, and discharge); recording of vital signs and symptoms; medication and intravenous fluid ordering and monitoring; laboratory results; clinician notes; and data export. It displays relevant patient information to clinicians in infectious and noninfectious zones. We implemented phase 1 (patient tracking; drug ordering and monitoring) after 2.5 months of full-time development. OpenMRS-Ebola was used for 112 patient registrations, 569 prescription orders, and 971 medication administration recordings. We were unable to fully implement phases 2 and 3 as the ETC closed because of a decrease in new Ebola cases. The phase 1 evaluation suggested that OpenMRS-Ebola worked well in the context of the rollout, and the user feedback was positive. To our knowledge, OpenMRS-Ebola is the most comprehensive adaptable clinical EHR built for a low-resource setting health emergency. It is designed to address the main challenges of data collection in highly infectious environments that require robust infection prevention and control measures and it is interoperable with other electronic health systems. Although we built and deployed OpenMRS-Ebola more rapidly than typical software, our work highlights the challenges of having to develop an appropriate system during an emergency rather than being able to rapidly adapt an existing one. Lessons learned from this and previous emergencies should be used to ensure that a set of well-designed, easy-to-use, pretested health software is ready for quick deployment in future. ©Shefali Oza, Darius Jazayeri, Jonathan M Teich, Ellen Ball, Patricia Alexandra Nankubuge, Job Rwebembera, Kevin Wing, Alieu Amara Sesay, Andrew S Kanter, Glauber D Ramos, David Walton, Rachael Cummings, Francesco Checchi, Hamish S Fraser. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 21.08.2017.
Code of Federal Regulations, 2014 CFR
2014-04-01
... standard for biometric data specifications for personal identity verification. Operating point means a... records on its servers. Audit trail means a record showing who has accessed an information technology... information on a local server or hard drive. Certificate policy means a named set of rules that sets forth the...
Code of Federal Regulations, 2011 CFR
2011-04-01
... standard for biometric data specifications for personal identity verification. Operating point means a... records on its servers. Audit trail means a record showing who has accessed an information technology... information on a local server or hard drive. Certificate policy means a named set of rules that sets forth the...
Code of Federal Regulations, 2013 CFR
2013-04-01
... standard for biometric data specifications for personal identity verification. Operating point means a... records on its servers. Audit trail means a record showing who has accessed an information technology... information on a local server or hard drive. Certificate policy means a named set of rules that sets forth the...
Acoustic classification of multiple simultaneous bird species: a multi-instance multi-label approach
F. Briggs; B. Lakshminarayanan; L. Neal; X.Z. Fern; R. Raich; S.F. Hadley; A.S. Hadley; M.G. Betts
2012-01-01
Although field-collected recordings typically contain multiple simultaneously vocalizing birds of different species, acoustic species classification in this setting has received little study so far. This work formulates the problem of classifying the set of species present in an audio recording using the multi-instance multi-label (MIML) framework for machine learning...
13 CFR 106.402 - What provisions must be set forth in a Non-Fee Based Record?
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What provisions must be set forth in a Non-Fee Based Record? 106.402 Section 106.402 Business Credit and Assistance SMALL BUSINESS... endorsement by SBA of the Donor, or the Donor's products or services. ...
Required number of records for ASCE/SEI 7 ground-motion scaling procedure
Reyes, Juan C.; Kalkan, Erol
2011-01-01
The procedures and criteria in 2006 IBC (International Council of Building Officials, 2006) and 2007 CBC (International Council of Building Officials, 2007) for the selection and scaling ground-motions for use in nonlinear response history analysis (RHA) of structures are based on ASCE/SEI 7 provisions (ASCE, 2005, 2010). According to ASCE/SEI 7, earthquake records should be selected from events of magnitudes, fault distance, and source mechanisms that comply with the maximum considered earthquake, and then scaled so that the average value of the 5-percent-damped response spectra for the set of scaled records is not less than the design response spectrum over the period range from 0.2Tn to 1.5Tn sec (where Tn is the fundamental vibration period of the structure). If at least seven ground-motions are analyzed, the design values of engineering demand parameters (EDPs) are taken as the average of the EDPs determined from the analyses. If fewer than seven ground-motions are analyzed, the design values of EDPs are taken as the maximum values of the EDPs. ASCE/SEI 7 requires a minimum of three ground-motions. These limits on the number of records in the ASCE/SEI 7 procedure are based on engineering experience, rather than on a comprehensive evaluation. This study statistically examines the required number of records for the ASCE/SEI 7 procedure, such that the scaled records provide accurate, efficient, and consistent estimates of" true" structural responses. Based on elastic-perfectly-plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI 7 scaling procedure is applied to 480 sets of ground-motions. The number of records in these sets varies from three to ten. The records in each set were selected either (i) randomly, (ii) considering their spectral shapes, or (iii) considering their spectral shapes and design spectral-acceleration value, A(Tn). As compared to benchmark (that is, "true") responses from unscaled records using a larger catalog of ground-motions, it is demonstrated that the ASCE/SEI 7 scaling procedure is overly conservative if fewer than seven ground-motions are employed. Utilizing seven or more randomly selected records provides a more accurate estimate of the EDPs accompanied by reduced record-to-record variability of the responses. Consistency in accuracy and efficiency is achieved only if records are selected on the basis of their spectral shape and A(Tn).
Algorithm for automatic analysis of electro-oculographic data
2013-01-01
Background Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. Methods The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. Results The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. Conclusion The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics. PMID:24160372
Algorithm for automatic analysis of electro-oculographic data.
Pettersson, Kati; Jagadeesan, Sharman; Lukander, Kristian; Henelius, Andreas; Haeggström, Edward; Müller, Kiti
2013-10-25
Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics.
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano
Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the automatic results with the manual picks indicates that the ASDP module can accurately pick 80% of the P-waves and 65% of S-waves. The on-line application on the latter data set shows that automatic locations are affected by larger errors, due to the preliminary setting of the configuration parameters in the program. However, both automatic ASDP and manual hypocenter locations are comparable within the estimated error bounds. New improvements of the PC-Seism software for on-line analysis are also discussed.
Folks, Russell D; Savir-Baruch, Bital; Garcia, Ernest V; Verdes, Liudmila; Taylor, Andrew T
2012-12-01
Our objective was to design and implement a clinical history database capable of linking to our database of quantitative results from (99m)Tc-mercaptoacetyltriglycine (MAG3) renal scans and export a data summary for physicians or our software decision support system. For database development, we used a commercial program. Additional software was developed in Interactive Data Language. MAG3 studies were processed using an in-house enhancement of a commercial program. The relational database has 3 parts: a list of all renal scans (the RENAL database), a set of patients with quantitative processing results (the Q2 database), and a subset of patients from Q2 containing clinical data manually transcribed from the hospital information system (the CLINICAL database). To test interobserver variability, a second physician transcriber reviewed 50 randomly selected patients in the hospital information system and tabulated 2 clinical data items: hydronephrosis and presence of a current stent. The CLINICAL database was developed in stages and contains 342 fields comprising demographic information, clinical history, and findings from up to 11 radiologic procedures. A scripted algorithm is used to reliably match records present in both Q2 and CLINICAL. An Interactive Data Language program then combines data from the 2 databases into an XML (extensible markup language) file for use by the decision support system. A text file is constructed and saved for review by physicians. RENAL contains 2,222 records, Q2 contains 456 records, and CLINICAL contains 152 records. The interobserver variability testing found a 95% match between the 2 observers for presence or absence of ureteral stent (κ = 0.52), a 75% match for hydronephrosis based on narrative summaries of hospitalizations and clinical visits (κ = 0.41), and a 92% match for hydronephrosis based on the imaging report (κ = 0.84). We have developed a relational database system to integrate the quantitative results of MAG3 image processing with clinical records obtained from the hospital information system. We also have developed a methodology for formatting clinical history for review by physicians and export to a decision support system. We identified several pitfalls, including the fact that important textual information extracted from the hospital information system by knowledgeable transcribers can show substantial interobserver variation, particularly when record retrieval is based on the narrative clinical records.
Development of Virtual Auditory Interfaces
2001-03-01
reference to compare the sound in the VE with the real 4. Lessons from the Entertainment Industry world experience. The entertainment industry has...systems are currently being evaluated. even though we have the technology to create astounding The first system uses a portable Sony TCD-D8 DAT audio...data set created a system called "Fantasound" which wrapped the including sound recordings and sound measurements musical compositions and sound
A Crowdsourcing Framework for Medical Data Sets
Ye, Cheng; Coco, Joseph; Epishova, Anna; Hajaj, Chen; Bogardus, Henry; Novak, Laurie; Denny, Joshua; Vorobeychik, Yevgeniy; Lasko, Thomas; Malin, Bradley; Fabbri, Daniel
2018-01-01
Crowdsourcing services like Amazon Mechanical Turk allow researchers to ask questions to crowds of workers and quickly receive high quality labeled responses. However, crowds drawn from the general public are not suitable for labeling sensitive and complex data sets, such as medical records, due to various concerns. Major challenges in building and deploying a crowdsourcing system for medical data include, but are not limited to: managing access rights to sensitive data and ensuring data privacy controls are enforced; identifying workers with the necessary expertise to analyze complex information; and efficiently retrieving relevant information in massive data sets. In this paper, we introduce a crowdsourcing framework to support the annotation of medical data sets. We further demonstrate a workflow for crowdsourcing clinical chart reviews including (1) the design and decomposition of research questions; (2) the architecture for storing and displaying sensitive data; and (3) the development of tools to support crowd workers in quickly analyzing information from complex data sets. PMID:29888085
USDA-ARS?s Scientific Manuscript database
Average yields of peanut in the U.S. set an all time record of 4,695 kg ha-1 in 2012. This far exceeded the previous record yield of 3,837 kg ha-1 in 2008. Favorable weather conditions undoubtedly contributed to the record yields in 2012; however, these record yields would not have been achievable...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Records. § 1633.11 Section § 1633.11... FLAMMABILITY (OPEN FLAME) OF MATTRESS SETS Rules and Regulations § 1633.11 Records. (a) Test and manufacturing records C general. Every manufacturer and any other person initially introducing into commerce mattress...
12 CFR 261b.11 - Transcripts, recordings, and minutes.
Code of Federal Regulations, 2011 CFR
2011-01-01
... minutes. (a) The agency will maintain a complete transcript or electronic recording or transcription... § 261b.5 of this part. Transcriptions of recordings will disclose the identity of each speaker. (b) The agency will maintain either such a transcript, recording or transcription thereof, or a set of minutes...
Development of a pseudo/anonymised primary care research database: Proof-of-concept study.
MacRury, Sandra; Finlayson, Jim; Hussey-Wilson, Susan; Holden, Samantha
2016-06-01
General practice records present a comprehensive source of data that could form a variety of anonymised or pseudonymised research databases to aid identification of potential research participants regardless of location. A proof-of-concept study was undertaken to extract data from general practice systems in 15 practices across the region to form pseudo and anonymised research data sets. Two feasibility studies and a disease surveillance study compared numbers of potential study participants and accuracy of disease prevalence, respectively. There was a marked reduction in screening time and increase in numbers of potential study participants identified with the research repository compared with conventional methods. Accurate disease prevalence was established and enhanced with the addition of selective text mining. This study confirms the potential for development of national anonymised research database from general practice records in addition to improving data collection for local or national audits and epidemiological projects. © The Author(s) 2014.
Nutritional Requirements for Space Station Freedom Crews
NASA Technical Reports Server (NTRS)
Lane, Helen W.; Rice, Barbara L.; Wogan, Christine F. (Editor)
1992-01-01
The purpose of this report was to set preliminary nutritional requirements for crewmembers flying from 90 to 180 day missions on Space Station Freedom. Specific recommendations included providing crewmembers with in flight feedback on nutritional intake, weight and strength, and incorporating issues of energy intake, body weight, body composition, strength, and protein intake in the flight medicine program. Exercise must be considered an integral part of any plan to maintain nutritional status, especially those modes that stress the skeleton and maintain body weight. Nutrient intake, amount of exercise, and drugs ingested must be recorded daily; high priority should be given to development of fully automated record systems that minimize astronauts' effort. A system of nutritional supplements should be developed to provide a method for reducing intake deficits that become apparent. Finally, post flight monitoring should include bone density, muscle mass and function, and iron status at three and six months after landing.
Microcomputer Nurse-Practitioner Protocols
Way, Anthony B.; Rowley, Blair A.; White, Melanie A.
1982-01-01
We have developed a set of protocols on a microcomputer to assist in the management of a geographically isolated nurse practitioner. If a mid-level practitioner is supervised by a physician, some system is needed to ensure that approved care is being provided. The currently available paper-based protocols do not adequately serve all the needs for training, auditing, and record keeping. Conversely, adequate systems based on large computers are not feasible for small clinics. We have therefore developed a microcomputer-based system of protocols for a small rural nurse-practitioner's clinic. Our programs are designed for direct use by the practitioners while the patient is in the clinic. The user is given immediate feedback about any errors. The supervisor is later provided with a summary of the protocol uses and errors, and a copy of any erroneous records. The system appears to be easy to use by the nurse practitioner. The protocols are quickly learned and auditing is facilitated.
Orlova, Anna O; Dunnagan, Mark; Finitzo, Terese; Higgins, Michael; Watkins, Todd; Tien, Allen; Beales, Steven
2005-01-01
Information exchange, enabled by computable interoperability, is the key to many of the initiatives underway including the development of Regional Health Information Exchanges, Regional Health Information Organizations, and the National Health Information Network. These initiatives must include public health as a full partner in the emerging transformation of our nation's healthcare system through the adoption and use of information technology. An electronic health record - public health (EHR-PH)system prototype was developed to demonstrate the feasibility of electronic data transfer from a health care provider, i.e. hospital or ambulatory care settings, to multiple customized public health systems which include a Newborn Metabolic Screening Registry, a Newborn Hearing Screening Registry, an Immunization Registry and a Communicable Disease Registry, using HL7 messaging standards. Our EHR-PH system prototype can be considered a distributed EHR-based RHIE/RHIO model - a principal element for a potential technical architecture for a NHIN.
Semi-automated Data Set Submission Work Flow for Archival with the ORNL DAAC
NASA Astrophysics Data System (ADS)
Wright, D.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Eby, P.; Heinz, S. L.; Hook, L. A.; McMurry, B. F.; Shanafield, H. A.; Sill, D.; Santhana Vannan, S.; Wei, Y.
2013-12-01
The ORNL DAAC archives and publishes, free of charge, data and information relevant to biogeochemical, ecological, and environmental processes. The ORNL DAAC primarily archives data produced by NASA's Terrestrial Ecology Program; however, any data that are pertinent to the biogeochemical and ecological community are of interest. The data set submission process to the ORNL DAAC has been recently updated and semi-automated to provide a consistent data provider experience and to create a uniform data product. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. If the ORNL DAAC is the appropriate archive for a data set, the data provider will be sent an email with several URL links to guide them through the submission process. The data provider will be asked to fill out a short online form to help the ORNL DAAC staff better understand the data set. These questions cover information about the data set, a description of the data set, temporal and spatial characteristics of the data set, and how the data were prepared and delivered. The questionnaire is generic and has been designed to gather input on the various diverse data sets the ORNL DAAC archives. A data upload module and metadata editor further guide the data provider through the submission process. For submission purposes, a complete data set includes data files, document(s) describing data, supplemental files, metadata record(s), and an online form. There are five major functions the ORNL DAAC performs during the process of archiving data: 1) Ingestion is the ORNL DAAC side of submission; data are checked, metadata records are compiled, and files are converted to archival formats. 2) Metadata records and data set documentation made searchable and the data set is given a permanent URL. 3) The data set is published, assigned a DOI, and advertised. 4) The data set is provided long-term post-project support. 5) Stewardship of data ensures the data are stored on state of the art computer systems with reliable backups.
Comparative study of mobile Raman instrumentation for art analysis.
Vandenabeele, P; Castro, K; Hargreaves, M; Moens, L; Madariaga, J M; Edwards, H G M
2007-04-04
In archaeometry, one of the main concerns is to extract information from an art object, without damaging it. Raman spectroscopy is being applied in this research field with recent developments in mobile instrumentation facilitating more routine analysis. This research paper evaluates the performances of five mobile Raman instruments (Renishaw RA100, Renishaw Portable Raman Analyser RX210, Ocean Optics RSL-1, Delta Nu Inspector Raman, Mobile Art Analyser--MArtA) in three different laboratories. A set of samples were collected, in order to obtain information on the spectral performances of the instruments including: spectral resolution, calibration, laser cut-off, the ability to record spectra of organic and inorganic pigments through varnish layers and on the possibilities to identify biomaterials. Spectra were recorded from predefined regions on a canvas painting to simulate the investigation of artworks and the capabilities to record spectra from hardly accessible areas was evaluated.
Vital sign documentation in electronic records: The development of workarounds.
Stevenson, Jean E; Israelsson, Johan; Nilsson, Gunilla; Petersson, Goran; Bath, Peter A
2018-06-01
Workarounds are commonplace in healthcare settings. An increase in the use of electronic health records has led to an escalation of workarounds as healthcare professionals cope with systems which are inadequate for their needs. Closely related to this, the documentation of vital signs in electronic health records has been problematic. The accuracy and completeness of vital sign documentation has a direct impact on the recognition of deterioration in a patient's condition. We examined workflow processes to identify workarounds related to vital signs in a 372-bed hospital in Sweden. In three clinical areas, a qualitative study was performed with data collected during observations and interviews and analysed through thematic content analysis. We identified paper workarounds in the form of handwritten notes and a total of eight pre-printed paper observation charts. Our results suggested that nurses created workarounds to allow a smooth workflow and ensure patients safety.
NASA Astrophysics Data System (ADS)
Gronewold, A.; Bruxer, J.; Smith, J.; Hunter, T.; Fortin, V.; Clites, A. H.; Durnford, D.; Qian, S.; Seglenieks, F.
2015-12-01
Resolving and projecting the water budget of the North American Great Lakes basin (Earth's largest lake system) requires aggregation of data from a complex array of in situ monitoring and remote sensing products that cross an international border (leading to potential sources of bias and other inconsistencies), and are relatively sparse over the surfaces of the lakes themselves. Data scarcity over the surfaces of the lakes is a particularly significant problem because, unlike Earth's other large freshwater basins, the Great Lakes basin water budget is (on annual scales) comprised of relatively equal contributions from runoff, over-lake precipitation, and over-lake evaporation. Consequently, understanding drivers behind changes in regional water storage and water levels requires a data management framework that can reconcile uncertainties associated with data scarcity and bias, and propagate those uncertainties into regional water budget projections and historical records. Here, we assess the development of a historical hydrometeorological database for the entire Great Lakes basin with records dating back to the late 1800s, and describe improvements that are specifically intended to differentiate hydrological, climatological, and anthropogenic drivers behind recent extreme changes in Great Lakes water levels. Our assessment includes a detailed analysis of the extent to which extreme cold winters in central North America in 2013-2014 (caused by the anomalous meridional upper air flow - commonly referred to in the public media as the "polar vortex" phenomenon) altered the thermal and hydrologic regimes of the Great Lakes and led to a record setting surge in water levels between January 2014 and December 2015.
Kyratzis, Angelos C; Skarlatos, Dimitrios P; Menexes, George C; Vamvakousis, Vasileios F; Katsiotis, Andreas
2017-01-01
There is growing interest for using Spectral Vegetation Indices (SVI) derived by Unmanned Aerial Vehicle (UAV) imagery as a fast and cost-efficient tool for plant phenotyping. The development of such tools is of paramount importance to continue progress through plant breeding, especially in the Mediterranean basin, where climate change is expected to further increase yield uncertainty. In the present study, Normalized Difference Vegetation Index (NDVI), Simple Ratio (SR) and Green Normalized Difference Vegetation Index (GNDVI) derived from UAV imagery were calculated for two consecutive years in a set of twenty durum wheat varieties grown under a water limited and heat stressed environment. Statistically significant differences between genotypes were observed for SVIs. GNDVI explained more variability than NDVI and SR, when recorded at booting. GNDVI was significantly correlated with grain yield when recorded at booting and anthesis during the 1st and 2nd year, respectively, while NDVI was correlated to grain yield when recorded at booting, but only for the 1st year. These results suggest that GNDVI has a better discriminating efficiency and can be a better predictor of yield when recorded at early reproductive stages. The predictive ability of SVIs was affected by plant phenology. Correlations of grain yield with SVIs were stronger as the correlations of SVIs with heading were weaker or not significant. NDVIs recorded at the experimental site were significantly correlated with grain yield of the same set of genotypes grown in other environments. Both positive and negative correlations were observed indicating that the environmental conditions during grain filling can affect the sign of the correlations. These findings highlight the potential use of SVIs derived by UAV imagery for durum wheat phenotyping under low yielding Mediterranean conditions.
Dennehy, Patricia; White, Mary P; Hamilton, Andrew; Pohl, Joanne M; Tanner, Clare; Onifade, Tiffiani J
2011-01-01
Objective To present a partnership-based and community-oriented approach designed to ease provider anxiety and facilitate the implementation of electronic health records (EHR) in resource-limited primary care settings. Materials and Methods The approach, referred to as partnership model, was developed and iteratively refined through the research team's previous work on implementing health information technology (HIT) in over 30 safety net practices. This paper uses two case studies to illustrate how the model was applied to help two nurse-managed health centers (NMHC), a particularly vulnerable primary care setting, implement EHR and get prepared to meet the meaningful use criteria. Results The strong focus of the model on continuous quality improvement led to eventual implementation success at both sites, despite difficulties encountered during the initial stages of the project. Discussion There has been a lack of research, particularly in resource-limited primary care settings, on strategies for abating provider anxiety and preparing them to manage complex changes associated with EHR uptake. The partnership model described in this paper may provide useful insights into the work shepherded by HIT regional extension centers dedicated to supporting resource-limited communities disproportionally affected by EHR adoption barriers. Conclusion NMHC, similar to other primary care settings, are often poorly resourced, understaffed, and lack the necessary expertise to deploy EHR and integrate its use into their day-to-day practice. This study demonstrates that implementation of EHR, a prerequisite to meaningful use, can be successfully achieved in this setting, and partnership efforts extending far beyond the initial software deployment stage may be the key. PMID:21828225
Methodologies, Models and Algorithms for Patients Rehabilitation.
Fardoun, H M; Mashat, A S
2016-01-01
This editorial is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The objective of this focus theme is to present current solutions by means of technologies and human factors related to the use of Information and Communication Technologies (ICT) for improving patient rehabilitation. The focus theme examines distinctive measurements of strengthening methodologies, models and algorithms for disabled people in terms of rehabilitation and health care, and to explore the extent to which ICT is a useful tool in this process. The focus theme records a set of solutions for ICT systems developed to improve the rehabilitation process of disabled people and to help them in carrying out their daily life. The development and subsequent setting up of computers for the patients' rehabilitation process is of continuous interest and growth.
Harper, Marvin B; Longhurst, Christopher A; McGuire, Troy L; Tarrago, Rod; Desai, Bimal R; Patterson, Al
2014-03-01
The study aims to develop a core set of pediatric drug-drug interaction (DDI) pairs for which electronic alerts should be presented to prescribers during the ordering process. A clinical decision support working group composed of Children's Hospital Association (CHA) members was developed. CHA Pharmacists and Chief Medical Information Officers participated. Consensus was reached on a core set of 19 DDI pairs that should be presented to pediatric prescribers during the order process. We have provided a core list of 19 high value drug pairs for electronic drug-drug interaction alerts to be recommended for inclusion as high value alerts in prescriber order entry software used with a pediatric patient population. We believe this list represents the most important pediatric drug interactions for practical implementation within computerized prescriber order entry systems.
Enhancing Worker Health Through Clinical Decision Support (CDS): An Introduction to a Compilation.
Filios, Margaret S; Storey, Eileen; Baron, Sherry; Luensman, Genevieve B; Shiffman, Richard N
2017-11-01
This article outlines an approach to developing clinical decision support (CDS) for conditions related to work and health. When incorporated in electronic health records, such CDS will assist primary care providers (PCPs) care for working patients. Three groups of Subject Matter Experts (SMEs) identified relevant clinical practice guidelines, best practices, and reviewed published literature concerning work-related asthma, return-to-work, and management of diabetes at work. SMEs developed one recommendation per topic that could be supported by electronic CDS. Reviews with PCPs, staff, and health information system implementers in five primary care settings confirmed that the approach was important and operationally sound. This compendium is intended to stimulate a dialogue between occupational health specialists and PCPs that will enhance the use of work information about patients in the primary care setting.
Climate Data Bases of the People's Republic of China 1841-1988 (TR-055)
Kaiser, Dale. [Oak Ridge National Lab, Oak Ridge, TN (USA); Carbon Dioxide Analysis Center (CDIAC); Tao, Shiyan [Chinese Academy of Sciences, Beijing, China; Fu, Congbin [Chinese Academy of Sciences, Beijing, China; Zeng, Zhaomei [Chinese Academy of Sciences (CAS), Beijing, China; Zhang, Qingyun [Chinese Academy of Sciences (CAS), Beijing (China); Wang, Wei-Chyung [University at Albany, State University of New York, Albany, New York (USA); Atmospheric Science Research Center; Karl, Thomas [National Oceanic and Atmospheric Administration, Asheville, North Carolina (USA); Global Climate Laboratory, National Climatic Data Center
1993-01-01
A data base containing meteorological observations from the People's Republic of China (PRC) is described. These data were compiled in accordance with a joint research agreement signed by the U.S. Department of Energy and the PRC Chinese Academy of Sciences (CAS) on August 19, 1987. CAS's Institute of Atmospheric Physics (Beijing, PRC) has provided records from 296 stations, organized into five data sets: (1) a 60-station data set containing monthly measurements of barometric pressure, surface air temperature, precipitation amount, relative humidity, sunshine duration, cloud amount, wind direction and speed, and number of days with snow cover; (2) a 205-station data set containing monthly mean temperatures and monthly precipitation totals; (3) a 40-station subset of the 205-station data set containing monthly mean maximum and minimum temperatures and monthly extreme maximum and minimum temperatures; (4) a 180-station data set containing daily precipitation totals; and (5) a 147-station data set containing 10-day precipitation totals. Sixteen stations from these data sets (13 from the 60-station set and 3 from the 205-station set) have temperature and/or precipitation records that begin prior to 1900, whereas the remaining stations began observing in the early to mid-1900s. Records from most stations extend through 1988. (Note: Users interested in the TR055 60-station data set should acquire expanded and updated data from CDIAC's NDP-039, Two Long-Term Instrumental Climatic Data Bases of the People's Republic of China)
Ver Donck, L; Lammers, W J E P; Moreaux, B; Smets, D; Voeten, J; Vekemans, J; Schuurkes, J A J; Coulie, B
2006-03-01
Myoelectric recordings from the intestines in conscious animals have been limited to a few electrode sites with relatively large inter-electrode distances. The aim of this project was to increase the number of recording sites to allow high-resolution reconstruction of the propagation of myoelectrical signals. Sets of six unipolar electrodes, positioned in a 3x2 array, were constructed. A silver ring close to each set served as the reference electrodes. Inter-electrode distances varied from 4 to 8 mm. Electrode sets, to a maximum of 4, were implanted in various configurations allowing recording from 24 sites simultaneously. Four sets of 6 electrodes each were implanted successfully in 11 female Beagles. Implantation sites evaluated were the upper small intestine (n=10), the lower small intestine (n=4) and the stomach (n=3). The implants remained functional for 7.2 months (median; range 1.4-27.3 months). Recorded signals showed slow waves at regular intervals and spike potentials. In addition, when the sets were positioned close together, it was possible to re-construct the propagation of individual slow waves, to determine their direction of propagation and to calculate their propagation velocity. No signs or symptoms of interference with normal GI-function were observed in the tested animals. With this approach, it is possible to implant 24 extracellular electrodes on the serosal surface of the intestines without interfering with its normal physiology. This approach makes it possible to study the electrical activities of the GI system at high resolution in vivo in the conscious animal.
NASA Astrophysics Data System (ADS)
Kremer, Katrina; Reusch, Anna; Wirth, Stefanie B.; Anselmetti, Flavio S.; Girardclos, Stéphanie; Strasser, Michael
2016-04-01
Intraplate settings are characterized by low deformation rates and recurrence intervals of strong earthquakes that often exceed the time span covered by instrumental records. Switzerland, as an example for such settings, shows a low instrumentally recorded seismicity, in contrast to strong earthquakes (e.g. 1356 Basel earthquake, Mw=6.6 and 1601 Unterwalden earthquake, Mw=5.9) mentioned in the historical archives. As such long recurrence rates do not allow for instrumental identification of earthquake sources of these strong events, and as intense geomorphologic alterations prevent preservation of surface expressions of faults, the knowledge of active faults is very limited. Lake sediments are sensitive to seismic shaking and thus, can be used to extend the regional earthquake catalogue if the sedimentary deposits or deformation structures can be linked to an earthquake. Single lake records allow estimating local intensities of shaking while multiple lake records can furthermore be used to compare temporal and spatial distribution of earthquakes. In this study, we compile a large dataset of dated sedimentary event deposits recorded in Swiss lakes available from peer-reviewed publications and unpublished master theses. We combine these data in order to detect large prehistoric regional earthquake events or periods of intense shaking that might have affected multiple lake settings. In a second step, using empirical seismic attenuation equations, we test if lake records can be used to reconstruct magnitudes and epicentres of identified earthquakes.
How many records should be used in ASCE/SEI-7 ground motion scaling procedure?
Reyes, Juan C.; Kalkan, Erol
2012-01-01
U.S. national building codes refer to the ASCE/SEI-7 provisions for selecting and scaling ground motions for use in nonlinear response history analysis of structures. Because the limiting values for the number of records in the ASCE/SEI-7 are based on engineering experience, this study examines the required number of records statistically, such that the scaled records provide accurate, efficient, and consistent estimates of “true” structural responses. Based on elastic–perfectly plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI-7 scaling procedure is applied to 480 sets of ground motions; the number of records in these sets varies from three to ten. As compared to benchmark responses, it is demonstrated that the ASCE/SEI-7 scaling procedure is conservative if fewer than seven ground motions are employed. Utilizing seven or more randomly selected records provides more accurate estimate of the responses. Selecting records based on their spectral shape and design spectral acceleration increases the accuracy and efficiency of the procedure.
2011 Souris River flood—Will it happen again?
Nustad, Rochelle A.; Kolars, Kelsey A.; Vecchia, Aldo V.; Ryberg, Karen R.
2016-09-29
The Souris River Basin is a 61,000 square kilometer basin in the provinces of Saskatchewan and Manitoba and the state of North Dakota. Record setting rains in May and June of 2011 led to record flooding with peak annual streamflow values (762 cubic meters per second [m3/s]) more than twice that of any previously recorded peak streamflow and more than five times the estimated 100 year postregulation streamflow (142 m3/s) at the U.S. Geological Survey (USGS) streamflow-gaging station above Minot, North Dakota. Upstream from Minot, N. Dak., the Souris River is regulated by three reservoirs in Saskatchewan (Rafferty, Boundary, and Alameda) and Lake Darling in North Dakota. During the 2011 flood, the city of Minot, N. Dak., experienced devastating damages with more than 4,000 homes flooded and 11,000 evacuated. As a result, the Souris River Basin Task Force recommended the U.S. Geological Survey (in cooperation with the North Dakota State Water Commission) develop a model for estimating the probabilities of future flooding and drought. The model that was developed took on four parts: (1) looking at past climate, (2) predicting future climate, (3) developing a streamflow model in response to certain climatic variables, and (4) combining future climate estimates with the streamflow model to predict future streamflow events. By taking into consideration historical climate record and trends in basin response to various climatic conditions, it was determined flood risk will remain high in the Souris River Basin until the wet climate state ends.
13 CFR 106.302 - What provisions must be set forth in a Fee Based Record?
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What provisions must be set forth in a Fee Based Record? 106.302 Section 106.302 Business Credit and Assistance SMALL BUSINESS... does not constitute or imply an endorsement by SBA of the Donor or the Donor's products or services. ...
Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos
2016-01-01
Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451
Does synchronization reflect a true interaction in the cardiorespiratory system?
Toledo, E; Akselrod, S; Pinhas, I; Aravot, D
2002-01-01
Cardiorespiratory synchronization, studied within the framework of phase synchronization, has recently raised interest as one of the interactions in the cardiorespiratory system. In this work, we present a quantitative approach to the analysis of this nonlinear phenomenon. Our primary aim is to determine whether synchronization between HR and respiration rate is a real phenomenon or a random one. First, we developed an algorithm, which detects epochs of synchronization automatically and objectively. The algorithm was applied to recordings of respiration and HR obtained from 13 normal subjects and 13 heart transplant patients. Surrogate data sets were constructed from the original recordings, specifically lacking the coupling between HR and respiration. The statistical properties of synchronization in the two data sets and in their surrogates were compared. Synchronization was observed in all groups: in normal subjects, in the heart transplant patients and in the surrogates. Interestingly, synchronization was less abundant in normal subjects than in the transplant patients, indicating that the unique physiological condition of the latter promote cardiorespiratory synchronization. The duration of synchronization epochs was significantly shorter in the surrogate data of both data sets, suggesting that at least some of the synchronization epochs are real. In view of those results, cardiorespiratory synchronization, although not a major feature of cardiorespiratory interaction, seems to be a real phenomenon rather than an artifact.
A proposed minimum data set for international primary care optometry: a modified Delphi study.
Davey, Christopher J; Slade, Sarah V; Shickle, Darren
2017-07-01
To identify a minimum list of metrics of international relevance to public health, research and service development which can be extracted from practice management systems and electronic patient records in primary optometric practice. A two stage modified Delphi technique was used. Stage 1 categorised metrics that may be recorded as being part of a primary eye examination by their importance to research using the results from a previous survey of 40 vision science and public health academics. Delphi stage 2 then gauged the opinion of a panel of seven vision science academics and achieved consensus on contentious metrics and methods of grading/classification. A consensus regarding inclusion and response categories was achieved for nearly all metrics. A recommendation was made of 53 metrics which would be appropriate in a minimum data set. This minimum data set should be easily integrated into clinical practice yet allow vital data to be collected internationally from primary care optometry. It should not be mistaken for a clinical guideline and should not add workload to the optometrist. A pilot study incorporating an additional Delphi stage prior to implementation is advisable to refine some response categories. © 2017 The Authors. Ophthalmic and Physiological Optics published by John Wiley & Sons Ltd on behalf of College of Optometrists.
Smartphone-coupled rhinolaryngoscopy at the point of care
NASA Astrophysics Data System (ADS)
Mink, Jonah; Bolton, Frank J.; Sebag, Cathy M.; Peterson, Curtis W.; Assia, Shai; Levitz, David
2018-02-01
Rhinolaryngoscopy remains difficult to perform in resource-limited settings due to the high cost of purchasing and maintaining equipment as well as the need for specialists to interpret exam findings. While the lack of expertise can be obviated by adopting telemedicine-based approaches, the capture, storage, and sharing of images/video is not a common native functionality of medical devices. Most rhinolaryngoscopy systems consist of an endoscope that interfaces with the patient's naso/oropharynx, and a tower of modules that record video/images. However, these expensive and bulky modules can be replaced by a smartphone that can fulfill the same functions but at a lower cost. To demonstrate this, a commercially available rhinolaryngoscope was coupled to a smartphone using a 3D-printed adapter. Software developed for other clinical applications was repurposed for ENT use, including an application that controls image and video capture, a HIPAA-compliant image/video storage and transfer cloud database, and customized software features developed to improve practitioner competency. Audio recording capabilities to assess speech pathology were also integrated into the smartphone rhinolaryngoscope system. The illumination module coupled onto the endoscope remained unchanged. The spatial resolution of the rhinolaryngoscope system was defined by the fiber diameter of endoscope fiber bundle, rather than the smartphone camera. The mobile rhinolaryngoscope system was used with appropriate patients by a general practitioner in an office setting. The general practitioner then consulted with an ENT specialist via the HIPAA compliant cloud database and workflow modules on difficult cases. These results suggest the smartphone-based rhinolaryngoscope holds promise for use in low-resource settings.
Zhao, Tian; Yang, Huifang; Sui, Huaxin; Salvi, Satyajeet Sudhir; Wang, Yong; Sun, Yuchun
2016-01-01
Objective Developments in digital technology have permitted researchers to study mandibular movements. Here, the accuracy of a real-time, computerized, binocular, three-dimensional (3D) trajectory-tracking device for recording functional mandibular movements was evaluated. Methods An occlusal splint without the occlusal region was created based on a plaster cast of the lower dentition. The splint was rigidly connected with a target on its labial side and seated on the cast. The cast was then rigidly attached to the stage of a high-precision triaxial electronic translator, which was used to move the target-cast-stage complex. Half-circular movements (5.00-mm radius) in three planes (XOY, XOZ, YOZ) and linear movements along the x-axis were performed at 5.00 mm/s. All trajectory points were recorded with the binocular 3D trajectory-tracking device and fitted to arcs or lines, respectively, with the Imageware software. To analyze the accuracy of the trajectory-tracking device, the mean distances between the trajectory points and the fitted arcs or lines were measured, and the mean differences between the lengths of the fitted arcs’ radii and a set value (5.00 mm) were then calculated. A one-way analysis of variance was used to evaluate the spatial consistency of the recording accuracy in three different planes. Results The mean distances between the trajectory points and fitted arcs or lines were 0.076 ± 0.033 mm or 0.089 ± 0.014 mm. The mean difference between the lengths of the fitted arcs’ radii and the set value (5.00 mm) was 0.025 ± 0.071 mm. A one-way ANOVA showed that the recording errors in three different planes were not statistically significant. Conclusion These results suggest that the device can record certain movements at 5.00 mm/s, which is similar to the speed of functional mandibular movements. In addition, the recordings had an error of <0.1 mm and good spatial consistency. Thus, the device meets some of the requirements necessary for recording human mandibular movements. PMID:27701462
A Long-Term BCI Study With ECoG Recordings in Freely Moving Rats.
Costecalde, Thomas; Aksenova, Tetiana; Torres-Martinez, Napoleon; Eliseyev, Andriy; Mestais, Corinne; Moro, Cecile; Benabid, Alim Louis
2018-02-01
Brain Computer Interface (BCI) studies are performed in an increasing number of applications. Questions are raised about electrodes, data processing and effectors. Experiments are needed to solve these issues. To develop a simple BCI set-up to easier studies for improving the mathematical tools to process the ECoG to control an effector. We designed a simple BCI using transcranial electrodes (17 screws, three mechanically linked to create a common reference, 14 used as recording electrodes) to record Electro-Cortico-Graphic (ECoG) neuronal activities in rodents. The data processing is based on an online self-paced non-supervised (asynchronous) BCI paradigm. N-way partial least squares algorithm together with Continuous Wavelet Transformation of ECoG recordings detect signatures related to motor activities. Signature detection in freely moving rats may activate external effectors during a behavioral task, which involved pushing a lever to obtain a reward. After routine training, we showed that peak brain activity preceding a lever push (LP) to obtain food reward was located mostly in the cerebellar cortex with a higher correlation coefficient, suggesting a strong postural component and also in the occipital cerebral cortex. Analysis of brain activities provided a stable signature in the high gamma band (∼180Hz) occurring within 1500 msec before the lever push approximately around -400 msec to -500 msec. Detection of the signature from a single cerebellar cortical electrode triggers the effector with high efficiency (68% Offline and 30% Online) and rare false positives per minute in sessions about 30 minutes and up to one hour (∼2 online and offline). In summary, our results are original as compared to the rest of the literature, which involves rarely rodents, a simple BCI set-up has been developed in rats, the data show for the first time long-term, up to one year, unsupervised online control of an effector. © 2017 International Neuromodulation Society.
Bohbot, Albert
2010-01-01
Preliminary results were measured by electromyography monitoring (electromyoscan) on three subjects suffering from spinal cord injury and who underwent a double therapy. The aim of this study was to evaluate regained voluntary activity below the injury in subjects who received a double therapy: 1) an olfactory ensheathing glia (OEG) transplantation using procedures developed by Dr. Hongyun Huang at the Xishan Hospital and Rehabilitation Centre, Beijing, China, and 2) LASERPONCTURE developed by Albert Bohbot, Laboratoire de Recherches sur le LASERPONCTURE, La Chapelle Montlinard, France. Materials uses were the LASERPONCTURE device developed by Albert Bohbot; the PROCOMP5 equipment with softwares BIOGRAPH INFINITI 5 and REHAB SUITE; the sensors MYOSCAN-PRO EMG (SA9401M-50) to record muscle activity, and FLEX/PRO-SA9309M to record skin conductance were fixed on the skin. An infrared laser, whose frequencies and power settings cannot be disclosed due to its proprietary nature, was applied after an OEG injection performed according to Dr. Hongyun Huang's procedures. Three cases, two males and one female, were selected for this study. Presentation and comments of the graphs recordings of voluntary muscle activity below the injury are provided. This preliminary study suggests that the double therapy restores some voluntary muscle activity as measured by electromyography monitoring.
Electronic medical records for otolaryngology office-based practice.
Chernobilsky, Boris; Boruk, Marina
2008-02-01
Pressure is mounting on physicians to adopt electronic medical records. The field of health information technology is evolving rapidly with innovations and policies often outpacing science. We sought to review research and discussions about electronic medical records from the past year to keep abreast of these changes. Original scientific research, especially from otolaryngologists, is lacking in this field. Adoption rates are slowly increasing, but more of the burden is shouldered by physicians despite policy efforts and the clear benefits to third-party payers. Scientific research from the past year suggests lack of improvements and even decreasing quality of healthcare with electronic medical record adoption in the ambulatory care setting. The increasing prevalence and standardization of electronic medical record systems results in a new set of problems including rising costs, audits, difficulties in transition and public concerns about security of information. As major players in healthcare continue to push for adoption, increased effort must be made to demonstrate actual improvements in patient care in the ambulatory care setting. More scientific studies are needed to demonstrate what features of electronic medical records actually improve patient care. Otolaryngologists should help each other by disseminating research about improvement in patient outcomes with their systems since current adoption and outcomes policies do not apply to specialists.
Corticospinal signals recorded with MEAs can predict the volitional forearm forces in rats.
Guo, Yi; Mesut, Sahin; Foulds, Richard A; Adamovich, Sergei V
2013-01-01
We set out to investigate if volitional components in the descending tracts of the spinal cord white matter can be accessed with multi-electrode array (MEA) recording technique. Rats were trained to press a lever connected to a haptic device with force feedback to receive sugar pellets. A flexible-substrate multi-electrode array was chronically implanted into the dorsal column of the cervical spinal cord. Field potentials and multi-unit activities were recorded from the descending axons of the corticospinal tract while the rat performed a lever pressing task. Forelimb forces, recorded with the sensor attached to the lever, were reconstructed using the hand position data and the neural signals through multiple trials over three weeks. The regression coefficients found from the trial set were cross-validated on the other trials recorded on same day. Approximately 30 trials of at least 2 seconds were required for accurate model estimation. The maximum correlation coefficient between the actual and predicted force was 0.7 in the test set. Positional information and its interaction with neural signals improved the correlation coefficient by 0.1 to 0.15. These results suggest that the volitional information contained in the corticospinal tract can be extracted with multi-channel neural recordings made with parenchymal electrodes.
NASA Astrophysics Data System (ADS)
Melnichenko, O.; Hacker, P. W.; Wentz, F. J.; Meissner, T.; Maximenko, N. A.; Potemra, J. T.
2016-12-01
To address the need for a consistent, continuous, long-term, high-resolution sea surface salinity (SSS) dataset for ocean research and applications, a trial SSS analysis is produced in the eastern tropical Pacific from multi-satellite observations. The new SSS data record is a synergy of data from two satellite missions. The beginning segment, covering the period from September 2011 to June 2015, utilizes Aquarius SSS data and is based on the optimum interpolation analysis developed at the University of Hawaii. The analysis is produced on a 0.25-degree grid and uses a dedicated bias-correction algorithm to correct the satellite retrievals for large-scale biases with respect to in-situ data. The time series is continued with the Soil Moisture Active Passive (SMAP) satellite-based SSS data provided by Remote Sensing Systems (RSS). To ensure consistency and continuity in the data record, SMAP SSS fields are adjusted using a set of optimally designed spatial filters and in-situ, primarily Argo, data to: (i) remove large-scale satellite biases, and (ii) reduce small-scale noise, while preserving the high spatial and temporal resolution of the data set. The consistency between the two sub-sets of the data record is evaluated during their overlapping period in April-June 2015. Verification studies show that SMAP SSS has a very good agreement with the Aquarius SSS, noting that SMAP SSS can provide better spatial resolution. The 5-yr long time series of SSS in the SPURS-2 domain (125oW, 10oN) shows fresher than normal SSS during the last year's El Nino event. The year-mean difference is about 0.5 psu. The annual cycle during the El Nino year also appears to be much weaker than in a normal year.
NASA Technical Reports Server (NTRS)
Crutcher, H. L.; Falls, L. W.
1976-01-01
Sets of experimentally determined or routinely observed data provide information about the past, present and, hopefully, future sets of similarly produced data. An infinite set of statistical models exists which may be used to describe the data sets. The normal distribution is one model. If it serves at all, it serves well. If a data set, or a transformation of the set, representative of a larger population can be described by the normal distribution, then valid statistical inferences can be drawn. There are several tests which may be applied to a data set to determine whether the univariate normal model adequately describes the set. The chi-square test based on Pearson's work in the late nineteenth and early twentieth centuries is often used. Like all tests, it has some weaknesses which are discussed in elementary texts. Extension of the chi-square test to the multivariate normal model is provided. Tables and graphs permit easier application of the test in the higher dimensions. Several examples, using recorded data, illustrate the procedures. Tests of maximum absolute differences, mean sum of squares of residuals, runs and changes of sign are included in these tests. Dimensions one through five with selected sample sizes 11 to 101 are used to illustrate the statistical tests developed.
U.S. Maternally Linked Birth Records May Be Biased for Hispanics and Other Population Groups
LEISS, JACK K.; GILES, DENISE; SULLIVAN, KRISTIN M.; MATHEWS, RAHEL; SENTELLE, GLENDA; TOMASHEK, KAY M.
2010-01-01
Purpose To advance understanding of linkage error in U.S. maternally linked datasets, and how the error may affect results of studies based on the linked data. Methods North Carolina birth and fetal death records for 1988-1997 were maternally linked (n=1,030,029). The maternal set probability, defined as the probability that all records assigned to the same maternal set do in fact represent events to the same woman, was used to assess differential maternal linkage error across race/ethnic groups. Results Maternal set probabilities were lower for records specifying Asian or Hispanic race/ethnicity, suggesting greater maternal linkage error. The lower probabilities for Hispanics were concentrated in women of Mexican origin who were not born in the United States. Conclusions Differential maternal linkage error may be a source of bias in studies using U.S. maternally linked datasets to make comparisons between Hispanics and other groups or among Hispanic subgroups. Methods to quantify and adjust for this potential bias are needed. PMID:20006273
Gargon, Elizabeth; Williamson, Paula R; Young, Bridget
2017-06-01
The objective of the study was to explore core outcome set (COS) developers' experiences of their work to inform methodological guidance on COS development and identify areas for future methodological research. Semistructured, audio-recorded interviews with a purposive sample of 32 COS developers. Analysis of transcribed interviews was informed by the constant comparative method and framework analysis. Developers found COS development to be challenging, particularly in relation to patient participation and accessing funding. Their accounts raised fundamental questions about the status of COS development and whether it is consultation or research. Developers emphasized how the absence of guidance had affected their work and identified areas where guidance or evidence about COS development would be useful including, patient participation, ethics, international development, and implementation. They particularly wanted guidance on systematic reviews, Delphi, and consensus meetings. The findings raise important questions about the funding, status, and process of COS development and indicate ways that it could be strengthened. Guidance could help developers to strengthen their work, but over specification could threaten quality in COS development. Guidance should therefore highlight common issues to consider and encourage tailoring of COS development to the context and circumstances of particular COS. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Average yields of peanut in the U.S. set an all time record of 4,695 kg ha-1 in 2012. This far exceeded the previous record yield of 3,837 kg ha-1 in 2008. Favorable weather conditions undoubtedly contributed to the record yields in 2012; however, these record yields would not have been achievable...
Managing Dental Office Records. Student's Manual [and] Instructor's Guide.
ERIC Educational Resources Information Center
Graf, Sandra Kovacs
The student's manual of this set consists of materials for use by individuals enrolled in an extension course in managing dental office records. Addressed in the individual units of the course are the following topics: clinical records, dental insurance, recall systems, inventory control, and financial records. Each unit contains some or all of…
30 CFR 250.1619 - Well records.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false Well records. 250.1619 Section 250.1619 Mineral... Well records. (a) Complete and accurate records for each well and all well operations shall be retained... if cored and analyzed; the kind, weight, size, grade, and setting depth of casing; all well logs and...
Status of the Dobson total ozone data set
NASA Technical Reports Server (NTRS)
Planet, Walter G.; Hudson, Robert D.
1994-01-01
During deliberations of the International Ozone Trends Panel (IOTP) it became obvious that satellite determinations of global ozone amounts by themselves could not provide the necessary confidence in the measured trends. During the time of the deliberations of the IOTP, Bojkov re-examined the records of serveral North American Dobson stations and Degorska re-examined the records of the Belsk station. They were able to improve the quality of the data sets, thus improving the precision of their total ozone data sets. These improvements showed the greater potential of the world-wide Dobson total ozone data set in two primary areas. Firstly, the improvements showed that the existing data set when evaluated will become more valuable for comparisons with satellite determinations of total ozone. Secondly, the Dobson data set covers a greater period of time than the satellite data sets thus offering the possibility of extending improved information on ozone trends further back in time. An International Dobson Workshop was convened in September, 1991, under the auspices of the NOAA Climate and Global Change Program. It was part of the Information Management element of the C&GC Program. Further, it was considered as a 'data archaeology' project under the above. Clearly if the existing Dobson data set can be improved by re-evaluating all data records, we will be able to uncover the 'true' or 'best' data and fulfill the role of archaeologists.
Roman, Lara A; Fristensky, Jason P; Eisenman, Theodore S; Greenfield, Eric J; Lundgren, Robert E; Cerwinka, Chloe E; Hewitt, David A; Welsh, Caitlin C
2017-12-01
Many municipalities are setting ambitious tree canopy cover goals to increase the extent of their urban forests. A historical perspective on urban forest development can help cities strategize how to establish and achieve appropriate tree cover targets. To understand how long-term urban forest change occurs, we examined the history of trees on an urban college campus: the University of Pennsylvania in Philadelphia, PA. Using a mixed methods approach, including qualitative assessments of archival records (1870-2017), complemented by quantitative analysis of tree cover from aerial imagery (1970-2012), our analysis revealed drastic canopy cover increase in the late 20th and early 21st centuries along with the principle mechanisms of that change. We organized the historical narrative into periods reflecting campus planting actions and management approaches; these periods are also connected to broader urban greening and city planning movements, such as City Beautiful and urban sustainability. University faculty in botany, landscape architecture, and urban design contributed to the design of campus green spaces, developed comprehensive landscape plans, and advocated for campus trees. A 1977 Landscape Development Plan was particularly influential, setting forth design principles and planting recommendations that enabled the dramatic canopy cover gains we observed, and continue to guide landscape management today. Our results indicate that increasing urban tree cover requires generational time scales and systematic management coupled with a clear urban design vision and long-term commitments. With the campus as a microcosm of broader trends in urban forest development, we conclude with a discussion of implications for municipal tree cover planning.
NASA Astrophysics Data System (ADS)
Roman, Lara A.; Fristensky, Jason P.; Eisenman, Theodore S.; Greenfield, Eric J.; Lundgren, Robert E.; Cerwinka, Chloe E.; Hewitt, David A.; Welsh, Caitlin C.
2017-12-01
Many municipalities are setting ambitious tree canopy cover goals to increase the extent of their urban forests. A historical perspective on urban forest development can help cities strategize how to establish and achieve appropriate tree cover targets. To understand how long-term urban forest change occurs, we examined the history of trees on an urban college campus: the University of Pennsylvania in Philadelphia, PA. Using a mixed methods approach, including qualitative assessments of archival records (1870-2017), complemented by quantitative analysis of tree cover from aerial imagery (1970-2012), our analysis revealed drastic canopy cover increase in the late 20th and early 21st centuries along with the principle mechanisms of that change. We organized the historical narrative into periods reflecting campus planting actions and management approaches; these periods are also connected to broader urban greening and city planning movements, such as City Beautiful and urban sustainability. University faculty in botany, landscape architecture, and urban design contributed to the design of campus green spaces, developed comprehensive landscape plans, and advocated for campus trees. A 1977 Landscape Development Plan was particularly influential, setting forth design principles and planting recommendations that enabled the dramatic canopy cover gains we observed, and continue to guide landscape management today. Our results indicate that increasing urban tree cover requires generational time scales and systematic management coupled with a clear urban design vision and long-term commitments. With the campus as a microcosm of broader trends in urban forest development, we conclude with a discussion of implications for municipal tree cover planning.
Utilising reinforcement learning to develop strategies for driving auditory neural implants.
Lee, Geoffrey W; Zambetta, Fabio; Li, Xiaodong; Paolini, Antonio G
2016-08-01
In this paper we propose a novel application of reinforcement learning to the area of auditory neural stimulation. We aim to develop a simulation environment which is based off real neurological responses to auditory and electrical stimulation in the cochlear nucleus (CN) and inferior colliculus (IC) of an animal model. Using this simulator we implement closed loop reinforcement learning algorithms to determine which methods are most effective at learning effective acoustic neural stimulation strategies. By recording a comprehensive set of acoustic frequency presentations and neural responses from a set of animals we created a large database of neural responses to acoustic stimulation. Extensive electrical stimulation in the CN and the recording of neural responses in the IC provides a mapping of how the auditory system responds to electrical stimuli. The combined dataset is used as the foundation for the simulator, which is used to implement and test learning algorithms. Reinforcement learning, utilising a modified n-Armed Bandit solution, is implemented to demonstrate the model's function. We show the ability to effectively learn stimulation patterns which mimic the cochlea's ability to covert acoustic frequencies to neural activity. Time taken to learn effective replication using neural stimulation takes less than 20 min under continuous testing. These results show the utility of reinforcement learning in the field of neural stimulation. These results can be coupled with existing sound processing technologies to develop new auditory prosthetics that are adaptable to the recipients current auditory pathway. The same process can theoretically be abstracted to other sensory and motor systems to develop similar electrical replication of neural signals.
Digital Geological Mapping for Earth Science Students
NASA Astrophysics Data System (ADS)
England, Richard; Smith, Sally; Tate, Nick; Jordan, Colm
2010-05-01
This SPLINT (SPatial Literacy IN Teaching) supported project is developing pedagogies for the introduction of teaching of digital geological mapping to Earth Science students. Traditionally students are taught to make geological maps on a paper basemap with a notebook to record their observations. Learning to use a tablet pc with GIS based software for mapping and data recording requires emphasis on training staff and students in specific GIS and IT skills and beneficial adjustments to the way in which geological data is recorded in the field. A set of learning and teaching materials are under development to support this learning process. Following the release of the British Geological Survey's Sigma software we have been developing generic methodologies for the introduction of digital geological mapping to students that already have experience of mapping by traditional means. The teaching materials introduce the software to the students through a series of structured exercises. The students learn the operation of the software in the laboratory by entering existing observations, preferably data that they have collected. Through this the students benefit from being able to reflect on their previous work, consider how it might be improved and plan new work. Following this they begin fieldwork in small groups using both methods simultaneously. They are able to practise what they have learnt in the classroom and review the differences, advantages and disadvantages of the two methods, while adding to the work that has already been completed. Once the field exercises are completed students use the data that they have collected in the production of high quality map products and are introduced to the use of integrated digital databases which they learn to search and extract information from. The relatively recent development of the technologies which underpin digital mapping also means that many academic staff also require training before they are able to deliver the course materials. Consequently, a set of staff training materials are being developed in parallel to those for the students. These focus on the operation of the software and an introduction to the structure of the exercises. The presentation will review the teaching exercises and student and staff responses to their introduction.
Code of Federal Regulations, 2014 CFR
2014-10-01
... responsible for monitoring the security standards set forth in this regulation. (b) A designated official... records at all times and for insuring that such records are secured in appropriate containers whenever not...
Code of Federal Regulations, 2011 CFR
2011-10-01
... responsible for monitoring the security standards set forth in this regulation. (b) A designated official... records at all times and for insuring that such records are secured in appropriate containers whenever not...
Code of Federal Regulations, 2013 CFR
2013-10-01
... responsible for monitoring the security standards set forth in this regulation. (b) A designated official... records at all times and for insuring that such records are secured in appropriate containers whenever not...
Code of Federal Regulations, 2012 CFR
2012-10-01
... responsible for monitoring the security standards set forth in this regulation. (b) A designated official... records at all times and for insuring that such records are secured in appropriate containers whenever not...
Code of Federal Regulations, 2010 CFR
2010-10-01
... responsible for monitoring the security standards set forth in this regulation. (b) A designated official... records at all times and for insuring that such records are secured in appropriate containers whenever not...
An anesthesia information system for monitoring and record keeping during surgical anesthesia.
Klocke, H; Trispel, S; Rau, G; Hatzky, U; Daub, D
1986-10-01
We have developed an anesthesia information system (AIS) that supports the anesthesiologist in monitoring and recording during a surgical operation. In development of the system, emphasis was placed on providing an anesthesiologist-computer interface that can be adapted to typical situations during anesthesia and to individual user behavior. One main feature of this interface is the integration of the input and output of information. The only device for interaction between the anesthesiologist and the AIS is a touch-sensitive, high-resolution color display screen. The anesthesiologist enters information by touching virtual function keys displayed on the screen. A data window displays all data generated over time, such as automatically recorded vital signs, including blood pressure, heart rate, and rectal and esophageal temperatures, and manually entered variables, such as administered drugs, and ventilator settings. The information gathered by the AIS is presented on the cathode ray tube in several pages. A main distributor page gives an overall view of the content of every work page. A one-page record of the anesthesia is automatically plotted on a multicolor digital plotter during the operation. An example of the use of the AIS is presented from a field test of the system during which it was evaluated in the operating room without interfering with the ongoing operation. Medical staff who used the AIS imitated the anesthesiologist's recording and information search behavior but did not have responsibility for the conduct of the anesthetic.
Martin, Caroline J Hollins; Kenney, Laurence; Pratt, Thomas; Granat, Malcolm H
2015-01-01
There is limited understanding of the type and extent of maternal postures that midwives should encourage or support during labor. The aims of this study were to identify a set of postures and movements commonly seen during labor, to develop an activity monitoring system for use during labor, and to validate this system design. Volunteer student midwives simulated maternal activity during labor in a laboratory setting. Participants (N = 15) wore monitors adhered to the left thigh and left shank, and adopted 13 common postures of laboring women for 3 minutes each. Simulated activities were recorded using a video camera. Postures and movements were coded from the video, and statistical analysis conducted of agreement between coded video data and outputs of the activity monitoring system. Excellent agreement between the 2 raters of the video recordings was found (Cohen's κ = 0.95). Both sensitivity and specificity of the activity monitoring system were greater than 80% for standing, lying, kneeling, and sitting (legs dangling). This validated system can be used to measure elected activity of laboring women and report on effects of postures on length of first stage, pain experience, birth satisfaction, and neonatal condition. This validated maternal posture-monitoring system is available as a reference-and for use by researchers who wish to develop research in this area. © 2015 by the American College of Nurse-Midwives.
Lahlou, Saadi; Boesen-Mariani, Sabine; Franks, Bradley; Guelinckx, Isabelle
2015-01-01
On average, children and adults in developed countries consume too little water, which can lead to negative health consequences. In a one-year longitudinal field experiment in Poland, we compared the impact of three home-based interventions on helping children and their parents/caregivers to develop sustainable increased plain water consumption habits. Fluid consumption of 334 children and their caregivers were recorded over one year using an online specific fluid dietary record. They were initially randomly allocated to one of the three following conditions: Control, Information (child and carer received information on the health benefits of water), or Placement (in addition to information, free small bottles of still water for a limited time period were delivered at home). After three months, half of the non-controls were randomly assigned to Community (child and caregiver engaged in an online community forum providing support on water consumption). All conditions significantly increased the water consumption of children (by 21.9-56.7%) and of adults (by 22-89%). Placement + Community generated the largest effects. Community enhanced the impact of Placement for children and parents, as well as the impact of Information for parents but not children. The results suggest that the family setting offers considerable scope for successful installation of interventions encouraging children and caregivers to develop healthier consumption habits, in mutually reinforcing ways. Combining information, affordances, and social influence gives the best, and most sustainable, results. © 2015 S. Karger AG, Basel.
Hough, Augustus; Vartan, Christine M; Groppi, Julie A; Reyes, Sonia; Beckey, Nick P
2013-07-01
The development of an electronic tool to quantify and characterize the interventions made by clinical pharmacy specialists (CPSs) in a primary care setting is described. An electronic clinical tool was developed to document the clinical pharmacy interventions made by CPSs at the Veterans Affairs Medical Center in West Palm Beach, Florida. The tool, embedded into the electronic medical record, utilizes a novel reminder dialogue to complete pharmacotherapy visit encounters and allows CPSs to document interventions made during patient care visits. Interventions are documented using specific electronic health factors so that the type and number of interventions made for both disease-specific and other pharmacotherapy interventions can be tracked. These interventions were assessed and analyzed to evaluate the impact of CPSs in the primary care setting. From February 2011 through January 2012, a total of 16,494 pharmacotherapy interventions (therapeutic changes and goals attained) were recorded. The average numbers of interventions documented per patient encounter were 0.96 for the management of diabetes mellitus, hypertension, dyslipidemia, and heart failure and 1.36 for non-disease-specific interventions, independent of those interventions being made by the primary physician or other members of the primary care team. A clinical reminder tool developed to quantify and characterize the interventions provided by CPSs found that for every visit with a CPS, approximately one disease-specific intervention and one additional pharmacotherapy intervention were made, independent of those interventions being made by the primary physician or other members of the primary care team.
The Development of Clinical Document Standards for Semantic Interoperability in China
Yang, Peng; Pan, Feng; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping
2011-01-01
Objectives This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. Methods The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Results Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. Conclusions This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study. PMID:22259722
Accurate blood pressure recording: is it difficult?
Bhalla, A; Singh, R; D'cruz, S; Lehl, S S; Sachdev, A
2005-11-01
Blood pressure (BP) measurement is a routine procedure but errors are frequently committed during BP recording. AIMS AND SETTINGS: The aim of the study was to look at the prevalent practices in the institute regarding BP recording. The study was conducted in the Medicine Department at Government Medical College, Chandigarh, a teaching institute for MBBS students. A prospective, observational study was performed amongst the 80 doctors in a tertiary care hospital. All of them were observed by a single observer during the act of BP recording. The observer was well versed with the guidelines issued by British Hypertension Society (BHS) and the deviations from the standard set of guidelines issued by BHS were noted. The errors were defined as deviations from these guidelines. The results were recorded as percentage of doctors committing these errors. In our study, 90% used mercury type sphygmomanometer. Zero error of the apparatus, hand dominance was not noted by any one. Every one used the standard BP cuff for recording BP. 70% of them did not let the patient rest before recording BP. 80% did not remove the clothing from the arm. None of them recorded BP in both arms. In out patient setting, 80% recorded blood pressure in sitting position and 14% in supine position. In all the patients where BP was recorded in sitting position BP apparatus was below the level of heart and 20% did not have their arm supported. 60% did not use palpatory method for noticing systolic BP and 70% did not raise pressure 30-40 mm Hg above the systolic level before checking the BP by auscultation. 80% lowered the BP at a rate of more than 2 mm/s and 60% rounded off the BP to nearest 5-10 mm Hg. 70% recorded BP only once and 90% of the rest re inflated the cuff without completely deflating and allowing rest before a second reading was obtained. The practice of recording BP in our hospital varies from the standard guidelines issued by the BHS.
Guaraldi, F; Gori, D; Beccuti, G; Prencipe, N; Giordano, R; Mints, Y; Di Giacomo, V S; Berton, A; Lorente, M; Gasco, V; Ghigo, E; Salvatori, R; Grottoli, S
2016-11-01
To determine the validity of a self-administered questionnaire (Acro-CQ) developed to systematically assess the presence, type and time of onset of acromegaly comorbidities. This is a cross-sectional study; 105 acromegaly patients and 147 controls with other types of pituitary adenoma, referred to a specialized Italian Center, autonomously compiled Acro-CQ in an outpatient clinical setting. To test its reliability in a different setting, Acro-CQ was administered via mail to 78 patients with acromegaly and 100 with other pituitary adenomas, referred to a specialized US Center. Data obtained from questionnaires in both settings were compared with medical records (gold standard). Demographics of patients and controls from both countries were similar. In both settings, >95 % of the questionnaires were completely filled; only one item was missed in the others. Concordance with medical record was excellent (k > 0.85) for most of the items, independently from the way of administration, patient age, gender and nationality, pituitary adenoma type and disease activity. Acro-CQ is an inexpensive, highly accepted from patients and reliable tool recommended to expedite systematic collection of relevant clinical data in acromegaly at diagnosis, to be replicated at follow-ups. This tool may guide a targeted, cost-effective management of complications. Moreover, it could be applied to retrieve data for survey studies in both acromegaly and other pituitary adenomas, as information is easily and rapidly accessible for statistical analysis.
Dungey, Sheena; Glew, Simon; Heyes, Barbara; Macleod, John; Tate, A Rosemary
2016-09-01
Electronic healthcare records provide information about patient care over time which not only affords the opportunity to improve patient care directly through effective monitoring and identification of care requirements but also offers a unique platform for both clinical and service-model research essential to the longer-term development of the health service. The quality of the recorded data can, however, be variable and can compromise the validity of data use both for primary and secondary purposes. In order to explore the challenges and benefits of and approaches to recording high quality primary care electronic records, a Clinical Practice Research Datalink (CPRD) sponsored workshop was held at the Society of Academic Primary Care (SAPC) conference in 2014 with the aim of engaging GPs and other data users. The workshop was held as a structured discussion, led by an expert panel and focused around three questions: (1) What are the data quality priorities for clinicians and researchers? How do these priorities differ or overlap? (2) What challenges might GPs face in provision of good data quality both for treating their patients and for research? Do these aims conflict? (3) What tools (such as data metrics and visualisations or software components) could assist the GP in improving data quality and patient management and could this tie in with analytical processes occurring at the research stage? The discussion highlighted both overlap and differences in the perceived data quality priorities and challenges for different user groups. Five key areas of focus were agreed upon and recommendations determined for moving forward in improving quality. The importance of good high quality electronic healthcare records has been set forth along with the need for a practical user-considered and collaborative approach to its improvement.
2015-03-01
data against previous published outcomes in AP and Chronic Pancreatitis (CP). This served as useful validation of our data set before entering the...These patients can develop multiple complications from their disease. In addition, the treatments for CD (both medical and surgical ) can impose...years of diagnosis. The treatment for CD can sometimes involve very expensive medications with potentially serious side effects, as well as surgical
The U.S.-India Relationship: Cross-Sector Collaboration to Promote Sustainable Development
2014-09-01
Development—Rationale for the Workshop and Overview of the Volume .....1 Michael J. Fratantuono PART I: WORKSHOP PAPERS AND DISCUSSANTS’ COMMENTS...time the leading expert at the SSI in the area of South Asia, who indicated his willingness to write a paper , to participate in the workshop, and...take to record the workshop proceedings effectively. Mr. Ryan Burke, Web De- velopment Specialist, helped us set up the workshop website that we
Tank waste remediation system baseline tank waste inventory estimates for fiscal year 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shelton, L.W., Westinghouse Hanford
1996-12-06
A set of tank-by-tank waste inventories is derived from historical waste models, flowsheet records, and analytical data to support the Tank Waste Remediation System flowsheet and retrieval sequence studies. Enabling assumptions and methodologies used to develop the inventories are discussed. These provisional inventories conform to previously established baseline inventories and are meant to serve as an interim basis until standardized inventory estimates are made available.
Historic Properties Report: Stratford Army Engine Plant, Connecticut.
1984-07-01
aircraft, Pan American began flights to Argentina, Hawaii, and New Zealand , and by August 1934 the Sikorsky S-42 airplane had set world records for...384;or a lengthy discussion of the Corsair , see William Green, Famous Fighters of the Second World War (Garden City, New York: Doubleday), pp. 79-92...manufacture the Corsair fighter plane. Presently, the Avco Lycoming Division uses the facility to develop and manufacture gas turbine engines. There are
ERIC Educational Resources Information Center
Virginia Polytechnic Inst. and State Univ., Blacksburg. Div. of Vocational-Technical Education.
This self-instructional module on business records is the ninth in a set of twelve modules designed for small business owner-managers. Competencies for this module are (1) identify the records required for business operations and (2) describe the important uses of business records. Provided are information sections (reasons for records, parts of a…
Fryer-Edwards, Kelly; Arnold, Robert M; Baile, Walter; Tulsky, James A; Petracca, Frances; Back, Anthony
2006-07-01
Small-group teaching is particularly suited for complex skills such as communication. Existing work has identified the basic elements of small-group teaching, but few descriptions of higher-order teaching practices exist in the medical literature. Thus the authors developed an empirically driven and theoretically grounded model for small-group communication-skills teaching. Between 2002 and 2005, teaching observations were collected over 100 hours of direct contact time between four expert facilitators and 120 medical oncology fellows participating in Oncotalk, a semiannual, four-day retreat focused on end-of-life communication skills. The authors conducted small-group teaching observations, semistructured interviews with faculty participants, video or audio recording with transcript review, and evaluation of results by faculty participants. Teaching skills observed during the retreats included a linked set of reflective, process-oriented teaching practices: identifying a learning edge, proposing and testing hypotheses, and calibrating learner self-assessments. Based on observations and debriefings with facilitators, the authors developed a conceptual model of teaching that illustrates an iterative loop of teaching practices aimed at enhancing learners' engagement and self-efficacy. Through longitudinal, empirical observations, this project identified a set of specific teaching skills for small-group settings with applicability to other clinical teaching settings. This study extends current theory and teaching practice prescriptions by describing specific teaching practices required for effective teaching. These reflective teaching practices, while developed for communication skills training, may be useful for teaching other challenging topics such as ethics and professionalism.
NASA Technical Reports Server (NTRS)
Mitchell, B. Greg; Kahru, Mati; Marra, John (Technical Monitor)
2002-01-01
Support for this project was used to develop satellite ocean color and temperature indices (SOCTI) for the California Current System (CCS) using the historic record of CZCS West Coast Time Series (WCTS), OCTS, WiFS and AVHRR SST. The ocean color satellite data have been evaluated in relation to CalCOFI data sets for chlorophyll (CZCS) and ocean spectral reflectance and chlorophyll OCTS and SeaWiFS. New algorithms for the three missions have been implemented based on in-water algorithm data sets, or in the case of CZCS, by comparing retrieved pigments with ship-based observations. New algorithms for absorption coefficients, diffuse attenuation coefficients and primary production have also been evaluated. Satellite retrievals are being evaluated based on our large data set of pigments and optics from CalCOFI.