Sample records for handling data

  1. Ground data handling for Landsat-D. [for thematic mapper

    NASA Technical Reports Server (NTRS)

    Lynch, T. J.

    1977-01-01

    The present plans for the Landsat-D ground data handling are described in relationship to the mission objectives and the planned spacecraft system. The end-to-end data system is presented with particular emphasis on the data handling plans for the new instrument, the Thematic Mapper. This instrument generates ten times the amount of data per scene as the present Multispectral Scanner and this resulting data rate and volume are discussed as well as possible new data techniques to handle them - such as image compression.

  2. Ground data handling for LANDSAT-D

    NASA Technical Reports Server (NTRS)

    Lynch, T. J.

    1976-01-01

    The present plans for the LANDSAT D ground data handling are described in relationship to the mission objectives and the planned spacecraft system. The end to end data system is presented with particular emphasis on the data handling plans for the new instrument, the Thematic Mapper. This instrument generates ten times the amount of data per scene as the present Multispectral Scanner, and this resulting data rate and volume are discussed as well as possible new data techniques to handle them such as image compression.

  3. MVAPACK: A Complete Data Handling Package for NMR Metabolomics

    PubMed Central

    2015-01-01

    Data handling in the field of NMR metabolomics has historically been reliant on either in-house mathematical routines or long chains of expensive commercial software. Thus, while the relatively simple biochemical protocols of metabolomics maintain a low barrier to entry, new practitioners of metabolomics experiments are forced to either purchase expensive software packages or craft their own data handling solutions from scratch. This inevitably complicates the standardization and communication of data handling protocols in the field. We report a newly developed open-source platform for complete NMR metabolomics data handling, MVAPACK, and describe its application on an example metabolic fingerprinting data set. PMID:24576144

  4. Earth resources sensor data handling system: NASA JSC version

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The design of the NASA JSC data handling system is presented. Data acquisition parameters and computer display formats and the flow of image data through the system, with recommendations for improving system efficiency are discussed along with modifications to existing data handling procedures which will allow utilization of data duplication techniques and the accurate identification of imagery.

  5. Best practices for missing data management in counseling psychology.

    PubMed

    Schlomer, Gabriel L; Bauman, Sheri; Card, Noel A

    2010-01-01

    This article urges counseling psychology researchers to recognize and report how missing data are handled, because consumers of research cannot accurately interpret findings without knowing the amount and pattern of missing data or the strategies that were used to handle those data. Patterns of missing data are reviewed, and some of the common strategies for dealing with them are described. The authors provide an illustration in which data were simulated and evaluate 3 methods of handling missing data: mean substitution, multiple imputation, and full information maximum likelihood. Results suggest that mean substitution is a poor method for handling missing data, whereas both multiple imputation and full information maximum likelihood are recommended alternatives to this approach. The authors suggest that researchers fully consider and report the amount and pattern of missing data and the strategy for handling those data in counseling psychology research and that editors advise researchers of this expectation.

  6. Teaching Data Handling in Foundation Phase: Teachers' Experiences

    ERIC Educational Resources Information Center

    Naidoo, Jayaluxmi; Mkhabela, Nokuphiwa

    2017-01-01

    Data handling plays an important role within mathematics education since it encompasses real-world situations and assists in developing critical thinking skills in learners. However, globally international assessments disclose that learners are not performing well in data handling. This article explores foundation phase South African teachers'…

  7. A Review of Missing Data Handling Methods in Education Research

    ERIC Educational Resources Information Center

    Cheema, Jehanzeb R.

    2014-01-01

    Missing data are a common occurrence in survey-based research studies in education, and the way missing values are handled can significantly affect the results of analyses based on such data. Despite known problems with performance of some missing data handling methods, such as mean imputation, many researchers in education continue to use those…

  8. 32 CFR 37.855 - How should I handle protected data?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 1 2012-07-01 2012-07-01 false How should I handle protected data? 37.855 Section 37.855 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE DoD GRANT AND... Intellectual Property § 37.855 How should I handle protected data? Prior to releasing or disclosing data marked...

  9. 32 CFR 37.855 - How should I handle protected data?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 1 2014-07-01 2014-07-01 false How should I handle protected data? 37.855 Section 37.855 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE DoD GRANT AND... Intellectual Property § 37.855 How should I handle protected data? Prior to releasing or disclosing data marked...

  10. 32 CFR 37.855 - How should I handle protected data?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 1 2013-07-01 2013-07-01 false How should I handle protected data? 37.855 Section 37.855 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE DoD GRANT AND... Intellectual Property § 37.855 How should I handle protected data? Prior to releasing or disclosing data marked...

  11. Extension of ERIM multispectral data processing capabilities through improved data handling techniques

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1973-01-01

    The improvement and extension of the capabilities of the Environmental Research Institute of Michigan processing facility in handling multispectral data are discussed. Improvements consisted of implementing hardware modifications which permitted more rapid access to the recorded data through improved numbering and indexing of such data. In addition, techniques are discussed for handling data from sources other than the ERIM M-5 and M-7 scanner systems.

  12. A reference model for space data system interconnection services

    NASA Astrophysics Data System (ADS)

    Pietras, John; Theis, Gerhard

    1993-03-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  13. A reference model for space data system interconnection services

    NASA Technical Reports Server (NTRS)

    Pietras, John; Theis, Gerhard

    1993-01-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  14. A compilation and analysis of helicopter handling qualities data. Volume 2: Data analysis

    NASA Technical Reports Server (NTRS)

    Heffley, R. K.

    1979-01-01

    A compilation and an analysis of helicopter handling qualities data are presented. Multiloop manual control methods are used to analyze the descriptive data, stability derivatives, and transfer functions for a six degrees of freedom, quasi static model. A compensatory loop structure is applied to coupled longitudinal, lateral and directional equations in such a way that key handling qualities features are examined directly.

  15. Infant handling in bonobos (Pan paniscus): Exploring functional hypotheses and the relationship to oxytocin.

    PubMed

    Boose, Klaree; White, Frances; Brand, Colin; Meinelt, Audra; Snodgrass, Josh

    2018-05-09

    Infant handling describes interactions between infants and non-maternal group members and is widespread across mammalian taxa. The expression of infant handling behaviors, defined as any affiliative or agonistic interaction between a group member and an infant, varies considerably among primate species. Several functional hypotheses may explain the adaptive value of infant handling including the Kin Selection hypothesis, which describes handling as a mechanism through which indirect fitness is increased and predicts a bias in handling behaviors directed toward related (genetic) infants; the Alliance Formation hypothesis, which describes handling as a social commodity and predicts females with infants will support handlers during conflict; and the Learning-to-Mother hypothesis, which describes handling as a mechanism through which handlers learn species-specific maternal behaviors and predicts that handling will occur most frequently in immature and nulliparous females. Using behavioral observation and data on urinary oxytocin, a neuropeptide hormone known to modulate maternal care and social bonds in mammals, the purpose of this study was to describe the pattern of infant handling in bonobos (Pan paniscus) and to explore proposed functional hypotheses. Data show that related infant-handler dyads occurred significantly more frequently than unrelated infant-handler dyads during some of the study period and that handling was positively correlated with support during conflict. Data also showed that immature and nulliparous females handled infants significantly more than other age-sex categories and exhibited higher post handling oxytocin values than other age-sex class. The trends identified in this data set provide insight into the role oxytocin may play in facilitating care-giving behaviors in young female bonobos and help to narrow the focus of future research efforts, particularly those associated with the Kin Selection, Alliance Formation, and Learning-to-Mother functional hypotheses. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Structural Elements in a Persistent Identifier Infrastructure and Resulting Benefits for the Earth Science Community

    NASA Astrophysics Data System (ADS)

    Weigel, T.; Toussaiant, F.; Stockhause, M.; Höck, H.; Kindermann, S.; Lautenschlager, M.; Ludwig, T.

    2012-12-01

    We propose a wide adoption of structural elements (typed links, collections, trees) in the Handle System to improve identification and access of scientific data, metadata and software as well as traceability of data provenance. Typed links target the issue of data provenance as a means to assess the quality of scientific data. Data provenance is seen here as a directed acyclic graph with nodes representing data and vertices representing derivative operations (Moreau 2010). Landing pages can allow a human user to explore the provenance graph back to the primary unprocessed data, thereby also giving credit to the original data producer. As in Earth System Modeling no single infrastructure with complete data lifecycle coverage exists, we propose to split the problem domain in two parts. Project-specific infrastructures such as the German project C3-Grid or the Earth System Grid Federation (ESGF) for CMIP5 data are aware of data and data operations (Toussaint et al. 2012) and can thus detect and accumulate single nodes and vertices in the provenance graph, assigning Handles to data, metadata and software. With a common schema for typed links, the provenance graph is established as downstream infrastructures refer incoming Handles. Data in this context is for example hierarchically structured Earth System model output data, which receives DataCite DOIs only for the most coarse-granular elements. Using Handle tree structures, the lower levels of the hierarchy can also receive Handles, allowing authors to more precisely identify the data they used (Lawrence et al. 2011). We can e.g. define a DOI for just the 2m-temperature variable of CMIP5 data across many CMIP5 experiments or a DOI for model and observational data coming from different sources. The structural elements should be implemented through Handle values at the Handle infrastructure level for two reasons. Handle values are more durable than downstream websites or databases, and thus the provenance chain does not break if individual links become unavailable. Secondly, a single service cannot interpret links if downstream solutions differ in their implementation schemas. Emerging efforts driven by the European Persistent Identifier Consortium (EPIC) aim to establish a default mechanism for structural elements at the Handle level. We motivate to make applications, which take part in the data lifecycle, aware of data derivation provenance and let them provide additional elements to the provenance graph. Since they are also Handles, DataCite DOIs can act as a corner stone and provide an entry point to discover the provenance graph. References B. Lawrence, C. Jones, B. Matthews, S. Pepler, and S. Callaghan, "Citation and peer review of data: Moving towards formal data publication," Int. J. of Digital Curation, vol. 6, no. 2, 2011. L. Moreau, "The foundations for provenance on the web," Foundations and Trends® in Web Science, vol. 2, no. 2-3, pp. 99-241, 2010. F. Toussaint, T. Weigel, H. Thiemann, H. Höck, M. Stockhause: "Application Examples for Handle System Usage", submitted to AGU 2012 session IN009.

  17. Information Science Panel joint meeting with Imaging Science Panel

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Specific activity in information extraction science (taken to include data handling) is needed to: help identify the bounds of practical missions; identify potential data handling and analysis scenarios; identify the required enabling technology; and identify the requirements for a design data base to be used by the disciplines in determining potential parameters for future missions. It was defined that specific analysis topics were a function of the discipline involved, and therefore no attempt was made to define any specific analysis developments required. Rather, it was recognized that a number of generic data handling requirements exist whose solutions cannot be typically supported by the disciplines. The areas of concern were therefore defined as: data handling aspects of system design considerations; enabling technology for data handling, with specific attention to rectification and registration; and enabling technology for analysis. Within each of these areas, the following topics were addressed: state of the art (current status and contributing factors); critical issues; and recommendations for research and/or development.

  18. TDRSS data handling and management system study. Ground station systems for data handling and relay satellite control

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Results of a two-phase study of the (Data Handling and Management System DHMS) are presented. An original baseline DHMS is described. Its estimated costs are presented in detail. The DHMS automates the Tracking and Data Relay Satellite System (TDRSS) ground station's functions and handles both the forward and return link user and relay satellite data passing through the station. Direction of the DHMS is effected via a TDRSS Operations Control Central (OCC) that is remotely located. A composite ground station system, a modified DHMS (MDHMS), was conceptually developed. The MDHMS performs both the DHMS and OCC functions. Configurations and costs are presented for systems using minicomputers and midicomputers. It is concluded that a MDHMS should be configured with a combination of the two computer types. The midicomputers provide the system's organizational direction and computational power, and the minicomputers (or interface processors) perform repetitive data handling functions that relieve the midicomputers of these burdensome tasks.

  19. 10 CFR 1016.24 - Special handling of classified material.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Special handling of classified material. 1016.24 Section 1016.24 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Physical Security § 1016.24 Special handling of classified material. When the Restricted Data contained in material...

  20. 10 CFR 1016.24 - Special handling of classified material.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Special handling of classified material. 1016.24 Section 1016.24 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Physical Security § 1016.24 Special handling of classified material. When the Restricted Data contained in material...

  1. When and how should multiple imputation be used for handling missing data in randomised clinical trials - a practical guide with flowcharts.

    PubMed

    Jakobsen, Janus Christian; Gluud, Christian; Wetterslev, Jørn; Winkel, Per

    2017-12-06

    Missing data may seriously compromise inferences from randomised clinical trials, especially if missing data are not handled appropriately. The potential bias due to missing data depends on the mechanism causing the data to be missing, and the analytical methods applied to amend the missingness. Therefore, the analysis of trial data with missing values requires careful planning and attention. The authors had several meetings and discussions considering optimal ways of handling missing data to minimise the bias potential. We also searched PubMed (key words: missing data; randomi*; statistical analysis) and reference lists of known studies for papers (theoretical papers; empirical studies; simulation studies; etc.) on how to deal with missing data when analysing randomised clinical trials. Handling missing data is an important, yet difficult and complex task when analysing results of randomised clinical trials. We consider how to optimise the handling of missing data during the planning stage of a randomised clinical trial and recommend analytical approaches which may prevent bias caused by unavoidable missing data. We consider the strengths and limitations of using of best-worst and worst-best sensitivity analyses, multiple imputation, and full information maximum likelihood. We also present practical flowcharts on how to deal with missing data and an overview of the steps that always need to be considered during the analysis stage of a trial. We present a practical guide and flowcharts describing when and how multiple imputation should be used to handle missing data in randomised clinical.

  2. Information science team

    NASA Technical Reports Server (NTRS)

    Billingsley, F.

    1982-01-01

    Concerns are expressed about the data handling aspects of system design and about enabling technology for data handling and data analysis. The status, contributing factors, critical issues, and recommendations for investigations are listed for data handling, rectification and registration, and information extraction. Potential supports to individual P.I., research tasks, systematic data system design, and to system operation. The need for an airborne spectrometer class instrument for fundamental research in high spectral and spatial resolution is indicated. Geographic information system formatting and labelling techniques, very large scale integration, and methods for providing multitype data sets must also be developed.

  3. Earth resources ground data handling systems for the 1980's

    NASA Technical Reports Server (NTRS)

    Vanvleck, E. M.; Sinclair, K. F.; Pitts, S. W.; Slye, R. E.

    1973-01-01

    The system requirements of an operational data handling system for earth resources in the decade of the 1980's are investigated. Attention is drawn to problems encountered in meeting the stringent agricultural user requirements of that time frame. Such an understanding of requirements is essential not only in designing the ground system that will ultimately handle the data, but also in design studies of the earth resources platform, sensors, and data relay satellites which may be needed.

  4. Remote-handled/special case TRU waste characterization summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.A.

    1984-02-27

    Remote-handled wastes are stored at Los Alamos, Hanford, Oak Ridge, and the Idaho National Engineering Laboratory. The following will be a site by site discussion of RH waste handling, placement, and container data. This will be followed by a series of data tables that were compiled in the TRU Waste Systems Office. These tables are a compendium of data that is the most up to date and accurate data available today. 2 figures, 10 tables.

  5. Data handling for the modular observatory

    NASA Technical Reports Server (NTRS)

    Taber, J. E.

    1975-01-01

    The current paper summarizes work undertaken at TRW for the EOS satellite and related missions, and it presents conclusions that lead to a flexible and low-cost overall system implementation. It shows how the usual communication and data handling functions must be altered to meet the modularization ground rules, and it demonstrates the modularization that is possible in the handling of wideband payload data both on board and on the ground.

  6. 32 CFR 37.855 - How should I handle protected data?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Award Terms Related to Other Administrative Matters Intellectual Property § 37.855 How should I handle protected data? Prior to releasing or disclosing data marked...

  7. 32 CFR 37.855 - How should I handle protected data?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Award Terms Related to Other Administrative Matters Intellectual Property § 37.855 How should I handle protected data? Prior to releasing or disclosing data marked...

  8. A new approach to handling incoming verifications.

    PubMed

    Luizzo, Anthony; Roy, Bill; Luizzo, Philip

    2016-10-01

    Outside requests for data on current or former employees are handled in different ways by healthcare organizations and present considerable liability risks if a corporate policy for handling such risks is not in place. In this article, the authors present a strategy for responsible handling of sensitive information.

  9. AVIRIS onboard data handling and control

    NASA Technical Reports Server (NTRS)

    Steinkraus, Ronald E.; Hickok, Roger W.

    1987-01-01

    The timing and flow of detector and ancillary data for the Airborne Visible/Infrared imaging spectrometer (AVIRIS) are controlled within the instrument by its digital electronics assembly. In addition to providing detector and signal chain timing, the digital electronics receives, formats, and rate-buffers digitized science data; collects and formats ancillary (calibration and engineering) data; and merges both into a single tape record. Overall AVIRIS data handling is effected by a combination of dedicated digital electronics to control instrument timing, image data flow, and data rate buffering and a microcomputer programmed to handle real-time control of instrument mechanisms and the coordinated preparation of ancillary data.

  10. Standards should be applied in the prevention and handling of missing data for patient-centered outcomes research: a systematic review and expert consensus.

    PubMed

    Li, Tianjing; Hutfless, Susan; Scharfstein, Daniel O; Daniels, Michael J; Hogan, Joseph W; Little, Roderick J A; Roy, Jason A; Law, Andrew H; Dickersin, Kay

    2014-01-01

    To recommend methodological standards in the prevention and handling of missing data for primary patient-centered outcomes research (PCOR). We searched National Library of Medicine Bookshelf and Catalog as well as regulatory agencies' and organizations' Web sites in January 2012 for guidance documents that had formal recommendations regarding missing data. We extracted the characteristics of included guidance documents and recommendations. Using a two-round modified Delphi survey, a multidisciplinary panel proposed mandatory standards on the prevention and handling of missing data for PCOR. We identified 1,790 records and assessed 30 as having relevant recommendations. We proposed 10 standards as mandatory, covering three domains. First, the single best approach is to prospectively prevent missing data occurrence. Second, use of valid statistical methods that properly reflect multiple sources of uncertainty is critical when analyzing missing data. Third, transparent and thorough reporting of missing data allows readers to judge the validity of the findings. We urge researchers to adopt rigorous methodology and promote good science by applying best practices to the prevention and handling of missing data. Developing guidance on the prevention and handling of missing data for observational studies and studies that use existing records is a priority for future research. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. A study to identify research issues in the area of electromagnetic measurements and signal handling of remotely sensed data

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Research issues in the area of electromagnetic measurements and signal handling of remotely sensed data are identified. The following seven issues are discussed; platform/sensor system position and velocity, platform/sensor attitudes and attitude rates, optics and antennas, detectors and associated electronics, sensor calibration, signal handling, and system design.

  12. Investigation of a high speed data handling system for use with multispectral aircraft scanners

    NASA Technical Reports Server (NTRS)

    Kelly, W. L.; Meredith, B. D.

    1978-01-01

    A buffer memory data handling technique for use with multispectral aircraft scanners is presented which allows digital data generated at high data rates to be recorded on magnetic tape. A digital memory is used to temporarily store the data for subsequent recording at slower rates during the passive time of the scan line, thereby increasing the maximum data rate recording capability over real-time recording. Three possible implementations are described and the maximum data rate capability is defined in terms of the speed capability of the key hardware components. The maximum data rates can be used to define the maximum ground resolution achievable by a multispectral aircraft scanner using conventional data handling techniques.

  13. Handling Missing Data With Multilevel Structural Equation Modeling and Full Information Maximum Likelihood Techniques.

    PubMed

    Schminkey, Donna L; von Oertzen, Timo; Bullock, Linda

    2016-08-01

    With increasing access to population-based data and electronic health records for secondary analysis, missing data are common. In the social and behavioral sciences, missing data frequently are handled with multiple imputation methods or full information maximum likelihood (FIML) techniques, but healthcare researchers have not embraced these methodologies to the same extent and more often use either traditional imputation techniques or complete case analysis, which can compromise power and introduce unintended bias. This article is a review of options for handling missing data, concluding with a case study demonstrating the utility of multilevel structural equation modeling using full information maximum likelihood (MSEM with FIML) to handle large amounts of missing data. MSEM with FIML is a parsimonious and hypothesis-driven strategy to cope with large amounts of missing data without compromising power or introducing bias. This technique is relevant for nurse researchers faced with ever-increasing amounts of electronic data and decreasing research budgets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. NASA/ESA CV-990 Spacelab simulation. Appendixes: C, data-handling: Planning and implementation; D, communications; E, mission documentation

    NASA Technical Reports Server (NTRS)

    Reller, J. O., Jr.

    1976-01-01

    Data handling, communications, and documentation aspects of the ASSESS mission are described. Most experiments provided their own data handling equipment, although some used the airborne computer for backup, and one experiment required real-time computations. Communications facilities were set up to simulate those to be provided between Spacelab and the ground, including a downlink TV system. Mission documentation was kept to a minimum and proved sufficient. Examples are given of the basic documents of the mission.

  15. Data handling and representation of freeform surfaces

    NASA Astrophysics Data System (ADS)

    Steinkopf, Ralf; Dick, Lars; Kopf, Tino; Gebhardt, Andreas; Risse, Stefan; Eberhardt, Ramona

    2011-10-01

    Freeform surfaces enable innovative optics. They are not limited by axis symmetry and hence they are almost free in design. They are used to reduce the installation space and enhance the performance of optical elements. State of the art optical design tools are computing with powerful algorithms to simulate freeform surfaces. Even new mathematical approaches are under development /1/. In consequence, new optical designs /2/ are pushing the development of manufacturing processes consequently and novel types of datasets have to proceed through the process chain /3/. The complexity of these data is the huge challenge for the data handling. Because of the asymmetrical and 3-dimensional surfaces of freeforms, large data volumes have to be created, trimmed, extended and fitted. All these processes must be performed without losing the accuracy of the original design data. Additionally, manifold types of geometries results in different kinds of mathematical representations of freeform surfaces and furthermore the used CAD/CAM tools are dealing with a set of spatial transport formats. These are all reasons why manufacture-oriented approaches for the freeform data handling are not yet sufficiently developed. This paper suggests a classification of freeform surfaces based on the manufacturing methods which are offered by diamond machining. The different manufacturing technologies, ranging from servo-turning to shaping, require a differentiated approach for the data handling process. The usage of analytical descriptions in form of splines and polynomials as well as the application of discrete descriptions like point clouds is shown in relation to the previously made classification. Advantages and disadvantages of freeform representations are discussed. Aspects of the data handling in between different process steps are pointed out and suitable exchange formats for freeform data are proposed. The described approach offers the possibility for efficient data handling from optical design to systems in novel optics.

  16. CTEPP STANDARD OPERATING PROCEDURE FOR HANDLING MISSING SAMPLES AND DATA (SOP-2.24)

    EPA Science Inventory

    This SOP describes the method for handling missing samples or data. Missing samples or data will be identified as soon as possible during field sampling. It provides guidance to collect the missing sample or data and document the reason for the missing sample or data.

  17. Best Practices for Missing Data Management in Counseling Psychology

    ERIC Educational Resources Information Center

    Schlomer, Gabriel L.; Bauman, Sheri; Card, Noel A.

    2010-01-01

    This article urges counseling psychology researchers to recognize and report how missing data are handled, because consumers of research cannot accurately interpret findings without knowing the amount and pattern of missing data or the strategies that were used to handle those data. Patterns of missing data are reviewed, and some of the common…

  18. Flight simulator for hypersonic vehicle and a study of NASP handling qualities

    NASA Technical Reports Server (NTRS)

    Ntuen, Celestine A.; Park, Eui H.; Deeb, Joseph M.; Kim, Jung H.

    1992-01-01

    The research goal of the Human-Machine Systems Engineering Group was to study the existing handling quality studies in aircraft with sonic to supersonic speeds and power in order to understand information requirements needed for a hypersonic vehicle flight simulator. This goal falls within the NASA task statements: (1) develop flight simulator for hypersonic vehicle; (2) study NASP handling qualities; and (3) study effects of flexibility on handling qualities and on control system performance. Following the above statement of work, the group has developed three research strategies. These are: (1) to study existing handling quality studies and the associated aircraft and develop flight simulation data characterization; (2) to develop a profile for flight simulation data acquisition based on objective statement no. 1 above; and (3) to develop a simulator and an embedded expert system platform which can be used in handling quality experiments for hypersonic aircraft/flight simulation training.

  19. The Impact of Different Missing Data Handling Methods on DINA Model

    ERIC Educational Resources Information Center

    Sünbül, Seçil Ömür

    2018-01-01

    In this study, it was aimed to investigate the impact of different missing data handling methods on DINA model parameter estimation and classification accuracy. In the study, simulated data were used and the data were generated by manipulating the number of items and sample size. In the generated data, two different missing data mechanisms…

  20. Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.

    PubMed

    Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L

    2016-02-09

    Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be performed, with weakened assumptions regarding the missing data mechanism to explore the robustness of results reported in the primary analysis.

  1. Space shuttle data handling and communications considerations.

    NASA Technical Reports Server (NTRS)

    Stoker, C. J.; Minor, R. G.

    1971-01-01

    Operational and development flight instrumentation, data handling subsystems and communication requirements of the space shuttle orbiter are discussed. Emphasis is made on data gathering methods, crew display data, computer processing, recording, and telemetry by means of a digital data bus. Also considered are overall communication conceptual system aspects and design features allowing a proper specification of telemetry encoders and instrumentation recorders. An adaptive bit rate concept is proposed to handle the telemetry bit rates which vary with the amount of operational and experimental data to be transmitted. A split-phase encoding technique is proposed for telemetry to cope with the excessive bit jitter and low bit transition density which may affect television performance.

  2. Handling Missing Data in Educational Research Using SPSS

    ERIC Educational Resources Information Center

    Cheema, Jehanzeb

    2012-01-01

    This study looked at the effect of a number of factors such as the choice of analytical method, the handling method for missing data, sample size, and proportion of missing data, in order to evaluate the effect of missing data treatment on accuracy of estimation. In order to accomplish this a methodological approach involving simulated data was…

  3. Missing Data and Institutional Research

    ERIC Educational Resources Information Center

    Croninger, Robert G.; Douglas, Karen M.

    2005-01-01

    Many do not consider the effect that missing data have on their survey results nor do they know how to handle missing data. This chapter offers strategies for handling item-missing data and provides a practical example of how these strategies may affect results. The chapter concludes with recommendations for preventing and dealing with missing…

  4. Trade-off analysis of modes of data handling for earth resources (ERS), volume 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Data handling requirements are reviewed for earth observation missions along with likely technology advances. Parametric techniques for synthesizing potential systems are developed. Major tasks include: (1) review of the sensors under development and extensions of or improvements in these sensors; (2) development of mission models for missions spanning land, ocean, and atmosphere observations; (3) summary of data handling requirements including the frequency of coverage, timeliness of dissemination, and geographic relationships between points of collection and points of dissemination; (4) review of data routing to establish ways of getting data from the collection point to the user; (5) on-board data processing; (6) communications link; and (7) ground data processing. A detailed synthesis of three specific missions is included.

  5. Exploratory piloted simulator study of the effects of winglets on handling qualities of a representative agricultural airplane

    NASA Technical Reports Server (NTRS)

    Ogburn, M. E.; Brown, P. W.

    1980-01-01

    The effects on handling qualities of adding winglets to a representative agricultural aircraft configuration during swath-run maneuvering were evaluated. Aerodynamic data used in the simulation were based on low-speed wind tunnel tests of a full scale airplane and a subscale model. The Cooper-Harper handling qualities rating scale, supplementary pilot comments, and pilot vehicle performance data were used to describe the handling qualities of the airplane with the different wing-tip configurations. Results showed that the lateral-directional handling qualities of the airplane were greatly affected by the application of winglets and winglet cant angle. The airplane with winglets canted out 20 deg exhibited severely degraded lateral directional handling qualities in comparison to the basic airplane. When the winglets were canted inward 10 deg, the flying qualities of the configuration were markedly improved over those of the winglet-canted-out configuration or the basic configuration without winglets, indicating that proper tailoring of the winglet design may afford a potential benefit in the area of handling qualities.

  6. Teaching Missing Data Methodology to Undergraduates Using a Group-Based Project within a Six-Week Summer Program

    ERIC Educational Resources Information Center

    Marron, Megan M.; Wahed, Abdus S.

    2016-01-01

    Missing data mechanisms, methods of handling missing data, and the potential impact of missing data on study results are usually not taught until graduate school. However, the appropriate handling of missing data is fundamental to biomedical research and should be introduced earlier on in a student's education. The Summer Institute for Training in…

  7. The Advanced Orbiting Systems Testbed Program: Results to date

    NASA Technical Reports Server (NTRS)

    Otranto, John F.; Newsome, Penny A.

    1994-01-01

    The Consultative Committee for Space Data Systems (CCSDS) Recommendations for Packet Telemetry (PT) and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's (GSFC's) AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations.

  8. Ergonomics of disposable handles for minimally invasive surgery.

    PubMed

    Büchel, D; Mårvik, R; Hallabrin, B; Matern, U

    2010-05-01

    The ergonomic deficiencies of currently available minimally invasive surgery (MIS) instrument handles have been addressed in many studies. In this study, a new ergonomic pistol handle concept, realized as a prototype, and two disposable ring handles were investigated according to ergonomic properties set by new European standards. In this study, 25 volunteers performed four practical tasks to evaluate the ergonomics of the handles used in standard operating procedures (e.g., measuring a suture and cutting to length, precise maneuvering and targeting, and dissection of a gallbladder). Moreover, 20 participants underwent electromyography (EMG) tests to measure the muscle strain they experienced while carrying out the basic functions (grasp, rotate, and maneuver) in the x, y, and z axes. The data measured included the number of errors, the time required for task completion, perception of pressure areas, and EMG data. The values for usability in the test were effectiveness, efficiency, and user satisfaction. Surveys relating to the subjective rating were completed after each task for each of the three handles tested. Each handle except the new prototype caused pressure areas and pain. Extreme differences in muscle strain could not be observed for any of the three handles. Experienced surgeons worked more quickly with the prototype when measuring and cutting a suture (approximately 20%) and during precise maneuvering and targeting (approximately 20%). On the other hand, they completed the dissection task faster with the handle manufactured by Ethicon. Fewer errors were made with the prototype in dissection of the gallbladder. In contrast to the handles available on the market, the prototype was always rated as positive by the volunteers in the subjective surveys. None of the handles could fulfil all of the requirements with top scores. Each handle had its advantages and disadvantages. In contrast to the ring handles, the volunteers could fulfil most of the tasks more efficiently using the prototype handle without any remarkable pressure areas, cramps, or pain.

  9. Individual Information-Centered Approach for Handling Physical Activity Missing Data

    ERIC Educational Resources Information Center

    Kang, Minsoo; Rowe, David A.; Barreira, Tiago V.; Robinson, Terrance S.; Mahar, Matthew T.

    2009-01-01

    The purpose of this study was to validate individual information (II)-centered methods for handling missing data, using data samples of 118 middle-aged adults and 91 older adults equipped with Yamax SW-200 pedometers and Actigraph accelerometers for 7 days. We used a semisimulation approach to create six data sets: three physical activity outcome…

  10. Handling Missing Data in Structural Equation Models in R: A Replication Study for Applied Researchers

    ERIC Educational Resources Information Center

    Wolgast, Anett; Schwinger, Malte; Hahnel, Carolin; Stiensmeier-Pelster, Joachim

    2017-01-01

    Introduction: Multiple imputation (MI) is one of the most highly recommended methods for replacing missing values in research data. The scope of this paper is to demonstrate missing data handling in SEM by analyzing two modified data examples from educational psychology, and to give practical recommendations for applied researchers. Method: We…

  11. [Prevention and handling of missing data in clinical trials].

    PubMed

    Jiang, Zhi-wei; Li, Chan-juan; Wang, Ling; Xia, Jie-lai

    2015-11-01

    Missing data is a common but unavoidable issue in clinical trials. It not only lowers the trial power, but brings the bias to the trial results. Therefore, on one hand, the missing data handling methods are employed in data analysis. On the other hand, it is vital to prevent the missing data in the trials. Prevention of missing data should take the first place. From the perspective of data, firstly, some measures should be taken at the stages of protocol design, data collection and data check to enhance the patients' compliance and reduce the unnecessary missing data. Secondly, the causes of confirmed missing data in the trials should be notified and recorded in detail, which are very important to determine the mechanism of missing data and choose the suitable missing data handling methods, e.g., last observation carried forward (LOCF); multiple imputation (MI); mixed-effect model repeated measure (MMRM), etc.

  12. A Look at Technologies Vis-a-vis Information Handling Techniques.

    ERIC Educational Resources Information Center

    Swanson, Rowena W.

    The paper examines several ideas for information handling implemented with new technologies that suggest directions for future development. These are grouped under the topic headings: Handling Large Data Banks, Providing Personalized Information Packages, Providing Information Specialist Services, and Expanding Man-Machine Interaction. Guides in…

  13. CTEPP STANDARD OPERATING PROCEDURE FOR HANDLING SAMPLE AND DATA CUSTODY (SOP-2.26)

    EPA Science Inventory

    This SOP describes the method for handling sample custody. A standardized Chain-of-Custody (CoC) Record is used to document the sample/data custody. Each participant is assigned one CoC Record for the samples/data collected at their home and/or day care center.

  14. Comparison of Two Approaches for Handling Missing Covariates in Logistic Regression

    ERIC Educational Resources Information Center

    Peng, Chao-Ying Joanne; Zhu, Jin

    2008-01-01

    For the past 25 years, methodological advances have been made in missing data treatment. Most published work has focused on missing data in dependent variables under various conditions. The present study seeks to fill the void by comparing two approaches for handling missing data in categorical covariates in logistic regression: the…

  15. Early Fuel Cell Market Demonstrations | Hydrogen and Fuel Cells | NREL

    Science.gov Websites

    Handling Equipment Data Collection and Analysis: 2015 Report, DOE Hydrogen and Fuel Cells Program Annual Progress Report (December 2015) Material Handling Equipment Data Collection and Analysis: 2015 Review, DOE Technical Report (March 2015) 2014 Forklift and Backup Power Data Collection and Analysis: 2014 Report, DOE

  16. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert B.

    1994-01-01

    Software was developed with NASA support to aid in the analysis and display of the massive amounts of data generated from satellites, observational field programs, and from model simulations. This software was developed in the context of the PATHFINDER (Probing ATmospHeric Flows in an Interactive and Distributed EnviRonment) Project. The overall aim of this project is to create a flexible, modular, and distributed environment for data handling, modeling simulations, data analysis, and visualization of atmospheric and fluid flows. Software completed with NASA support includes GEMPAK analysis, data handling, and display modules for which collaborators at NASA had primary responsibility, and prototype software modules for three-dimensional interactive and distributed control and display as well as data handling, for which NSCA was responsible. Overall process control was handled through a scientific and visualization application builder from Silicon Graphics known as the Iris Explorer. In addition, the GEMPAK related work (GEMVIS) was also ported to the Advanced Visualization System (AVS) application builder. Many modules were developed to enhance those already available in Iris Explorer including HDF file support, improved visualization and display, simple lattice math, and the handling of metadata through development of a new grid datatype. Complete source and runtime binaries along with on-line documentation is available via the World Wide Web at: http://redrock.ncsa.uiuc.edu/ PATHFINDER/pathre12/top/top.html.

  17. Performance of the Magnetospheric Multiscale central instrument data handling

    NASA Astrophysics Data System (ADS)

    Klar, Robert A.; Miller, Scott A.; Brysch, Michael L.; Bertrand, Allison R.

    In order to study the fundamental physical processes of magnetic reconnection, particle acceleration and turbulence, the Magnetospheric Multiscale (MMS) mission employs a constellation of four identically configured observatories, each with a suite of complementary science instruments. Southwest Research Institute® (SwRI® ) developed the Central Instrument Data Processor (CIDP) to handle the large data volume associated with these instruments. The CIDP is an integrated access point between the instruments and the spacecraft. It provides synchronization pulses, relays telecommands, and gathers instrument housekeeping telemetry. It collects science data from the instruments and stores it to a mass memory for later playback to a ground station. This paper retrospectively examines the data handling performance realized by the CIDP implementation. It elaborates on some of the constraints on the hardware and software designs and the resulting effects on performance. For the hardware, it discusses the limitations of the front-end electronics input/output (I/O) architecture and associated mass memory buffering. For the software, it discusses the limitations of the Consultative Committee for Space Data Systems (CCSDS) File Delivery Protocol (CFDP) implementation and the data structure choices for file management. It also describes design changes that improve data handling performance in newer designs.

  18. Missing Data in the Field of Otorhinolaryngology and Head & Neck Surgery: Need for Improvement.

    PubMed

    Netten, Anouk P; Dekker, Friedo W; Rieffe, Carolien; Soede, Wim; Briaire, Jeroen J; Frijns, Johan H M

    Clinical studies are often facing missing data. Data can be missing for various reasons, for example, patients moved, certain measurements are only administered in high-risk groups, and patients are unable to attend clinic because of their health status. There are various ways to handle these missing data (e.g., complete cases analyses, mean substitution). Each of these techniques potentially influences both the analyses and the results of a study. The first aim of this structured review was to analyze how often researchers in the field of otorhinolaryngology/head & neck surgery report missing data. The second aim was to systematically describe how researchers handle missing data in their analyses. The third aim was to provide a solution on how to deal with missing data by means of the multiple imputation technique. With this review, we aim to contribute to a higher quality of reporting in otorhinolaryngology research. Clinical studies among the 398 most recently published research articles in three major journals in the field of otorhinolaryngology/head & neck surgery were analyzed based on how researchers reported and handled missing data. Of the 316 clinical studies, 85 studies reported some form of missing data. Of those 85, only a small number (12 studies, 3.8%) actively handled the missingness in their data. The majority of researchers exclude incomplete cases, which results in biased outcomes and a drop in statistical power. Within otorhinolaryngology research, missing data are largely ignored and underreported, and consequently, handled inadequately. This has major impact on the results and conclusions drawn from this research. Based on the outcomes of this review, we provide solutions on how to deal with missing data. To illustrate, we clarify the use of multiple imputation techniques, which recently became widely available in standard statistical programs.

  19. Quantifying the effect of editor-author relations on manuscript handling times.

    PubMed

    Sarigöl, Emre; Garcia, David; Scholtes, Ingo; Schweitzer, Frank

    2017-01-01

    In this article we study to what extent the academic peer review process is influenced by social relations between the authors of a manuscript and the editor handling the manuscript. Taking the open access journal PlosOne as a case study, our analysis is based on a data set of more than 100,000 articles published between 2007 and 2015. Using available data on handling editor, submission and acceptance time of manuscripts, we study the question whether co-authorship relations between authors and the handling editor affect the manuscript handling time , i.e. the time taken between the submission and acceptance of a manuscript. Our analysis reveals (1) that editors handle papers co-authored by previous collaborators significantly more often than expected at random, and (2) that such prior co-author relations are significantly related to faster manuscript handling. Addressing the question whether these shorter manuscript handling times can be explained by the quality of publications, we study the number of citations and downloads which accepted papers eventually accumulate. Moreover, we consider the influence of additional (social) factors, such as the editor's experience, the topical similarity between authors and editors, as well as reciprocal citation relations between authors and editors. Our findings show that, even when correcting for other factors like time, experience, and performance, prior co-authorship relations have a large and significant influence on manuscript handling times, speeding up the editorial decision on average by 19 days.

  20. Proposal for a Web Encoding Service (wes) for Spatial Data Transactio

    NASA Astrophysics Data System (ADS)

    Siew, C. B.; Peters, S.; Rahman, A. A.

    2015-10-01

    Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.

  1. Statistical primer: how to deal with missing data in scientific research?

    PubMed

    Papageorgiou, Grigorios; Grant, Stuart W; Takkenberg, Johanna J M; Mokhles, Mostafa M

    2018-05-10

    Missing data are a common challenge encountered in research which can compromise the results of statistical inference when not handled appropriately. This paper aims to introduce basic concepts of missing data to a non-statistical audience, list and compare some of the most popular approaches for handling missing data in practice and provide guidelines and recommendations for dealing with and reporting missing data in scientific research. Complete case analysis and single imputation are simple approaches for handling missing data and are popular in practice, however, in most cases they are not guaranteed to provide valid inferences. Multiple imputation is a robust and general alternative which is appropriate for data missing at random, surpassing the disadvantages of the simpler approaches, but should always be conducted with care. The aforementioned approaches are illustrated and compared in an example application using Cox regression.

  2. GENFAS- Decentralised PUS-Based Data Handling Software Using SOIS and SpaceWire

    NASA Astrophysics Data System (ADS)

    Fowell, Stuart D.; Wheeler, Simon; Mendham, Peter; Gasti, Wahida

    2011-08-01

    This paper describes GenFAS, a decentralised PUS- based Data Handling onboard software architecture, based on the SOIS and SpaceWire communication specifications. GenFAS was initially developed for and deployed on the MARC system under an ESA GSTP contract.

  3. Ruminating on Coursework

    ERIC Educational Resources Information Center

    Povey, Hilary

    2006-01-01

    GCSE (General Certificate of Secondary Education) Handling Data coursework should be a golden opportunity to engage students in meaningful "real-life" mathematics. The marking criteria for GCSE handling data coursework puts emphasis on students' ability to plan and carry out a statistical project and to make meaningful analysis of the…

  4. Understanding Skill in EVA Mass Handling. Volume 2; Empirical Investigation

    NASA Technical Reports Server (NTRS)

    Riccio, Gary; McDonald, Vernon; Peters, Brian; Layne, Charles; Bloomberg, Jacob

    1997-01-01

    In this report we describe the details of our empirical protocol effort investigating skill in extravehicular mass handling using NASA's principal mass handling simulator, the precision air bearing floor. Contents of this report include a description of the necessary modifications to the mass handling simulator; choice of task, and the description of an operationally relevant protocol. Our independent variables are presented in the context of the specific operational issues they were designed to simulate. The explanation of our dependent variables focuses on the specific data processing procedures used to transform data from common laboratory instruments into measures that are relevant to a special class of nested control systems (discussed in Volume 1): manual interactions between an individual and the substantial environment. The data reduction is explained in the context of the theoretical foundation described in Volume 1. Finally as a preface to the presentation of the empirical data in Volume 3 of this report series, a set of detailed hypotheses is presented.

  5. The Effect of Missing Data Handling Methods on Goodness of Fit Indices in Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Köse, Alper

    2014-01-01

    The primary objective of this study was to examine the effect of missing data on goodness of fit statistics in confirmatory factor analysis (CFA). For this aim, four missing data handling methods; listwise deletion, full information maximum likelihood, regression imputation and expectation maximization (EM) imputation were examined in terms of…

  6. LACIE data-handling techniques

    NASA Technical Reports Server (NTRS)

    Waits, G. H. (Principal Investigator)

    1979-01-01

    Techniques implemented to facilitate processing of LANDSAT multispectral data between 1975 and 1978 are described. The data that were handled during the large area crop inventory experiment and the storage mechanisms used for the various types of data are defined. The overall data flow, from the placing of the LANDSAT orders through the actual analysis of the data set, is discussed. An overview is provided of the status and tracking system that was developed and of the data base maintenance and operational task. The archiving of the LACIE data is explained.

  7. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less

  8. 40 CFR 75.10 - General operating requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... continuous emission monitoring system and a flow monitoring system with an automated data acquisition and handling system for measuring and recording SO2 concentration (in ppm), volumetric gas flow (in scfh), and... emission monitoring system and a flow monitoring system with an automated data acquisition and handling...

  9. Care and Handling of Computer Magnetic Storage Media.

    ERIC Educational Resources Information Center

    Geller, Sidney B.

    Intended for use by data processing installation managers, operating personnel, and technical staff, this publication provides a comprehensive set of care and handling guidelines for the physical/chemical preservation of computer magnetic storage media--principally computer magnetic tapes--and their stored data. Emphasis is placed on media…

  10. Introducing Python tools for magnetotellurics: MTpy

    NASA Astrophysics Data System (ADS)

    Krieger, L.; Peacock, J.; Inverarity, K.; Thiel, S.; Robertson, K.

    2013-12-01

    Within the framework of geophysical exploration techniques, the magnetotelluric method (MT) is relatively immature: It is still not as widely spread as other geophysical methods like seismology, and its processing schemes and data formats are not thoroughly standardized. As a result, the file handling and processing software within the academic community is mainly based on a loose collection of codes, which are sometimes highly adapted to the respective local specifications. Although tools for the estimation of the frequency dependent MT transfer function, as well as inversion and modelling codes, are available, the standards and software for handling MT data are generally not unified throughout the community. To overcome problems that arise from missing standards, and to simplify the general handling of MT data, we have developed the software package "MTpy", which allows the handling, processing, and imaging of magnetotelluric data sets. It is written in Python and the code is open-source. The setup of this package follows the modular approach of successful software packages like GMT or Obspy. It contains sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides pure Python classes and functions, MTpy provides wrappers and convenience scripts to call external software, e.g. modelling and inversion codes. Even though still under development, MTpy already contains ca. 250 functions that work on raw and preprocessed data. However, as our aim is not to produce a static collection of software, we rather introduce MTpy as a flexible framework, which will be dynamically extended in the future. It then has the potential to help standardise processing procedures and at same time be a versatile supplement for existing algorithms. We introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing utilising MTpy on an example data set collected over a geothermal exploration site in South Australia. Workflow of MT data processing. Within the structural diagram, the MTpy sub-packages are shown in red (time series data processing), green (handling of EDI files and impedance tensor data), yellow (connection to modelling/inversion algorithms), black (impedance tensor interpretation, e.g. by Phase Tensor calculations), and blue (generation of visual representations, e.g pseudo sections or resistivity models).

  11. IEEE 1393 Spaceborne Fiber Optic Data Bus: A Standard Approach to On-Board Payload Data Handling Networks for the AIAA Space Technology Conference and Exposition "Partnering in the 21th Century"

    NASA Technical Reports Server (NTRS)

    Andrucyk, Dennis J.; Orlando, Fred J.; Chalfant, Charles H.

    1999-01-01

    The Spaceborne Fiber Optic Data Bus (SFODB) is the next generation in on-board data handling networks. It will do for high speed payloads what SAE 1773 has done for on-board command and telemetry systems. That is, it will significantly reduce the cost of payload development, integration and test through interface standardization. As defined in IEEE 1393, SFODB is a 1 Gb/s, fiber optic network specifically designed to support the real-time, on-board data handling requirements of remote sensing spacecraft. The network is highly reliable, fault tolerant, and capable of withstanding the rigors of launch and the harsh space environment. SFODB achieves this operational and environmental performance while maintaining the small size, light weight, and low power necessary for spaceborne applications. SFODB was developed jointly by DoD and NASA GSFC to meet the on-board data handling needs of Remote Sensing satellites. This jointly funded project produced a complete set of flight transmitters, receivers and protocol ASICS; a complete Development & Evaluation System; and, the IEEE 1393 standard.

  12. Provision of Information to the Research Staff.

    ERIC Educational Resources Information Center

    Williams, Martha E.

    The Information Sciences section at Illinois Institute of Technology Research Institute (IITRI) is now operating a Computer Search Center (CSC) for handling numerous machine-readable data bases. The computer programs are generalized in the sense that they will handle any incoming data base. This is accomplished by means of a preprocessor system…

  13. 40 CFR 65.161 - Continuous records and monitoring system data handling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Routing to a Fuel Gas System or a Process § 65.161 Continuous records and monitoring system data handling...) Monitoring system breakdowns, repairs, preventive maintenance, calibration checks, and zero (low-level) and... section unless an alternative monitoring or recordkeeping system has been requested and approved under...

  14. 40 CFR 65.161 - Continuous records and monitoring system data handling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Routing to a Fuel Gas System or a Process § 65.161 Continuous records and monitoring system data handling...) Monitoring system breakdowns, repairs, preventive maintenance, calibration checks, and zero (low-level) and... section unless an alternative monitoring or recordkeeping system has been requested and approved under...

  15. An EEG Data Investigation Using Only Artifacts

    DTIC Science & Technology

    2017-02-22

    approach, called artifact separation, was developed to enable the consumer of the EEG data to decide how to handle artifacts. The current...mediation approach, called artifact separation, was developed to enable the consumer of the EEG data to decide how to handle artifacts. The current...contaminated. Having the spectral results flagged as containing an artifact, means that the consumer of the data has the freedom to decide how to

  16. Study design and data analysis considerations for the discovery of prognostic molecular biomarkers: a case study of progression free survival in advanced serous ovarian cancer.

    PubMed

    Qin, Li-Xuan; Levine, Douglas A

    2016-06-10

    Accurate discovery of molecular biomarkers that are prognostic of a clinical outcome is an important yet challenging task, partly due to the combination of the typically weak genomic signal for a clinical outcome and the frequently strong noise due to microarray handling effects. Effective strategies to resolve this challenge are in dire need. We set out to assess the use of careful study design and data normalization for the discovery of prognostic molecular biomarkers. Taking progression free survival in advanced serous ovarian cancer as an example, we conducted empirical analysis on two sets of microRNA arrays for the same set of tumor samples: arrays in one set were collected using careful study design (that is, uniform handling and randomized array-to-sample assignment) and arrays in the other set were not. We found that (1) handling effects can confound the clinical outcome under study as a result of chance even with randomization, (2) the level of confounding handling effects can be reduced by data normalization, and (3) good study design cannot be replaced by post-hoc normalization. In addition, we provided a practical approach to define positive and negative control markers for detecting handling effects and assessing the performance of a normalization method. Our work showcased the difficulty of finding prognostic biomarkers for a clinical outcome of weak genomic signals, illustrated the benefits of careful study design and data normalization, and provided a practical approach to identify handling effects and select a beneficial normalization method. Our work calls for careful study design and data analysis for the discovery of robust and translatable molecular biomarkers.

  17. The Effect of Gentle Handling on Depressive-Like Behavior in Adult Male Mice: Considerations for Human and Rodent Interactions in the Laboratory

    PubMed Central

    Lane, Christina; Torres, Julio; Flinn, Jane

    2018-01-01

    Environmental factors play a significant role in well-being of laboratory animals. Regulations and guidelines recommend, if not require, that stressors such as bright lighting, smells, and noises are eliminated or reduced to maximize animal well-being. A factor that is often overlooked is handling and how researchers interact with their animals. Researchers, lab assistants, and husbandry staff in animal facilities may use inconsistent handling methods when interacting with rodents, but humans should be considered a part of the animal's social environment. This study examined the effects of different handling techniques on depressive-like behavior, measured by the Porsolt forced swim test, in adult C57BL/6J male mice. The same two researchers handled the mice in a gentle, aggressive, or minimal (control) fashion over approximately two weeks prior to testing. The results demonstrated a beneficial effect of gentle handling: gentle handling reduced swimming immobility in the forced swim test compared to mice that were aggressively or minimally handled. We argue that gentle handling, rather than methodical handling, can foster a better relationship between the handlers and rodents. Although handling is not standardized across labs, consistent gentle handling allows for less challenging behavioral testing, better data collection, and overall improved animal welfare. PMID:29692869

  18. The Effect of Gentle Handling on Depressive-Like Behavior in Adult Male Mice: Considerations for Human and Rodent Interactions in the Laboratory.

    PubMed

    Neely, Caroline; Lane, Christina; Torres, Julio; Flinn, Jane

    2018-01-01

    Environmental factors play a significant role in well-being of laboratory animals. Regulations and guidelines recommend, if not require, that stressors such as bright lighting, smells, and noises are eliminated or reduced to maximize animal well-being. A factor that is often overlooked is handling and how researchers interact with their animals. Researchers, lab assistants, and husbandry staff in animal facilities may use inconsistent handling methods when interacting with rodents, but humans should be considered a part of the animal's social environment. This study examined the effects of different handling techniques on depressive-like behavior, measured by the Porsolt forced swim test, in adult C57BL/6J male mice. The same two researchers handled the mice in a gentle, aggressive, or minimal (control) fashion over approximately two weeks prior to testing. The results demonstrated a beneficial effect of gentle handling: gentle handling reduced swimming immobility in the forced swim test compared to mice that were aggressively or minimally handled. We argue that gentle handling, rather than methodical handling, can foster a better relationship between the handlers and rodents. Although handling is not standardized across labs, consistent gentle handling allows for less challenging behavioral testing, better data collection, and overall improved animal welfare.

  19. Laboratory Activity on Sample Handling and Maintaining a Laboratory Notebook through Simple pH Measurements

    ERIC Educational Resources Information Center

    Erdmann, Mitzy A.; March, Joe L.

    2016-01-01

    Sample handling and laboratory notebook maintenance are necessary skills but can seem abstract if not presented to students in context. An introductory exercise focusing on proper sample handling, data collection and laboratory notebook keeping for the general chemistry laboratory was developed to emphasize the importance of keeping an accurate…

  20. Data handling and analysis for the 1971 corn blight watch experiment.

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.; Phillips, T. L.; Landgrebe, D. A.

    1972-01-01

    Review of the data handling and analysis methods used in the near-operational test of remote sensing systems provided by the 1971 corn blight watch experiment. The general data analysis techniques and, particularly, the statistical multispectral pattern recognition methods for automatic computer analysis of aircraft scanner data are described. Some of the results obtained are examined, and the implications of the experiment for future data communication requirements of earth resource survey systems are discussed.

  1. Handling Practices of Fresh Leafy Greens in Restaurants: Receiving and Training†

    PubMed Central

    COLEMAN, ERIK; DELEA, KRISTIN; EVERSTINE, KAREN; REIMANN, DAVID; RIPLEY, DANNY

    2015-01-01

    Multiple foodborne illness outbreaks have been associated with the consumption of fresh produce. Investigations have indicated that microbial contamination throughout the farm-to-fork continuum often contributed to these outbreaks. Researchers have hypothesized that handling practices for leafy greens in restaurants may support contamination by and proliferation and amplification of pathogens that cause foodborne illness outbreaks. However, limited data are available on how workers handle leafy greens in restaurants. The purpose of this study was to collect descriptive data on handling practices of leafy greens in restaurants, including restaurant characteristics, types of leafy greens used, produce receipt, and food safety training and certification. As a federal collaborative partner with the Environmental Health Specialists Network (EHS-Net) of the Centers for Disease Control and Prevention, the U.S. Food and Drug Administration (FDA) recommended that EHS-Net participants survey handling practices for leafy greens in restaurants. The recommendations in the FDA’s Guide to Minimize Microbial Food Safety Hazards of Leafy Greens are significant to this study for comparison of the results. The survey revealed that appropriate handling procedures assist in the mitigation of other unsafe handling practices for leafy greens. These results are significant because the FDA guidance for the safe handling of leafy greens was not available until 2009, after the survey had been completed. The information provided from this study can be used to promote additional efforts that will assist in developing interventions to prevent future foodborne illness outbreaks associated with leafy greens. PMID:24290691

  2. CFDP Evolutions and File Based Operations

    NASA Astrophysics Data System (ADS)

    Valverde, Alberto; Taylor, Chris; Magistrati, Giorgio; Maiorano, Elena; Colombo, Cyril; Haddow, Colin

    2015-09-01

    The complexity of the scientific ESA missions in terms of data handling requirements has been steadily increasing in the last years. The availability of high speed telemetry links to ground, the increase on the data storage capacity, as well as the processing performance of the spacecraft avionics have enabled this process. Nowadays, it is common to find missions with hundreds of gigabytes of daily on-board generated data, with terabytes of on-board mass memories and with downlinks of several hundreds of megabits per second. This technological trends push an upgrade on the spacecraft data handling and operation concept, smarter solutions are needed to sustain such high data rates and volumes, while improving the on-board autonomy and easing operations. This paper describes the different activities carried out to adapt to the new data handling scenario. It contains an analysis of the proposed operations concept for file-based spacecrafts, including the updates on the PUS and CFDP standards.

  3. An application of the Multi-Purpose System Simulation /MPSS/ model to the Monitor and Control Display System /MACDS/ at the National Aeronautics and Space Administration /NASA/ Goddard Space Flight Center /GSFC/

    NASA Technical Reports Server (NTRS)

    Mill, F. W.; Krebs, G. N.; Strauss, E. S.

    1976-01-01

    The Multi-Purpose System Simulator (MPSS) model was used to investigate the current and projected performance of the Monitor and Control Display System (MACDS) at the Goddard Space Flight Center in processing and displaying launch data adequately. MACDS consists of two interconnected mini-computers with associated terminal input and display output equipment and a disk-stored data base. Three configurations of MACDS were evaluated via MPSS and their performances ascertained. First, the current version of MACDS was found inadequate to handle projected launch data loads because of unacceptable data backlogging. Second, the current MACDS hardware with enhanced software was capable of handling two times the anticipated data loads. Third, an up-graded hardware ensemble combined with the enhanced software was capable of handling four times the anticipated data loads.

  4. A new compact and low cost Langmuir Probe and associated onboard data handling system for CubeSat

    NASA Astrophysics Data System (ADS)

    Muralikrishna, Polinaya; Domingos, Sinval; Paredes, Andres; Abrahão Dos Santos, Walter

    2016-07-01

    A new compact and low cost Langmuir Probe and associated onboard data handling system are being developed at Instituto Nacional de Pesquisas Espaciais for launching on board one of the future 2U CubeSat missions. The system is a simplified and compacted version of the Langmuir Probe payloads launched on board several Brazilian SONDA III rockets and also developed for the Brazilian scientific satellites SACI-1 and SACI-2. The onboard data handling system will have the dual functions of preprocessing the data collected by the Langmuir Probe and acting as the interface between the experiment and the on board computer. The Langmuir Probe sensor in the form of two rectangular stainless steel strips of total surface area of approximately 80cm2 will be deployed soon after the injection of the CubeSat into orbit. A sweep voltage varying linearly from 0V to 3.0V in about 1.5 seconds and then remaining fixed at 3.0V for 1 second will be applied to the LP sensor to obtain both the electron density and electron temperature. A high sensitivity preamplifier will be used to convert the sensor current expected to be in the range of a few nano amperes to a few micro amperes into a varying potential. In order to cover the large dynamic range of the expected sensor current the preamplifier output will be further amplified by a logarithmic amplifier before being sampled and sent to the data handling system. The data handling system is projected to handle 8 analog channels and 4 digital words of 8 bits each. The incoming data will be stored in a RAM and later sent to the on board computer using a serial RS422 communication protocol. The interface unit will process the telecommands received from the on board computer. The interface is also projected to do FFT analysis of the LP sensor data and send the averaged FFT spectral amplitudes in place of the original unprocessed data. The system details are presented here.

  5. Converting Constant Volume, Multizone Air Handling Systems to Energy Efficient Variable Air Volume Multizone Systems

    DTIC Science & Technology

    2017-10-26

    30. Energy Information Agency Natural Gas Price Data ..................................................................................... 65 Figure...different market sectors (residential, commercial, and industrial). Figure 30. Energy Information Agency Natural Gas Price Data 7.2.3 AHU Size...1 FINAL REPORT Converting Constant Volume, Multizone Air Handling Systems to Energy Efficient Variable Air Volume Multizone

  6. On-line computer system for use with low- energy nuclear physics experiments is reported

    NASA Technical Reports Server (NTRS)

    Gemmell, D. S.

    1969-01-01

    Computer program handles data from low-energy nuclear physics experiments which utilize the ND-160 pulse-height analyzer and the PHYLIS computing system. The program allows experimenters to choose from about 50 different basic data-handling functions and to prescribe the order in which these functions will be performed.

  7. 77 FR 51930 - Approval and Promulgation of Air Quality Implementation Plans; Pennsylvania; Attainment Plan for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-28

    ... for the Philadelphia Area. While the monitoring data that show the Philadelphia Area attained the 1997... sources (such as cargo handling equipment) at ports. Activity data for land-based sources collected from... emission estimates. EPA also verified that land-based sources for cargo handling equipment, such as...

  8. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  9. A hybrid group method of data handling with discrete wavelet transform for GDP forecasting

    NASA Astrophysics Data System (ADS)

    Isa, Nadira Mohamed; Shabri, Ani

    2013-09-01

    This study is proposed the application of hybridization model using Group Method of Data Handling (GMDH) and Discrete Wavelet Transform (DWT) in time series forecasting. The objective of this paper is to examine the flexibility of the hybridization GMDH in time series forecasting by using Gross Domestic Product (GDP). A time series data set is used in this study to demonstrate the effectiveness of the forecasting model. This data are utilized to forecast through an application aimed to handle real life time series. This experiment compares the performances of a hybrid model and a single model of Wavelet-Linear Regression (WR), Artificial Neural Network (ANN), and conventional GMDH. It is shown that the proposed model can provide a promising alternative technique in GDP forecasting.

  10. Medical-Information-Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, Sidney; Friedman, Carl A.; Frankowski, James W.

    1989-01-01

    Medical Information Management System (MIMS) computer program interactive, general-purpose software system for storage and retrieval of information. Offers immediate assistance where manipulation of large data bases required. User quickly and efficiently extracts, displays, and analyzes data. Used in management of medical data and handling all aspects of data related to care of patients. Other applications include management of data on occupational safety in public and private sectors, handling judicial information, systemizing purchasing and procurement systems, and analyses of cost structures of organizations. Written in Microsoft FORTRAN 77.

  11. Landsat 7 Science Data Processing: An Overview

    NASA Technical Reports Server (NTRS)

    Schweiss, Robert J.; Daniel, Nathaniel E.; Derrick, Deborah K.

    2000-01-01

    The Landsat 7 Science Data Processing System, developed by NASA for the Landsat 7 Project, provides the science data handling infrastructure used at the Earth Resources Observation Systems (EROS) Data Center (EDC) Landsat Data Handling Facility (DHF) of the United States Department of Interior, United States Geological Survey (USGS) located in Sioux Falls, South Dakota. This paper presents an overview of the Landsat 7 Science Data Processing System and details of the design, architecture, concept of operation, and management aspects of systems used in the processing of the Landsat 7 Science Data.

  12. Alternative Methods for Handling Attrition

    PubMed Central

    Foster, E. Michael; Fang, Grace Y.

    2009-01-01

    Using data from the evaluation of the Fast Track intervention, this article illustrates three methods for handling attrition. Multiple imputation and ignorable maximum likelihood estimation produce estimates that are similar to those based on listwise-deleted data. A panel selection model that allows for selective dropout reveals that highly aggressive boys accumulate in the treatment group over time and produces a larger estimate of treatment effect. In contrast, this model produces a smaller treatment effect for girls. The article's conclusion discusses the strengths and weaknesses of the alternative approaches and outlines ways in which researchers might improve their handling of attrition. PMID:15358906

  13. Dynamic stability and handling qualities tests on a highly augmented, statically unstable airplane

    NASA Technical Reports Server (NTRS)

    Gera, Joseph; Bosworth, John T.

    1987-01-01

    Novel flight test and analysis techniques in the flight dynamics and handling qualities area are described. These techniques were utilized at NASA Ames-Dryden during the initial flight envelope clearance of the X-29A aircraft. It is shown that the open-loop frequency response of an aircraft with highly relaxed static stability can be successfully computed on the ground from telemetry data. Postflight closed-loop frequency response data were obtained from pilot-generated frequency sweeps and it is found that the current handling quality requirements for high-maneuverability aircraft are generally applicable to the X-29A.

  14. Achieving accurate compound concentration in cell-based screening: validation of acoustic droplet ejection technology.

    PubMed

    Grant, Richard John; Roberts, Karen; Pointon, Carly; Hodgson, Clare; Womersley, Lynsey; Jones, Darren Craig; Tang, Eric

    2009-06-01

    Compound handling is a fundamental and critical step in compound screening throughout the drug discovery process. Although most compound-handling processes within compound management facilities use 100% DMSO solvent, conventional methods of manual or robotic liquid-handling systems in screening workflows often perform dilutions in aqueous solutions to maintain solvent tolerance of the biological assay. However, the use of aqueous media in these applications can lead to suboptimal data quality due to compound carryover or precipitation during the dilution steps. In cell-based assays, this effect is worsened by the unpredictable physical characteristics of compounds and the low DMSO tolerance within the assay. In some cases, the conventional approaches using manual or automated liquid handling resulted in variable IC(50) dose responses. This study examines the cause of this variability and evaluates the accuracy of screening data in these case studies. A number of liquid-handling options have been explored to address the issues and establish a generic compound-handling workflow to support cell-based screening across our screening functions. The authors discuss the validation of the Labcyte Echo reformatter as an effective noncontact solution for generic compound-handling applications against diverse compound classes using triple-quad liquid chromatography/mass spectrometry. The successful validation and implementation challenges of this technology for direct dosing onto cells in cell-based screening is discussed.

  15. A Bit-Encoding Based New Data Structure for Time and Memory Efficient Handling of Spike Times in an Electrophysiological Setup.

    PubMed

    Ljungquist, Bengt; Petersson, Per; Johansson, Anders J; Schouenborg, Jens; Garwicz, Martin

    2018-04-01

    Recent neuroscientific and technical developments of brain machine interfaces have put increasing demands on neuroinformatic databases and data handling software, especially when managing data in real time from large numbers of neurons. Extrapolating these developments we here set out to construct a scalable software architecture that would enable near-future massive parallel recording, organization and analysis of neurophysiological data on a standard computer. To this end we combined, for the first time in the present context, bit-encoding of spike data with a specific communication format for real time transfer and storage of neuronal data, synchronized by a common time base across all unit sources. We demonstrate that our architecture can simultaneously handle data from more than one million neurons and provide, in real time (< 25 ms), feedback based on analysis of previously recorded data. In addition to managing recordings from very large numbers of neurons in real time, it also has the capacity to handle the extensive periods of recording time necessary in certain scientific and clinical applications. Furthermore, the bit-encoding proposed has the additional advantage of allowing an extremely fast analysis of spatiotemporal spike patterns in a large number of neurons. Thus, we conclude that this architecture is well suited to support current and near-future Brain Machine Interface requirements.

  16. DOE's Remote-Handled TRU Waste Characterization Program: Implementation Plan

    EPA Pesticide Factsheets

    Remote-handled (RH) transuranic (TRU) waste characterization, which involves obtaining chemical, radiological, and physical data, is a primary component of ensuring compliance of the Waste Isolation Pilot Plant (WIPP) with regulatory requirements.

  17. [Nursing workers' perceptions regarding the handling of hazardous chemical waste].

    PubMed

    Costa, Taiza Florêncio; Felli, Vanda Elisa Andres; Baptista, Patrícia Campos Pavan

    2012-12-01

    The objectives of this study were to identify the perceptions of nursing workers regarding the handling of hazardous chemical waste at the University of São Paulo University Hospital (HU-USP), and develop a proposal to improve safety measures. This study used a qualitative approach and a convenience sample consisting of eighteen nursing workers. Data collection was performed through focal groups. Thematic analysis revealed four categories that gave evidence of training deficiencies in terms of the stages of handling waste. Difficulties that emerged included a lack of knowledge regarding exposure and its impact, the utilization of personal protective equipment versus collective protection, and suggestions regarding measures to be taken by the institution and workers for the safe handling of hazardous chemical waste. The present data allowed for recommending proposals regarding the safe management of hazardous chemical waste by the nursing staff.

  18. A Software Suite for Testing SpaceWire Devices and Networks

    NASA Astrophysics Data System (ADS)

    Mills, Stuart; Parkes, Steve

    2015-09-01

    SpaceWire is a data-handling network for use on-board spacecraft, which connects together instruments, mass-memory, processors, downlink telemetry, and other on-board sub-systems. SpaceWire is simple to implement and has some specific characteristics that help it support data-handling applications in space: high-speed, low-power, simplicity, relatively low implementation cost, and architectural flexibility making it ideal for many space missions. SpaceWire provides high-speed (2 Mbits/s to 200 Mbits/s), bi-directional, full-duplex data-links, which connect together SpaceWire enabled equipment. Data-handling networks can be built to suit particular applications using point-to-point data-links and routing switches. STAR-Dundee’s STAR-System software stack has been designed to meet the needs of engineers designing and developing SpaceWire networks and devices. This paper describes the aims of the software and how those needs were met.

  19. Substructure analysis techniques and automation. [to eliminate logistical data handling and generation chores

    NASA Technical Reports Server (NTRS)

    Hennrich, C. W.; Konrath, E. J., Jr.

    1973-01-01

    A basic automated substructure analysis capability for NASTRAN is presented which eliminates most of the logistical data handling and generation chores that are currently associated with the method. Rigid formats are proposed which will accomplish this using three new modules, all of which can be added to level 16 with a relatively small effort.

  20. Early Childhood Inservice and Preservice Teachers' Perceived Levels of Preparedness to Handle Stress in Their Students

    ERIC Educational Resources Information Center

    Onchwari, Jacqueline

    2010-01-01

    This article reports a study that investigated preservice and inservice early childhood teachers' perceived levels of preparedness to handle stress in early childhood and elementary education students. A survey that included vignettes was used to collect data. Data were analyzed both qualitatively and statistically, using one-way ANOVA, "t"-test,…

  1. A Primer for Handling Missing Values in the Analysis of Education and Training Data

    ERIC Educational Resources Information Center

    Gemici, Sinan; Bednarz, Alice; Lim, Patrick

    2012-01-01

    Quantitative research in vocational education and training (VET) is routinely affected by missing or incomplete information. However, the handling of missing data in published VET research is often sub-optimal, leading to a real risk of generating results that can range from being slightly biased to being plain wrong. Given that the growing…

  2. NHEXAS PHASE I REGION 5 STUDY--STANDARD OPERATING PROCEDURE--NHEXAS FILTER HANDLING, WEIGHING AND ARCHIVING PROCEDURES FOR AEROSOL SAMPLES (RTI/ACS-AP-209-011)

    EPA Science Inventory

    This protocol describes the procedures for weighing, handling, and archiving aerosol filters and for managing the associated analytical and quality assurance data. Filter samples were weighed for aerosol mass at RTI laboratory, with only the automated field sampling data transfer...

  3. The Sample Handling System for the Mars Icebreaker Life Mission: from Dirt to Data

    NASA Technical Reports Server (NTRS)

    Dave, Arwen; Thompson, Sarah J.; McKay, Christopher P.; Stoker, Carol R.; Zacny, Kris; Paulsen, Gale; Mellerowicz, Bolek; Glass, Brian J.; Wilson, David; Bonaccorsi, Rosalba; hide

    2013-01-01

    The Mars icebreaker life mission will search for subsurface life on mars. It consists of three payload elements: a drill to retrieve soil samples from approx. 1 meter below the surface, a robotic sample handling system to deliver the sample from the drill to the instruments, and the instruments themselves. This paper will discuss the robotic sample handling system.

  4. Some propulsion system noise data handling conventions and computer programs used at the Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Montegani, F. J.

    1974-01-01

    Methods of handling one-third-octave band noise data originating from the outdoor full-scale fan noise facility and the engine acoustic facility at the Lewis Research Center are presented. Procedures for standardizing, retrieving, extrapolating, and reporting these data are explained. Computer programs are given which are used to accomplish these and other noise data analysis tasks. This information is useful as background for interpretation of data from these facilities appearing in NASA reports and can aid data exchange by promoting standardization.

  5. Maternal handling during pregnancy reduces DMBA-induced mammary tumorigenesis among female offspring.

    PubMed Central

    Hilakivi-Clarke, L.

    1997-01-01

    The present study investigated whether handling of pregnant rats would affect mammary tumorigenesis in their female offspring. Pregnant Sprague-Dawley rats were injected daily with 0.05 ml of vehicle between days 14 and 20 of gestation or were left undisturbed. Handling did not have any effects on pregnancy or early development of the offspring. The female offspring were administered 10 mg of 7,12-dimethylbenz(a)anthracene (DMBA) at the age of 55 days. The rats whose mothers were handled during pregnancy had a significantly reduced mammary tumour incidence when compared with the offspring of non-handled mothers. Thus, on week 18 after DMBA exposure, 15% of the handled offspring had developed mammary tumours, whereas 44% of the non-handled offspring had tumours. No significant differences in the latency to tumour appearance, in the size of the tumours or in their growth rates were noted. Daily handling performed during post-natal days 5 and 20 produced similar data to that obtained for prenatal handling; on week 18 after DMBA exposure, the mammary tumour incidence among the post-natally handled rats was 22% and among the non-handled rats 44%. Possible deviations in hormonal parameters were also studied in adult female rats exposed in utero to handling. The onset of puberty tended to occur later among the handled offspring, but no differences in the uterine wet weights or serum oestradiol levels between the groups were noted. In conclusion, maternal handling reduced the offspring's risk to develop mammary tumours, and this effect was independent of the oestrogenic environment at adulthood. We propose that handling of a pregnant rat reduces mammary tumorigenesis in her offspring by means of changing the morphology of the mammary gland, the pattern of expression of specific genes and/or immune functions. PMID:9231913

  6. Experience of Data Handling with IPPM Payload

    NASA Astrophysics Data System (ADS)

    Errico, Walter; Tosi, Pietro; Ilstad, Jorgen; Jameux, David; Viviani, Riccardo; Collantoni, Daniele

    2010-08-01

    A simplified On-Board Data Handling system has been developed by CAEN AURELIA SPACE and ABSTRAQT as PUS-over-SpaceWire demonstration platform for the Onboard Payload Data Processing laboratory at ESTEC. The system is composed of three Leon2-based IPPM (Integrated Payload Processing Module) computers that play the roles of Instrument, Payload Data Handling Unit and Satellite Management Unit. Two PCs complete the test set-up simulating an external Memory Management Unit and the Ground Control Unit. Communication among units take place primarily through SpaceWire links; RMAP[2] protocol is used for configuration and housekeeping. A limited implementation of ECSS-E-70-41B Packet Utilisation Standard (PUS)[1] over CANbus and MIL-STD-1553B has been also realized. The Open Source RTEMS is running on the IPPM AT697E CPU as real-time operating system.

  7. Family caregivers' experience of activities of daily living handling in older adult with stroke: a qualitative research in the Iranian context.

    PubMed

    Hesamzadeh, Ali; Dalvandi, Asghar; Bagher Maddah, Sadat; Fallahi Khoshknab, Masoud; Ahmadi, Fazlollah; Mosavi Arfa, Nazila

    2017-09-01

    Patients with stroke require additional support from family to live independently in the area of activities of daily living. Family members are usually the main caregivers of stroke patients. Comprehensive explanation of ADL handling from family caregivers' view is lacking. This study explores and describes family caregivers' experiences about the strategies to handle activities of daily living (ADL) dependency of elderly patient with stroke in the Iranian context. A qualitative content analysis approach was conducted to analyse data. Nineteen family caregivers participated in the study from multiple physiotherapy clinics of physiotherapy in Sari (Iran) between September 2013 and May 2014. Data were generated through in-depth interviews, and content analysis method was used to analyse the data and determine themes. The findings show that family caregivers manage the ADL dependency of their elderly stroke patients through seven strategies including encouraging physical movements, providing personal hygiene, nutritional consideration, facilitating religious activities, filling leisure time, and facilitating transfer and assisting in financial issues. Family has an important role in handling of elderly stroke patients' ADL dependency. Health practitioners can take benefit from the findings to help the stroke families play more active role in the handling ADL dependency of their patients after stroke. © 2016 Nordic College of Caring Science.

  8. Systems thinking applied to safety during manual handling tasks in the transport and storage industry.

    PubMed

    Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter

    2014-07-01

    Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Development of longitudinal handling qualities criteria for large advanced supersonic aircraft

    NASA Technical Reports Server (NTRS)

    Sudderth, R. W.; Bohn, J. G.; Caniff, M. A.; Bennett, G. R.

    1975-01-01

    Longitudinal handling qualities criteria in terms of airplane response characteristics were developed. The criteria cover high speed cruise maneuvering, landing approach, and stall recovery. Data substantiating the study results are reported.

  10. Architectures and methodologies for future deployment of multi-site Zettabyte-Exascale data handling platforms

    NASA Astrophysics Data System (ADS)

    Acín, V.; Bird, I.; Boccali, T.; Cancio, G.; Collier, I. P.; Corney, D.; Delaunay, B.; Delfino, M.; dell'Agnello, L.; Flix, J.; Fuhrmann, P.; Gasthuber, M.; Gülzow, V.; Heiss, A.; Lamanna, G.; Macchi, P.-E.; Maggi, M.; Matthews, B.; Neissner, C.; Nief, J.-Y.; Porto, M. C.; Sansum, A.; Schulz, M.; Shiers, J.

    2015-12-01

    Several scientific fields, including Astrophysics, Astroparticle Physics, Cosmology, Nuclear and Particle Physics, and Research with Photons, are estimating that by the 2020 decade they will require data handling systems with data volumes approaching the Zettabyte distributed amongst as many as 1018 individually addressable data objects (Zettabyte-Exascale systems). It may be convenient or necessary to deploy such systems using multiple physical sites. This paper describes the findings of a working group composed of experts from several

  11. A review of the handling of missing longitudinal outcome data in clinical trials

    PubMed Central

    2014-01-01

    The aim of this review was to establish the frequency with which trials take into account missingness, and to discover what methods trialists use for adjustment in randomised controlled trials with longitudinal measurements. Failing to address the problems that can arise from missing outcome data can result in misleading conclusions. Missing data should be addressed as a means of a sensitivity analysis of the complete case analysis results. One hundred publications of randomised controlled trials with longitudinal measurements were selected randomly from trial publications from the years 2005 to 2012. Information was extracted from these trials, including whether reasons for dropout were reported, what methods were used for handing the missing data, whether there was any explanation of the methods for missing data handling, and whether a statistician was involved in the analysis. The main focus of the review was on missing data post dropout rather than missing interim data. Of all the papers in the study, 9 (9%) had no missing data. More than half of the papers included in the study failed to make any attempt to explain the reasons for their choice of missing data handling method. Of the papers with clear missing data handling methods, 44 papers (50%) used adequate methods of missing data handling, whereas 30 (34%) of the papers used missing data methods which may not have been appropriate. In the remaining 17 papers (19%), it was difficult to assess the validity of the methods used. An imputation method was used in 18 papers (20%). Multiple imputation methods were introduced in 1987 and are an efficient way of accounting for missing data in general, and yet only 4 papers used these methods. Out of the 18 papers which used imputation, only 7 displayed the results as a sensitivity analysis of the complete case analysis results. 61% of the papers that used an imputation explained the reasons for their chosen method. Just under a third of the papers made no reference to reasons for missing outcome data. There was little consistency in reporting of missing data within longitudinal trials. PMID:24947664

  12. Severe storms observing satellite study

    NASA Technical Reports Server (NTRS)

    Iwens, R. P.; Stern, D. A.

    1976-01-01

    Payload distribution and the attitude control system for the multi-mission modular spacecraft/StormSat configuration are discussed. The design of the advanced atmospheric sounder and imaging radiometer (AASIR) gimbal drive and its servomechanism is described. Onboard data handling, data downlink communications, and ground data handling systems are developed. Additional topics covered include: magnetic unloading at synchronous altitude, north-south stationkeeping, and the feasibility and impact of flying the microwave atmospheric sounding radiometer (MASR) as an additional payload.

  13. Missing binary data extraction challenges from Cochrane reviews in mental health and Campbell reviews with implications for empirical research.

    PubMed

    Spineli, Loukia M

    2017-12-01

    Tο report challenges encountered during the extraction process from Cochrane reviews in mental health and Campbell reviews and to indicate their implications on the empirical performance of different methods to handle missingness. We used a collection of meta-analyses on binary outcomes collated from a previous work on missing outcome data. To evaluate the accuracy of their extraction, we developed specific criteria pertaining to the reporting of missing outcome data in systematic reviews. Using the most popular methods to handle missing binary outcome data, we investigated the implications of the accuracy of the extracted meta-analysis on the random-effects meta-analysis results. Of 113 meta-analyses from Cochrane reviews, 60 (53%) were judged as "unclearly" extracted (ie, no information on the outcome of completers but available information on how missing participants were handled) and 42 (37%) as "unacceptably" extracted (ie, no information on the outcome of completers as well as no information on how missing participants were handled). For the remaining meta-analyses, it was judged that data were "acceptably" extracted (ie, information on the completers' outcome was provided for all trials). Overall, "unclear" extraction overestimated the magnitude of the summary odds ratio and the between-study variance and additionally inflated the uncertainty of both meta-analytical parameters. The only eligible Campbell review was judged as "unclear." Depending on the extent of missingness, the reporting quality of the systematic reviews can greatly affect the accuracy of the extracted meta-analyses and by extent, the empirical performance of different methods to handle missingness. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Correlation signatures of wet soils and snows. [algorithm development and computer programming

    NASA Technical Reports Server (NTRS)

    Phillips, M. R.

    1972-01-01

    Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.

  15. Motion-base simulator results of advanced supersonic transport handling qualities with active controls

    NASA Technical Reports Server (NTRS)

    Feather, J. B.; Joshi, D. S.

    1981-01-01

    Handling qualities of the unaugmented advanced supersonic transport (AST) are deficient in the low-speed, landing approach regime. Consequently, improvement in handling with active control augmentation systems has been achieved using implicit model-following techniques. Extensive fixed-based simulator evaluations were used to validate these systems prior to tests with full motion and visual capabilities on a six-axis motion-base simulator (MBS). These tests compared the handling qualities of the unaugmented AST with several augmented configurations to ascertain the effectiveness of these systems. Cooper-Harper ratings, tracking errors, and control activity data from the MBS tests have been analyzed statistically. The results show the fully augmented AST handling qualities have been improved to an acceptable level.

  16. Building a framework for ergonomic research on laparoscopic instrument handles.

    PubMed

    Li, Zheng; Wang, Guohui; Tan, Juan; Sun, Xulong; Lin, Hao; Zhu, Shaihong

    2016-06-01

    Laparoscopic surgery carries the advantage of minimal invasiveness, but ergonomic design of the instruments used has progressed slowly. Previous studies have demonstrated that the handle of laparoscopic instruments is vital for both surgical performance and surgeon's health. This review provides an overview of the sub-discipline of handle ergonomics, including an evaluation framework, objective and subjective assessment systems, data collection and statistical analyses. Furthermore, a framework for ergonomic research on laparoscopic instrument handles is proposed to standardize work on instrument design. Copyright © 2016 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.

  17. Novice Interpretations of Visual Representations of Geosciences Data

    NASA Astrophysics Data System (ADS)

    Burkemper, L. K.; Arthurs, L.

    2013-12-01

    Past cognition research of individual's perception and comprehension of bar and line graphs are substantive enough that they have resulted in the generation of graph design principles and graph comprehension theories; however, gaps remain in our understanding of how people process visual representations of data, especially of geologic and atmospheric data. This pilot project serves to build on others' prior research and begin filling the existing gaps. The primary objectives of this pilot project include: (i) design a novel data collection protocol based on a combination of paper-based surveys, think-aloud interviews, and eye-tracking tasks to investigate student data handling skills of simple to complex visual representations of geologic and atmospheric data, (ii) demonstrate that the protocol yields results that shed light on student data handling skills, and (iii) generate preliminary findings upon which tentative but perhaps helpful recommendations on how to more effectively present these data to the non-scientist community and teach essential data handling skills. An effective protocol for the combined use of paper-based surveys, think-aloud interviews, and computer-based eye-tracking tasks for investigating cognitive processes involved in perceiving, comprehending, and interpreting visual representations of geologic and atmospheric data is instrumental to future research in this area. The outcomes of this pilot study provide the foundation upon which future more in depth and scaled up investigations can build. Furthermore, findings of this pilot project are sufficient for making, at least, tentative recommendations that can help inform (i) the design of physical attributes of visual representations of data, especially more complex representations, that may aid in improving students' data handling skills and (ii) instructional approaches that have the potential to aid students in more effectively handling visual representations of geologic and atmospheric data that they might encounter in a course, television news, newspapers and magazines, and websites. Such recommendations would also be the potential subject of future investigations and have the potential to impact the design features when data is presented to the public and instructional strategies not only in geoscience courses but also other science, technology, engineering, and mathematics (STEM) courses.

  18. Remote-handled/special case TRU waste characterization summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.A.

    1984-03-30

    TRU wastes are those (other than high level waste) contaminated with specified quantities of certain alpha-emitting radionuclides of long half-life and high specific radiotoxicity. TRU waste is defined as /sup 226/Ra isotopic sources and those other materials that, without regard to source or form, are contaminated with transuranic elements with half-lives greater than 20 years, and have TRU alpha contamination greater than 100 nCi/g. RH TRU waste has high beta and gamma radiation levels, up to 30,000 R/hr, and thermal output may be a few hundred watts per container. The radiation levels in most of this remotely handled (RH) TRUmore » waste, however, are below 100 R/hr. Remote-handled wastes are stored at Los Alamos, Hanford, Oak Ridge, and the Idaho National Engineering Laboratory. This report presents a site by site discussion of RH waste handling, placement, and container data. This is followed by a series of data tables that were compiled in the TRU Waste Systems Office. These tables are a compendium of data that are the most up to date and accurate data available today. 10 tables.« less

  19. The 'last mile' of data handling: Fermilab's IFDH tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, Adam L.; Mengel, Marc W.

    2014-01-01

    IFDH (Intensity Frontier Data Handling), is a suite of tools for data movement tasks for Fermilab experiments and is an important part of the FIFE[2] (Fabric for Intensity Frontier [1] Experiments) initiative described at this conference. IFDH encompasses moving input data from caches or storage elements to compute nodes (the 'last mile' of data movement) and moving output data potentially to those caches as part of the journey back to the user. IFDH also involves throttling and locking to ensure that large numbers of jobs do not cause data movement bottlenecks. IFDH is realized as an easy to use layermore » that users call in their job scripts (e.g. 'ifdh cp'), hiding the low level data movement tools. One advantage of this layer is that the underlying low level tools can be selected or changed without the need for the user to alter their scripts. Logging and performance monitoring can also be added easily. This system will be presented in detail as well as its impact on the ease of data handling at Fermilab experiments.« less

  20. An Investigation of Large Tilt-Rotor Short-Term Attitude Response Handling Qualities Requirements in Hover

    NASA Technical Reports Server (NTRS)

    Malcipa, Carlos; Decker, William A.; Theodore, Colin R.; Blanken, Christopher L.; Berger, Tom

    2010-01-01

    A piloted simulation investigation was conducted using the NASA Ames Vertical Motion Simulator to study the impact of pitch, roll and yaw attitude bandwidth and phase delay on handling qualities of large tilt-rotor aircraft. Multiple bandwidth and phase delay pairs were investigated for each axis. The simulation also investigated the effect that the pilot offset from the center of gravity has on handling qualities. While pilot offset does not change the dynamics of the vehicle, it does affect the proprioceptive and visual cues and it can have an impact on handling qualities. The experiment concentrated on two primary evaluation tasks: a precision hover task and a simple hover pedal turn. Six pilots flew over 1400 data runs with evaluation comments and objective performance data recorded. The paper will describe the experiment design and methodology, discuss the results of the experiment and summarize the findings.

  1. Factors influencing oncology nurses' use of hazardous drug safe-handling precautions.

    PubMed

    Polovich, Martha; Clark, Patricia C

    2012-05-01

    To examine relationships among factors affecting nurses' use of hazardous drug (HD) safe-handling precautions, identify factors that promote or interfere with HD precaution use, and determine managers' perspectives on the use of HD safe-handling precautions. Cross-sectional, mixed methods; mailed survey to nurses who handle chemotherapy and telephone interviews with managers. Mailed invitation to oncology centers across the United States. 165 nurses who reported handling chemotherapy and 20 managers of nurses handling chemotherapy. Instruments measured the use of HD precautions and individual and organizational factors believed to influence precaution use. Data analysis included descriptive statistics and hierarchical regression. Manager interview data were analyzed using content analysis. Chemotherapy exposure knowledge, self-efficacy, perceived barriers, perceived risk, interpersonal influences, and workplace safety climate. Nurses were well educated, experienced, and certified in oncology nursing. The majority worked in outpatient settings and administered chemotherapy to an average of 6.8 patients per day. Exposure knowledge, self-efficacy for using personal protective equipment, and perceived risk of harm from HD exposure were high; total precaution use was low. Nurse characteristics did not predict HD precaution use. Fewer barriers, better workplace safety climate, and fewer patients per day were independent predictors of higher HD precaution use. HD handling policies were present, but many did not reflect current recommendations. Few managers formally monitored nurses' HD precaution use. Circumstances in the workplace interfere with nurses' use of HD precautions. Interventions should include fostering a positive workplace safety climate, reducing barriers, and providing appropriate nurse-patient ratios.

  2. Invention activities as preparation for learning laboratory data handling skills

    NASA Astrophysics Data System (ADS)

    Day, James

    2012-10-01

    Undergraduate physics laboratories are often driven by a mix of goals, and usually enough of them to cause cognitive overload for the student. Our recent findings align well with studies indicating that students often exit a physics lab without having properly learned how to handle real data. The value of having students explore the underlying structure of a problem before being able to solve it has been shown as an effective way to ready students for learning. Borrowing on findings from the fields of education and cognitive psychology, we use ``invention activities'' to precede direct instruction and bolster learning. In this talk I will show some of what we have learned about students' data handling skills, explain how an invention activity works, and share some observations of successful transfer.

  3. Using generalized estimating equations and extensions in randomized trials with missing longitudinal patient reported outcome data.

    PubMed

    Bell, Melanie L; Horton, Nicholas J; Dhillon, Haryana M; Bray, Victoria J; Vardy, Janette

    2018-05-26

    Patient reported outcomes (PROs) are important in oncology research; however, missing data can pose a threat to the validity of results. Psycho-oncology researchers should be aware of the statistical options for handling missing data robustly. One rarely used set of methods, which includes extensions for handling missing data, is generalized estimating equations (GEEs). Our objective was to demonstrate use of GEEs to analyze PROs with missing data in randomized trials with assessments at fixed time points. We introduce GEEs and show, with a worked example, how to use GEEs that account for missing data: inverse probability weighted GEEs and multiple imputation with GEE. We use data from an RCT evaluating a web-based brain training for cancer survivors reporting cognitive symptoms after chemotherapy treatment. The primary outcome for this demonstration is the binary outcome of cognitive impairment. Several methods are used, and results are compared. We demonstrate that estimates can vary depending on the choice of analytical approach, with odds ratios for no cognitive impairment ranging from 2.04 to 5.74. While most of these estimates were statistically significant (P < 0.05), a few were not. Researchers using PROs should use statistical methods that handle missing data in a way as to result in unbiased estimates. GEE extensions are analytic options for handling dropouts in longitudinal RCTs, particularly if the outcome is not continuous. Copyright © 2018 John Wiley & Sons, Ltd.

  4. Application of the Hyper-Poisson Generalized Linear Model for Analyzing Motor Vehicle Crashes.

    PubMed

    Khazraee, S Hadi; Sáez-Castillo, Antonio Jose; Geedipally, Srinivas Reddy; Lord, Dominique

    2015-05-01

    The hyper-Poisson distribution can handle both over- and underdispersion, and its generalized linear model formulation allows the dispersion of the distribution to be observation-specific and dependent on model covariates. This study's objective is to examine the potential applicability of a newly proposed generalized linear model framework for the hyper-Poisson distribution in analyzing motor vehicle crash count data. The hyper-Poisson generalized linear model was first fitted to intersection crash data from Toronto, characterized by overdispersion, and then to crash data from railway-highway crossings in Korea, characterized by underdispersion. The results of this study are promising. When fitted to the Toronto data set, the goodness-of-fit measures indicated that the hyper-Poisson model with a variable dispersion parameter provided a statistical fit as good as the traditional negative binomial model. The hyper-Poisson model was also successful in handling the underdispersed data from Korea; the model performed as well as the gamma probability model and the Conway-Maxwell-Poisson model previously developed for the same data set. The advantages of the hyper-Poisson model studied in this article are noteworthy. Unlike the negative binomial model, which has difficulties in handling underdispersed data, the hyper-Poisson model can handle both over- and underdispersed crash data. Although not a major issue for the Conway-Maxwell-Poisson model, the effect of each variable on the expected mean of crashes is easily interpretable in the case of this new model. © 2014 Society for Risk Analysis.

  5. The advanced orbiting systems testbed program: Results to date

    NASA Technical Reports Server (NTRS)

    Newsome, Penny A.; Otranto, John F.

    1993-01-01

    The Consultative Committee for Space Data Systems Recommendations for Packet Telemetry and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. Central to the AOST Program are the development of an end-to-end Testbed and its use in a comprehensive testing program. Other Program activities include flight-qualifiable component development, supporting studies, and knowledge dissemination. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations. The results presented in this paper include architectural issues, a draft proposed standardized test suite and flight-qualifiable components.

  6. Employee and customer handling of nicotine-containing e-liquids in vape shops.

    PubMed

    Garcia, Robert; Allem, Jon Patrick; Baezconde-Garbanati, Lourdes; Unger, Jennifer Beth; Sussman, Steve

    2016-01-01

    Vape shops sell electronic cigarettes and related products such as e-liquids, which may contain nicotine. Direct contact with nicotine can lead to adverse health effects, and few regulations exist on how nicotine is handled in vape shops. This study examined how customers and employees come into contact with, and handle, nicotine-containing e-liquids in vape shops with the goal of informing potential future regulation of nicotine handling in vape shops. Data were collected from 77 vape shops in the Los Angeles basin. Characteristics of the shops were documented by employee interviews and in store observations. Data collection was focused on shops located in areas with high concentrations of communities of interest; 20 shops from African-American communities, 17 from Hispanic communities, 18 from Korean communities, and 22 from non-Hispanic White communities. Half of the vape shops allowed customers to sample e-liquids with nicotine. Most of the shops (83%) provided self-service sampling stations for customers. A majority of shop employees (72%) reported that spills of e-liquids containing nicotine had occurred in the past. While 64% of the shops provided safety equipment, only 34% provided equipment for proper nicotine handling. Furthermore, 62% of shop employees reported handling nicotine without gloves or other safety equipment. Regulation on the handling of nicotine by customers and vape shop employees is important to prevent unsafe practices and subsequent injury. The frequent occurrence of spills and limited availability of safety equipment in vape shops highlights the need for the creation and enforcement of regulations to protect employees and customers. Appropriate safety training and equipment should be provided to employees to prevent accidental exposure to nicotine. Information on ways to safely handle nicotine should be communicated to vape shop employees and customers.

  7. Employee and customer handling of nicotine-containing e-liquids in vape shops

    PubMed Central

    Garcia, Robert; Allem, Jon Patrick; Baezconde-Garbanati, Lourdes; Unger, Jennifer Beth; Sussman, Steve

    2017-01-01

    INTRODUCTION Vape shops sell electronic cigarettes and related products such as e-liquids, which may contain nicotine. Direct contact with nicotine can lead to adverse health effects, and few regulations exist on how nicotine is handled in vape shops. This study examined how customers and employees come into contact with, and handle, nicotine-containing e-liquids in vape shops with the goal of informing potential future regulation of nicotine handling in vape shops. METHODS Data were collected from 77 vape shops in the Los Angeles basin. Characteristics of the shops were documented by employee interviews and in store observations. Data collection was focused on shops located in areas with high concentrations of communities of interest; 20 shops from African-American communities, 17 from Hispanic communities, 18 from Korean communities, and 22 from non-Hispanic White communities. RESULTS Half of the vape shops allowed customers to sample e-liquids with nicotine. Most of the shops (83%) provided self-service sampling stations for customers. A majority of shop employees (72%) reported that spills of e-liquids containing nicotine had occurred in the past. While 64% of the shops provided safety equipment, only 34% provided equipment for proper nicotine handling. Furthermore, 62% of shop employees reported handling nicotine without gloves or other safety equipment. CONCLUSIONS Regulation on the handling of nicotine by customers and vape shop employees is important to prevent unsafe practices and subsequent injury. The frequent occurrence of spills and limited availability of safety equipment in vape shops highlights the need for the creation and enforcement of regulations to protect employees and customers. Appropriate safety training and equipment should be provided to employees to prevent accidental exposure to nicotine. Information on ways to safely handle nicotine should be communicated to vape shop employees and customers. PMID:28660255

  8. Agricultural Handling and Processing Industries; Data Pertinent to an Evaluation of Overtime Exemptions Available under the Fair Labor Standards Act. Volume I.

    ERIC Educational Resources Information Center

    Wage and Labor Standards Administration (DOL), Washington, DC.

    This report covers the major agricultural handling and processing industries qualifying for partial overtime exemption under the Fair Labor Standards Act and evaluates the need for such exemptions. Questionnaires which were sent to firms in various processing industries provide data on nearly 4,000 processors. The results show that existing…

  9. Report on a Highly Used Computer Aid for the Professor in his Grade and Record Keeping Tasks.

    ERIC Educational Resources Information Center

    Brockmeier, Richard

    SPARS is a computer data base management system designed to aid the college professor in handling his students' grades and other classroom data. It can handle multiple sections and labs, and allows the professor to combine and separate these components in a variety of ways. SPARS seeks to meet the sometimes competing goals of simplicity of use and…

  10. Effects of side-stick controllers on rotorcraft handling qualities for terrain flight

    NASA Technical Reports Server (NTRS)

    Aiken, E. W.

    1985-01-01

    Pertinent fixed and rotary-wing feasibility studies and handling-qualities research programs are reviewed and the effects of certain controller characteristics on handling qualities for specific rotorcraft flight tasks are summarized. The effects of the controller force-deflection relationship and the number of controlled axes that are integrated in a single controller are examined. Simulation studies were conducted which provide a significant part of the available handling qualities data. The studies demonstrate the feasibility of using a single, properly designed, limited-displacement, multiaxis controller for certain relatively routine flight tasks in a two-crew rotorcraft with nominal levels of stability and control augmentation with a high degree of reliability are incorporated, separated three or two-axis controller configurations are required for acceptable handling qualities.

  11. Flight-testing and frequency-domain analysis for rotorcraft handling qualities

    NASA Technical Reports Server (NTRS)

    Ham, Johnnie A.; Gardner, Charles K.; Tischler, Mark B.

    1995-01-01

    A demonstration of frequency-domain flight-testing techniques and analysis was performed on a U.S. Army OH-58D helicopter in support of the OH-58D Airworthiness and Flight Characteristics Evaluation and of the Army's development and ongoing review of Aeronautical Design Standard 33C, Handling Qualities Requirements for Military Rotorcraft. Hover and forward flight (60 kn) tests were conducted in 1 flight hour by Army experimental test pilots. Further processing of the hover data generated a complete database of velocity, angular-rate, and acceleration-frequency responses to control inputs. A joint effort was then undertaken by the Airworthiness Qualification Test Dirtectorate and the U.S. Army Aeroflightdynamics Directorate to derive handling-quality information from the frequency-domain database using a variety of approaches. This report documents numerous results that have been obtained from the simple frequency-domain tests; in many areas, these results provide more insight into the aircraft dynmamics that affect handling qualities than do traditional flight tests. The handling-quality results include ADS-33C bandwidth and phase-delay calculations, vibration spectral determinations, transfer-function models to examine single-axis results, and a six-degree-of-freedom fully coupled state-space model. The ability of this model to accurately predict responses was verified using data from pulse inputs. This report also documents the frequency-sweep flight-test technique and data analysis used to support the tests.

  12. Technological survey of tellurium and its compounds

    NASA Technical Reports Server (NTRS)

    Steindler, M. J.; Vissers, D. R.

    1968-01-01

    Review includes data on the chemical and physical properties of tellurium, its oxides, and fluorides, pertinent to the process problem of handling fission product tellurium in fluoride form. The technology of tellurium handling in nonaqueous processing of nuclear fuels is also reviewed.

  13. Working paper : national costs of the metropolitan ITS infrastructure : updated with 2002 deployment data

    DOT National Transportation Integrated Search

    1995-02-01

    This paper addresses the relationship of truck size and weight (TS&W) policy, vehicle handling and stability, and safety. Handling and stability are the primary mechanisms relating vehicle characteristics and safety. Vehicle characteristics may also ...

  14. X.400: The Standard for Message Handling Systems.

    ERIC Educational Resources Information Center

    Swain, Leigh; Tallim, Paula

    1990-01-01

    Profiles X.400, the Open Systems Interconnection (OSI) Application layer standard that supports interpersonal electronic mail services, facsimile transfer, electronic data interchange, electronic funds transfer, electronic publishing, and electronic invoicing. Also discussed are an electronic directory to support message handling, compatibility…

  15. X-Windows Widget for Image Display

    NASA Technical Reports Server (NTRS)

    Deen, Robert G.

    2011-01-01

    XvicImage is a high-performance XWindows (Motif-compliant) user interface widget for displaying images. It handles all aspects of low-level image display. The fully Motif-compliant image display widget handles the following tasks: (1) Image display, including dithering as needed (2) Zoom (3) Pan (4) Stretch (contrast enhancement, via lookup table) (5) Display of single-band or color data (6) Display of non-byte data (ints, floats) (7) Pseudocolor display (8) Full overlay support (drawing graphics on image) (9) Mouse-based panning (10) Cursor handling, shaping, and planting (disconnecting cursor from mouse) (11) Support for all user interaction events (passed to application) (12) Background loading and display of images (doesn't freeze the GUI) (13) Tiling of images.

  16. Empirical evaluation of data normalization methods for molecular classification.

    PubMed

    Huang, Huei-Chung; Qin, Li-Xuan

    2018-01-01

    Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers-an increasingly important application of microarrays in the era of personalized medicine. In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy.

  17. Stream processing health card application.

    PubMed

    Polat, Seda; Gündem, Taflan Imre

    2012-10-01

    In this paper, we propose a data stream management system embedded to a smart card for handling and storing user specific summaries of streaming data coming from medical sensor measurements and/or other medical measurements. The data stream management system that we propose for a health card can handle the stream data rates of commonly known medical devices and sensors. It incorporates a type of context awareness feature that acts according to user specific information. The proposed system is cheap and provides security for private data by enhancing the capabilities of smart health cards. The stream data management system is tested on a real smart card using both synthetic and real data.

  18. MetAlign: interface-driven, versatile metabolomics tool for hyphenated full-scan mass spectrometry data preprocessing.

    PubMed

    Lommen, Arjen

    2009-04-15

    Hyphenated full-scan MS technology creates large amounts of data. A versatile easy to handle automation tool aiding in the data analysis is very important in handling such a data stream. MetAlign softwareas described in this manuscripthandles a broad range of accurate mass and nominal mass GC/MS and LC/MS data. It is capable of automatic format conversions, accurate mass calculations, baseline corrections, peak-picking, saturation and mass-peak artifact filtering, as well as alignment of up to 1000 data sets. A 100 to 1000-fold data reduction is achieved. MetAlign software output is compatible with most multivariate statistics programs.

  19. PIMS sequencing extension: a laboratory information management system for DNA sequencing facilities.

    PubMed

    Troshin, Peter V; Postis, Vincent Lg; Ashworth, Denise; Baldwin, Stephen A; McPherson, Michael J; Barton, Geoffrey J

    2011-03-07

    Facilities that provide a service for DNA sequencing typically support large numbers of users and experiment types. The cost of services is often reduced by the use of liquid handling robots but the efficiency of such facilities is hampered because the software for such robots does not usually integrate well with the systems that run the sequencing machines. Accordingly, there is a need for software systems capable of integrating different robotic systems and managing sample information for DNA sequencing services. In this paper, we describe an extension to the Protein Information Management System (PIMS) that is designed for DNA sequencing facilities. The new version of PIMS has a user-friendly web interface and integrates all aspects of the sequencing process, including sample submission, handling and tracking, together with capture and management of the data. The PIMS sequencing extension has been in production since July 2009 at the University of Leeds DNA Sequencing Facility. It has completely replaced manual data handling and simplified the tasks of data management and user communication. Samples from 45 groups have been processed with an average throughput of 10000 samples per month. The current version of the PIMS sequencing extension works with Applied Biosystems 3130XL 96-well plate sequencer and MWG 4204 or Aviso Theonyx liquid handling robots, but is readily adaptable for use with other combinations of robots. PIMS has been extended to provide a user-friendly and integrated data management solution for DNA sequencing facilities that is accessed through a normal web browser and allows simultaneous access by multiple users as well as facility managers. The system integrates sequencing and liquid handling robots, manages the data flow, and provides remote access to the sequencing results. The software is freely available, for academic users, from http://www.pims-lims.org/.

  20. SaaS Platform for Time Series Data Handling

    NASA Astrophysics Data System (ADS)

    Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail

    2018-02-01

    The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.

  1. A multi-component patient-handling intervention improves attitudes and behaviors for safe patient handling and reduces aggression experienced by nursing staff: A controlled before-after study.

    PubMed

    Risør, Bettina Wulff; Casper, Sven Dalgas; Andersen, Lars Louis; Sørensen, Jan

    2017-04-01

    This study evaluated an intervention for patient-handling equipment aimed to improve nursing staffs' use of patient handling equipment and improve their general health, reduce musculoskeletal problems, aggressive episodes, days of absence and work-related accidents. As a controlled before-after study, questionnaire data were collected at baseline and 12-month follow-up among nursing staff at intervention and control wards at two hospitals. At 12-month follow-up, the intervention group had more positive attitudes towards patient-handling equipment and increased use of specific patient-handling equipment. In addition, a lower proportion of nursing staff in the intervention group had experienced physically aggressive episodes. No significant change was observed in general health status, musculoskeletal problems, days of absence or work-related accidents. The intervention resulted in more positive attitudes and behaviours for safe patient-handling and less physically aggressive episodes. However, this did not translate into improved health of the staff during the 12-month study period. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Developing a Hadoop-based Middleware for Handling Multi-dimensional NetCDF

    NASA Astrophysics Data System (ADS)

    Li, Z.; Yang, C. P.; Schnase, J. L.; Duffy, D.; Lee, T. J.

    2014-12-01

    Climate observations and model simulations are collecting and generating vast amounts of climate data, and these data are ever-increasing and being accumulated in a rapid speed. Effectively managing and analyzing these data are essential for climate change studies. Hadoop, a distributed storage and processing framework for large data sets, has attracted increasing attentions in dealing with the Big Data challenge. The maturity of Infrastructure as a Service (IaaS) of cloud computing further accelerates the adoption of Hadoop in solving Big Data problems. However, Hadoop is designed to process unstructured data such as texts, documents and web pages, and cannot effectively handle the scientific data format such as array-based NetCDF files and other binary data format. In this paper, we propose to build a Hadoop-based middleware for transparently handling big NetCDF data by 1) designing a distributed climate data storage mechanism based on POSIX-enabled parallel file system to enable parallel big data processing with MapReduce, as well as support data access by other systems; 2) modifying the Hadoop framework to transparently processing NetCDF data in parallel without sequencing or converting the data into other file formats, or loading them to HDFS; and 3) seamlessly integrating Hadoop, cloud computing and climate data in a highly scalable and fault-tolerance framework.

  3. A Review of Toxicity and Use and Handling Considerations for Guanidine, Guanidine Hydrochloride, and Urea.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ertell, Katherine GB

    2006-03-27

    This is a technical report prepared for Oregon Sustainable Energy, LLC, under Agreement 06-19 with PNNL's Office of Small Business Programs. The request was to perform a review of the toxicity and safe handling of guanidine. The request was later amended to add urea. This report summarizes the toxicity data available in the scientific literature and provides an interpretation of the results and recommendations for handling these compounds.

  4. The future point-of-care detection of disease and its data capture and handling.

    PubMed

    Lopez-Barbosa, Natalia; Gamarra, Jorge D; Osma, Johann F

    2016-04-01

    Point-of-care detection is a widely studied area that attracts effort and interest from a large number of fields and companies. However, there is also increased interest from the general public in this type of device, which has driven enormous changes in the design and conception of these developments and the way data is handled. Therefore, future point-of-care detection has to include communication with front-end technology, such as smartphones and networks, automation of manufacture, and the incorporation of concepts like the Internet of Things (IoT) and cloud computing. Three key examples, based on different sensing technology, are analyzed in detail on the basis of these items to highlight a route for the future design and development of point-of-care detection devices and their data capture and handling.

  5. Handling qualities effects of display latency

    NASA Technical Reports Server (NTRS)

    King, David W.

    1993-01-01

    Display latency is the time delay between aircraft response and the corresponding response of the cockpit displays. Currently, there is no explicit specification for allowable display lags to ensure acceptable aircraft handling qualities in instrument flight conditions. This paper examines the handling qualities effects of display latency between 70 and 400 milliseconds for precision instrument flight tasks of the V-22 Tiltrotor aircraft. Display delay effects on the pilot control loop are analytically predicted through a second order pilot crossover model of the V-22 lateral axis, and handling qualities trends are evaluated through a series of fixed-base piloted simulation tests. The results show that the effects of display latency for flight path tracking tasks are driven by the stability characteristics of the attitude control loop. The data indicate that the loss of control damping due to latency can be simply predicted from knowledge of the aircraft's stability margins, control system lags, and required control bandwidths. Based on the relationship between attitude control damping and handling qualities ratings, latency design guidelines are presented. In addition, this paper presents a design philosophy, supported by simulation data, for using flight director display augmentation to suppress the effects of display latency for delays up to 300 milliseconds.

  6. A survey of manufacturing and handling practices for monoclonal antibodies by pharmacy, nursing and medical personnel.

    PubMed

    Alexander, M; King, J; Lingaratnam, S; Byrne, J; MacMillan, K; Mollo, A; Kirsa, S; Green, M

    2016-04-01

    There is a paucity of data available to assess the occupational health and safety risk associated with exposure to monoclonal antibodies. Industry standards and published guidelines are conflicting or outdated. Guidelines offer contrary recommendations based on an array of methodological approaches. This survey aimed to describe current practices, beliefs and attitudes relating to the handling of monoclonal antibodies by Australian medical, nursing and pharmacy clinicians. An electronic survey was distributed between June and September 2013. Respondents were surveyed on three focus areas: institutional guideline availability and content, current practices and attitudes. Demographic data relating to respondent and primary place of practice were also collected. A total of 222 clinicians completed the survey, with representation from all targeted professional groups and from a variety of geographic locations. 92% of respondents reported that their institution prepared or administered monoclonal antibodies, with 87% specifically handling anti-cancer monoclonal antibodies. Monoclonal antibodies were mostly prepared onsite (84-90%) and mostly within pharmacy clean-rooms (75%) and using cytotoxic cabinets (61%). 43% of respondents reported access to institutional monoclonal antibody handling guidelines with risk reduction strategies including training and education (71%), spill and waste management (71%), procedures for transportation (57%) and restricted handling (50%). Nurses had a stronger preference towards pharmacy manufacturing than both doctors and pharmacists for a range of clinical scenarios. 95% of all respondents identified that professional or regulatory body guidelines are an important resource when considering handling practices. Monoclonal antibodies are most commonly handled according to cytotoxic drug standards and often in the absence of formal guidelines. © The Author(s) 2014.

  7. Notification: Audit of Security Categorization for EPA Systems That Handle Hazardous Material Information

    EPA Pesticide Factsheets

    Project #OA-FY18-0089, January 8, 2018. The OIG plans to begin preliminary research to determine whether the EPA classified the sensitivity of data for systems that handle hazardous waste material information as prescribed by NIST.

  8. Adaptive handling of Rayleigh and Raman scatter of fluorescence data based on evaluation of the degree of spectral overlap

    NASA Astrophysics Data System (ADS)

    Hu, Yingtian; Liu, Chao; Wang, Xiaoping; Zhao, Dongdong

    2018-06-01

    At present the general scatter handling methods are unsatisfactory when scatter and fluorescence seriously overlap in excitation emission matrix. In this study, an adaptive method for scatter handling of fluorescence data is proposed. Firstly, the Raman scatter was corrected by subtracting the baseline of deionized water which was collected in each experiment to adapt to the intensity fluctuations. Then, the degrees of spectral overlap between Rayleigh scatter and fluorescence were classified into three categories based on the distance between the spectral peaks. The corresponding algorithms, including setting to zero, fitting on single or both sides, were implemented after the evaluation of the degree of overlap for individual emission spectra. The proposed method minimized the number of fitting and interpolation processes, which reduced complexity, saved time, avoided overfitting, and most importantly assured the authenticity of data. Furthermore, the effectiveness of this procedure on the subsequent PARAFAC analysis was assessed and compared to Delaunay interpolation by conducting experiments with four typical organic chemicals and real water samples. Using this method, we conducted long-term monitoring of tap water and river water near a dyeing and printing plant. This method can be used for improving adaptability and accuracy in the scatter handling of fluorescence data.

  9. Missing data handling in non-inferiority and equivalence trials: A systematic review.

    PubMed

    Rabe, Brooke A; Day, Simon; Fiero, Mallorie H; Bell, Melanie L

    2018-05-25

    Non-inferiority (NI) and equivalence clinical trials test whether a new treatment is therapeutically no worse than, or equivalent to, an existing standard of care. Missing data in clinical trials have been shown to reduce statistical power and potentially bias estimates of effect size; however, in NI and equivalence trials, they present additional issues. For instance, they may decrease sensitivity to differences between treatment groups and bias toward the alternative hypothesis of NI (or equivalence). Our primary aim was to review the extent of and methods for handling missing data (model-based methods, single imputation, multiple imputation, complete case), the analysis sets used (Intention-To-Treat, Per-Protocol, or both), and whether sensitivity analyses were used to explore departures from assumptions about the missing data. We conducted a systematic review of NI and equivalence trials published between May 2015 and April 2016 by searching the PubMed database. Articles were reviewed primarily by 2 reviewers, with 6 articles reviewed by both reviewers to establish consensus. Of 109 selected articles, 93% reported some missing data in the primary outcome. Among those, 50% reported complete case analysis, and 28% reported single imputation approaches for handling missing data. Only 32% reported conducting analyses of both intention-to-treat and per-protocol populations. Only 11% conducted any sensitivity analyses to test assumptions with respect to missing data. Missing data are common in NI and equivalence trials, and they are often handled by methods which may bias estimates and lead to incorrect conclusions. Copyright © 2018 John Wiley & Sons, Ltd.

  10. Evaluating the Impact of Handling and Logger Attachment on Foraging Parameters and Physiology in Southern Rockhopper Penguins

    PubMed Central

    Ludynia, Katrin; Dehnhard, Nina; Poisbleau, Maud; Demongin, Laurent; Masello, Juan F.; Quillfeldt, Petra

    2012-01-01

    Logger technology has revolutionised our knowledge of the behaviour and physiology of free-living animals but handling and logger attachments may have negative effects on the behaviour of the animals and their welfare. We studied southern rockhopper penguin (Eudyptes chrysocome) females during the guard stage in three consecutive breeding seasons (2008/09−2010/11) to evaluate the effects of handling and logger attachment on foraging trip duration, dive behaviour and physiological parameters. Smaller dive loggers (TDRs) were used in 2010/11 for comparison to larger GPS data loggers used in all three seasons and we included two categories of control birds: handled controls and PIT control birds that were previously marked with passive integrative transponders (PITs), but which had not been handled during this study. Increased foraging trip duration was only observed in GPS birds during 2010/11, the breeding season in which we also found GPS birds foraging further away from the colony and travelling longer distances. Compared to previous breeding seasons, 2010/11 may have been a period with less favourable environmental conditions, which would enhance the impact of logger attachments. A comparison between GPS and TDR birds showed a significant difference in dive depth frequencies with birds carrying larger GPS data loggers diving shallower. Mean and maximum dive depths were similar between GPS and TDR birds. We measured little impact of logger attachments on physiological parameters (corticosterone, protein, triglyceride levels and leucocyte counts). Overall, handling and short-term logger attachments (1–3 days) showed limited impact on the behaviour and physiology of the birds but care must be taken with the size of data loggers on diving seabirds. Increased drag may alter their diving behaviour substantially, thus constraining them in their ability to catch prey. Results obtained in this study indicate that data recorded may also not represent their normal dive behaviour. PMID:23185623

  11. Long-term effects of good handling practices during the pre-weaning period of crossbred dairy heifer calves.

    PubMed

    Silva, Luciana Pontes; Sant'Anna, Aline Cristina; Silva, Lívia Carolina Magalhães; Paranhos da Costa, Mateus José Rodrigues

    2017-01-01

    The aim of this study was to determine whether applying good practices of handling during the pre-weaning period have long-term effects on behavioral and physiological indicators, health status, and average daily gain (ADG) of crossbred Bos taurus × Bos indicus heifer calves. During the pre-weaning period, 98 crossbred of Holstein × Gir heifer calves were allotted into three treatments: (1) good practices of handling + brushing (GPB; n = 25), (2) good practices of handling (GP; n = 25), and (3) control (n = 48). Every 2 months, four evaluation periods (EV 1 to EV 4 ) were conducted to record data. Behavioral indicators comprised time to drive (TD), flight speed (FS), flight distance (FD), and composite reactivity score (CRS). Physiological indicators of acute stress during handling comprised respiratory and heart rates. Health status comprised data regarding occurrence of most common diseases (i.e., pneumonia and anaplasmosis). Collected data were analyzed by using a linear mixed model for repeated measures, Tukey's test, and chi-squared procedures. Treatment influenced (P < 0.05) TD, FS, and FD but not CRS (P = 0.78). From EV 1 to EV 3 , the control calves had the lowest TD. The GPB group had lower FS than the control but did not differ from GP. The GPB group had lower FD means than the other two groups in EV 2 , EV 3 , and EV 4 . No differences (P > 0.05) due to treatment were observed on heart and respiratory rates, ADG, or occurrence of pneumonia and anaplasmosis. It was concluded that adoption of good practices of handling during pre-weaning period may lead to long-term positive effects.

  12. Database integration for investigative data visualization with the Temporal Analysis System

    NASA Astrophysics Data System (ADS)

    Barth, Stephen W.

    1997-02-01

    This paper describes an effort to provide mechanisms for integration of existing law enforcement databases with the temporal analysis system (TAS) -- an application for analysis and visualization of military intelligence data. Such integration mechanisms are essential for bringing advanced military intelligence data handling software applications to bear on the analysis of data used in criminal investigations. Our approach involved applying a software application for intelligence message handling to the problem of data base conversion. This application provides mechanisms for distributed processing and delivery of converted data records to an end-user application. It also provides a flexible graphic user interface for development and customization in the field.

  13. A distributed computing system for magnetic resonance imaging: Java-based processing and binding of XML.

    PubMed

    de Beer, R; Graveron-Demilly, D; Nastase, S; van Ormondt, D

    2004-03-01

    Recently we have developed a Java-based heterogeneous distributed computing system for the field of magnetic resonance imaging (MRI). It is a software system for embedding the various image reconstruction algorithms that we have created for handling MRI data sets with sparse sampling distributions. Since these data sets may result from multi-dimensional MRI measurements our system has to control the storage and manipulation of large amounts of data. In this paper we describe how we have employed the extensible markup language (XML) to realize this data handling in a highly structured way. To that end we have used Java packages, recently released by Sun Microsystems, to process XML documents and to compile pieces of XML code into Java classes. We have effectuated a flexible storage and manipulation approach for all kinds of data within the MRI system, such as data describing and containing multi-dimensional MRI measurements, data configuring image reconstruction methods and data representing and visualizing the various services of the system. We have found that the object-oriented approach, possible with the Java programming environment, combined with the XML technology is a convenient way of describing and handling various data streams in heterogeneous distributed computing systems.

  14. Aircraft Handling Qualities Data

    DTIC Science & Technology

    1972-12-01

    WITH YOUR INQUIRY 02672-931227121247 DEPT OF DEFENSE DEFENSE TECHNICAL INFORMATION CENTER ATTN: DTIC-OCP/JOYCE CHIRAS CAMERON STATION BLDG 9...Handling Qualities, F-IO4A, Lockheed Rept. t No. LR 10794, 12 Dec. 19ř Andrews, William H., and Herman A. Rediess, Flight-Determined Eta- bility and

  15. Airborne asbestos take-home exposures during handling of chrysotile-contaminated clothing following simulated full shift workplace exposures.

    PubMed

    Sahmel, Jennifer; Barlow, Christy A; Gaffney, Shannon; Avens, Heather J; Madl, Amy K; Henshaw, John; Unice, Ken; Galbraith, David; DeRose, Gretchen; Lee, Richard J; Van Orden, Drew; Sanchez, Matthew; Zock, Matthew; Paustenbach, Dennis J

    2016-01-01

    The potential for para-occupational, domestic, or take-home exposures from asbestos-contaminated work clothing has been acknowledged for decades, but historically has not been quantitatively well characterized. A simulation study was performed to measure airborne chrysotile concentrations associated with laundering of contaminated clothing worn during a full shift work day. Work clothing fitted onto mannequins was exposed for 6.5 h to an airborne concentration of 11.4 f/cc (PCME) of chrysotile asbestos, and was subsequently handled and shaken. Mean 5-min and 15-min concentrations during active clothes handling and shake-out were 3.2 f/cc and 2.9 f/cc, respectively (PCME). Mean airborne PCME concentrations decreased by 55% 15 min after clothes handling ceased, and by 85% after 30 min. PCM concentrations during clothes handling were 11-47% greater than PCME concentrations. Consistent with previously published data, daily mean 8-h TWA airborne concentrations for clothes-handling activity were approximately 1.0% of workplace concentrations. Similarly, weekly 40-h TWAs for clothes handling were approximately 0.20% of workplace concentrations. Estimated take-home cumulative exposure estimates for weekly clothes handling over 25-year working durations were below 1 f/cc-year for handling work clothes contaminated in an occupational environment with full shift airborne chrysotile concentrations of up to 9 f/cc (8-h TWA).

  16. Implications for patient safety in the use of safe patient handling equipment: a national survey.

    PubMed

    Elnitsky, Christine A; Lind, Jason D; Rugs, Deborah; Powell-Cope, Gail

    2014-12-01

    The prevalence of musculoskeletal injuries among nursing staff has been high due to patient handling and movement. Internationally, healthcare organizations are integrating technological equipment into patient handling and movement to improve safety. Although evidence shows that safe patient handling programs reduce work-related musculoskeletal injuries in nursing staff, it is not clear how safe these new programs are for patients. The objective of this study was to explore adverse patient events associated with safe patient handling programs and preventive approaches in US Veterans Affairs medical centers. The study surveyed a convenience sample of safe patient handling program managers from 51 US Department of Veterans Affairs medical centers to collect data on skin-related and fall-related adverse patient events. Both skin- and fall-related adverse patient events associated with safe patient handling occurred at VA Medical centers. Skin-related events included abrasions, contusions, pressure ulcers and lacerations. Fall-related events included sprains and strains, fractures, concussions and bleeding. Program managers described contextual factors in these adverse events and ways of preventing the events. The use of safe patient handling equipment can pose risks for patients. This study found that organizational factors, human factors and technology factors were associated with patient adverse events. The findings have implications for how nursing professionals can implement safe patient handling programs in ways that are safe for both staff and patients. Published by Elsevier Ltd.

  17. Application research of 3D additive manufacturing technology in the nail shell

    NASA Astrophysics Data System (ADS)

    Xiao, Shanhua; Yan, Ruiqiang; Song, Ning

    2018-04-01

    Based on the analysis of hierarchical slicing algorithm, 3D scanning of enterprise product nailing handle case file is carried out, point cloud data processing is performed on the source file, and the surface modeling and innovative design of nail handling handle case are completed. Using MakerBot Replicator2X-based 3D printer for layered 3D print samples, for the new nail product development to provide reverse modeling and rapid prototyping technical support.

  18. A preliminary study of the effects of handling type on horses' emotional reactivity and the human-horse relationship.

    PubMed

    Fureix, Carole; Pagès, Magali; Bon, Richard; Lassalle, Jean-Michel; Kuntz, Philippe; Gonzalez, Georges

    2009-10-01

    Handling is a crucial component of the human-horse relationship. Here, we report data from an experiment conducted to assess and compare the effect of two training methods. Two groups of six Welsh mares were trained during four sessions of 50 min, one handled with traditional exercises (halter leading, grooming/brushing, lifting feet, lunging and pseudo-saddling (using only girth and saddle pad) and the second group with natural horsemanship exercises (desensitization, yielding to body pressure, lunging and free-lunging). Emotional reactivity (ER) and the human-horse relationship (HHR) were assessed both prior to and following handling. A social isolation test, a neophobia test and a bridge test were used to assess ER. HHR was assessed through test of spontaneous approach to, and forced approach by, an unknown human. Horses' ER decreased after both types of handling as indicated by decreases in the occurrence of whinnying during stressful situations. Head movement (jerk/shake) was the most sensitive variable to handling type. In the spontaneous approach tests, horses in the traditional handling group showed higher latencies to approach a motionless person after handling than did the natural horsemanship group. Our study suggests that natural horsemanship exercises could be more efficient than traditional exercises for improving horses' HHR.

  19. Empirical evaluation of data normalization methods for molecular classification

    PubMed Central

    Huang, Huei-Chung

    2018-01-01

    Background Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers—an increasingly important application of microarrays in the era of personalized medicine. Methods In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. Results In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Conclusion Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy. PMID:29666754

  20. Data Handling and Processing Unit for Alphabus/Alphasat TDP-8

    NASA Astrophysics Data System (ADS)

    Habinc, Sandi; Martins, Rodolfo; Costa Pinto, Joao; Furano, Gianluca

    2011-08-01

    ESA's and Inmarsat's ARTES 8 Alphabus/Alphasat is a specific programme dedicated to the development and deployment of Alphasat. It encompasses several technology demonstration payloads (TDPs), of which the TDP8 is an Environment effects facility to monitor the GEO radiation environment and its effects on electronic components and sensors. This paper will discuss the rapid development of the processor and board for TDP8's data handling and processing unit.

  1. Maintaining data integrity in a rural clinical trial.

    PubMed

    Van den Broeck, Jan; Mackay, Melanie; Mpontshane, Nontobeko; Kany Kany Luabeya, Angelique; Chhagan, Meera; Bennish, Michael L

    2007-01-01

    Clinical trials conducted in rural resource-poor settings face special challenges in ensuring quality of data collection and handling. The variable nature of these challenges, ways to overcome them, and the resulting data quality are rarely reported in the literature. To provide a detailed example of establishing local data handling capacity for a clinical trial conducted in a rural area, highlight challenges and solutions in establishing such capacity, and to report the data quality obtained by the trial. We provide a descriptive case study of a data system for biological samples and questionnaire data, and the problems encountered during its implementation. To determine the quality of data we analyzed test-retest studies using Kappa statistics of inter- and intra-observer agreement on categorical data. We calculated Technical Errors of Measurement of anthropometric measurements, audit trail analysis was done to assess error correction rates, and residual error rates were calculated by database-to-source document comparison. Initial difficulties included the unavailability of experienced research nurses, programmers and data managers in this rural area and the difficulty of designing new software tools and a complex database while making them error-free. National and international collaboration and external monitoring helped ensure good data handling and implementation of good clinical practice. Data collection, fieldwork supervision and query handling depended on streamlined transport over large distances. The involvement of a community advisory board was helpful in addressing cultural issues and establishing community acceptability of data collection methods. Data accessibility for safety monitoring required special attention. Kappa values and Technical Errors of Measurement showed acceptable values. Residual error rates in key variables were low. The article describes the experience of a single-site trial and does not address challenges particular to multi-site trials. Obtaining and maintaining data integrity in rural clinical trials is feasible, can result in acceptable data quality and can be used to develop capacity in developing country sites. It does, however, involve special challenges and requirements.

  2. Flight testing and frequency domain analysis for rotorcraft handling qualities characteristics

    NASA Technical Reports Server (NTRS)

    Ham, Johnnie A.; Gardner, Charles K.; Tischler, Mark B.

    1993-01-01

    A demonstration of frequency domain flight testing techniques and analyses was performed on a U.S. Army OH-58D helicopter in support of the OH-58D Airworthiness and Flight Characteristics Evaluation and the Army's development and ongoing review of Aeronautical Design Standard 33C, Handling Qualities Requirements for Military Rotorcraft. Hover and forward flight (60 knots) tests were conducted in 1 flight hour by Army experimental test pilots. Further processing of the hover data generated a complete database of velocity, angular rate, and acceleration frequency responses to control inputs. A joint effort was then undertaken by the Airworthiness Qualification Test Directorate (AQTD) and the U.S. Army Aeroflightdynamics Directorate (AFDD) to derive handling qualities information from the frequency response database. A significant amount of information could be extracted from the frequency domain database using a variety of approaches. This report documents numerous results that have been obtained from the simple frequency domain tests; in many areas, these results provide more insight into the aircraft dynamics that affect handling qualities than to traditional flight tests. The handling qualities results include ADS-33C bandwidth and phase delay calculations, vibration spectral determinations, transfer function models to examine single axis results, and a six degree of freedom fully coupled state space model. The ability of this model to accurately predict aircraft responses was verified using data from pulse inputs. This report also documents the frequency-sweep flight test technique and data analysis used to support the tests.

  3. Guidance Counsellor Strategies for Handling Bullying

    ERIC Educational Resources Information Center

    Power-Elliott, Michleen; Harris, Gregory E.

    2012-01-01

    The purpose of this exploratory-descriptive study was to examine how guidance counsellors in the province of Newfoundland and Labrador would handle a specific verbal-relational bullying incident. Also of interest was guidance counsellor involvement and training in bullying programmes and Positive Behaviour Supports. Data for this study was…

  4. IPAD applications to the design, analysis, and/or machining of aerospace structures. [Integrated Program for Aerospace-vehicle Design

    NASA Technical Reports Server (NTRS)

    Blackburn, C. L.; Dovi, A. R.; Kurtze, W. L.; Storaasli, O. O.

    1981-01-01

    A computer software system for the processing and integration of engineering data and programs, called IPAD (Integrated Programs for Aerospace-Vehicle Design), is described. The ability of the system to relieve the engineer of the mundane task of input data preparation is demonstrated by the application of a prototype system to the design, analysis, and/or machining of three simple structures. Future work to further enhance the system's automated data handling and ability to handle larger and more varied design problems are also presented.

  5. Facilitating hydrological data analysis workflows in R: the RHydro package

    NASA Astrophysics Data System (ADS)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges of the RHydro package, including integration with big data technologies, web technologies, and emerging data models in hydrology.

  6. Quinone-induced protein handling changes: Implications for major protein handling systems in quinone-mediated toxicity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Rui; Siegel, David; Ross, David, E-mail: david.ross@ucdenver.edu

    2014-10-15

    Para-quinones such as 1,4-Benzoquinone (BQ) and menadione (MD) and ortho-quinones including the oxidation products of catecholamines, are derived from xenobiotics as well as endogenous molecules. The effects of quinones on major protein handling systems in cells; the 20/26S proteasome, the ER stress response, autophagy, chaperone proteins and aggresome formation, have not been investigated in a systematic manner. Both BQ and aminochrome (AC) inhibited proteasomal activity and activated the ER stress response and autophagy in rat dopaminergic N27 cells. AC also induced aggresome formation while MD had little effect on any protein handling systems in N27 cells. The effect of NQO1more » on quinone induced protein handling changes and toxicity was examined using N27 cells stably transfected with NQO1 to generate an isogenic NQO1-overexpressing line. NQO1 protected against BQ–induced apoptosis but led to a potentiation of AC- and MD-induced apoptosis. Modulation of quinone-induced apoptosis in N27 and NQO1-overexpressing cells correlated only with changes in the ER stress response and not with changes in other protein handling systems. These data suggested that NQO1 modulated the ER stress response to potentiate toxicity of AC and MD, but protected against BQ toxicity. We further demonstrated that NQO1 mediated reduction to unstable hydroquinones and subsequent redox cycling was important for the activation of the ER stress response and toxicity for both AC and MD. In summary, our data demonstrate that quinone-specific changes in protein handling are evident in N27 cells and the induction of the ER stress response is associated with quinone-mediated toxicity. - Highlights: • Unstable hydroquinones contributed to quinone-induced ER stress and toxicity.« less

  7. Tickling, a Technique for Inducing Positive Affect When Handling Rats.

    PubMed

    Cloutier, Sylvie; LaFollette, Megan R; Gaskill, Brianna N; Panksepp, Jaak; Newberry, Ruth C

    2018-05-08

    Handling small animals such as rats can lead to several adverse effects. These include the fear of humans, resistance to handling, increased injury risk for both the animals and the hands of their handlers, decreased animal welfare, and less valid research data. To minimize negative effects on experimental results and human-animal relationships, research animals are often habituated to being handled. However, the methods of habituation are highly variable and often of limited effectiveness. More potently, it is possible for humans to mimic aspects of the animals' playful rough-and-tumble behavior during handling. When applied to laboratory rats in a systematic manner, this playful handling, referred to as tickling, consistently gives rise to positive behavioral responses. This article provides a detailed description of a standardized rat tickling technique. This method can contribute to future investigations into positive affective states in animals, make it easier to handle rats for common husbandry activities such as cage changing or medical/research procedures such as injection, and be implemented as a source of social enrichment. It is concluded that this method can be used to efficiently and practicably reduce rats' fearfulness of humans and improve their welfare, as well as reliably model positive affective states.

  8. Brief early handling increases morphine dependence in adult rats.

    PubMed

    Vazquez, Vincent; Penit-Soria, Jacqueline; Durand, Claudette; Besson, Marie-Jo; Giros, Bruno; Daugé, Valérie

    2006-06-30

    Short early manipulations of rodent postnatal environment may trigger long-term effects on neurobiological and behavioural phenotypes in adulthood. However, little is known about such effects of handling on the vulnerability to develop drug dependence. The present study aimed to analyze the long-term effects of a brief handling (1 min) on morphine and ethanol dependence and on the preproenkephalin (PPE) mRNA and mu opioid receptor levels. Handled rats showed a significant increase in morphine (25mg/l) but not ethanol (10%) consumption and preference after 7 weeks and no difference in morphine (2 and 5mg/kg) conditioned place preference. No difference of preproenkephalin mRNA and mu opioid receptor levels was detected in the mesolimbic system between both groups. These data emphasize that human brief handling, which can lead to morphine dependence development, constitutes in itself an experimental treatment and not a control condition.

  9. Handling of thermal paper: Implications for dermal exposure to bisphenol A and its alternatives

    PubMed Central

    Bernier, Meghan R.

    2017-01-01

    Bisphenol A (BPA) is an endocrine disrupting chemical used in a wide range of consumer products including photoactive dyes used in thermal paper. Recent studies have shown that dermal absorption of BPA can occur when handling these papers. Yet, regulatory agencies have largely dismissed thermal paper as a major source of BPA exposure. Exposure estimates provided by agencies such as the European Food Safety Authority (EFSA) are based on assumptions about how humans interact with this material, stating that ‘typical’ exposures for adults involve only one handling per day for short periods of time (<1 minute), with limited exposure surfaces (three fingertips). The objective of this study was to determine how individuals handle thermal paper in one common setting: a cafeteria providing short-order meals. We observed thermal paper handling in a college-aged population (n = 698 subjects) at the University of Massachusetts’ dining facility. We find that in this setting, individuals handle receipts for an average of 11.5 min, that >30% of individuals hold thermal paper with more than three fingertips, and >60% allow the paper to touch their palm. Only 11% of the participants we observed were consistent with the EFSA model for time of contact and dermal surface area. Mathematical modeling based on handling times we measured and previously published transfer coefficients, concentrations of BPA in paper, and absorption factors indicate the most conservative estimated intake from handling thermal paper in this population is 51.1 ng/kg/day, similar to EFSA’s estimates of 59 ng/kg/day from dermal exposures. Less conservative estimates, using published data on concentrations in thermal paper and transfer rates to skin, indicate that exposures are likely significantly higher. Based on our observational data, we propose that the current models for estimating dermal BPA exposures are not consistent with normal human behavior and should be reevaluated. PMID:28570582

  10. Handling of thermal paper: Implications for dermal exposure to bisphenol A and its alternatives.

    PubMed

    Bernier, Meghan R; Vandenberg, Laura N

    2017-01-01

    Bisphenol A (BPA) is an endocrine disrupting chemical used in a wide range of consumer products including photoactive dyes used in thermal paper. Recent studies have shown that dermal absorption of BPA can occur when handling these papers. Yet, regulatory agencies have largely dismissed thermal paper as a major source of BPA exposure. Exposure estimates provided by agencies such as the European Food Safety Authority (EFSA) are based on assumptions about how humans interact with this material, stating that 'typical' exposures for adults involve only one handling per day for short periods of time (<1 minute), with limited exposure surfaces (three fingertips). The objective of this study was to determine how individuals handle thermal paper in one common setting: a cafeteria providing short-order meals. We observed thermal paper handling in a college-aged population (n = 698 subjects) at the University of Massachusetts' dining facility. We find that in this setting, individuals handle receipts for an average of 11.5 min, that >30% of individuals hold thermal paper with more than three fingertips, and >60% allow the paper to touch their palm. Only 11% of the participants we observed were consistent with the EFSA model for time of contact and dermal surface area. Mathematical modeling based on handling times we measured and previously published transfer coefficients, concentrations of BPA in paper, and absorption factors indicate the most conservative estimated intake from handling thermal paper in this population is 51.1 ng/kg/day, similar to EFSA's estimates of 59 ng/kg/day from dermal exposures. Less conservative estimates, using published data on concentrations in thermal paper and transfer rates to skin, indicate that exposures are likely significantly higher. Based on our observational data, we propose that the current models for estimating dermal BPA exposures are not consistent with normal human behavior and should be reevaluated.

  11. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed Central

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group. PMID:28034175

  12. Should multiple imputation be the method of choice for handling missing data in randomized trials?

    PubMed

    Sullivan, Thomas R; White, Ian R; Salter, Amy B; Ryan, Philip; Lee, Katherine J

    2016-01-01

    The use of multiple imputation has increased markedly in recent years, and journal reviewers may expect to see multiple imputation used to handle missing data. However in randomized trials, where treatment group is always observed and independent of baseline covariates, other approaches may be preferable. Using data simulation we evaluated multiple imputation, performed both overall and separately by randomized group, across a range of commonly encountered scenarios. We considered both missing outcome and missing baseline data, with missing outcome data induced under missing at random mechanisms. Provided the analysis model was correctly specified, multiple imputation produced unbiased treatment effect estimates, but alternative unbiased approaches were often more efficient. When the analysis model overlooked an interaction effect involving randomized group, multiple imputation produced biased estimates of the average treatment effect when applied to missing outcome data, unless imputation was performed separately by randomized group. Based on these results, we conclude that multiple imputation should not be seen as the only acceptable way to handle missing data in randomized trials. In settings where multiple imputation is adopted, we recommend that imputation is carried out separately by randomized group.

  13. Handling Data Skew in MapReduce Cluster by Using Partition Tuning

    PubMed

    Gao, Yufei; Zhou, Yanjie; Zhou, Bing; Shi, Lei; Zhang, Jiacai

    2017-01-01

    The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH). In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN). We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM) on healthcare data. © 2017 Yufei Gao et al.

  14. Handling Data Skew in MapReduce Cluster by Using Partition Tuning.

    PubMed

    Gao, Yufei; Zhou, Yanjie; Zhou, Bing; Shi, Lei; Zhang, Jiacai

    2017-01-01

    The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH). In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN). We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM) on healthcare data.

  15. Handling Data Skew in MapReduce Cluster by Using Partition Tuning

    PubMed Central

    Zhou, Yanjie; Zhou, Bing; Shi, Lei

    2017-01-01

    The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH). In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN). We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM) on healthcare data. PMID:29065568

  16. Classified and clustered data constellation: An efficient approach of 3D urban data management

    NASA Astrophysics Data System (ADS)

    Azri, Suhaibah; Ujang, Uznir; Castro, Francesc Antón; Rahman, Alias Abdul; Mioc, Darka

    2016-03-01

    The growth of urban areas has resulted in massive urban datasets and difficulties handling and managing issues related to urban areas. Huge and massive datasets can degrade data retrieval and information analysis performance. In addition, the urban environment is very difficult to manage because it involves various types of data, such as multiple types of zoning themes in the case of urban mixed-use development. Thus, a special technique for efficient handling and management of urban data is necessary. This paper proposes a structure called Classified and Clustered Data Constellation (CCDC) for urban data management. CCDC operates on the basis of two filters: classification and clustering. To boost up the performance of information retrieval, CCDC offers a minimal percentage of overlap among nodes and coverage area to avoid repetitive data entry and multipath query. The results of tests conducted on several urban mixed-use development datasets using CCDC verify that it efficiently retrieves their semantic and spatial information. Further, comparisons conducted between CCDC and existing clustering and data constellation techniques, from the aspect of preservation of minimal overlap and coverage, confirm that the proposed structure is capable of preserving the minimum overlap and coverage area among nodes. Our overall results indicate that CCDC is efficient in handling and managing urban data, especially urban mixed-use development applications.

  17. Brunn: an open source laboratory information system for microplates with a graphical plate layout design process.

    PubMed

    Alvarsson, Jonathan; Andersson, Claes; Spjuth, Ola; Larsson, Rolf; Wikberg, Jarl E S

    2011-05-20

    Compound profiling and drug screening generates large amounts of data and is generally based on microplate assays. Current information systems used for handling this are mainly commercial, closed source, expensive, and heavyweight and there is a need for a flexible lightweight open system for handling plate design, and validation and preparation of data. A Bioclipse plugin consisting of a client part and a relational database was constructed. A multiple-step plate layout point-and-click interface was implemented inside Bioclipse. The system contains a data validation step, where outliers can be removed, and finally a plate report with all relevant calculated data, including dose-response curves. Brunn is capable of handling the data from microplate assays. It can create dose-response curves and calculate IC50 values. Using a system of this sort facilitates work in the laboratory. Being able to reuse already constructed plates and plate layouts by starting out from an earlier step in the plate layout design process saves time and cuts down on error sources.

  18. Handling of Adolescent Rats Improves Learning and Memory and Decreases Anxiety

    PubMed Central

    Costa, Rafaela; Tamascia, Mariana L; Nogueira, Marie D; Casarini, Dulce E; Marcondes, Fernanda K

    2012-01-01

    Some environmental interventions can result in physiologic and behavioral changes in laboratory animals. In this context, the handling of adolescent or adult rodents has been reported to influence exploratory behavior and emotionality. Here we examined the effects of handling on memory and anxiety levels of adolescent rats. Male Sprague–Dawley rats (age, 60 d) were divided into a control group and a handled group, which were handled for 5 min daily, 5 d per week, for 6 wk. During handling bouts, the rat was removed from its cage, placed in the experimenter's lap or on the top of a table, and had its neck and back gently stroked by the experimenter's fingers. During week 6, each rat's anxiety level was evaluated in the elevated plus-maze (EPM) test. Learning and memory were evaluated 48 h later, by measuring escape latency in the elevated plus-maze test. Plasma corticosterone and catecholamine levels were measured also. Norepinephrine levels were lower in the handled rats compared with control animals, with no differences in epinephrine and corticosterone. As compared with the control rats, the handled rats showed increases in the percentage of time spent in the open arms of the test apparatus, percentage of entries into open arms, and number of visits to the end of the open arms and decreases in the latency of the first open arm entry. Escape latency was lower in the handled rats compared with control rats in both the first and second trials. The data obtained suggest that handling decreases anxiety levels and improves learning skills and memory in rats. PMID:23312082

  19. Theoretical Insights for Practical Handling of Pressurized Fluids

    ERIC Educational Resources Information Center

    Aranda, Alfonso; Rodriguez, Maria del Prado

    2006-01-01

    The practical scenarios discussed in a chemistry or chemical engineering course that use solid or liquid reactants are presented. Important ideas to be considered when handling pressurized fluids are provided and three typical examples are described to enable students develop secondary skills such as the selective search of data, identification of…

  20. LIMITATIONS ON THE USES OF MULTIMEDIA EXPOSURE MEASUREMENTS FOR MULTIPATHWAY EXPOSURE ASSESSMENT - PART I: HANDLING OBSERVATIONS BELOW DETECTION LIMITS

    EPA Science Inventory

    Multimedia data from two probability-based exposure studies were investigated in terms of how censoring of non-detects affected estimation of population parameters and associations. Appropriate methods for handling censored below-detection-limit (BDL) values in this context were...

  1. The current practice of handling and reporting missing outcome data in eight widely used PROMs in RCT publications: a review of the current literature.

    PubMed

    Rombach, Ines; Rivero-Arias, Oliver; Gray, Alastair M; Jenkinson, Crispin; Burke, Órlaith

    2016-07-01

    Patient-reported outcome measures (PROMs) are designed to assess patients' perceived health states or health-related quality of life. However, PROMs are susceptible to missing data, which can affect the validity of conclusions from randomised controlled trials (RCTs). This review aims to assess current practice in the handling, analysis and reporting of missing PROMs outcome data in RCTs compared to contemporary methodology and guidance. This structured review of the literature includes RCTs with a minimum of 50 participants per arm. Studies using the EQ-5D-3L, EORTC QLQ-C30, SF-12 and SF-36 were included if published in 2013; those using the less commonly implemented HUI, OHS, OKS and PDQ were included if published between 2009 and 2013. The review included 237 records (4-76 per relevant PROM). Complete case analysis and single imputation were commonly used in 33 and 15 % of publications, respectively. Multiple imputation was reported for 9 % of the PROMs reviewed. The majority of publications (93 %) failed to describe the assumed missing data mechanism, while low numbers of papers reported methods to minimise missing data (23 %), performed sensitivity analyses (22 %) or discussed the potential influence of missing data on results (16 %). Considerable discrepancy exists between approved methodology and current practice in handling, analysis and reporting of missing PROMs outcome data in RCTs. Greater awareness is needed for the potential biases introduced by inappropriate handling of missing data, as well as the importance of sensitivity analysis and clear reporting to enable appropriate assessments of treatment effects and conclusions from RCTs.

  2. Results of Database Studies in Spine Surgery Can Be Influenced by Missing Data.

    PubMed

    Basques, Bryce A; McLynn, Ryan P; Fice, Michael P; Samuel, Andre M; Lukasiewicz, Adam M; Bohl, Daniel D; Ahn, Junyoung; Singh, Kern; Grauer, Jonathan N

    2017-12-01

    National databases are increasingly being used for research in spine surgery; however, one limitation of such databases that has received sparse mention is the frequency of missing data. Studies using these databases often do not emphasize the percentage of missing data for each variable used and do not specify how patients with missing data are incorporated into analyses. This study uses the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) database to examine whether different treatments of missing data can influence the results of spine studies. (1) What is the frequency of missing data fields for demographics, medical comorbidities, preoperative laboratory values, operating room times, and length of stay recorded in ACS-NSQIP? (2) Using three common approaches to handling missing data, how frequently do those approaches agree in terms of finding particular variables to be associated with adverse events? (3) Do different approaches to handling missing data influence the outcomes and effect sizes of an analysis testing for an association with these variables with occurrence of adverse events? Patients who underwent spine surgery between 2005 and 2013 were identified from the ACS-NSQIP database. A total of 88,471 patients undergoing spine surgery were identified. The most common procedures were anterior cervical discectomy and fusion, lumbar decompression, and lumbar fusion. Demographics, comorbidities, and perioperative laboratory values were tabulated for each patient, and the percent of missing data was noted for each variable. These variables were tested for an association with "any adverse event" using three separate multivariate regressions that used the most common treatments for missing data. In the first regression, patients with any missing data were excluded. In the second regression, missing data were treated as a negative or "reference" value; for continuous variables, the mean of each variable's reference range was computed and imputed. In the third regression, any variables with > 10% rate of missing data were removed from the regression; among variables with ≤ 10% missing data, individual cases with missing values were excluded. The results of these regressions were compared to determine how the different treatments of missing data could affect the results of spine studies using the ACS-NSQIP database. Of the 88,471 patients, as many as 4441 (5%) had missing elements among demographic data, 69,184 (72%) among comorbidities, 70,892 (80%) among preoperative laboratory values, and 56,551 (64%) among operating room times. Considering the three different treatments of missing data, we found different risk factors for adverse events. Of 44 risk factors found to be associated with adverse events in any analysis, only 15 (34%) of these risk factors were common among the three regressions. The second treatment of missing data (assuming "normal" value) found the most risk factors (40) to be associated with any adverse event, whereas the first treatment (deleting patients with missing data) found the fewest associations at 20. Among the risk factors associated with any adverse event, the 10 with the greatest effect size (odds ratio) by each regression were ranked. Of the 15 variables in the top 10 for any regression, six of these were common among all three lists. Differing treatments of missing data can influence the results of spine studies using the ACS-NSQIP. The current study highlights the importance of considering how such missing data are handled. Until there are better guidelines on the best approaches to handle missing data, investigators should report how missing data were handled to increase the quality and transparency of orthopaedic database research. Readers of large database studies should note whether handling of missing data was addressed and consider potential bias with high rates or unspecified or weak methods for handling missing data.

  3. DoD Electronic Data Interchange (EDI) Convention: ASC X12 Transaction Set 832 Price Sales Catalog (Version 003030)

    DTIC Science & Technology

    1992-12-01

    DATA DES . ELEMENT NAME ATlNPUTES Conditional TD401 152 Special Handling Code C ID 2/3 Code specifying special transportation handling instructions. HAN...Executhre Age"t for Eketronic Conmnerce/Electmnlc Dots lnterchange/Protection of Logistica Undaasslfled/Serssltlve Systerr Executive Agent for EC/EDI...PRICEISALES CATALOG ANSI ASC X12 VERSIONIRELEASE 003030DOD_ 7 Communications Transport Protocol ISA /_Interchange Control Header GS/ Functional Group Header

  4. Payload/orbiter signal-processing and data-handling system evaluation

    NASA Technical Reports Server (NTRS)

    Teasdale, W. E.; Polydoros, A.

    1980-01-01

    Incompatibilities between orbiter subsystems and payload communication systems to assure that acceptable and to end system performamce will be achieved are identified. The potential incompatabilities are associated with either payloads in the cargo bay or detached payloads communicating with the orbiter via an RF link. The payload signal processing and data handling systems are assessed by investigating interface problems experienced between the inertial upper stage and the orbiter since similar problems are expected for other payloads.

  5. Recent progress towards predicting aircraft ground handling performance

    NASA Technical Reports Server (NTRS)

    Yager, T. J.; White, E. J.

    1981-01-01

    Capability implemented in simulating aircraft ground handling performance is reviewed and areas for further expansion and improvement are identified. Problems associated with providing necessary simulator input data for adequate modeling of aircraft tire/runway friction behavior are discussed and efforts to improve tire/runway friction definition, and simulator fidelity are described. Aircraft braking performance data obtained on several wet runway surfaces are compared to ground vehicle friction measurements. Research to improve methods of predicting tire friction performance are discussed.

  6. Preprocessing Structured Clinical Data for Predictive Modeling and Decision Support

    PubMed Central

    Oliveira, Mónica Duarte; Janela, Filipe; Martins, Henrique M. G.

    2016-01-01

    Summary Background EHR systems have high potential to improve healthcare delivery and management. Although structured EHR data generates information in machine-readable formats, their use for decision support still poses technical challenges for researchers due to the need to preprocess and convert data into a matrix format. During our research, we observed that clinical informatics literature does not provide guidance for researchers on how to build this matrix while avoiding potential pitfalls. Objectives This article aims to provide researchers a roadmap of the main technical challenges of preprocessing structured EHR data and possible strategies to overcome them. Methods Along standard data processing stages – extracting database entries, defining features, processing data, assessing feature values and integrating data elements, within an EDPAI framework –, we identified the main challenges faced by researchers and reflect on how to address those challenges based on lessons learned from our research experience and on best practices from related literature. We highlight the main potential sources of error, present strategies to approach those challenges and discuss implications of these strategies. Results Following the EDPAI framework, researchers face five key challenges: (1) gathering and integrating data, (2) identifying and handling different feature types, (3) combining features to handle redundancy and granularity, (4) addressing data missingness, and (5) handling multiple feature values. Strategies to address these challenges include: cross-checking identifiers for robust data retrieval and integration; applying clinical knowledge in identifying feature types, in addressing redundancy and granularity, and in accommodating multiple feature values; and investigating missing patterns adequately. Conclusions This article contributes to literature by providing a roadmap to inform structured EHR data preprocessing. It may advise researchers on potential pitfalls and implications of methodological decisions in handling structured data, so as to avoid biases and help realize the benefits of the secondary use of EHR data. PMID:27924347

  7. Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben

    2014-01-01

    This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.

  8. Differences in Muscle Activity During Cable Resistance Training Are Influenced by Variations in Handle Types.

    PubMed

    Rendos, Nicole K; Heredia Vargas, Héctor M; Alipio, Taislaine C; Regis, Rebeca C; Romero, Matthew A; Signorile, Joseph F

    2016-07-01

    Rendos, NK, Heredia Vargas, HM, Alipio, TC, Regis, RC, Romero, MA, and Signorile, JF. Differences in muscle activity during cable resistance training are influenced by variations in handle types. J Strength Cond Res 30(7): 2001-2009, 2016-There has been a recent resurgence in the use of cable machines for resistance training allowing movements that more effectively simulate daily activities and sports-specific movements. By necessity, these devices require a machine/human interface through some type of handle. Considerable data from material handling, industrial engineering, and exercise training studies indicate that handle qualities, especially size and shape, can significantly influence force production and muscular activity, particularly of the forearm muscles, which affect the critical link in activities that require object manipulation. The purpose for this study was to examine the influence of three different handle conditions: standard handle (StandH), ball handle with the cable between the index and middle fingers (BallIM), and ball handle with the cable between the middle and ring fingers (BallMR), on activity levels (rmsEMG) of the triceps brachii lateral and long heads (TriHLat, TriHLong), brachioradialis (BR), flexor carpi radialis (FCR), extensor carpi ulnaris, and extensor digitorum (ED) during eight repetitions of standing triceps pushdown performed from 90° to 0° elbow flexion at 1.5 s per contractile stage. Handle order was randomized. No significant differences were seen for triceps or BR rmsEMG across handle conditions; however, relative patterns of activation did vary for the forearm muscles by handle condition, with more coordinated activation levels for the FCR and ED during the ball handle conditions. In addition, the rmsEMG for the ED was significantly higher during the BallIM than any other condition and during the BallMR than the StandH. These results indicate that the use of ball handles with the cable passing between different fingers can vary the utilization patterns of selected forearm muscles and may therefore be advantageous for coaches, personal trainers, therapists, or bodybuilders for targeted training or rehabilitation of these muscles.

  9. Rotorcraft handling-qualities design criteria development

    NASA Technical Reports Server (NTRS)

    Aiken, Edwin W.; Lebacqz, J. Victor; Chen, Robert T. N.; Key, David L.

    1988-01-01

    Joint NASA/Army efforts at the Ames Research Center to develop rotorcraft handling-qualities design criteria began in earnest in 1975. Notable results were the UH-1H VSTOLAND variable stability helicopter, the VFA-2 camera-and-terrain-board simulator visual system, and the generic helicopter real-time mathematical model, ARMCOP. An initial series of handling-qualities studies was conducted to assess the effects of rotor design parameters, interaxis coupling, and various levels of stability and control augmentation. The ability to conduct in-flight handling-qualities research was enhanced by the development of the NASA/Army CH-47 variable-stability helicopter. Research programs conducted using this vehicle include vertical-response investigations, hover augmentation systems, and the effects of control-force characteristics. The handling-qualities data base was judged to be sufficient to allow an update of the military helicopter handling-qualities specification, MIL-H-8501. These efforts, including not only the in-house experimental work but also contracted research and collaborative programs performed under the auspices of various international agreements. The report concludes by reviewing the topics that are currently most in need of work, and the plans for addressing these topics.

  10. The relationship between emotional intelligence competencies and preferred conflict-handling styles.

    PubMed

    Morrison, Jeanne

    2008-11-01

    The purpose of this study was to determine if a relationship exists between emotional intelligence (EI) and preferred conflict-handling styles of registered nurses. Conflict cannot be eliminated from the workplace therefore learning appropriate conflict-handling skills is important. Ninety-four registered nurses working in three south Mississippi healthcare facilities participated in this quantitative study. Ninety-two valid sets of data instruments were collected for this study. Higher levels of EI positively correlated with collaborating and negatively with accommodating. The issue of occupational stress and conflict among nurses is a major concern. It is imperative nurses learn how to effectively handle conflict in the work environment. Developing the competencies of EI and understanding how to effectively handle conflict is necessary for nurses working in a highly stressful occupation. Effective leadership management includes conflict management and collaboration. The art of relationship management is necessary when handling other people's emotions. When conflict is approached with high levels of EI, it creates an opportunity for learning effective interpersonal skills. Understanding how EI levels and conflict skills correlate can be used to improve interpersonal relationships in a healthcare facility.

  11. Summary of the effects of engine throttle response on airplane formation-flying qualities

    NASA Technical Reports Server (NTRS)

    Walsh, Kevin R.

    1993-01-01

    A flight evaluation was conducted to determine the effect of engine throttle response characteristics on precision formation-flying qualities. A variable electronic throttle control system was developed and flight-tested on a TF-104G airplane with a J79-11B engine at the NASA Dryden Flight Research Facility. This airplane was chosen because of its known, very favorable thrust response characteristics. Ten research flights were flown to evaluate the effects of throttle gain, time delay, and fuel control rate limiting on engine handling qualities during a demanding precision wing formation task. Handling quality effects of lag filters and lead compensation time delays were also evaluated. The Cooper and Harper Pilot Rating Scale was used to assign levels of handling quality. Data from pilot ratings and comments indicate that throttle control system time delays and rate limits cause significant degradations in handling qualities. Threshold values for satisfactory (level 1) and adequate (level 2) handling qualities of these key variables are presented. These results may provide engine manufacturers with guidelines to assure satisfactory handling qualities in future engine designs.

  12. Factors affecting vaccine handling and storage practices among immunization service providers in Ibadan, Oyo State, Nigeria.

    PubMed

    Dairo, David M; Osizimete, Oyarebu E

    2016-06-01

    Improper handling has been identified as one of the major reasons for the decline in vaccine potency at the time of administration. Loss of potency becomes evident when immunised individuals contract the diseases the vaccines were meant to prevent. Assessing the factors associated with vaccine handling and storage practices. This was a cross-sectional study. Three-stage sampling was used to recruit 380 vaccine handlers from 273 health facilities from 11 Local Government areas in Ibadan. Data was analysed using SPSS version 16. Seventy-three percent were aware of vaccine handling and storage guidelines with 68.4% having ever read such guidelines. Only 15.3% read a guideline less than 1 month prior to the study. About 65.0% had received training on vaccine management. Incorrect handling practices reported included storing injections with vaccines (13.7%) and maintaining vaccine temperature using ice blocks (7.6%). About 43.0% had good knowledge of vaccine management, while 66.1% had good vaccine management practices. Respondents who had good knowledge of vaccine handling and storage [OR=10.0, 95%CI (5.28 - 18.94), p < 0.001] and had received formal training on vaccine management [OR=5.3, 95%CI (2.50 - 11.14), p< 0.001] were more likely to have good vaccine handling and storage practices. Regular training is recommended to enhance vaccine handling and storage practices.

  13. Physiological and subjective measures of workload when shovelling with a conventional and two-handled ('levered') shovel.

    PubMed

    Bridger, R S; Cabion, N; Goedecke, J; Rickard, S; Schabort, E; Westgarth-Taylor, C; Lambert, M I

    1997-11-01

    Previous studies have suggested that the two-handled (levered) shovel is advantageous over the conventional spade from a biomechanical point of view. The aim of this experiment was to determine whether less energy was consumed while shovelling a load of sand with this shovel compared to a conventional tool. Accordingly, an experiment was designed in which subjects (n = 10) shovelled 1815 kg sand under laboratory conditions using either a conventional or a levered shovel. Heart rate and oxygen consumption were measured continuously during the trial and subjective data on perceived exertion, general fatigue and body discomfort were recorded after the trial. Although total energy expenditure was similar under both conditions (120 +/- 20 and 125 +/- 25 kcal; conventional versus two-handled spade), average heart rate was 4% higher when the two-handled shovel was used (p < 0.05). In addition, the mass of sand per scoop was 4% less with the two-handled shovel (p < 0.05). In conclusion, subjects used similar energy expenditure to shovel 1815 kg sand with the conventional shovel and the two-handled tool despite lower mass of sand per scoop with the latter. This can be explained by the fact that the increased mass of the additional handle compensated for the lower mass of sand per scoop. The higher average heart rate while shovelling with the two-handled shovel can be explained by the more erect posture.

  14. Framing moving and handling as a complex healthcare intervention within the acute care of older people with osteoporosis: a qualitative study.

    PubMed

    Smith, Margaret Coulter; O'May, Fiona; Tropea, Savina; Berg, Jackie

    2016-10-01

    To investigate healthcare staff's views and experiences of caring for older hospitalised adults (aged 60+) with osteoporosis focusing on moving and handling. Specific objectives were to explore the composition of manual handling risk assessments and interventions in osteoporosis. Osteoporosis is a skeletal disease that reduces bone density and causes increased fracture risk. Incidence rises with age and osteoporotic fractures cause increased morbidity and mortality. It is a major global health problem. In the UK older hospitalised adults are normally screened for falls risk but not necessarily for osteoporosis. As presentation of osteoporosis is normally silent until fractures are evident, it is frequently undiagnosed. Healthcare staff's knowledge of osteoporosis is often suboptimal and specific manual handling implications are under-researched. An exploratory qualitative content analysis research design informed by critical realism. The purposive sample comprised 26 nursing and allied health professionals. Semi-structured interviews addressed topics including knowledge of osteoporosis, implications for acute care, moving and handling and clinical guidelines. Qualitative content data analysis was used. Awareness of osteoporosis prevalence in older populations varies and implications for nursing are indistinct to nonspecialists. In-hospital fractures potentially linked to suboptimal moving and handling seemed rare, but prospective studies are needed. Categories of 'Understanding moving and handling as routine care or as a healthcare intervention', with further categories 'healthcare practitioners' capacities and capabilities for dealing with people with osteoporosis' and 'the structural and organisational context for moving and handling' are reported alongside safety, frailty and dependency dimensions. This study informs moving and handling in higher risk groups such as osteoporosis. Clinical knowledge/expertise is required when adapting generic manual handling guidelines to specific patients/contexts. Patients' experiences of moving and handling have received limited attention. Increased focus on musculoskeletal conditions and moving and handling implications is required. © 2016 The Authors. Journal of Clinical Nursing Published by John Wiley & Sons Ltd.

  15. Effects of early human handling on the pain sensitivity of young lambs.

    PubMed

    Guesgen, Mirjam J; Beausoleil, Ngaio J; Stewart, Mairi

    2013-01-01

    Pain sensitivity of lambs changes over the first weeks of life. However, the effects of early treatments such as human handling on pain sensitivity are unknown for this species. This study investigated the effects of regular early gentle human handling on the pain sensitivity of lambs, indicated by their behavioural responses to tail docking. Prospective part-blinded experimental study. Twenty-nine singleton Coopworth lambs (females n=14, males n=15). Starting at one day of age, lambs were either handled twice daily for 2 weeks (Handled), were kept in the presence of lambs who were being handled but were not handled themselves (Presence), or were exposed to a human only during routine feeding and care (Control). At 3 weeks of age, all lambs were tail docked using rubber rings. Changes in behaviour due to docking were calculated and change data were analyzed using two-way anova with treatment and test pen as main factors. All lambs showed significant increases in the frequency and duration of behaviours indicative of pain, including 'abnormal' behaviours, and decreases in the frequency and duration of 'normal' behaviours after docking. Handled lambs showed a smaller increase in the time spent lying abnormally after docking than did Control lambs (mean transformed change in proportion of 30 minutes spent±SE: Control 0.55±0.04; Handled 0.38±0.03; Presence 0.48±0.03; C versus H t=3.45, p=0.007). These results provide some evidence that handling early in life may reduce subsequent pain sensitivity in lambs. While the behavioural effects of handling on pain behaviour were subtle, the results suggest, at the very least, that early handling does not increase pain sensitivity in lambs and suggests there is still flexibility postnatally in the pain processing system of a precocial species. © 2012 The Authors. Veterinary Anaesthesia and Analgesia. © 2012 Association of Veterinary Anaesthetists and the American College of Veterinary Anesthesiologists.

  16. Social isolation and chronic handling alter endocannabinoid signaling and behavioral reactivity to context in adult rats

    PubMed Central

    Sciolino, Natale R.; Bortolato, Marco; Eisenstein, Sarah A.; Fu, Jin; Oveisi, Fariba; Hohmann, Andrea G.; Piomelli, Daniele

    2010-01-01

    Social deprivation in early life disrupts emotionality and attentional processes in humans. Rearing rats in isolation reproduces some of these abnormalities, which are attenuated by daily handling. However, the neurochemical mechanisms underlying these responses remain poorly understood. We hypothesized that post-weaning social isolation alters the endocannabinoid system, a neuromodulatory system that controls emotional responding. We characterized behavioral consequences of social isolation and evaluated whether handling would reverse social isolation-induced alterations in behavioral reactivity to context and the endocannabinoid system. At weaning, pups were single or group housed and concomitantly handled or not handled daily until adulthood. Rats were tested in emotionality- and attentional-sensitive behavioral assays (open field, elevated plus maze, startle and prepulse inhibition). Cannabinoid receptor densities and endocannabinoid levels were quantified in a separate group of rats. Social isolation negatively altered behavioral responding. Socially-isolated rats that were handled showed less deficits in the open field, elevated plus maze, and prepulse inhibition tests. Social isolation produced site-specific alterations (supraoptic nucleus, ventrolateral thalamus, rostral striatum) in cannabinoid receptor densities compared to group rearing. Handling altered the endocannabinoid system in neural circuitry controlling emotional expression. Handling altered endocannabinoid content (prefrontal and piriform cortices, nucleus accumbens) and cannabinoid receptor densities (lateral globus pallidus, cingulate and piriform cortices, hippocampus) in a region-specific manner. Some effects of social isolation on the endocannabinoid system were moderated by handling. Isolates were unresponsive to handling-induced increases in cannabinoid receptor densities (caudal striatum, anterior thalamus), but were sensitive to handling-induced increases in endocannabinoid content (piriform cortex), compared to group-reared rats. Our findings suggest alterations in the endocannabinoid system may contribute to the abnormal isolate phenotype. Handling modifies the endocannabinoid system and behavioral reactivity to context, but surmounts only some effects of social isolation. These data implicate a pivotal role for the endocannabinoid system in stress adaptation and emotionality-related disturbances. PMID:20394803

  17. A Data Model Framework for the Characterization of a Satellite Data Handling Software

    NASA Astrophysics Data System (ADS)

    Camatto, Gianluigi; Tipaldi, Massimo; Bothmer, Wolfgang; Ferraguto, Massimo; Bruenjes, Bernhard

    2014-08-01

    This paper describes an approach for the modelling of the characterization and configuration data yielded when developing a Satellite Data Handling Software (DHSW). The model can then be used as an input for the preparation of the logical and physical representation of the Satellite Reference Database (SRDB) contents and related SW suite, an essential product that allows transferring the information between the different system stakeholders, but also to produce part of the DHSW documentation and artefacts. Special attention is given to the shaping of the general Parameter concept, which is shared by a number of different entities within a Space System.

  18. Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level.

    PubMed

    Savalei, Victoria; Rhemtulla, Mijke

    2017-08-01

    In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data-that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study.

  19. PIMS sequencing extension: a laboratory information management system for DNA sequencing facilities

    PubMed Central

    2011-01-01

    Background Facilities that provide a service for DNA sequencing typically support large numbers of users and experiment types. The cost of services is often reduced by the use of liquid handling robots but the efficiency of such facilities is hampered because the software for such robots does not usually integrate well with the systems that run the sequencing machines. Accordingly, there is a need for software systems capable of integrating different robotic systems and managing sample information for DNA sequencing services. In this paper, we describe an extension to the Protein Information Management System (PIMS) that is designed for DNA sequencing facilities. The new version of PIMS has a user-friendly web interface and integrates all aspects of the sequencing process, including sample submission, handling and tracking, together with capture and management of the data. Results The PIMS sequencing extension has been in production since July 2009 at the University of Leeds DNA Sequencing Facility. It has completely replaced manual data handling and simplified the tasks of data management and user communication. Samples from 45 groups have been processed with an average throughput of 10000 samples per month. The current version of the PIMS sequencing extension works with Applied Biosystems 3130XL 96-well plate sequencer and MWG 4204 or Aviso Theonyx liquid handling robots, but is readily adaptable for use with other combinations of robots. Conclusions PIMS has been extended to provide a user-friendly and integrated data management solution for DNA sequencing facilities that is accessed through a normal web browser and allows simultaneous access by multiple users as well as facility managers. The system integrates sequencing and liquid handling robots, manages the data flow, and provides remote access to the sequencing results. The software is freely available, for academic users, from http://www.pims-lims.org/. PMID:21385349

  20. On the derivation of a full life table from mortality data recorded in five-year age groups.

    PubMed

    Pollard, J H

    1989-01-01

    Mortality data are often gathered using 5-year age groups rather than individual years of life. Furthermore, it is common practice to use a large open-ended interval (such as 85 and over) for mortality data at the older ages. These limitations of the data pose problems for the actuary or demographer who wishes to compile a full and accurate life table using individual years of life. The author devises formulae which handle these problems. He also devises methods for handling mortality during the 1st year of life and for dealing with other technical problems which arise in the compilation of the full life table from grouped data.

  1. Teletesting at IABG - Technical Features and Security Issues

    NASA Astrophysics Data System (ADS)

    Goerner, E.

    2004-08-01

    In the space simulation department at IABG data handling systems are used to collect, evaluate and present all data gathered from different test chambers during thermal vacuum tests. In the year 2000 a redesign of the existing data handling systems gave us the opportunity to add some features like ethernet- based client / server systems and internet protocol TCP / IP. The results were state of the art internet-ready data handling systems. Based on this we started mid 2002 with a new project called teletesting to give our customers remote access to test data. For the realisation TCO (Total Cost of Ownership), QoS (Quality of Service), data confidentiality, restrictive access to test data and a plain and simple user interface with standard components, i.e. normal PC hardware and software, were mandatory. As a result of this project, our customers have now online access to their test data in CSV/EXCEL format, in display mode either in numerical or graphical form and through DynaWorks. ISDN teletesting is already used by our customers, internet teletesting is in test mode but some parts have already been approved and used. Although an extension to teleoperation is implemented in the control systems (WIN CC) of our test chambers, it is not yet in use.

  2. On the Treatment of Field Quantities and Elemental Continuity in FEM Solutions.

    PubMed

    Jallepalli, Ashok; Docampo-Sanchez, Julia; Ryan, Jennifer K; Haimes, Robert; Kirby, Robert M

    2018-01-01

    As the finite element method (FEM) and the finite volume method (FVM), both traditional and high-order variants, continue their proliferation into various applied engineering disciplines, it is important that the visualization techniques and corresponding data analysis tools that act on the results produced by these methods faithfully represent the underlying data. To state this in another way: the interpretation of data generated by simulation needs to be consistent with the numerical schemes that underpin the specific solver technology. As the verifiable visualization literature has demonstrated: visual artifacts produced by the introduction of either explicit or implicit data transformations, such as data resampling, can sometimes distort or even obfuscate key scientific features in the data. In this paper, we focus on the handling of elemental continuity, which is often only continuous or piecewise discontinuous, when visualizing primary or derived fields from FEM or FVM simulations. We demonstrate that traditional data handling and visualization of these fields introduce visual errors. In addition, we show how the use of the recently proposed line-SIAC filter provides a way of handling elemental continuity issues in an accuracy-conserving manner with the added benefit of casting the data in a smooth context even if the representation is element discontinuous.

  3. Fifteen-Year-Old Pupils' Variable Handling Performance in the Context of Scientific Investigations.

    ERIC Educational Resources Information Center

    Donnelly, J. F.

    1987-01-01

    Reports findings on variable-handling aspects of pupil performance in investigatory tasks, using data from the British Assessment of Performance Unit (APU) national survey program. Discusses the significance of these findings for assessment methodology and for understanding of 15-year-olds' approaches to the variable-based logic of investigation.…

  4. Psychosocial Determinants of Conflict-Handling Behaviour of Workers in Oil Sector in Nigeria

    ERIC Educational Resources Information Center

    Bankole, Akanji Rafiu

    2011-01-01

    The study examined the joint and relative influence of three psychosocial factors: Emotional intelligence, communication skill and interpersonal skill on conflict-handling behaviour of oil workers in Nigeria. Survey research design was adopted and a sample of 610 workers was randomly selected from oil companies across the country. Data were…

  5. Managing Data From Signal-Propagation Experiments

    NASA Technical Reports Server (NTRS)

    Kantak, A. V.

    1989-01-01

    Computer programs generate characteristic plots from amplitudes and phases. Software system enables minicomputer to process data on amplitudes and phases of signals received during experiments in ground-mobile/satellite radio propagation. Takes advantage of file-handling capabilities of UNIX operating system and C programming language. Interacts with user, under whose guidance programs in FORTRAN language generate plots of spectra or other curves of types commonly used to characterize signals. FORTRAN programs used to process file-handling outputs into any of several useful forms.

  6. Systems definition summary. Earth Observatory Satellite system definition study (EOS)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A standard spacecraft bus for performing a variety of earth orbit missions in the late 1970's and 1980's is defined. Emphasis is placed on a low-cost, multimission capability, benefitting from the space shuttle system. The subjects considered are as follows: (1) performance requirements, (2) internal interfaces, (3) redundancy and reliability, (4) communications and data handling module design, (5) payload data handling, (6) application of the modular design to various missions, and (7) the verification concept.

  7. The State-of-the-Art in Natural Language Understanding.

    DTIC Science & Technology

    1981-01-28

    driven text analysis. If we know a story is about a restaurant, we expect that we may encounter a waitress, menu, table, a bill, food , and other... Pront aids for Data Bases During the 70’s a number of natural language data base front ends apreared: LUNPLR Woods et al 19721 has already been briefly...like to loo.< it inr. ui4 : 3D ’-- "-: handling of novel language, especially netaphor; az-I i,?i nn rti inriq, -mlerstanding systems: the handling of

  8. Replaceable Inserts for Ship’s Line Handling Chocks

    DTIC Science & Technology

    2006-10-30

    terms on the cover page. Page 12 - Nelson Engineering Co. Contract # N6553806M0156 Replaceable Inserts for Ship’s Line Handling Chocks Phase I Final... Nelson Engineering CO. 3655 Sells Arbor Circle T11tusvlle, FL 32760 (321)269-1113 Fax (321)269-0506 www.NelsonEngrCo.com Topic N06-059 Final Phase...2006 Carolyi’ S. Seringer, Oice President, POC Expiration of SBIR Data Rights Period: October 30, 2011 Nelson Engineering Co. exerts its data rights

  9. Earth Observatory Satellite system definition study. Report 5: System design and specifications. Volume 3: General purpose spacecraft segment and module specifications

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The specifications for the Earth Observatory Satellite (EOS) general purpose aircraft segment are presented. The satellite is designed to provide attitude stabilization, electrical power, and a communications data handling subsystem which can support various mission peculiar subsystems. The various specifications considered include the following: (1) structures subsystem, (2) thermal control subsystem, (3) communications and data handling subsystem module, (4) attitude control subsystem module, (5) power subsystem module, and (6) electrical integration subsystem.

  10. Data Handling Recording System (DHRS).

    DTIC Science & Technology

    1980-07-01

    er.. side Ittv-00MYe artc Identify by bock no.Imbe.) The final technical report submitted by Harris Corporation contains a brief synopsis of the...is several hours, plenty of time for enemy aircraft, tanks, ships, and convoys to relocate. The Harris /WEC DHRS allows real-time target reporting and...A-A089 952 HARRIS CORP MELBOURE FLA F/6 15/4 DATA HANDLING RECORDING SYSTEM (DHRS).(U) JUL80 V E TAYLOR F30602-79-C-0268 NCLASSIFIED RADC-TR-80-198

  11. High rate science data handling on Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Masline, Richard C.

    1990-01-01

    A study by NASA's User Information System Working Group for Space Station Freedom (SSF) has determined that the proposed onboard Data Management System, as initially configured, will be incapable of handling the data-generation rates typical of numerous scientific sensor payloads; many of these generate data at rates in excess of 10 Mbps, and there are at least four cases of rates in excess of 300 Mbps. The SSF Working Group has accordingly suggested an alternative conceptual architecture based on technology expected to achieve space-qualified status by 1995. The architecture encompasses recorders with rapid data-ingest capabilities and massive storage capabilities, optical delay lines allowing the recording of only the phenomena of interest, and data flow-compressing image processors.

  12. Consumer food handling in the home: a review of food safety studies.

    PubMed

    Redmond, Elizabeth C; Griffith, Christopher J

    2003-01-01

    Epidemiological data from Europe, North America, Australia, and New Zealand indicate that a substantial proportion of foodborne disease is attributable to improper food preparation practices in consumers' homes. International concern about consumer food safety has prompted considerable research to evaluate domestic food-handling practices. The majority of consumer food safety studies in the last decade have been conducted in the United Kingdom and Northern Ireland (48%) and in the United States (42%). Surveys (questionnaires and interviews), the most frequent means of data collection, were used in 75% of the reviewed studies. Focus groups and observational studies have also been used. One consumer food safety study examined the relationship between pathogenic microbial contamination from raw chicken and observed food-handling behaviors, and the results of this study indicated extensive Campylobacter cross-contamination during food preparation sessions. Limited information about consumers' attitudes and intentions with regard to safe food-handling behaviors has been obtained, although a substantial amount of information about consumer knowledge and self-reported practices is available. Observation studies suggest that substantial numbers of consumers frequently implement unsafe food-handling practices. Knowledge, attitudes, intentions, and self-reported practices did not correspond to observed behaviors, suggesting that observational studies provide a more realistic indication of the food hygiene actions actually used in domestic food preparation. An improvement in consumer food-handling behavior is likely to reduce the risk and incidence of foodborne disease. The need for the development and implementation of food safety education strategies to improve specific food safety behaviors is reviewed in this paper.

  13. Reducing security risk using data loss prevention technology.

    PubMed

    Beeskow, John

    2015-11-01

    Data loss/leakage protection (DLP) technology seeks to improve data security by answering three fundamental questions: > Where are confidential data stored? > Who is accessing the information? > How are data being handled?

  14. TROPHI: development of a tool to measure complex, multi-factorial patient handling interventions.

    PubMed

    Fray, Mike; Hignett, Sue

    2013-01-01

    Patient handling interventions are complex and multi-factorial. It has been difficult to make comparisons across different strategies due to the lack of a comprehensive outcome measurement method. The Tool for Risk Outstanding in Patient Handling Interventions (TROPHI) was developed to address this gap by measuring outcomes and comparing performance across interventions. Focus groups were held with expert patient handling practitioners (n = 36) in four European countries (Finland, Italy, Portugal and the UK) to identify preferred outcomes to be measured for interventions. A systematic literature review identified 598 outcome measures; these were critically appraised and the most appropriate measurement tool was selected for each outcome. TROPHI was evaluated in the four EU countries (eight sites) and by an expert panel (n = 16) from the European Panel of Patient Handling Ergonomics for usability and practical application. This final stage added external validity to the research by exploring transferability potential and presenting the data and analysis to allow respondent (participant) validation. Patient handling interventions are complex and multi-factorial and it has been difficult to make comparisons due to the lack of a comprehensive outcome measurement method. The Tool for Risk Outstanding in Patient Handling Interventions (TROPHI) was developed to address this gap by measuring outcomes to compare performance across interventions.

  15. Women of low socioeconomic status living with diabetes: Becoming adept at handling a disease.

    PubMed

    Boonsatean, Wimonrut; Dychawy Rosner, Irena; Carlsson, Anna; Östman, Margareta

    2015-01-01

    The objective of this study was to explore how Thai women of low socioeconomic status handle their type 2 diabetes. A qualitative interpretative method was used to study 19 women with type 2 diabetes in a suburban community in Thailand. Data were collected via semi-structured interviews and were analysed using inductive and constructive processes. Participants' lives underwent many changes between their initial diagnoses and later stages when they became adept at handling diabetes. Two themes emerged, which involved (1) the transition to handling diabetes and (2) the influences of the social environment. The first theme encompassed confronting the disease, reaching a turning point in the process of adaptation and developing expertise in handling diabetes. The second theme involved threats of loss of status and empowerment by families. These findings showed that becoming adept at handling diabetes required significant changes in women's behaviours and required taking advantage of influences from the social environment. The process of developing expertise in handling diabetes was influenced by both inner and outer factors that required adjustment to learn to live with diabetes. Furthermore, the reductions found in women's social statuses when they become patients in the healthcare system might pose a barrier to women of low socioeconomic status becoming adept at handling diabetes. However, the experiences of empowerment received from the women's families acted as a powerful strategy to strengthen their handling of the disease. To develop accessible and sensitive health care for this population, it is important to pay attention to these findings.

  16. The Gaia On-Board Scientific Data Handling

    NASA Astrophysics Data System (ADS)

    Arenou, F.; Babusiaux, C.; Chéreau, F.; Mignot, S.

    2005-01-01

    Because Gaia will perform a continuous all-sky survey at a medium (Spectro) or very high (Astro) angular resolution, the on-board processing needs to cope with a high variety of objects and densities which calls for generic and adaptive algorithms at the detection level, but not only. Consequently, the Pyxis scientific algorithms developed for the on-board data handling cover a large range of application: detection and confirmation of astronomical objects, background sky estimation, classification of detected objects, Near-Earth Objects onboard detection, and window selection and positioning. Very dense fields, where the real-time computing requirements should remain within fixed bounds, are particularly challenging. Another constraint stems from the limited telemetry bandwidth and an additional compromise has to be found between scientific requirements and constraints in terms of the mass, volume and power budgets of the satellite. The rationale for the on-board data handling procedure is described here, together with the developed algorithms, the main issues and the expected scientific performances in the Astro and Spectro instruments.

  17. Nurses' Attitudes Regarding the Safe Handling of Patients Who Are Morbidly Obese: Instrument Development and Psychometric Analysis.

    PubMed

    Bejciy-Spring, Susan; Vermillion, Brenda; Morgan, Sally; Newton, Cheryl; Chucta, Sheila; Gatens, Cindy; Zadvinskis, Inga; Holloman, Christopher; Chipps, Esther

    2016-12-01

    Nurses' attitudes play an important role in the consistent practice of safe patient handling behaviors. The purposes of this study were to develop and assess the psychometric properties of a newly developed instrument measuring attitudes of nurses related to the care and safe handling of patients who are obese. Phases of instrument development included (a) item generation, (b) content validity assessment, (c) reliability assessment, (d) cognitive interviewing, and (e) construct validity assessment through factor analysis. The final data from the exploratory factor analysis produced a 26-item multidimensional instrument that contains 9 subscales. Based on the factor analysis, a 26-item instrument can be used to examine nurses' attitudes regarding patients who are morbidly obese and related safe handling practices.

  18. Accelerated speckle imaging with the ATST visible broadband imager

    NASA Astrophysics Data System (ADS)

    Wöger, Friedrich; Ferayorni, Andrew

    2012-09-01

    The Advanced Technology Solar Telescope (ATST), a 4 meter class telescope for observations of the solar atmosphere currently in construction phase, will generate data at rates of the order of 10 TB/day with its state of the art instrumentation. The high-priority ATST Visible Broadband Imager (VBI) instrument alone will create two data streams with a bandwidth of 960 MB/s each. Because of the related data handling issues, these data will be post-processed with speckle interferometry algorithms in near-real time at the telescope using the cost-effective Graphics Processing Unit (GPU) technology that is supported by the ATST Data Handling System. In this contribution, we lay out the VBI-specific approach to its image processing pipeline, put this into the context of the underlying ATST Data Handling System infrastructure, and finally describe the details of how the algorithms were redesigned to exploit data parallelism in the speckle image reconstruction algorithms. An algorithm re-design is often required to efficiently speed up an application using GPU technology; we have chosen NVIDIA's CUDA language as basis for our implementation. We present our preliminary results of the algorithm performance using our test facilities, and base a conservative estimate on the requirements of a full system that could achieve near real-time performance at ATST on these results.

  19. A guide to missing data for the pediatric nephrologist.

    PubMed

    Larkins, Nicholas G; Craig, Jonathan C; Teixeira-Pinto, Armando

    2018-03-13

    Missing data is an important and common source of bias in clinical research. Readers should be alert to and consider the impact of missing data when reading studies. Beyond preventing missing data in the first place, through good study design and conduct, there are different strategies available to handle data containing missing observations. Complete case analysis is often biased unless data are missing completely at random. Better methods of handling missing data include multiple imputation and models using likelihood-based estimation. With advancing computing power and modern statistical software, these methods are within the reach of clinician-researchers under guidance of a biostatistician. As clinicians reading papers, we need to continue to update our understanding of statistical methods, so that we understand the limitations of these techniques and can critically interpret literature.

  20. Integrated Data Base Program: a status report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Notz, K.J.; Klein, J.A.

    1984-06-01

    The Integrated Data Base (IDB) Program provides official Department of Energy (DOE) data on spent fuel and radioactive waste inventories, projections, and characteristics. The accomplishments of FY 1983 are summarized for three broad areas: (1) upgrading and issuing of the annual report on spent fuel and radioactive waste inventories, projections, and characteristics, including ORIGEN2 applications and a quality assurance plan; (2) creation of a summary data file in user-friendly format for use on a personal computer and enhancing user access to program data; and (3) optimizing and documentation of the data handling methodology used by the IDB Program and providingmore » direct support to other DOE programs and sites in data handling. Plans for future work in these three areas are outlined. 23 references, 11 figures.« less

  1. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  2. Handling-induced stress and mortalities in African wild dogs (Lycaon pictus).

    PubMed

    de Villiers, M S; Meltzer, D G; van Heerden, J; Mills, M G; Richardson, P R; van Jaarsveld, A S

    1995-11-22

    Recently it was suggested that the handling of wild dogs (Lycaon pictus) by researchers in the Serengeti ecosystem created stress, resulting in the reactivation of latent rabies viruses in carrier animals. We present data from ongoing studies on free-ranging and captive wild dogs elsewhere in Africa which do not support this hypothesis. Cortisol profiles suggest that immobilization of wild dogs does not cause the chronic stress required for stress-reactivation of latent viruses. Furthermore, there is no evidence of handling-related mortalities in wild dogs: the survivorship of unhandled and handled free-ranging wild dogs did not differ and no captive animals died within a year of handling (immobilization and/or vaccination against rabies). We suggest that the mortalities observed in Tanzania were due to an outbreak of a disease which rabies vaccination was unable to prevent. Intensive monitoring and active management research programmes on wild dogs are essential as without these, critically endangered wild dog populations have little hope of survival.

  3. Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level

    PubMed Central

    Savalei, Victoria; Rhemtulla, Mijke

    2017-01-01

    In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data—that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study. PMID:29276371

  4. Missing data exploration: highlighting graphical presentation of missing pattern.

    PubMed

    Zhang, Zhongheng

    2015-12-01

    Functions shipped with R base can fulfill many tasks of missing data handling. However, because the data volume of electronic medical record (EMR) system is always very large, more sophisticated methods may be helpful in data management. The article focuses on missing data handling by using advanced techniques. There are three types of missing data, that is, missing completely at random (MCAR), missing at random (MAR) and not missing at random (NMAR). This classification system depends on how missing values are generated. Two packages, Multivariate Imputation by Chained Equations (MICE) and Visualization and Imputation of Missing Values (VIM), provide sophisticated functions to explore missing data pattern. In particular, the VIM package is especially helpful in visual inspection of missing data. Finally, correlation analysis provides information on the dependence of missing data on other variables. Such information is useful in subsequent imputations.

  5. Assessing and Addressing Safe Food Handling Knowledge, Attitudes, and Behaviors of College Undergraduates

    ERIC Educational Resources Information Center

    Stein, Susan E.; Dirks, Brian P.; Quinlan, Jennifer J.

    2010-01-01

    The authors determined the food safety knowledge, attitudes, and behaviors of undergraduates (n = 1122) on an urban college campus using a previously piloted survey tool. Data obtained found that while students reported high levels of confidence in their ability to engage in safe food handling practices, their knowledge and self-reported behaviors…

  6. A shuttle and space station manipulator system for assembly, docking, maintenance, cargo handling and spacecraft retrieval (preliminary design). Volume 3: Concept analysis. Part 1: Technical

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Information backing up the key features of the manipulator system concept and detailed technical information on the subsystems are presented. Space station assembly and shuttle cargo handling tasks are emphasized in the concept analysis because they involve shuttle berthing, transferring the manipulator boom between shuttle and station, station assembly, and cargo handling. Emphasis is also placed on maximizing commonality in the system areas of manipulator booms, general purpose end effectors, control and display, data processing, telemetry, dedicated computers, and control station design.

  7. Command and data handling of science signals on Spacelab

    NASA Technical Reports Server (NTRS)

    Mccain, H. G.

    1975-01-01

    The Orbiter Avionics and the Spacelab Command and Data Management System (CDMS) combine to provide a relatively complete command, control, and data handling service to the instrument complement during a Shuttle Sortie Mission. The Spacelab CDMS services the instruments and the Orbiter in turn services the Spacelab. The CDMS computer system includes three computers, two I/O units, a mass memory, and a variable number of remote acquisition units. Attention is given to the CDMS high rate multiplexer, CDMS tape recorders, closed circuit television for the visual monitoring of payload bay and cabin area activities, methods of science data acquisition, questions of transmission and recording, CDMS experiment computer usage, and experiment electronics.

  8. A big-data model for multi-modal public transportation with application to macroscopic control and optimisation

    NASA Astrophysics Data System (ADS)

    Faizrahnemoon, Mahsa; Schlote, Arieh; Maggi, Lorenzo; Crisostomi, Emanuele; Shorten, Robert

    2015-11-01

    This paper describes a Markov-chain-based approach to modelling multi-modal transportation networks. An advantage of the model is the ability to accommodate complex dynamics and handle huge amounts of data. The transition matrix of the Markov chain is built and the model is validated using the data extracted from a traffic simulator. A realistic test-case using multi-modal data from the city of London is given to further support the ability of the proposed methodology to handle big quantities of data. Then, we use the Markov chain as a control tool to improve the overall efficiency of a transportation network, and some practical examples are described to illustrate the potentials of the approach.

  9. Effects of multiple concurrent stressors on rectal temperature, blood acid-base status, and longissimus muscle glycolytic potential in market-weight pigs.

    PubMed

    Ritter, M J; Ellis, M; Anderson, D B; Curtis, S E; Keffaber, K K; Killefer, J; McKeith, F K; Murphy, C M; Peterson, B A

    2009-01-01

    Sixty-four market-weight (130.0 +/- 0.65 kg) barrows (n = 16) and gilts (n = 48) were used in a split-plot design with a 2 x 2 x 2 factorial arrangement of treatments: 1) handling intensity (gentle vs. aggressive), 2) transport floor space (0.39 vs. 0.49 m(2)/pig), and 3) distance moved during handling (25 vs. 125 m) to determine the effects of multiple concurrent stressors on metabolic responses. For the handling intensity treatment, pigs were moved individually approximately 50 m through a handling course with either 0 (gentle) or 8 (aggressive) shocks from an electric goad. Pigs were loaded onto a trailer and transported for approximately 1 h at floor spaces of either 0.39 or 0.49 m(2)/pig. After transport, pigs were unloaded, and the distance moved treatment was applied; pigs were moved 25 or 125 m through a handling course using livestock paddles. Rectal temperature was measured, and blood samples (to measure blood acid-base status) were collected 2 h before the handling intensity treatment was applied and immediately after the distance moved treatment was applied. A LM sample to measure glycolytic potential was collected after the distance moved treatments on a subset of 32 pigs. There were handling intensity x distance moved interactions (P < 0.05) for several blood acid-base measurements. In general, there was no effect of distance moved on these traits when pigs were previously handled gently. However, when pigs were previously handled aggressively, pigs moved 125 compared with 25 m had greater (P < 0.05) blood lactate and less (P < 0.05) blood pH, bicarbonate, and base-excess. Pigs transported at 0.39 compared with 0.49 m(2)/pig had a greater (P < 0.01) increase in creatine kinase values; however, transport floor space did not affect any other measurements. Data were analyzed by the number of stressors (the aggressive handling, restricted transport floor space, and 125-m distance moved treatments) experienced by each pig (0, 1, 2, or 3). As the number of stressors experienced by the pig increased, rectal temperature, blood lactate, and LM lactate increased linearly (P

  10. Incidents/accidents classification and reporting in Statoil.

    PubMed

    Berentsen, Rune; Holmboe, Rolf H

    2004-07-26

    Based on requirements in the new petroleum regulations from Norwegian Petroleum Directorate (NPD) and the realisation of a need to improve and rationalise the routines for reporting and follow up of incidents, Statoil Exploration & Production Norway (Statoil E&P Norway) has formulated a new strategy and process for handling of incidents/accidents. The following past experiences serve as basis for the changes made to incident reporting in Statoil E&P Norway; too much resources were spent on a comprehensive handling and analysis of a vast amount of incidents with less importance for the safety level, taking the focus away from the more severe and important issues at hand, the assessment of "Risk Factor", i.e. the combination of recurrence frequency and consequence, was difficult to use. The high degree of subjectivity involved in the determination of the "Risk Factor" (in particular the estimation of the recurrence frequency) resulted in poor data quality and lack of consistency in the data material. The new system for categorisation and handling of undesirable incidents was established in January 2002. The intention was to get a higher degree of focus on serious incidents (injuries, damages, loss and near misses), with a thorough handling and follow-up. This is reflected throughout the handling of the serious incidents, all the way from immediate notification of the incident, through investigation and follow-up of corrective and preventive actions. Simultaneously, it was also an objective to rationalise/simplify the handling of less serious incidents. These incidents are, however, subjected to analyses twice a year in order to utilize the learning opportunity that they also provide. A year after the introduction of this new system for categorisation and follow-up of undesirable incidents, Statoil's experiences are predominantly good; the intention to get a higher degree of focus on serious incidents (injuries, damages, loss and near misses), has been met, the data quality for the more serious incidents (5% of the total number of incidents registered) has improved, the improved handling of incidents has contributed to more reliable and accurate HSE indicators at a corporate level, more user friendly codes in place for incident registration (based on MTO methodology), the revised matrix gives distinct criteria with respect to which investigation level to be initiated for a specific incident. All activities related to handling of undesirable incidents have been summarised and illustrated on a two-sided plastic form, incorporating both the categorisation matrix and the activity flowchart (see Figs. 1 and 4).

  11. Patient complaints in healthcare services in Vietnam’s health system

    PubMed Central

    Thi Thu Ha, Bui; Mirzoev, Tolib; Morgan, Rosemary

    2015-01-01

    Background: There is growing recognition of patient rights in health sectors around the world. Patients’ right to complain in hospitals, often visible in legislative and regulatory protocols, can be an important information source for service quality improvement and achievement of better health outcomes. However, empirical evidence on complaint processes is scarce, particularly in the developing countries. To contribute in addressing this gap, we investigated patients’ complaint handling processes and the main influences on their implementation in public hospitals in Vietnam. Methods: The study was conducted in two provinces of Vietnam. We focused specifically on the implementation of the Law on Complaints and Denunciations and the Ministry of Health regulation on resolving complaints in the health sector. The data were collected using document review and in-depth interviews with key respondents. Framework approach was used for data analysis, guided by a conceptual framework and aided by qualitative data analysis software. Results: Five steps of complaint handling were implemented, which varied in practice between the provinces. Four groups of factors influenced the procedures: (1) insufficient investment in complaint handling procedures; (2) limited monitoring of complaint processes; (3) patients’ low awareness of, and perceived lack of power to change, complaint procedures and (4) autonomization pressures on local health facilities. While the existence of complaint handling processes is evident in the health system in Vietnam, their utilization was often limited. Different factors which constrained the implementation and use of complaint regulations included health system–related issues as well as social and cultural influences. Conclusion: The study aimed to contribute to improved understanding of complaint handling processes and the key factors influencing these processes in public hospitals in Vietnam. Specific policy implications for improving these processes were proposed, which include improving accountability of service providers and better utilization of information on complaints. PMID:26770804

  12. Distributed and parallel approach for handle and perform huge datasets

    NASA Astrophysics Data System (ADS)

    Konopko, Joanna

    2015-12-01

    Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.

  13. Effects of surgically implanted transmitters on reproduction and survival in mallards

    USGS Publications Warehouse

    Sheppard, Jennifer; Arnold, Todd W.; Amundson, Courtney L.; Klee, David

    2017-01-01

    Abdominally implanted radiotransmitters have been widely used in studies of waterbird ecology; however, the longer handling times and invasiveness of surgical implantation raise important concerns about animal welfare and potential effects on data quality. Although it is difficult to assess effects of handling and marking wild animals by comparing them with unmarked controls, insights can often be obtained by evaluating variation in handling or marking techniques. Here, we used data from 243 female mallards (Anas platyrhynchos) and mallard–grey duck hybrids (A. platyrhynchos × A. superciliosa) equipped with fully encapsulated abdominally implanted radiotransmitters from 2 study sites in New Zealand during 2014–2015 to assess potential marking effects. We evaluated survival, dispersal, and reproductive effort (e.g., breeding propensity, nest initiation date, clutch size) in response to 3 different attributes of handling duration and procedures: 1) processing time, including presurgery banding, measurements, and blood sampling of unanaesthetized birds; 2) surgery time from initiation to cessation of anesthetic; and 3) total holding time from first capture until release. We found no evidence that female survival, dispersal probability, or reproductive effort were negatively affected by holding, processing, or surgery time and concluded that we collected reliable data without compromising animal welfare. Our results support previous research that techniques using fully encapsulated abdominal-implant radiotransmitters are suitable to enable researchers to obtain reliable estimates of reproductive performance and survival. 

  14. Summary of the effects of engine throttle response on airplane formation-flying qualities

    NASA Technical Reports Server (NTRS)

    Walsh, Kevin R.

    1992-01-01

    A flight evaluation as conducted to determine the effect of engine throttle response characteristics on precision formation-flying qualities. A variable electronic throttle control system was developed and flight-tested on a TF-104G airplane with a J79-11B engine at the NASA Dryden Flight Research Facility. Ten research flights were flown to evaluate the effects of throttle gain, time delay, and fuel control rate limiting on engine handling qualities during a demanding precision wing formation task. Handling quality effects of lag filters and lead compensation time delays were also evaluated. Data from pilot ratings and comments indicate that throttle control system time delays and rate limits cause significant degradations in handling qualities. Threshold values for satisfactory (level 1) and adequate (level 2) handling qualities of these key variables are presented.

  15. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  16. Explicating the Conditions Under Which Multilevel Multiple Imputation Mitigates Bias Resulting from Random Coefficient-Dependent Missing Longitudinal Data.

    PubMed

    Gottfredson, Nisha C; Sterba, Sonya K; Jackson, Kristina M

    2017-01-01

    Random coefficient-dependent (RCD) missingness is a non-ignorable mechanism through which missing data can arise in longitudinal designs. RCD, for which we cannot test, is a problematic form of missingness that occurs if subject-specific random effects correlate with propensity for missingness or dropout. Particularly when covariate missingness is a problem, investigators typically handle missing longitudinal data by using single-level multiple imputation procedures implemented with long-format data, which ignores within-person dependency entirely, or implemented with wide-format (i.e., multivariate) data, which ignores some aspects of within-person dependency. When either of these standard approaches to handling missing longitudinal data is used, RCD missingness leads to parameter bias and incorrect inference. We explain why multilevel multiple imputation (MMI) should alleviate bias induced by a RCD missing data mechanism under conditions that contribute to stronger determinacy of random coefficients. We evaluate our hypothesis with a simulation study. Three design factors are considered: intraclass correlation (ICC; ranging from .25 to .75), number of waves (ranging from 4 to 8), and percent of missing data (ranging from 20 to 50%). We find that MMI greatly outperforms the single-level wide-format (multivariate) method for imputation under a RCD mechanism. For the MMI analyses, bias was most alleviated when the ICC is high, there were more waves of data, and when there was less missing data. Practical recommendations for handling longitudinal missing data are suggested.

  17. Syndrome of transient headache and neurological deficits with cerebrospinal fluid lymphocytosis (HaNDL) in a patient with confusional symptoms, diffuse EEG abnormalities, and bilateral vasospasm in transcranial Doppler ultrasound: A case report and literature review.

    PubMed

    Hidalgo de la Cruz, M; Domínguez Rubio, R; Luque Buzo, E; Díaz Otero, F; Vázquez Alén, P; Orcajo Rincón, J; Prieto Montalvo, J; Contreras Chicote, A; Grandas Pérez, F

    2017-04-17

    HaNDL syndrome (transient headache and neurological deficits with cerebrospinal fluid lymphocytosis) is characterised by one or more episodes of headache and transient neurological deficits associated with cerebrospinal fluid lymphocytosis. To date, few cases of HaNDL manifesting with confusional symptoms have been described. Likewise, very few patients with HaNDL and confusional symptoms have been evaluated with transcranial Doppler ultrasound (TCD). TCD data from patients with focal involvement reveal changes consistent with vasomotor alterations. We present the case of a 42-year-old man who experienced headache and confusional symptoms and displayed pleocytosis, diffuse slow activity on EEG, increased blood flow velocity in both middle cerebral arteries on TCD, and single-photon emission computed tomography (SPECT) findings suggestive of diffuse involvement, especially in the left hemisphere. To our knowledge, this is the first description of a patient with HaNDL, confusional symptoms, diffuse slow activity on EEG, and increased blood flow velocity in TCD. Our findings suggest a relationship between cerebral vasomotor changes and the pathophysiology of HaNDL. TCD may be a useful tool for early diagnosis of HaNDL. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  18. Curriculum in Food Handling and Distribution; a Guide for Experimentation in High School and Post High School Vocational Training.

    ERIC Educational Resources Information Center

    Stiles, Philip G.; And Others

    The project developed an experimental curriculum guide for training persons at the high school and post-high school levels in food handling and distribution. Data were gathered through interviews with over 200 food industries in Connecticut. Courses and curriculums were obtained from six secondary schools and seven post-secondary schools. Some of…

  19. Tape tracking and handling for magnetic tape recorders. [aboard spacecraft

    NASA Technical Reports Server (NTRS)

    Paroby, W.; Disilvestre, R.

    1975-01-01

    One of the critical performance and life limiting elements of a spacecraft tape recorder instrumentation system which has received little attention in technical literature is magnetic tape tracking and handling technology. This technology is required to understand how to gently transfer tape from one reel to another with proper alignment and a desirable uniform velocity at the read and write transducer heads. The increased demand for high data rate (i.e., multi-track spacecraft recording instrumentation systems), coupled with performance under extreme environmental conditions, requires a thorough knowledge of the various parameters which establish an optimum designed tape tracking and handling system. Stress analysis techniques are required to evaluate these parameters substantiated with test tape tracking data, to show the effect of each parameter on a tape recorder instrumentation tracking system. The technology is applicable to ground type tape recorders where the detrimental effects of edge guidance can be eliminated.

  20. Graphical overview and navigation of electronic health records in a prototyping environment using Google Earth and openEHR archetypes.

    PubMed

    Sundvall, Erik; Nyström, Mikael; Forss, Mattias; Chen, Rong; Petersson, Håkan; Ahlfeldt, Hans

    2007-01-01

    This paper describes selected earlier approaches to graphically relating events to each other and to time; some new combinations are also suggested. These are then combined into a unified prototyping environment for visualization and navigation of electronic health records. Google Earth (GE) is used for handling display and interaction of clinical information stored using openEHR data structures and 'archetypes'. The strength of the approach comes from GE's sophisticated handling of detail levels, from coarse overviews to fine-grained details that has been combined with linear, polar and region-based views of clinical events related to time. The system should be easy to learn since all the visualization styles can use the same navigation. The structured and multifaceted approach to handling time that is possible with archetyped openEHR data lends itself well to visualizing and integration with openEHR components is provided in the environment.

  1. Magnetic Field Investigation for ISEE mother and daughter spacecraft. [magnetometer design, capability, and results

    NASA Technical Reports Server (NTRS)

    Russell, C. T.

    1981-01-01

    Highlights of the design and fabrication of fluxgate magnetometers for the ISEE A and B satellites which were launched from a single launch vehicle into the same highly elliptical orbit are presented. The instrument consisted of four basic assemblies: the sensors, the drive and sense electronics, the data handling unit; and the flipper. The digital handling data handling assembly contained a digital filter that mantained a uniform transfer function for all three axes of both spacecraft. Initial studies centered on the bow shock and the magnetopause and show that both boundaries are in rapid motion. The bow shock was found to be very thin, close to an ion inertial length in thickness, but the magnetopause was much thicker than expected, about 400 to 1000 km on average. The magnetometers have each logged over 3 2/3 years of continuous operation.

  2. Implementation and adoption of mechanical patient lift equipment in the hospital setting: The importance of organizational and cultural factors.

    PubMed

    Schoenfisch, Ashley L; Myers, Douglas J; Pompeii, Lisa A; Lipscomb, Hester J

    2011-12-01

    Work focused on understanding implementation and adoption of interventions designed to prevent patient-handling injuries in the hospital setting is lacking in the injury literature and may be more insightful than more traditional evaluation measures. Data from focus groups with health care workers were used to describe barriers and promoters of the adoption of patient lift equipment and a shift to a "minimal-manual lift environment" at two affiliated hospitals. Several factors influencing the adoption of the lift equipment and patient-handling policy were noted: time, knowledge/ability, staffing, patient characteristics, and organizational and cultural aspects of work. The adoption process was complex, and considerable variability by hospital and across units was observed. The use of qualitative data can enhance the understanding of factors that influence implementation and adoption of interventions designed to prevent patient-handling injuries among health care workers. Copyright © 2011 Wiley Periodicals, Inc.

  3. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation

    PubMed Central

    Lee, Jae H.; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho

    2014-01-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting. PMID:27081299

  4. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation.

    PubMed

    Lee, Jae H; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T; Seo, Youngho

    2014-11-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting.

  5. Archive data base and handling system for the Orbiter flying qualities experiment program

    NASA Technical Reports Server (NTRS)

    Myers, T. T.; Dimarco, R.; Magdaleno, R. E.; Aponso, B. L.

    1986-01-01

    The OFQ archives data base and handling system assembled as part of the Orbiter Flying Qualities (OFQ) research of the Orbiter Experiments Program (EOX) are described. The purpose of the OFQ archives is to preserve and document shuttle flight data relevant to vehicle dynamics, flight control, and flying qualities in a form that permits maximum use for qualified users. In their complete form, the OFQ archives contain descriptive text (general information about the flight, signal descriptions and units) as well as numerical time history data. Since the shuttle program is so complex, the official data base contains thousands of signals and very complex entries are required to obtain data. The OFQ archives are intended to provide flight phase oriented data subsets with relevant signals which are easily identified for flying qualities research.

  6. Automated Data Handling And Instrument Control Using Low-Cost Desktop Computers And An IEEE 488 Compatible Version Of The ODETA V.

    NASA Astrophysics Data System (ADS)

    van Leunen, J. A. J.; Dreessen, J.

    1984-05-01

    The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to the much better reproducibility of the automatic optimization, which resulted in better reproducibility of the measurement result. Another advantage of the automation is that the programs that control the data handling and the automatic measurement are "user friendly". They guide the operator through the measuring procedure using information from earlier measurements of equivalent test specimens. This makes it possible to let routine measurements be done by much less skilled assistants. It also removes much of the tedious routine labour normally involved in MTF measurements. It can be concluded that automation of MTF measurements as described in the foregoing enhances the usefulness of MTF results as well as reducing the cost of MTF measurements.

  7. Missing data exploration: highlighting graphical presentation of missing pattern

    PubMed Central

    2015-01-01

    Functions shipped with R base can fulfill many tasks of missing data handling. However, because the data volume of electronic medical record (EMR) system is always very large, more sophisticated methods may be helpful in data management. The article focuses on missing data handling by using advanced techniques. There are three types of missing data, that is, missing completely at random (MCAR), missing at random (MAR) and not missing at random (NMAR). This classification system depends on how missing values are generated. Two packages, Multivariate Imputation by Chained Equations (MICE) and Visualization and Imputation of Missing Values (VIM), provide sophisticated functions to explore missing data pattern. In particular, the VIM package is especially helpful in visual inspection of missing data. Finally, correlation analysis provides information on the dependence of missing data on other variables. Such information is useful in subsequent imputations. PMID:26807411

  8. 40 CFR 68.160 - Registration.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... purposes of correcting minor clerical errors, updating administrative information, providing missing data... substances handled in covered processes. (b) The registration shall include the following data: (1...

  9. 40 CFR 68.160 - Registration.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... purposes of correcting minor clerical errors, updating administrative information, providing missing data... substances handled in covered processes. (b) The registration shall include the following data: (1...

  10. 40 CFR 68.160 - Registration.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... purposes of correcting minor clerical errors, updating administrative information, providing missing data... substances handled in covered processes. (b) The registration shall include the following data: (1...

  11. Gwyscan: a library to support non-equidistant scanning probe microscope measurements

    NASA Astrophysics Data System (ADS)

    Klapetek, Petr; Yacoot, Andrew; Grolich, Petr; Valtr, Miroslav; Nečas, David

    2017-03-01

    We present a software library and related methodology for enabling easy integration of adaptive step (non-equidistant) scanning techniques into metrological scanning probe microscopes or scanning probe microscopes where individual x, y position data are recorded during measurements. Scanning with adaptive steps can reduce the amount of data collected in SPM measurements thereby leading to faster data acquisition, a smaller amount of data collection required for a specific analytical task and less sensitivity to mechanical and thermal drift. Implementation of adaptive scanning routines into a custom built microscope is not normally an easy task: regular data are much easier to handle for previewing (e.g. levelling) and storage. We present an environment to make implementation of adaptive scanning easier for an instrument developer, specifically taking into account data acquisition approaches that are used in high accuracy microscopes as those developed by National Metrology Institutes. This includes a library with algorithms written in C and LabVIEW for handling data storage, regular mesh preview generation and planning the scan path on basis of different assumptions. A set of modules for Gwyddion open source software for handling these data and for their further analysis is presented. Using this combination of data acquisition and processing tools one can implement adaptive scanning in a relatively easy way into an instrument that was previously measuring on a regular grid. The performance of the presented approach is shown and general non-equidistant data processing steps are discussed.

  12. Flight test results for several light, canard-configured airplanes

    NASA Technical Reports Server (NTRS)

    Brown, Philip W.

    1987-01-01

    Brief flight evaluations of two different, light, composite constructed, canard and winglet configured airplanes were performed to assess their handling qualities; one airplane was a single engine, pusher design and the other a twin engine, push-pull configuration. An emphasis was placed on the slow speed/high angle of attack region for both airplanes and on the engine-out regime for the twin. Mission suitability assessment included cockpit and control layout, ground and airborne handling qualities, and turbulence response. Very limited performance data was taken. Stall/spin tests and the effects of laminar flow loss on performance and handling qualities were assessed on an extended range, single engine pusher design.

  13. In Flight Evaluation of Active Inceptor Force-Feel Characteristics and Handling Qualities

    NASA Technical Reports Server (NTRS)

    Lusardi, Jeff A.; Blanken, Chris L.; Ott, Carl Raymond; Malpica, Carlos A.; von Gruenhagen, Wolfgang

    2012-01-01

    The effect of inceptor feel-system characteristics on piloted handling qualities has been a research topic of interest for many years. Most of the research efforts have focused on advanced fly-by-wire fixed-wing aircraft with only a few studies investigating the effects on rotorcraft. Consequently, only limited guidance is available on how cyclic force-feel characteristics should be set to obtain optimal handling qualities for rotorcraft. To study this effect, the U.S. Army Aeroflightdynamics Directorate working with the DLR Institute of Flight Systems in Germany under Task X of the U.S. German Memorandum of Understanding have been conducting flight test evaluations. In the U.S., five experimental test pilots have completed evaluations of two Mission Task Elements (MTEs) from ADS-33E-PRF and two command/response types for a matrix of center-stick cyclic force-feel characteristics at Moffett Field. In Germany, three experimental test Pilots have conducted initial evaluations of the two MTEs with two command/response types for a parallel matrix of side-stick cyclic force-feel characteristics at WTD-61 in Manching. The resulting data set is used to correlate the effect of changes in natural frequency and damping ratio of the cyclic inceptor on the piloted handling qualities. Existing criteria in ADS-33E and a proposed Handling Qualities Sensitivity Function that includes the effects of the cyclic force-feel characteristics are also evaluated against the data set and discussed.

  14. Privacy Policy Enforcement for Ambient Ubiquitous Services

    NASA Astrophysics Data System (ADS)

    Oyomno, Were; Jäppinen, Pekka; Kerttula, Esa

    Ubiquitous service providers leverage miniaturised computing terminals equipped with wireless capabilities to avail new service models. These models are pivoted on personal and inexpensive terminals to customise services to individual preferences. Portability, small sizes and compact keyboards are few features popularising mobile terminals. Features enable storing and carrying of ever increasing proportions of personal data and ability to use them in service adaptations. Ubiquitous services automate deeper soliciting of personal data transparently without the need for user interactions. Transparent solicitations, acquisitions and handling of personal data legitimises privacy concerns regarding disclosures, retention and re-use of the data. This study presents a policy enforcement for ubiquitous services that safeguards handling of users personal data and monitors adherence to stipulated privacy policies. Enforcement structures towards usability and scalability are presented.

  15. sbtools: A package connecting R to cloud-based data for collaborative online research

    USGS Publications Warehouse

    Winslow, Luke; Chamberlain, Scott; Appling, Alison P.; Read, Jordan S.

    2016-01-01

    The adoption of high-quality tools for collaboration and reproducible research such as R and Github is becoming more common in many research fields. While Github and other version management systems are excellent resources, they were originally designed to handle code and scale poorly to large text-based or binary datasets. A number of scientific data repositories are coming online and are often focused on dataset archival and publication. To handle collaborative workflows using large scientific datasets, there is increasing need to connect cloud-based online data storage to R. In this article, we describe how the new R package sbtools enables direct access to the advanced online data functionality provided by ScienceBase, the U.S. Geological Survey’s online scientific data storage platform.

  16. Data reduction programs for a laser radar system

    NASA Technical Reports Server (NTRS)

    Badavi, F. F.; Copeland, G. E.

    1984-01-01

    The listing and description of software routines which were used to analyze the analog data obtained from LIDAR - system are given. All routines are written in FORTRAN - IV on a HP - 1000/F minicomputer which serves as the heart of the data acquisition system for the LIDAR program. This particular system has 128 kilobytes of highspeed memory and is equipped with a Vector Instruction Set (VIS) firmware package, which is used in all the routines, to handle quick execution of different long loops. The system handles floating point arithmetic in hardware in order to enhance the speed of execution. This computer is a 2177 C/F series version of HP - 1000 RTE-IVB data acquisition computer system which is designed for real time data capture/analysis and disk/tape mass storage environment.

  17. Space biology initiative program definition review. Trade study 1: Automation costs versus crew utilization

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Hambright, R. N.; Nedungadi, A.; Mcfayden, G. M.; Tsuchida, M. S.

    1989-01-01

    A significant emphasis upon automation within the Space Biology Initiative hardware appears justified in order to conserve crew labor and crew training effort. Two generic forms of automation were identified: automation of data and information handling and decision making, and the automation of material handling, transfer, and processing. The use of automatic data acquisition, expert systems, robots, and machine vision will increase the volume of experiments and quality of results. The automation described may also influence efforts to miniaturize and modularize the large array of SBI hardware identified to date. The cost and benefit model developed appears to be a useful guideline for SBI equipment specifiers and designers. Additional refinements would enhance the validity of the model. Two NASA automation pilot programs, 'The Principal Investigator in a Box' and 'Rack Mounted Robots' were investigated and found to be quite appropriate for adaptation to the SBI program. There are other in-house NASA efforts that provide technology that may be appropriate for the SBI program. Important data is believed to exist in advanced medical labs throughout the U.S., Japan, and Europe. The information and data processing in medical analysis equipment is highly automated and future trends reveal continued progress in this area. However, automation of material handling and processing has progressed in a limited manner because the medical labs are not affected by the power and space constraints that Space Station medical equipment is faced with. Therefore, NASA's major emphasis in automation will require a lead effort in the automation of material handling to achieve optimal crew utilization.

  18. HAZARDOUS SUBSTANCES DATA BANK (HSDB)

    EPA Science Inventory

    Hazardous Substances Data Bank (HSDB) is a factual, non-bibliographic data bank focusing upon the toxicology of potentially hazardous chemicals. It is enhanced with data from such related areas as emergency handling procedures, environmental fate, human exposure, detection method...

  19. Data mining in soft computing framework: a survey.

    PubMed

    Mitra, S; Pal, S K; Mitra, P

    2002-01-01

    The present article provides a survey of the available literature on data mining using soft computing. A categorization has been provided based on the different soft computing tools and their hybridizations used, the data mining function implemented, and the preference criterion selected by the model. The utility of the different soft computing methodologies is highlighted. Generally fuzzy sets are suitable for handling the issues related to understandability of patterns, incomplete/noisy data, mixed media information and human interaction, and can provide approximate solutions faster. Neural networks are nonparametric, robust, and exhibit good learning and generalization capabilities in data-rich environments. Genetic algorithms provide efficient search algorithms to select a model, from mixed media data, based on some preference criterion/objective function. Rough sets are suitable for handling different types of uncertainty in data. Some challenges to data mining and the application of soft computing methodologies are indicated. An extensive bibliography is also included.

  20. Considerations of multiple imputation approaches for handling missing data in clinical trials.

    PubMed

    Quan, Hui; Qi, Li; Luo, Xiaodong; Darchy, Loic

    2018-07-01

    Missing data exist in all clinical trials and missing data issue is a very serious issue in terms of the interpretability of the trial results. There is no universally applicable solution for all missing data problems. Methods used for handling missing data issue depend on the circumstances particularly the assumptions on missing data mechanisms. In recent years, if the missing at random mechanism cannot be assumed, conservative approaches such as the control-based and returning to baseline multiple imputation approaches are applied for dealing with the missing data issues. In this paper, we focus on the variability in data analysis of these approaches. As demonstrated by examples, the choice of the variability can impact the conclusion of the analysis. Besides the methods for continuous endpoints, we also discuss methods for binary and time to event endpoints as well as consideration for non-inferiority assessment. Copyright © 2018. Published by Elsevier Inc.

  1. A new technique in reference based DNA sequence compression algorithm: Enabling partial decompression

    NASA Astrophysics Data System (ADS)

    Banerjee, Kakoli; Prasad, R. A.

    2014-10-01

    The whole gamut of Genetic data is ever increasing exponentially. The human genome in its base format occupies almost thirty terabyte of data and doubling its size every two and a half year. It is well-know that computational resources are limited. The most important resource which genetic data requires in its collection, storage and retrieval is its storage space. Storage is limited. Computational performance is also dependent on storage and execution time. Transmission capabilities are also directly dependent on the size of the data. Hence Data compression techniques become an issue of utmost importance when we confront with the task of handling such giganticdatabases like GenBank. Decompression is also an issue when such huge databases are being handled. This paper is intended not only to provide genetic data compression but also partially decompress the genetic sequences.

  2. Uvf - Unified Volume Format: A General System for Efficient Handling of Large Volumetric Datasets.

    PubMed

    Krüger, Jens; Potter, Kristin; Macleod, Rob S; Johnson, Christopher

    2008-01-01

    With the continual increase in computing power, volumetric datasets with sizes ranging from only a few megabytes to petascale are generated thousands of times per day. Such data may come from an ordinary source such as simple everyday medical imaging procedures, while larger datasets may be generated from cluster-based scientific simulations or measurements of large scale experiments. In computer science an incredible amount of work worldwide is put into the efficient visualization of these datasets. As researchers in the field of scientific visualization, we often have to face the task of handling very large data from various sources. This data usually comes in many different data formats. In medical imaging, the DICOM standard is well established, however, most research labs use their own data formats to store and process data. To simplify the task of reading the many different formats used with all of the different visualization programs, we present a system for the efficient handling of many types of large scientific datasets (see Figure 1 for just a few examples). While primarily targeted at structured volumetric data, UVF can store just about any type of structured and unstructured data. The system is composed of a file format specification with a reference implementation of a reader. It is not only a common, easy to implement format but also allows for efficient rendering of most datasets without the need to convert the data in memory.

  3. Hearing aid handling skills: relationship with satisfaction and benefit.

    PubMed

    Campos, Patrícia Danieli; Bozza, Amanda; Ferrari, Deborah Viviane

    2014-01-01

    To evaluate hearing aid handling skills for new and experienced users and to assess if such skills influence user's benefit and satisfaction. Seventy four participants (mean age of 70.43), experienced (n=37) or new hearing aid users (n=37) performed the tasks of "Practical Hearing Aid Skills Test" (PHAST), which were scored on a five-point Likert scale - higher scores indicate better hearing aid handling skills. Experienced users answered the International Outcome Inventory for Hearing Aids (IOI-HA) and the hearing aid benefit for handicap reduction was calculated by the hearing handicap inventory (HHIA/HHIE). Medians for PHAST total scores of 79 and 71% were obtained for experienced and new users, respectively - there were no significant difference between groups. Lower PHAST scores were observed for the tasks of volume control manipulation and telephone usage. Moderate correlations were obtained between IOI benefit and quality of life items and the PHAST scores. There was no correlation between the results of PHAST and demographic data of the participants. There was no difference in handling skills between new and experienced hearing aid users. Handling skills affected hearing aid benefit.

  4. Helicopter roll control effectiveness criteria program summary

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Bourne, Simon M.; Mnich, Marc A.

    1988-01-01

    A study of helicopter roll control effectiveness is summarized for the purpose of defining military helicopter handling qualities requirements. The study is based on an analysis of pilot-in-the-loop task performance of several basic maneuvers. This is extended by a series of piloted simulations using the NASA Ames Vertical Motion Simulator and selected flight data. The main results cover roll control power and short-term response characteristics. In general the handling qualities requirements recommended are set in conjunction with desired levels of flight task and maneuver response which can be directly observed in actual flight. An important aspect of this, however, is that vehicle handling qualities need to be set with regard to some quantitative aspect of mission performance. Specific examples of how this can be accomplished include a lateral unmask/remask maneuver in the presence of a threat and an air tracking maneuver which recognizes the kill probability enhancement connected with decreasing the range to the target. Conclusions and recommendations address not only the handling qualities recommendations, but also the general use of flight simulators and the dependence of mission performance on handling qualities.

  5. Assessing acute effects of trapping, handling, and tagging on the behavior of wildlife using GPS telemetry: a case study of the common brushtail possum.

    PubMed

    Dennis, Todd E; Shah, Shabana F

    2012-01-01

    Trapping, handling, and deployment of tracking devices (tagging) are essential aspects of many research and conservation studies of wildlife. However, often these activities place nonhuman animals under considerable physical or psychological distress, which disrupts normal patterns of behavior and may ultimately result in deleterious effects on animal welfare and the validity of research results. Thus, knowledge of how trapping, handling, and tagging alter the behavior of research animals is essential if measures to ameliorate stress-related effects are to be developed and implemented. This article describes how time-stamped location data obtained by global-positioning-system telemetry can be used to retrospectively characterize acute behavioral responses to trapping, handling, and tagging in free-ranging animals used for research. Methods are demonstrated in a case study of the common brushtail possum, a semiarboreal phalangerid marsupial native to Australia. The study discusses possible physiological causes of observed effects and offers general suggestions regarding simple means to reduce trapping-handling-and-tagging-related stress in field studies of vertebrates.

  6. Exception handling for sensor fusion

    NASA Astrophysics Data System (ADS)

    Chavez, G. T.; Murphy, Robin R.

    1993-08-01

    This paper presents a control scheme for handling sensing failures (sensor malfunctions, significant degradations in performance due to changes in the environment, and errant expectations) in sensor fusion for autonomous mobile robots. The advantages of the exception handling mechanism are that it emphasizes a fast response to sensing failures, is able to use only a partial causal model of sensing failure, and leads to a graceful degradation of sensing if the sensing failure cannot be compensated for. The exception handling mechanism consists of two modules: error classification and error recovery. The error classification module in the exception handler attempts to classify the type and source(s) of the error using a modified generate-and-test procedure. If the source of the error is isolated, the error recovery module examines its cache of recovery schemes, which either repair or replace the current sensing configuration. If the failure is due to an error in expectation or cannot be identified, the planner is alerted. Experiments using actual sensor data collected by the CSM Mobile Robotics/Machine Perception Laboratory's Denning mobile robot demonstrate the operation of the exception handling mechanism.

  7. Documentation forms for monitoring occupational surveillance of healthcare workers who handle cytotoxic drugs.

    PubMed

    Parillo, V L

    1994-01-01

    To develop a procedure for medical surveillance of healthcare workers who handle cytotoxic drugs. Literature review and guidelines published by the Occupational Safety and Health Administration and the National Institute for Occupational Safety and Health. INFORMATION SELECTION: Studies of possible exposure screening tests, congenital defects in offspring, and case studies. Some degree of risk exists in handling cytotoxic drugs, but no reliable screening test for cytotoxic drug exposure has been developed. Reproductive hazards are possible when protective equipment is not used. Areas to be addressed when devising surveillance procedures include who to cover, what baseline data to gather, what periodic monitoring will be necessary (and at what interval it will be conducted), how to handle exposure incidents, and what documentation system will be used. A procedure using a baseline risk factor form and a yearly monitoring questionnaire was devised and implemented. Forms contain documentation of worker teaching. Most often, nurses are the healthcare workers who handle cytotoxic drugs. A consistent approach to monitoring healthcare workers is facilitated by using a defined procedure and standardized forms.

  8. Hand-handle interface force and torque measurement system for pneumatic assembly tool operations: suggested enhancement to ISO 6544.

    PubMed

    Lin, Jia-Hua; McGorry, Raymond W; Chang, Chien-Chi

    2007-05-01

    A hand-handle interface force and torque measurement system is introduced to fill the void acknowledged in the international standard ISO 6544, which governs pneumatic, assembly tool reaction torque and force measurement. This system consists of an instrumented handle with a sensor capable of measuring grip force and reaction hand moment when threaded, fastener-driving tools are used by operators. The handle is rigidly affixed to the tool in parallel to the original tool handle allowing normal fastener-driving operations with minimal interference. Demonstration of this proposed system was made with tools of three different shapes: pistol grip, right angle, and in-line. During tool torque buildup, the proposed system measured operators exerting greater grip force on the soft joint than on the hard joint. The system also demonstrated that the soft joint demanded greater hand moment impulse than the hard joint. The results demonstrate that the measurement system can provide supplemental data useful in exposure assessment with power hand tools as proposed in ISO 6544.

  9. Structural dynamic model obtained from flight use with piloted simulation and handling qualities analysis

    NASA Technical Reports Server (NTRS)

    Powers, Bruce G.

    1996-01-01

    The ability to use flight data to determine an aircraft model with structural dynamic effects suitable for piloted simulation. and handling qualities analysis has been developed. This technique was demonstrated using SR-71 flight test data. For the SR-71 aircraft, the most significant structural response is the longitudinal first-bending mode. This mode was modeled as a second-order system, and the other higher order modes were modeled as a time delay. The distribution of the modal response at various fuselage locations was developed using a uniform beam solution, which can be calibrated using flight data. This approach was compared to the mode shape obtained from the ground vibration test, and the general form of the uniform beam solution was found to be a good representation of the mode shape in the areas of interest. To calibrate the solution, pitch-rate and normal-acceleration instrumentation is required for at least two locations. With the resulting structural model incorporated into the simulation, a good representation of the flight characteristics was provided for handling qualities analysis and piloted simulation.

  10. A versatile system for the rapid collection, handling and graphics analysis of multidimensional data

    NASA Astrophysics Data System (ADS)

    O'Brien, P. M.; Moloney, G.; O'Connor, A.; Legge, G. J. F.

    1993-05-01

    The aim of this work was to provide a versatile system for handling multiparameter data that may arise from a variety of experiments — nuclear, AMS, microprobe elemental analysis, 3D microtomography etc. Some of the most demanding requirements arise in the application of microprobes to quantitative elemental mapping and to microtomography. A system to handle data from such experiments had been under continuous development and use at MARC for the past 15 years. It has now been made adaptable to the needs of multiparameter (or single parameter) experiments in general. The original system has been rewritten, greatly expanded and made much more powerful and faster, by use of modern computer technology — a VME bus computer with a real time operating system and a RISC workstation running Unix and the X Window system. This provides the necessary (i) power, speed and versatility, (ii) expansion and updating capabilities (iii) standardisation and adaptability, (iv) coherent modular programming structures, (v) ability to interface to other programs and (vi) transparent operation with several levels, involving the use of menus, programmed function keys and powerful macro programming facilities.

  11. 3D Boolean operations in virtual surgical planning.

    PubMed

    Charton, Jerome; Laurentjoye, Mathieu; Kim, Youngjun

    2017-10-01

    Boolean operations in computer-aided design or computer graphics are a set of operations (e.g. intersection, union, subtraction) between two objects (e.g. a patient model and an implant model) that are important in performing accurate and reproducible virtual surgical planning. This requires accurate and robust techniques that can handle various types of data, such as a surface extracted from volumetric data, synthetic models, and 3D scan data. This article compares the performance of the proposed method (Boolean operations by a robust, exact, and simple method between two colliding shells (BORES)) and an existing method based on the Visualization Toolkit (VTK). In all tests presented in this article, BORES could handle complex configurations as well as report impossible configurations of the input. In contrast, the VTK implementations were unstable, do not deal with singular edges and coplanar collisions, and have created several defects. The proposed method of Boolean operations, BORES, is efficient and appropriate for virtual surgical planning. Moreover, it is simple and easy to implement. In future work, we will extend the proposed method to handle non-colliding components.

  12. Are special read alignment strategies necessary and cost-effective when handling sequencing reads from patient-derived tumor xenografts?

    PubMed

    Tso, Kai-Yuen; Lee, Sau Dan; Lo, Kwok-Wai; Yip, Kevin Y

    2014-12-23

    Patient-derived tumor xenografts in mice are widely used in cancer research and have become important in developing personalized therapies. When these xenografts are subject to DNA sequencing, the samples could contain various amounts of mouse DNA. It has been unclear how the mouse reads would affect data analyses. We conducted comprehensive simulations to compare three alignment strategies at different mutation rates, read lengths, sequencing error rates, human-mouse mixing ratios and sequenced regions. We also sequenced a nasopharyngeal carcinoma xenograft and a cell line to test how the strategies work on real data. We found the "filtering" and "combined reference" strategies performed better than aligning reads directly to human reference in terms of alignment and variant calling accuracies. The combined reference strategy was particularly good at reducing false negative variants calls without significantly increasing the false positive rate. In some scenarios the performance gain of these two special handling strategies was too small for special handling to be cost-effective, but it was found crucial when false non-synonymous SNVs should be minimized, especially in exome sequencing. Our study systematically analyzes the effects of mouse contamination in the sequencing data of human-in-mouse xenografts. Our findings provide information for designing data analysis pipelines for these data.

  13. A prototype of a computerized patient record.

    PubMed

    Adelhard, K; Eckel, R; Hölzel, D; Tretter, W

    1995-01-01

    Computerized medical record systems (CPRS) should present user and problem oriented views of the patient file. Problem lists, clinical course, medication profiles and results of examinations have to be recorded in a computerized patient record. Patient review screens should give a synopsis of the patient data to inform whenever the patient record is opened. Several different types of data have to be stored in a patient record. Qualitative and quantitative measurements, narratives and images are such examples. Therefore, a CPR must also be able to handle these different data types. New methods and concepts appear frequently in medicine. Thus a CPRS must be flexible enough to cope with coming demands. We developed a prototype of a computer based patient record with a graphical user interface on a SUN workstation. The basis of the system are a dynamic data dictionary, an interpreter language and a large set of basic functions. This approach gives optimal flexibility to the system. A lot of different data types are already supported. Extensions are easily possible. There is also almost no limit concerning the number of medical concepts that can be handled by our prototype. Several applications were built on this platform. Some of them are presented to exemplify the patient and problem oriented handling of the CPR.

  14. RSV (Research Safety Vehicle) test monitoring and data publication-results of European performance and handling test on the Calspan RSV. Final report, Apr-May 79

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, N.; Davis, S.

    1979-07-01

    Performance and handling tests on the Calspan RSV were performed in Italy by the Istituto Sperimentale Auto E Motori (ISAM) and in West Germany by Volkswagenwerk AG Wolfsburg. The ISAM tests evaluated the Calspan RSV in the areas of fuel economy, vehicle response, braking and handling, and driver environment. The Volkswagen tests evaluated the Calspan RSV in the areas of braking, steering, handling, and overturning immunity. The ISAM tests are unlike any previously used to evaluate American vehicles. Therefore, the Calspan RSV results are compared to those of ten European cars which had undergone identical tests. The Volkswagen test proceduresmore » were identical to those specified in the Research Safety Vehicle program. The Calspan RSV results are compared to the RSV specifications for these tests.« less

  15. Contact Analog/Compressed Symbology Heading Tape Assessment

    NASA Technical Reports Server (NTRS)

    Shively, R. Jay; Atencio, Adolph; Turpin, Terry; Dowell, Susan

    2002-01-01

    A simulation assessed the performance, handling qualities and workload associated with a contact-analog, world-referenced heading tape as implemented on the Comanche Helmet Integrated Display Sight System (HIDSS) when compared with a screen-fixed, compressed heading tape. Six pilots, four active duty Army Aviators and two civilians flew three ADS-33 maneuvers and a traffic pattern in the Ames Vertical Motion Simulation facility. Small, but statistically significant advantages were found for the compressed symbology for handling qualities, workload, and some of the performance measures. It should be noted however that the level of performance and handling qualities for both symbology sets fell within the acceptable tolerance levels. Both symbology sets yield satisfactory handling qualities and performance in velocity stabilization mode and adequate handling qualities in the automatic flight control mode. Pilot comments about the contact analog symbology highlighted the lack of useful rate of change information in the heading tape and "blurring" due to the rapid movement of the heading tape. These issues warrant further study. Care must be taken in interpreting the operational significance of these results. The symbology sets yielded categorically similar data, i.e., acceptable handling qualities and adequate performance, so while the results point to the need for further study, their operational significance has yet to be determined.

  16. A Survey of Rabbit Handling Methods Within the United Kingdom and the Republic of Ireland.

    PubMed

    Oxley, James Andrew; Ellis, Clare Frances; McBride, E Anne; McCormick, Wanda Denise

    2018-04-25

    Rabbits are commonly kept in a variety of settings, including homes, laboratories, and veterinary clinics. Despite the popularity of keeping this prey species, little research has investigated current methods of handling. The aim of this study was to examine the experience of caregivers (owners and keepers) in using five handling methods commonly referred to in books written for companion animal (pet) owners and veterinary and/or laboratory personnel. An online survey was completed by 2644 respondents, representing all three of these groups, and breeders. Data were acquired to determine sources that participants used to gain knowledge of different handling methods, the methods they used and for what purposes they used them, and their perceptions of any associated difficulties or welfare concerns. Results indicated that participants most frequently used the method of supporting a rabbit's body against a person's chest, which was considered the easiest and most welfare-friendly method of the handling methods explored. "Scruffing with rear support" was the least used method and was considered to be distressing and painful for the rabbit. As rabbits are a terrestrial prey species, being picked up is likely an innately stressful experience. Additional research is encouraged to explore the experience of rabbits during handling to identify methods that can be easily used with the fewest welfare compromises.

  17. Shoulder torques resulting from luggage handling tasks in non-inertial frames.

    PubMed

    Shippen, James; May, Barbara

    2018-05-18

    This paper reports on the torques developed in the shoulder joint experienced by occupants of moving vehicles during manual handling tasks. Handling heavy weights can cause musculoskeletal injuries, especially if handling is done with arms extended or at high levels. The aim of the study was to measure the longitudinal and lateral accelerations in a variety of passenger vehicles together with the postures of subjects lifting luggage onto storage shelves. This data enabled the application of inverse dynamics methods in a non-inertial reference frame to calculate the shoulder joint torques. The subjects lifted 3 pieces of luggage of masses of 5 kg, 10 kg and 14 kg onto shelving which were at heights of 1.2 m, 1.6 m and 1.8 m. The movement of subjects was measured using a 12 camera, 3-dimensional optical tracking system. The subjects stood on force plates to measure the ground reaction forces. Sixty-three trials were completed, although 9 trials were aborted because subjects felt unable to complete the task. It was found that the shoulder torques exceeded the levels recommend by the UK Health and Safety Executive for manual handling. A lift assistance device is suggested to reduce the shoulder torques required for luggage handling.

  18. Exploring Data: Euclid's Way.

    ERIC Educational Resources Information Center

    Brinkworth, Peter

    1998-01-01

    Introduces handling data as conceived by Euclid, which provides some interesting possibilities for students to investigate fundamental geometrical ideas as well as relating some elementary geometry with elementary trigonometry. (ASK)

  19. Knowledge-based geographic information systems (KBGIS): New analytic and data management tools

    USGS Publications Warehouse

    Albert, T.M.

    1988-01-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.

  20. Design and evaluation of a new ergonomic handle for instruments in minimally invasive surgery.

    PubMed

    Sancibrian, Ramon; Gutierrez-Diez, María C; Torre-Ferrero, Carlos; Benito-Gonzalez, Maria A; Redondo-Figuero, Carlos; Manuel-Palazuelos, Jose C

    2014-05-01

    Laparoscopic surgery techniques have been demonstrated to provide massive benefits to patients. However, surgeons are subjected to hardworking conditions because of the poor ergonomic design of the instruments. In this article, a new ergonomic handle design is presented. This handle is designed using ergonomic principles, trying to provide both more intuitive manipulation of the instrument and a shape that reduces the high-pressure zones in the contact with the surgeon's hand. The ergonomic characteristics of the new handle were evaluated using objective and subjective studies. The experimental evaluation was performed using 28 volunteers by means of the comparison of the new handle with the ring-handle (RH) concept in an instrument available on the market. The volunteers' muscle activation and motions of the hand, wrist, and arm were studied while they performed different tasks. The data measured in the experiment include electromyography and goniometry values. The results obtained from the subjective analysis reveal that most volunteers (64%) preferred the new prototype to the RH, reporting less pain and less difficulty to complete the tasks. The results from the objective study reveal that the hyperflexion of the wrist required for the manipulation of the instrument is strongly reduced. The new ergonomic handle not only provides important ergonomic advantages but also improves the efficiency when completing the tasks. Compared with RH instruments, the new prototype reduced the high-pressure areas and the extreme motions of the wrist. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. FPGA based data processing in the ALICE High Level Trigger in LHC Run 2

    NASA Astrophysics Data System (ADS)

    Engel, Heiko; Alt, Torsten; Kebschull, Udo; ALICE Collaboration

    2017-10-01

    The ALICE High Level Trigger (HLT) is a computing cluster dedicated to the online compression, reconstruction and calibration of experimental data. The HLT receives detector data via serial optical links into FPGA based readout boards that process the data on a per-link level already inside the FPGA and provide it to the host machines connected with a data transport framework. FPGA based data pre-processing is enabled for the biggest detector of ALICE, the Time Projection Chamber (TPC), with a hardware cluster finding algorithm. This algorithm was ported to the Common Read-Out Receiver Card (C-RORC) as used in the HLT for RUN 2. It was improved to handle double the input bandwidth and adjusted to the upgraded TPC Readout Control Unit (RCU2). A flexible firmware implementation in the HLT handles both the old and the new TPC data format and link rates transparently. Extended protocol and data error detection, error handling and the enhanced RCU2 data ordering scheme provide an improved physics performance of the cluster finder. The performance of the cluster finder was verified against large sets of reference data both in terms of throughput and algorithmic correctness. Comparisons with a software reference implementation confirm significant savings on CPU processing power using the hardware implementation. The C-RORC hardware with the cluster finder for RCU1 data is in use in the HLT since the start of RUN 2. The extended hardware cluster finder implementation for the RCU2 with doubled throughput is active since the upgrade of the TPC readout electronics in early 2016.

  2. Measuring the Association Between Body Mass Index and All-Cause Mortality in the Presence of Missing Data: Analyses From the Scottish National Diabetes Register.

    PubMed

    Read, Stephanie H; Lewis, Steff C; Halbesma, Nynke; Wild, Sarah H

    2017-04-15

    Incorrectly handling missing data can lead to imprecise and biased estimates. We describe the effect of applying different approaches to handling missing data in an analysis of the association between body mass index and all-cause mortality among people with type 2 diabetes. We used data from the Scottish diabetes register that were linked to hospital admissions data and death registrations. The analysis was based on people diagnosed with type 2 diabetes between 2004 and 2011, with follow-up until May 31, 2014. The association between body mass index and mortality was investigated using Cox proportional hazards models. Findings were compared using 4 different missing-data methods: complete-case analysis, 2 multiple-imputation models, and nearest-neighbor imputation. There were 124,451 cases of type 2 diabetes, among which there were 17,085 deaths during 787,275 person-years of follow-up. Patients with missing data (24.8%) had higher mortality than those without missing data (adjusted hazard ratio = 1.36, 95% confidence interval: 1.31, 1.41). A U-shaped relationship between body mass index and mortality was observed, with the lowest hazard ratios occurring among moderately obese people, regardless of the chosen approach for handling missing data. Missing data may affect absolute and relative risk estimates differently and should be considered in analyses of routinely collected data. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Long-term efficacy of an ergonomics program that includes patient-handling devices on reducing musculoskeletal injuries to nursing personnel.

    PubMed

    Garg, Arun; Kapellusch, Jay M

    2012-08-01

    The aim of this study was to evaluate long-term efficacy of an ergonomics program that included patient-handling devices in six long-term care facilities (LTC) and one chronic care hospital (CCH). Patient handling is recognized as a major source of musculoskeletal disorders (MSDs) among nursing personnel, and several studies have demonstrated effectiveness of patient-handling devices in reducing those MSDs. However, most studies have been conducted in a single facility, for a short period, and/or without a comprehensive ergonomics program. Patient-handling devices along with a comprehensive ergonomics program was implemented in six LTC facilities and one CCH. Pre- and postintervention injury data were collected for 38.9 months (range = 29 to 54 months) and 51.2 months (range = 36 to 60 months), respectively. Postintervention patient-handling injuries decreased by 59.8% (rate ratio [RR] = 0.36, 95% confidence interval [CI] [0.28, 0.49], p < .001), lost workdays by 86.7% (RR = 0.16, 95% CI [0.13, 0.18], p < .001), modified-duty days by 78.8% (RR = 0.25, 95% CI [0.22, 0.28], p < .001), and workers' compensation costs by 90.6% (RR = 0.12, 95% CI [0.09, 0.15], p < .001). Perceived stresses to low back and shoulders among nursing staff were fairly low. A vast majority of patients found the devices comfortable and safe. Longer transfer times with the use of devices was not an issue. Implementation of patient-handling devices along with a comprehensive program can be effective in reducing MSDs among nursing personnel. Strategies to expand usage of patient-handling devices in most health care settings should be explored.

  4. Factors affecting food handling Practices among food handlers of Dangila town food and drink establishments, North West Ethiopia

    PubMed Central

    2014-01-01

    Background Food borne diseases are major health problems in developed and developing countries including Ethiopia. The problem is more noticeable in developing countries due to prevailing poor food handling and sanitation practices, inadequate food safety laws, weak regulatory systems, lack of financial resources to invest on safer equipments, and lack of education for food handlers. Methods The objective of this study was to assess food handling practice and associated factors among food handlers working in food and drinking establishments of Dangila town, North West Ethiopia. Cross-sectional quantitative study design was conducted among 406 food handlers working in 105 food and drink establishments from July to August 2013 in Dangila town. Data were collected using face to face interview with pretested structured questionnaire and physical observation. Result The mean age of the respondents was 22.7 ± 4.2 years of which 62.8% of the food handlers were females. Two hundred thirteen (52.5%) of food handlers had good food handling practices. Marital status (AOR = 7.52, 95% CI, 1.45-38.97), monthly income (AOR = 0.395, 95% CI, 0.25-0.62), knowledge about food handling (AOR = 1.69, 95% CI, 1.05-2.73), existence of shower facility (AOR = 1.89, 95% CI, 1.12-3.21) and separate dressing room (AOR = 1.97, 95% CI, 1.11-3.49) were found to be significantly associated with good food handling Practices. Conclusion Above half of food handlers had good food handling practices. Marital status, monthly income, knowledge status, existence of shower facility, existence of separate dressing room and presence of insect and rodent were factors associated with food handling Practices. PMID:24908104

  5. Persistence of touch DNA on burglary-related tools.

    PubMed

    Pfeifer, Céline M; Wiegand, Peter

    2017-07-01

    Experts are increasingly concerned by issues regarding the activity level of DNA stains. A case from our burglary-related casework pointed out the need for experiments regarding the persistence of DNA when more than one person touched a tool handle. We performed short tandem repeat (STR) analyses for three groups of tools: (1) personal and mock owned tools; (2) tools, which were first "owned" by a first user and then handled in a burglary action by a second user; and (3) tools, which were first owned by a first user and then handled in a moderate action. At least three types of tool handles were included in each of the groups. Every second user handled the tool with and without gloves. In total, 234 samples were analyzed regarding profile completeness of first and second user as well as properties like detectable major profile or mixture attributes. When second users simulated a burglary by using a tool bare handed, we could not detect the first user as major component on their handles but attribute him to the stain in 1/40 cases. When second users broke up the burglary setup using gloves, the first user matched the DNA handle profile in 37% of the cases. Moderate use of mock borrowed tools demonstrated a material-dependent persistence. In total, we observed that the outcome depends mainly on the nature of contact, the handle material, and the user-specific characteristics. This study intends to supplement present knowledge about persistence of touch DNA with a special emphasis on burglary-related cases with two consecutive users and to act as experimental data for an evaluation of the relevance of alleged hypotheses, when such is needed in a court hearing.

  6. The Space Telescope SI C&DH system. [Scientific Instrument Control and Data Handling Subsystem

    NASA Technical Reports Server (NTRS)

    Gadwal, Govind R.; Barasch, Ronald S.

    1990-01-01

    The Hubble Space Telescope Scientific Instrument Control and Data Handling Subsystem (SI C&DH) is designed to interface with five scientific instruments of the Space Telescope to provide ground and autonomous control and collect health and status information using the Standard Telemetry and Command Components (STACC) multiplex data bus. It also formats high throughput science data into packets. The packetized data is interleaved and Reed-Solomon encoded for error correction and Pseudo Random encoded. An inner convolutional coding with the outer Reed-Solomon coding provides excellent error correction capability. The subsystem is designed with the capacity for orbital replacement in order to meet a mission life of fifteen years. The spacecraft computer and the SI C&DH computer coordinate the activities of the spacecraft and the scientific instruments to achieve the mission objectives.

  7. The National Aeronautics and Space Administration (NASA) Tracking and Data Relay Satellite System (TDRSS) program Economic and programmatic, considerations

    NASA Technical Reports Server (NTRS)

    Aller, R. O.

    1985-01-01

    The Tracking and Data Relay Satellite System (TDRSS) represents the principal element of a new space-based tracking and communication network which will support NASA spaceflight missions in low earth orbit. In its complete configuration, the TDRSS network will include a space segment consisting of three highly specialized communication satellites in geosynchronous orbit, a ground segment consisting of an earth terminal, and associated data handling and control facilities. The TDRSS network has the objective to provide communication and data relay services between the earth-orbiting spacecraft and their ground-based mission control and data handling centers. The first TDRSS spacecraft has been now in service for two years. The present paper is concerned with the TDRSS experience from the perspective of the various programmatic and economic considerations which relate to the program.

  8. Storage and Retrieval of Large RDF Graph Using Hadoop and MapReduce

    NASA Astrophysics Data System (ADS)

    Farhan Husain, Mohammad; Doshi, Pankil; Khan, Latifur; Thuraisingham, Bhavani

    Handling huge amount of data scalably is a matter of concern for a long time. Same is true for semantic web data. Current semantic web frameworks lack this ability. In this paper, we describe a framework that we built using Hadoop to store and retrieve large number of RDF triples. We describe our schema to store RDF data in Hadoop Distribute File System. We also present our algorithms to answer a SPARQL query. We make use of Hadoop's MapReduce framework to actually answer the queries. Our results reveal that we can store huge amount of semantic web data in Hadoop clusters built mostly by cheap commodity class hardware and still can answer queries fast enough. We conclude that ours is a scalable framework, able to handle large amount of RDF data efficiently.

  9. APPHi: Automated Photometry Pipeline for High Cadence Large Volume Data

    NASA Astrophysics Data System (ADS)

    Sánchez, E.; Castro, J.; Silva, J.; Hernández, J.; Reyes, M.; Hernández, B.; Alvarez, F.; García T.

    2018-04-01

    APPHi (Automated Photometry Pipeline) carries out aperture and differential photometry of TAOS-II project data. It is computationally efficient and can be used also with other astronomical wide-field image data. APPHi works with large volumes of data and handles both FITS and HDF5 formats. Due the large number of stars that the software has to handle in an enormous number of frames, it is optimized to automatically find the best value for parameters to carry out the photometry, such as mask size for aperture, size of window for extraction of a single star, and the number of counts for the threshold for detecting a faint star. Although intended to work with TAOS-II data, APPHi can analyze any set of astronomical images and is a robust and versatile tool to performing stellar aperture and differential photometry.

  10. Application of the WHO keys of safer food to improve food handling practices of food vendors in a poor resource community in Ghana.

    PubMed

    Donkor, Eric S; Kayang, Boniface B; Quaye, Jonathan; Akyeh, Moses L

    2009-11-01

    Data was collected from food vendors in a poor resource community in Ghana, which showed that the vendors constituted an important source of oro-faecal transmission. Following this, the WHO five keys of safer food were utilized in an evidence based training programme for the vendors to improve their food handling practices. Impact assessment of the food safety training showed that 67.6% of the vendors had acquired some knowledge from the workshop and were putting it into practice. Lack of food safety equipment was a major hinderance to behavioral change among the vendors as far food handling practices are concerned.

  11. Parental handling of fear in children with cancer; caring in the best interests of the child.

    PubMed

    Anderzén-Carlsson, Agneta; Kihlgren, Mona; Svantesson, Mia; Sorlie, Venke

    2010-10-01

    The aim of this study was to gain a deeper understanding of how parents of children with cancer handle the fear in their children. Fifteen parents of 11 children participated in focus-group interviews. Data were analyzed by a phenomenological hermeneutical method. The results suggest that the parents' handling was equivalent with caring in the best interests of the child. This included striving for the security and well-being of the child up to a certain point where the parents instead used their authority to maintain the child's physical health rather than trying to prevent or relieve the child's fear. Copyright © 2010 Elsevier Inc. All rights reserved.

  12. A pilot modeling technique for handling-qualities research

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1980-01-01

    A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.

  13. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis

    2014-06-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  14. Processing techniques for software based SAR processors

    NASA Technical Reports Server (NTRS)

    Leung, K.; Wu, C.

    1983-01-01

    Software SAR processing techniques defined to treat Shuttle Imaging Radar-B (SIR-B) data are reviewed. The algorithms are devised for the data processing procedure selection, SAR correlation function implementation, multiple array processors utilization, cornerturning, variable reference length azimuth processing, and range migration handling. The Interim Digital Processor (IDP) originally implemented for handling Seasat SAR data has been adapted for the SIR-B, and offers a resolution of 100 km using a processing procedure based on the Fast Fourier Transformation fast correlation approach. Peculiarities of the Seasat SAR data processing requirements are reviewed, along with modifications introduced for the SIR-B. An Advanced Digital SAR Processor (ADSP) is under development for use with the SIR-B in the 1986 time frame as an upgrade for the IDP, which will be in service in 1984-5.

  15. Atmospheric Pressure Corrections in Geodesy and Oceanography: a Strategy for Handling Air Tides

    NASA Technical Reports Server (NTRS)

    Ponte, Rui M.; Ray, Richard D.

    2003-01-01

    Global pressure data are often needed for processing or interpreting modern geodetic and oceanographic measurements. The most common source of these data is the analysis or reanalysis products of various meteorological centers. Tidal signals in these products can be problematic for several reasons, including potentially aliased sampling of the semidiurnal solar tide as well as the presence of various modeling or timing errors. Building on the work of Van den Dool and colleagues, we lay out a strategy for handling atmospheric tides in (re)analysis data. The procedure also offers a method to account for ocean loading corrections in satellite altimeter data that are consistent with standard ocean-tide corrections. The proposed strategy has immediate application to the on-going Jason-1 and GRACE satellite missions.

  16. A new approach for handling longitudinal count data with zero-inflation and overdispersion: poisson geometric process model.

    PubMed

    Wan, Wai-Yin; Chan, Jennifer S K

    2009-08-01

    For time series of count data, correlated measurements, clustering as well as excessive zeros occur simultaneously in biomedical applications. Ignoring such effects might contribute to misleading treatment outcomes. A generalized mixture Poisson geometric process (GMPGP) model and a zero-altered mixture Poisson geometric process (ZMPGP) model are developed from the geometric process model, which was originally developed for modelling positive continuous data and was extended to handle count data. These models are motivated by evaluating the trend development of new tumour counts for bladder cancer patients as well as by identifying useful covariates which affect the count level. The models are implemented using Bayesian method with Markov chain Monte Carlo (MCMC) algorithms and are assessed using deviance information criterion (DIC).

  17. Advanced Information Processing System - Fault detection and error handling

    NASA Technical Reports Server (NTRS)

    Lala, J. H.

    1985-01-01

    The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles, including tactical and transport aircraft, and manned and autonomous spacecraft. A proof-of-concept (POC) system is now in the detailed design and fabrication phase. This paper gives an overview of a preliminary fault detection and error handling philosophy in AIPS.

  18. The effects of baseball bat mass properties on swing mechanics, ground reaction forces, and swing timing.

    PubMed

    Laughlin, Walter A; Fleisig, Glenn S; Aune, Kyle T; Diffendaffer, Alek Z

    2016-01-01

    Swing trajectory and ground reaction forces (GRF) of 30 collegiate baseball batters hitting a pitched ball were compared between a standard bat, a bat with extra weight about its barrel, and a bat with extra weight in its handle. It was hypothesised that when compared to a standard bat, only a handle-weighted bat would produce equivalent bat kinematics. It was also hypothesised that hitters would not produce equivalent GRFs for each weighted bat, but would maintain equivalent timing when compared to a standard bat. Data were collected utilising a 500 Hz motion capture system and 1,000 Hz force plate system. Data between bats were considered equivalent when the 95% confidence interval of the difference was contained entirely within ±5% of the standard bat mean value. The handle-weighted bat had equivalent kinematics, whereas the barrel-weighted bat did not. Both weighted bats had equivalent peak GRF variables. Neither weighted bat maintained equivalence in the timing of bat kinematics and some peak GRFs. The ability to maintain swing kinematics with a handle-weighted bat may have implications for swing training and warm-up. However, altered timings of kinematics and kinetics require further research to understand the implications on returning to a conventionally weighted bat.

  19. A simulation investigation of the effects of engine-and thrust-response characteristics on helicopter handling qualities

    NASA Technical Reports Server (NTRS)

    Corless, L. D.; Blanken, C. L.

    1983-01-01

    A multi-phase program is being conducted to study, in a generic sense and through ground simulation, the effects of engine response, rotor inertia, rpm control, excess power, and vertical damping on specific maneuvers included in nap-of-the-Earth (NOE) operations. The helicopter configuration with an rpm-governed gas-turbine engine are considered. Handling-qualities-criteria data are considered in light of aspects peculiar to rotary-wing and NOE operations. The results of three moving-based piloted simulation studies are summarized and the frequency, characteristics of the helicopter thrust response which set it apart from other VTOL types are explained. Power-system response is affected by both the engine-governor response and the level of rotor inertia. However, results indicate that with unlimited power, variations in engine response can have a significant effect on pilot rating, whereas changes in rotor inertia, in general, do not. The results also show that any pilot interaction required to maintain proper control can significantly degrade handling qualities. Data for variations in vertical damping and collective sensitivity are compared with existing handling-qualities specifications, MIL-F-83300 and AGARD 577, and show a need for higher minimums for both damping and sensitivity for the bob-up task. Results for cases of limited power are also shown.

  20. Estimating time available for sensor fusion exception handling

    NASA Astrophysics Data System (ADS)

    Murphy, Robin R.; Rogers, Erika

    1995-09-01

    In previous work, we have developed a generate, test, and debug methodology for detecting, classifying, and responding to sensing failures in autonomous and semi-autonomous mobile robots. An important issue has arisen from these efforts: how much time is there available to classify the cause of the failure and determine an alternative sensing strategy before the robot mission must be terminated? In this paper, we consider the impact of time for teleoperation applications where a remote robot attempts to autonomously maintain sensing in the presence of failures yet has the option to contact the local for further assistance. Time limits are determined by using evidential reasoning with a novel generalization of Dempster-Shafer theory. Generalized Dempster-Shafer theory is used to estimate the time remaining until the robot behavior must be suspended because of uncertainty; this becomes the time limit on autonomous exception handling at the remote. If the remote cannot complete exception handling in this time or needs assistance, responsibility is passed to the local, while the remote assumes a `safe' state. An intelligent assistant then facilitates human intervention, either directing the remote without human assistance or coordinating data collection and presentation to the operator within time limits imposed by the mission. The impact of time on exception handling activities is demonstrated using video camera sensor data.

  1. Small values in big data: The continuing need for appropriate metadata

    USGS Publications Warehouse

    Stow, Craig A.; Webster, Katherine E.; Wagner, Tyler; Lottig, Noah R.; Soranno, Patricia A.; Cha, YoonKyung

    2018-01-01

    Compiling data from disparate sources to address pressing ecological issues is increasingly common. Many ecological datasets contain left-censored data – observations below an analytical detection limit. Studies from single and typically small datasets show that common approaches for handling censored data — e.g., deletion or substituting fixed values — result in systematic biases. However, no studies have explored the degree to which the documentation and presence of censored data influence outcomes from large, multi-sourced datasets. We describe left-censored data in a lake water quality database assembled from 74 sources and illustrate the challenges of dealing with small values in big data, including detection limits that are absent, range widely, and show trends over time. We show that substitutions of censored data can also bias analyses using ‘big data’ datasets, that censored data can be effectively handled with modern quantitative approaches, but that such approaches rely on accurate metadata that describe treatment of censored data from each source.

  2. Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization: Human Factors in Streaming Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey R.

    Real-world systems change continuously and across domains like traffic monitoring, cyber security, etc., such changes occur within short time scales. This leads to a streaming data problem and produces unique challenges for the human in the loop, as analysts have to ingest and make sense of dynamic patterns in real time. In this paper, our goal is to study how the state-of-the-art in streaming data visualization handles these challenges and reflect on the gaps and opportunities. To this end, we have three contributions: i) problem characterization for identifying domain-specific goals and challenges for handling streaming data, ii) a survey andmore » analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space, and iii) reflections on the perceptually motivated design challenges and potential research directions for addressing them.« less

  3. Relational Databases and Biomedical Big Data.

    PubMed

    de Silva, N H Nisansa D

    2017-01-01

    In various biomedical applications that collect, handle, and manipulate data, the amounts of data tend to build up and venture into the range identified as bigdata. In such occurrences, a design decision has to be taken as to what type of database would be used to handle this data. More often than not, the default and classical solution to this in the biomedical domain according to past research is relational databases. While this used to be the norm for a long while, it is evident that there is a trend to move away from relational databases in favor of other types and paradigms of databases. However, it still has paramount importance to understand the interrelation that exists between biomedical big data and relational databases. This chapter will review the pros and cons of using relational databases to store biomedical big data that previous researches have discussed and used.

  4. Parallel processing spacecraft communication system

    NASA Technical Reports Server (NTRS)

    Bolotin, Gary S. (Inventor); Donaldson, James A. (Inventor); Luong, Huy H. (Inventor); Wood, Steven H. (Inventor)

    1998-01-01

    An uplink controlling assembly speeds data processing using a special parallel codeblock technique. A correct start sequence initiates processing of a frame. Two possible start sequences can be used; and the one which is used determines whether data polarity is inverted or non-inverted. Processing continues until uncorrectable errors are found. The frame ends by intentionally sending a block with an uncorrectable error. Each of the codeblocks in the frame has a channel ID. Each channel ID can be separately processed in parallel. This obviates the problem of waiting for error correction processing. If that channel number is zero, however, it indicates that the frame of data represents a critical command only. That data is handled in a special way, independent of the software. Otherwise, the processed data further handled using special double buffering techniques to avoid problems from overrun. When overrun does occur, the system takes action to lose only the oldest data.

  5. Female parity, maternal kinship, infant age and sex influence natal attraction and infant handling in a wild colobine (Colobus vellerosus).

    PubMed

    Bădescu, Iulia; Sicotte, Pascale; Ting, Nelson; Wikberg, Eva C

    2015-04-01

    Primate females often inspect, touch and groom others' infants (natal attraction) and they may hold and carry these infants in a manner resembling maternal care (infant handling). While natal attraction and infant handling occur in most wild colobines, little is known about the factors influencing the expression of these behaviors. We examined the effects of female parity, kinship, and dominance rank, as well as infant age and sex in wild Colobus vellerosus at Boabeng-Fiema Monkey Sanctuary, Ghana. We collected data via focal sampling of females in 2008 and 2009 (N = 61) and of infants in 2010 (N = 12). Accounting for the individuals who interacted with our focal subjects, this study includes 74 females and 66 infants in 8 groups. We recorded female agonistic interactions ad libitum to determine dominance ranks. We used partial pedigree information and genotypes at 17 short tandem repeat loci to determine kinship. We knew female parity, infant age and sex from demographic records. Nulliparous females showed more natal attraction and infant handling than parous females, which may suggest that interactions with infants are more adaptive for nulliparous females because they learn mothering skills through these behaviors. Compared to non-kin, maternal kin were more likely to handle infants. Maternal kin may be permitted greater access to infants because mothers are most familiar with them. Handlers may incur inclusive fitness benefits from infant handling. Dominance rank did not affect female interactions with infants. The youngest infants received the most natal attraction and infant handling, and male infants were handled more than female infants. The potential benefits of learning to mother and inclusive fitness, in combination with the relatively low costs of natal attraction and infant handling, may explain the high rates of these behaviors in many colobines. © 2014 Wiley Periodicals, Inc.

  6. Processes and outcomes of the veterans health administration safe patient handling program: study protocol.

    PubMed

    Rugs, Deborah; Toyinbo, Peter; Patel, Nitin; Powell-Cope, Gail; Hahm, Bridget; Elnitsky, Christine; Besterman-Dahan, Karen; Campbell, Robert; Sutton, Bryce

    2013-11-18

    Health care workers, such as nurses, nursing aides, orderlies, and attendants, who manually move patients, are consistently listed in the top professions for musculoskeletal injuries (MSIs) by the Bureau of Labor Statistics. These MSIs are typically caused by high-risk patient caregiving activities. In 2008, a safe patient handling (SPH) program was implemented in all 153 Veterans Administration Medical Centers (VAMCs) throughout the United States to reduce patient handling injuries. The goal of the present study is to evaluate the effects associated with the national implementation of a comprehensive SPH program. The primary objectives of the research were to determine the effectiveness of the SPH program in improving direct care nursing outcomes and to provide a context for understanding variations in program results across sites over time. Secondary objectives of the present research were to evaluate the effectiveness of the program in reducing direct and indirect costs associated with patient handling, to explore the potential mediating and moderating mechanisms, and to identify unintended consequences of implementing the program. This 3-year longitudinal study used mixed methods of data collection at 6- to 9-month intervals. The analyses will include data from surveys, administrative databases, individual and focus group interviews, and nonparticipant observations. For this study, a 3-tiered measurement plan was used. For Tier 1, the unit of analysis was the facility, the data source was the facility coordinator or administrative data, and all 153 VAMCs participated. For Tier 2, frontline caregivers and program peer leaders at 17 facilities each completed different surveys. For Tier 3, six facilities completed qualitative site visits, which included individual interviews, focus groups, and nonparticipant observations. Multiple regression models were proposed to test the effects of SPH components on nursing outcomes related to patient handling. Content analysis and constant comparative analysis were proposed for qualitative data analysis to understand the context of implementation and to triangulate quantitative data. All three tiers of data for this study have been collected. We are now in the analyses and writing phase of the project, with the possibility for extraction of additional administrative data. The focus of this paper is to describe the SPH program, its evaluation study design, and its data collection procedures. This study evaluates the effects associated with the national implementation of a comprehensive SPH program that was implemented in all 153 VAMCs throughout the United States to reduce patient handling injuries. To our knowledge, this is the largest evaluation of an SPH program in the United States. A major strength of this observational study design is that all VAMCs implemented the program and were included in Tier 1 of the study; therefore, population sampling bias is not a concern. Although the design lacks a comparison group for testing program effects, this longitudinal field study design allows for capturing program dose-response effects within a naturalistic context. Implementation of the VA-wide SPH program afforded the opportunity for rigorous evaluation in a naturalistic context. Findings will guide VA operations for policy and decision making about resources, and will be useful for health care, in general, outside of the VA, in implementation and impact of an SPH program.

  7. Early life influences on emotional reactivity: evidence that social enrichment has greater effects than handling on anxiety-like behaviors, neuroendocrine responses to stress and central BDNF levels.

    PubMed

    Cirulli, Francesca; Berry, Alessandra; Bonsignore, Luca Tommaso; Capone, Francesca; D'Andrea, Ivana; Aloe, Luigi; Branchi, Igor; Alleva, Enrico

    2010-05-01

    During the early post-natal phases the brain is experience-seeking and provided by a considerable plasticity which allows a fine tuning between the external environment and the developing organism. Since the early work of Seymour Levine, an impressive amount of research has clearly shown that stressful experiences exert powerful effects on the brain and body development. These effects can last throughout the entire life span influencing brain function and increasing the risk for depression and anxiety disorders. The mechanisms underlying the effects of early stress on the developing organism have been widely studied in rodents through experimental manipulations of the post-natal environment, such as handling, which have been shown to exert important effects on the emotional phenotype and the response to stress. In the present paper we review the relevant literature and present some original data indicating that, compared to handling, which imposes an external manipulation on the mother-infant relationship, social enrichment, in the form of communal rearing, in mice has very profound effects on animal's emotionality and the response to stress. These effects are also accompanied by important changes in central levels of brain-derived neurotrophic factor. The present data indicate that communal rearing has more pervasive effects than handling, strengthening previous data suggesting that it is a good animal model of reduced susceptibility to depression-like behavior. Overall, the availability of ever more sophisticated animal models represents a fundamental tool to translate basic research data into appropriate interventions for humans raised under traumatic or impoverished situations. (c) 2010 Elsevier Ltd. All rights reserved.

  8. Furan in coffee: pilot studies on formation during roasting and losses during production steps and consumer handling.

    PubMed

    Guenther, H; Hoenicke, K; Biesterveld, S; Gerhard-Rieben, E; Lantz, I

    2010-03-01

    The occurrence of furan in some food products has already been known for a few decades, and it has been reconfirmed in more recent investigations that furan is present in a variety of foodstuffs. This list of products includes roasted coffee, which has been shown to generate furan as a result of the heat treatment at roasting which is applied to achieve the desired aroma and flavour profile of a roasted coffee. The objective of this study is to provide data to allow a better understanding of the available data of furan in coffee, the kinetics of furan generated during roasting, and to estimate the reduction of furan levels afterwards due to subsequent processing steps and consumer handling. Finally, the study is meant as a contribution to establish exposure data on the basis of scientific data at the stage of coffee consumption. This paper shows that the formation of furan during roasting is dependent on roasting conditions and is, therefore, directly linked to achieving targeted flavour profiles. Furthermore, it is demonstrated that modifications in process conditions potentially to reduce furan levels may have the opposite effect on other undesired reaction products of the roasting chemistry such as, for example, acrylamide. Due to the high volatility of furan, any subsequent processing step or consumer handling has an impact on the level of furan. As a guidance from this study and in consideration of the identified losses of each process and handling step on the basis of the trial conditions, it is estimated that only approximately 10% of the initially generated furan during roasting gets into the cup of coffee for consumption.

  9. Copenhagen Airport Cohort: air pollution, manual baggage handling and health

    PubMed Central

    Møller, Karina Lauenborg; Brauer, Charlotte; Mikkelsen, Sigurd; Loft, Steffen; Simonsen, Erik B; Koblauch, Henrik; Bern, Stine Hvid; Alkjær, Tine; Hertel, Ole; Becker, Thomas; Larsen, Karin Helweg; Bonde, Jens Peter; Thygesen, Lau Caspar

    2017-01-01

    Purpose Copenhagen Airport Cohort 1990–2012 presents a unique data source for studies of health effects of occupational exposure to air pollution (ultrafine particles) and manual baggage handling among airport employees. We describe the extent of information in the cohort and in the follow-up based on data linkage to the comprehensive Danish nationwide health registers. In the cohort, all information is linked to the personal identification number that also is used in Denmark Statistics demographic and socioeconomic databases and in the nationwide health registers. Participants The cohort covers 69 175 men in unskilled positions. The exposed cohort includes men in unskilled jobs employed at Copenhagen Airport in the period 1990–2012 either as baggage handlers or in other outdoor work. The reference cohort includes men in unskilled jobs working in the greater Copenhagen area. Findings to date The cohort includes environmental Global Positioning System (GPS) measurements in Copenhagen Airport, information on job function/task for each calendar year of employment between 1990 and 2012, exposure to air pollution at residence, average weight of baggage lifted per day and lifestyle. By linkage to registers, we retrieved socioeconomic and demographic data and data on healthcare contacts, drug subscriptions, incident cancer and mortality. Future plans The size of the cohort and the completeness of the register-based follow-up allow a more accurate assessment of the possible health risks of occupational exposure to ultrafine particles and manual baggage handling at airports than in previous studies. We plan to follow the cohort for the incidence of ischaemic heart diseases, cerebrovascular disease, lung and bladder cancer, asthma and chronic obstructive pulmonary disease, and further for associations between heavy manual baggage handling and musculoskeletal disorders. Trial registration number 2012–41–0199. PMID:28478397

  10. A General-purpose Framework for Parallel Processing of Large-scale LiDAR Data

    NASA Astrophysics Data System (ADS)

    Li, Z.; Hodgson, M.; Li, W.

    2016-12-01

    Light detection and ranging (LiDAR) technologies have proven efficiency to quickly obtain very detailed Earth surface data for a large spatial extent. Such data is important for scientific discoveries such as Earth and ecological sciences and natural disasters and environmental applications. However, handling LiDAR data poses grand geoprocessing challenges due to data intensity and computational intensity. Previous studies received notable success on parallel processing of LiDAR data to these challenges. However, these studies either relied on high performance computers and specialized hardware (GPUs) or focused mostly on finding customized solutions for some specific algorithms. We developed a general-purpose scalable framework coupled with sophisticated data decomposition and parallelization strategy to efficiently handle big LiDAR data. Specifically, 1) a tile-based spatial index is proposed to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, 2) two spatial decomposition techniques are developed to enable efficient parallelization of different types of LiDAR processing tasks, and 3) by coupling existing LiDAR processing tools with Hadoop, this framework is able to conduct a variety of LiDAR data processing tasks in parallel in a highly scalable distributed computing environment. The performance and scalability of the framework is evaluated with a series of experiments conducted on a real LiDAR dataset using a proof-of-concept prototype system. The results show that the proposed framework 1) is able to handle massive LiDAR data more efficiently than standalone tools; and 2) provides almost linear scalability in terms of either increased workload (data volume) or increased computing nodes with both spatial decomposition strategies. We believe that the proposed framework provides valuable references on developing a collaborative cyberinfrastructure for processing big earth science data in a highly scalable environment.

  11. MR-MOOSE: an advanced SED-fitting tool for heterogeneous multi-wavelength data sets

    NASA Astrophysics Data System (ADS)

    Drouart, G.; Falkendal, T.

    2018-07-01

    We present the public release of MR-MOOSE, a fitting procedure that is able to perform multi-wavelength and multi-object spectral energy distribution (SED) fitting in a Bayesian framework. This procedure is able to handle a large variety of cases, from an isolated source to blended multi-component sources from a heterogeneous data set (i.e. a range of observation sensitivities and spectral/spatial resolutions). Furthermore, MR-MOOSE handles upper limits during the fitting process in a continuous way allowing models to be gradually less probable as upper limits are approached. The aim is to propose a simple-to-use, yet highly versatile fitting tool for handling increasing source complexity when combining multi-wavelength data sets with fully customisable filter/model data bases. The complete control of the user is one advantage, which avoids the traditional problems related to the `black box' effect, where parameter or model tunings are impossible and can lead to overfitting and/or over-interpretation of the results. Also, while a basic knowledge of PYTHON and statistics is required, the code aims to be sufficiently user-friendly for non-experts. We demonstrate the procedure on three cases: two artificially generated data sets and a previous result from the literature. In particular, the most complex case (inspired by a real source, combining Herschel, ALMA, and VLA data) in the context of extragalactic SED fitting makes MR-MOOSE a particularly attractive SED fitting tool when dealing with partially blended sources, without the need for data deconvolution.

  12. Common Data Model to Handle PDS3 and PDS4 Data

    NASA Astrophysics Data System (ADS)

    Saiz, J.; Macfarlane, A.; Docasal, R.; Rios, C.; Barbarisi, I.; Vallejo, F.; Besse, S.; Vallat, C.; Arviset, C.

    2017-06-01

    European Space Agency's (ESA) planetary missions following either the PDS3 or the PDS4 standards preserve their data in the Planetary Science Archive (PSA). A common data model has been developed to provide transparency to all PSA services.

  13. 75 FR 74146 - Release of Waybill Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-30

    ... DEPARTMENT OF TRANSPORTATION Surface Transportation Board Release of Waybill Data The Surface... Montana (WB10-069(1)), for permission to use certain data from the Board's 2006 through 2009 (when... handling this waybill sample request. The waybill sample contains confidential railroad and shipper data...

  14. Minicomputer front end. [Modcomp II/CP as buffer between CDC 6600 and PDP-9 at graphics stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudson, J.A.

    1976-01-01

    Sandia Labs developed an Interactive Graphics System (SIGS) that was established on a CDC 6600 using a communication scheme based on the Control Data Corporation product IGS. As implemented at Sandia, the graphics station consists primarily of a PDP-9 with a Vector General display. A system is being developed which uses a minicomputer (Modcomp II/CP) as the buffer machine for the graphics stations. The original SIGS required a dedicated peripheral processor (PP) on the CDC 6600 to handle the communication with the stations; however, with the Modcomp handling the actual communication protocol, the PP is only assigned as needed tomore » handle data transfer within the CDC 6600 portion of SIGS. The new system will thus support additional graphics stations with less impact on the CDC 6600. This paper discusses the design philosophy of the system, and the hardware and software used to implement it. 1 figure.« less

  15. Missing data in FFQs: making assumptions about item non-response.

    PubMed

    Lamb, Karen E; Olstad, Dana Lee; Nguyen, Cattram; Milte, Catherine; McNaughton, Sarah A

    2017-04-01

    FFQs are a popular method of capturing dietary information in epidemiological studies and may be used to derive dietary exposures such as nutrient intake or overall dietary patterns and diet quality. As FFQs can involve large numbers of questions, participants may fail to respond to all questions, leaving researchers to decide how to deal with missing data when deriving intake measures. The aim of the present commentary is to discuss the current practice for dealing with item non-response in FFQs and to propose a research agenda for reporting and handling missing data in FFQs. Single imputation techniques, such as zero imputation (assuming no consumption of the item) or mean imputation, are commonly used to deal with item non-response in FFQs. However, single imputation methods make strong assumptions about the missing data mechanism and do not reflect the uncertainty created by the missing data. This can lead to incorrect inference about associations between diet and health outcomes. Although the use of multiple imputation methods in epidemiology has increased, these have seldom been used in the field of nutritional epidemiology to address missing data in FFQs. We discuss methods for dealing with item non-response in FFQs, highlighting the assumptions made under each approach. Researchers analysing FFQs should ensure that missing data are handled appropriately and clearly report how missing data were treated in analyses. Simulation studies are required to enable systematic evaluation of the utility of various methods for handling item non-response in FFQs under different assumptions about the missing data mechanism.

  16. Methods for Handling Missing Secondary Respondent Data

    ERIC Educational Resources Information Center

    Young, Rebekah; Johnson, David

    2013-01-01

    Secondary respondent data are underutilized because researchers avoid using these data in the presence of substantial missing data. The authors reviewed, evaluated, and tested solutions to this problem. Five strategies of dealing with missing partner data were reviewed: (a) complete case analysis, (b) inverse probability weighting, (c) correction…

  17. Missing data imputation: focusing on single imputation.

    PubMed

    Zhang, Zhongheng

    2016-01-01

    Complete case analysis is widely used for handling missing data, and it is the default method in many statistical packages. However, this method may introduce bias and some useful information will be omitted from analysis. Therefore, many imputation methods are developed to make gap end. The present article focuses on single imputation. Imputations with mean, median and mode are simple but, like complete case analysis, can introduce bias on mean and deviation. Furthermore, they ignore relationship with other variables. Regression imputation can preserve relationship between missing values and other variables. There are many sophisticated methods exist to handle missing values in longitudinal data. This article focuses primarily on how to implement R code to perform single imputation, while avoiding complex mathematical calculations.

  18. Missing data imputation: focusing on single imputation

    PubMed Central

    2016-01-01

    Complete case analysis is widely used for handling missing data, and it is the default method in many statistical packages. However, this method may introduce bias and some useful information will be omitted from analysis. Therefore, many imputation methods are developed to make gap end. The present article focuses on single imputation. Imputations with mean, median and mode are simple but, like complete case analysis, can introduce bias on mean and deviation. Furthermore, they ignore relationship with other variables. Regression imputation can preserve relationship between missing values and other variables. There are many sophisticated methods exist to handle missing values in longitudinal data. This article focuses primarily on how to implement R code to perform single imputation, while avoiding complex mathematical calculations. PMID:26855945

  19. Command and data handling for Atmosphere Explorer satellite

    NASA Technical Reports Server (NTRS)

    Fuldner, W. V.

    1974-01-01

    The command and data-handling subsystem of the Atmosphere Explorer satellite provides the necessary controls for the instrumentation and telemetry, and also controls the satellite attitude and trajectory. The subsystem executes all command information within the spacecraft, either in real time (as received over the S-band command transmission link) or remote from the command site (as required by the orbit operations schedule). Power consumption in the spacecraft is optimized by suitable application and removal of power to various instruments; additional functions include control of magnetic torquers and of the orbit-adjust propulsion subsystem. Telemetry data from instruments and the spacecraft equipment are formatted into a single serial bit stream. Attention is given to command types, command formats, decoder operation, and command processing functions.

  20. Design of a feedback-feedforward steering controller for accurate path tracking and stability at the limits of handling

    NASA Astrophysics Data System (ADS)

    Kapania, Nitin R.; Gerdes, J. Christian

    2015-12-01

    This paper presents a feedback-feedforward steering controller that simultaneously maintains vehicle stability at the limits of handling while minimising lateral path tracking deviation. The design begins by considering the performance of a baseline controller with a lookahead feedback scheme and a feedforward algorithm based on a nonlinear vehicle handling diagram. While this initial design exhibits desirable stability properties at the limits of handling, the steady-state path deviation increases significantly at highway speeds. Results from both linear and nonlinear analyses indicate that lateral path tracking deviations are minimised when vehicle sideslip is held tangent to the desired path at all times. Analytical results show that directly incorporating this sideslip tangency condition into the steering feedback dramatically improves lateral path tracking, but at the expense of poor closed-loop stability margins. However, incorporating the desired sideslip behaviour into the feedforward loop creates a robust steering controller capable of accurate path tracking and oversteer correction at the physical limits of tyre friction. Experimental data collected from an Audi TTS test vehicle driving at the handling limits on a full length race circuit demonstrates the improved performance of the final controller design.

  1. High-Alpha Handling Qualities Flight Research on the NASA F/A-18 High Alpha Research Vehicle

    NASA Technical Reports Server (NTRS)

    Wichman, Keith D.; Pahle, Joseph W.; Bahm, Catherine; Davidson, John B.; Bacon, Barton J.; Murphy, Patrick C.; Ostroff, Aaron J.; Hoffler, Keith D.

    1996-01-01

    A flight research study of high-angle-of-attack handling qualities has been conducted at the NASA Dryden Flight Research Center using the F/A-18 High Alpha Research Vehicle (HARV). The objectives were to create a high-angle-of-attack handling qualities flight database, develop appropriate research evaluation maneuvers, and evaluate high-angle-of-attack handling qualities guidelines and criteria. Using linear and nonlinear simulations and flight research data, the predictions from each criterion were compared with the pilot ratings and comments. Proposed high-angle-of-attack nonlinear design guidelines and proposed handling qualities criteria and guidelines developed using piloted simulation were considered. Recently formulated time-domain Neal-Smith guidelines were also considered for application to high-angle-of-attack maneuvering. Conventional envelope criteria were evaluated for possible extension to the high-angle-of-attack regime. Additionally, the maneuvers were studied as potential evaluation techniques, including a limited validation of the proposed standard evaluation maneuver set. This paper gives an overview of these research objectives through examples and summarizes result highlights. The maneuver development is described briefly, the criteria evaluation is emphasized with example results given, and a brief discussion of the database form and content is presented.

  2. Safe meat-handling knowledge, attitudes and practices of private and government meat processing plants' workers: implications for future policy.

    PubMed

    Adesokan, H K; Raji, A O Q

    2014-03-01

    Food-borne disease outbreaks remain a major global health challenge and cross-contamination from raw meat due to poor handling is a major cause in developing countries. Adequate knowledge of meat handlers is important in limiting these outbreaks. This study evaluated and compared the safe meat-handling knowledge, attitudes and practices (KAP) of private (PMPP) and government meat processing plants' (GMPP) workers in south-western Nigeria. This cross sectional study comprised 190 meat handlers (PMPP = 55; GMPP = 135). Data concerning their safe meat-handling knowledge, attitudes and practices as well as their socio-demographic characteristics, such as age, gender and work experience were collected. A significant association was observed between the type of meat processing plants and their knowledge (p = 0.000), attitudes (p = 0.000) and practices (p = 0.000) of safe meat-handling. Meat handlers in the GMPP were respectively, about 17 times (OR = 0.060, 95% CI: 0.018-0.203), 57 times (OR = 0.019, 95% CI: 0.007-0.054) and 111 times (OR = 0.009, 95% CI: 0.001- 0.067) less likely to obtain good knowledge, attitude and practice level of safe meat-handling than those from PMPP. Further, KAP levels were significantly associated with age group, education and work experience (p < 0.05). Study findings suggest the need for future policy in food industry in developing countries to accommodate increased involvement of private sector for improved food safety and quality delivery. Public health education on safe food handling and hygiene should be on the front burner among food handlers in general.

  3. Musculoskeletal injuries resulting from patient handling tasks among hospital workers.

    PubMed

    Pompeii, Lisa A; Lipscomb, Hester J; Schoenfisch, Ashley L; Dement, John M

    2009-07-01

    The purpose of this study was to evaluate musculoskeletal injuries and disorders resulting from patient handling prior to the implementation of a "minimal manual lift" policy at a large tertiary care medical center. We sought to define the circumstances surrounding patient handling injuries and to identify potential preventive measures. Human resources data were used to define the cohort and their time at work. Workers' compensation records (1997-2003) were utilized to identify work-related musculoskeletal claims, while the workers' description of injury was used to identify those that resulted from patient handling. Adjusted rate ratios were generated using Poisson regression. One-third (n = 876) of all musculoskeletal injuries resulted from patient handling activities. Most (83%) of the injury burden was incurred by inpatient nurses, nurses' aides and radiology technicians, while injury rates were highest for nurses' aides (8.8/100 full-time equivalent, FTEs) and smaller workgroups including emergency medical technicians (10.3/100 FTEs), patient transporters (4.3/100 FTEs), operating room technicians (3.1/100 FTEs), and morgue technicians (2.2/100 FTEs). Forty percent of injuries due to lifting/transferring patients may have been prevented through the use of mechanical lift equipment, while 32% of injuries resulting from repositioning/turning patients, pulling patients up in bed, or catching falling patients may not have been prevented by the use of lift equipment. The use of mechanical lift equipment could significantly reduce the risk of some patient handling injuries but additional interventions need to be considered that address other patient handling tasks. Smaller high-risk workgroups should not be neglected in prevention efforts.

  4. Developing and Testing SpaceWire Devices and Networks

    NASA Astrophysics Data System (ADS)

    Parkes, Steve; Mills, Stuart

    2014-08-01

    SpaceWire is a data-handling network for use on-board spacecraft, which connects together instruments, mass- memory, processors, downlink telemetry, and other on- board sub-systems [1]. SpaceWire is simple to implement and has some specific characteristics that help it support data-handling applications in space: high-speed, low-power, simplicity, relatively low implementation cost, and architectural flexibility making it ideal for many space missions. SpaceWire provides high-speed (2 Mbits/s to 200 Mbits/s), bi- directional, full-duplex data-links, which connect together SpaceWire enabled equipment. Data-handling networks can be built to suit particular applications using point-to-point data-links and routing switches.Since the SpaceWire standard was published in January 2003, it has been adopted by ESA, NASA, JAXA and RosCosmos for many missions and is being widely used on scientific, Earth observation, commercial and other spacecraft. High-profile missions using SpaceWire include: Gaia, ExoMars rover, Bepi- Colombo, James Webb Space Telescope, GOES-R, Lunar Reconnaissance Orbiter and Astro-H.The development and testing of the SpaceWire links and networks used on these and many other spacecraft currently under development, requires a comprehensive array of test equipment. In this paper the requirements for test equipment fulfilling key test functions are outlined and then equipment that meets these requirements is described. Finally the all-important software that operates with the test equipment is introduced.

  5. Integrated Payload Data Handling Systems Using Software Partitioning

    NASA Astrophysics Data System (ADS)

    Taylor, Alun; Hann, Mark; Wishart, Alex

    2015-09-01

    An integrated Payload Data Handling System (I-PDHS) is one in which multiple instruments share a central payload processor for their on-board data processing tasks. This offers a number of advantages over the conventional decentralised architecture. Savings in payload mass and power can be realised because the total processing resource is matched to the requirements, as opposed to the decentralised architecture here the processing resource is in effect the sum of all the applications. Overall development cost can be reduced using a common processor. At individual instrument level the potential benefits include a standardised application development environment, and the opportunity to run the instrument data handling application on a fully redundant and more powerful processing platform [1]. This paper describes a joint program by SCISYS UK Limited, Airbus Defence and Space, Imperial College London and RAL Space to implement a realistic demonstration of an I-PDHS using engineering models of flight instruments (a magnetometer and camera) and a laboratory demonstrator of a central payload processor which is functionally representative of a flight design. The objective is to raise the Technology Readiness Level of the centralised data processing technique by address the key areas of task partitioning to prevent fault propagation and the use of a common development process for the instrument applications. The project is supported by a UK Space Agency grant awarded under the National Space Technology Program SpaceCITI scheme. [1].

  6. PCACE-Personal-Computer-Aided Cabling Engineering

    NASA Technical Reports Server (NTRS)

    Billitti, Joseph W.

    1987-01-01

    PCACE computer program developed to provide inexpensive, interactive system for learning and using engineering approach to interconnection systems. Basically database system that stores information as files of individual connectors and handles wiring information in circuit groups stored as records. Directly emulates typical manual engineering methods of handling data, thus making interface between user and program very natural. Apple version written in P-Code Pascal and IBM PC version of PCACE written in TURBO Pascal 3.0

  7. The Importance of Keyboarding and Data Handling Skills In Automated Data Processing

    ERIC Educational Resources Information Center

    Wood, Merle W.

    1977-01-01

    Business educators are aware of the keyboarding skills needed by data entry employees but are not aware of the skills needed by those who work with printed output; these skills should be covered in data processing programs and courses. (TA)

  8. NASA Webworldwind: Multidimensional Virtual Globe for Geo Big Data Visualization

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Hogan, P.; Prestifilippo, G.; Zamboni, G.

    2016-06-01

    In this paper, we presented a web application created using the NASA WebWorldWind framework. The application is capable of visualizing n-dimensional data using a Voxel model. In this case study, we handled social media data and Call Detailed Records (CDR) of telecommunication networks. These were retrieved from the "BigData Challenge 2015" of Telecom Italia. We focused on the visualization process for a suitable way to show this geo-data in a 3D environment, incorporating more than three dimensions. This engenders an interactive way to browse the data in their real context and understand them quickly. Users will be able to handle several varieties of data, import their dataset using a particular data structure, and then mash them up in the WebWorldWind virtual globe. A broad range of public use this tool for diverse purposes is possible, without much experience in the field, thanks to the intuitive user-interface of this web app.

  9. Real-time analysis of healthcare using big data analytics

    NASA Astrophysics Data System (ADS)

    Basco, J. Antony; Senthilkumar, N. C.

    2017-11-01

    Big Data Analytics (BDA) provides a tremendous advantage where there is a need of revolutionary performance in handling large amount of data that covers 4 characteristics such as Volume Velocity Variety Veracity. BDA has the ability to handle such dynamic data providing functioning effectiveness and exceptionally beneficial output in several day to day applications for various organizations. Healthcare is one of the sectors which generate data constantly covering all four characteristics with outstanding growth. There are several challenges in processing patient records which deals with variety of structured and unstructured format. Inducing BDA in to Healthcare (HBDA) will deal with sensitive patient driven information mostly in unstructured format comprising of prescriptions, reports, data from imaging system, etc., the challenges will be overcome by big data with enhanced efficiency in fetching and storing of data. In this project, dataset alike Electronic Medical Records (EMR) produced from numerous medical devices and mobile applications will be induced into MongoDB using Hadoop framework with Improvised processing technique to improve outcome of processing patient records.

  10. Algorithm Design of CPCI Backboard's Interrupts Management Based on VxWorks' Multi-Tasks

    NASA Astrophysics Data System (ADS)

    Cheng, Jingyuan; An, Qi; Yang, Junfeng

    2006-09-01

    This paper begins with a brief introduction of the embedded real-time operating system VxWorks and CompactPCI standard, then gives the programming interfaces of Peripheral Controller Interface (PCI) configuring, interrupts handling and multi-tasks programming interface under VxWorks, and then emphasis is placed on the software frameworks of CPCI interrupt management based on multi-tasks. This method is sound in design and easy to adapt, ensures that all possible interrupts are handled in time, which makes it suitable for data acquisition systems with multi-channels, a high data rate, and hard real-time high energy physics.

  11. Maximum-likelihood estimation of parameterized wavefronts from multifocal data

    PubMed Central

    Sakamoto, Julia A.; Barrett, Harrison H.

    2012-01-01

    A method for determining the pupil phase distribution of an optical system is demonstrated. Coefficients in a wavefront expansion were estimated using likelihood methods, where the data consisted of multiple irradiance patterns near focus. Proof-of-principle results were obtained in both simulation and experiment. Large-aberration wavefronts were handled in the numerical study. Experimentally, we discuss the handling of nuisance parameters. Fisher information matrices, Cramér-Rao bounds, and likelihood surfaces are examined. ML estimates were obtained by simulated annealing to deal with numerous local extrema in the likelihood function. Rapid processing techniques were employed to reduce the computational time. PMID:22772282

  12. Data Management System

    NASA Technical Reports Server (NTRS)

    1997-01-01

    CENTRA 2000 Inc., a wholly owned subsidiary of Auto-trol technology, obtained permission to use software originally developed at Johnson Space Center for the Space Shuttle and early Space Station projects. To support their enormous information-handling needs, a product data management, electronic document management and work-flow system was designed. Initially, just 33 database tables comprised the original software, which was later expanded to about 100 tables. This system, now called CENTRA 2000, is designed for quick implementation and supports the engineering process from preliminary design through release-to-production. CENTRA 2000 can also handle audit histories and provides a means to ensure new information is distributed. The product has 30 production sites worldwide.

  13. Modeling strength data for CREW CHIEF

    NASA Technical Reports Server (NTRS)

    Mcdaniel, Joe W.

    1990-01-01

    The Air Force has developed CREW CHIEF, a computer-aided design (CAD) tool for simulating and evaluating aircraft maintenance to determine if the required activities are feasible. CREW CHIEF gives the designer the ability to simulate maintenance activities with respect to reach, accessibility, strength, hand tool operation, and materials handling. While developing the CREW CHIEF, extensive research was performed to describe workers strength capabilities for using hand tools and manual handling of objects. More than 100,000 strength measures were collected and modeled for CREW CHIEF. These measures involved both male and female subjects in the 12 maintenance postures included in CREW CHIEF. The data collection and modeling effort are described.

  14. [A guide to good practice for information security in the handling of personal health data by health personnel in ambulatory care facilities].

    PubMed

    Sánchez-Henarejos, Ana; Fernández-Alemán, José Luis; Toval, Ambrosio; Hernández-Hernández, Isabel; Sánchez-García, Ana Belén; Carrillo de Gea, Juan Manuel

    2014-04-01

    The appearance of electronic health records has led to the need to strengthen the security of personal health data in order to ensure privacy. Despite the large number of technical security measures and recommendations that exist to protect the security of health data, there is an increase in violations of the privacy of patients' personal data in healthcare organizations, which is in many cases caused by the mistakes or oversights of healthcare professionals. In this paper, we present a guide to good practice for information security in the handling of personal health data by health personnel, drawn from recommendations, regulations and national and international standards. The material presented in this paper can be used in the security audit of health professionals, or as a part of continuing education programs in ambulatory care facilities. Copyright © 2013 Elsevier España, S.L. All rights reserved.

  15. Recent Progress Towards Predicting Aircraft Ground Handling Performance

    NASA Technical Reports Server (NTRS)

    Yager, T. J.; White, E. J.

    1981-01-01

    The significant progress which has been achieved in development of aircraft ground handling simulation capability is reviewed and additional improvements in software modeling identified. The problem associated with providing necessary simulator input data for adequate modeling of aircraft tire/runway friction behavior is discussed and efforts to improve this complex model, and hence simulator fidelity, are described. Aircraft braking performance data obtained on several wet runway surfaces is compared to ground vehicle friction measurements and, by use of empirically derived methods, good agreement between actual and estimated aircraft braking friction from ground vehilce data is shown. The performance of a relatively new friction measuring device, the friction tester, showed great promise in providing data applicable to aircraft friction performance. Additional research efforts to improve methods of predicting tire friction performance are discussed including use of an instrumented tire test vehicle to expand the tire friction data bank and a study of surface texture measurement techniques.

  16. SEED 2: a user-friendly platform for amplicon high-throughput sequencing data analyses.

    PubMed

    Vetrovský, Tomáš; Baldrian, Petr; Morais, Daniel; Berger, Bonnie

    2018-02-14

    Modern molecular methods have increased our ability to describe microbial communities. Along with the advances brought by new sequencing technologies, we now require intensive computational resources to make sense of the large numbers of sequences continuously produced. The software developed by the scientific community to address this demand, although very useful, require experience of the command-line environment, extensive training and have steep learning curves, limiting their use. We created SEED 2, a graphical user interface for handling high-throughput amplicon-sequencing data under Windows operating systems. SEED 2 is the only sequence visualizer that empowers users with tools to handle amplicon-sequencing data of microbial community markers. It is suitable for any marker genes sequences obtained through Illumina, IonTorrent or Sanger sequencing. SEED 2 allows the user to process raw sequencing data, identify specific taxa, produce of OTU-tables, create sequence alignments and construct phylogenetic trees. Standard dual core laptops with 8 GB of RAM can handle ca. 8 million of Illumina PE 300 bp sequences, ca. 4GB of data. SEED 2 was implemented in Object Pascal and uses internal functions and external software for amplicon data processing. SEED 2 is a freeware software, available at http://www.biomed.cas.cz/mbu/lbwrf/seed/ as a self-contained file, including all the dependencies, and does not require installation. Supplementary data contain a comprehensive list of supported functions. daniel.morais@biomed.cas.cz. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  17. A comparison of model-based imputation methods for handling missing predictor values in a linear regression model: A simulation study

    NASA Astrophysics Data System (ADS)

    Hasan, Haliza; Ahmad, Sanizah; Osman, Balkish Mohd; Sapri, Shamsiah; Othman, Nadirah

    2017-08-01

    In regression analysis, missing covariate data has been a common problem. Many researchers use ad hoc methods to overcome this problem due to the ease of implementation. However, these methods require assumptions about the data that rarely hold in practice. Model-based methods such as Maximum Likelihood (ML) using the expectation maximization (EM) algorithm and Multiple Imputation (MI) are more promising when dealing with difficulties caused by missing data. Then again, inappropriate methods of missing value imputation can lead to serious bias that severely affects the parameter estimates. The main objective of this study is to provide a better understanding regarding missing data concept that can assist the researcher to select the appropriate missing data imputation methods. A simulation study was performed to assess the effects of different missing data techniques on the performance of a regression model. The covariate data were generated using an underlying multivariate normal distribution and the dependent variable was generated as a combination of explanatory variables. Missing values in covariate were simulated using a mechanism called missing at random (MAR). Four levels of missingness (10%, 20%, 30% and 40%) were imposed. ML and MI techniques available within SAS software were investigated. A linear regression analysis was fitted and the model performance measures; MSE, and R-Squared were obtained. Results of the analysis showed that MI is superior in handling missing data with highest R-Squared and lowest MSE when percent of missingness is less than 30%. Both methods are unable to handle larger than 30% level of missingness.

  18. Efficient halal bleeding, animal handling, and welfare: A holistic approach for meat quality.

    PubMed

    Aghwan, Z A; Bello, A U; Abubakar, A A; Imlan, J C; Sazili, A Q

    2016-11-01

    Traditional halal slaughter and other forms of religious slaughter are still an issue of debate. Opposing arguments related to pre-slaughter handling, stress and pain associated with restraint, whether the incision is painful or not, and the onset of unconsciousness have been put forward, but no consensus has been achieved. There is a need to strike a balance between halal bleeding in the light of science and animal welfare. There is a paucity of scientific data with respect to animal welfare, particularly the use of restraining devices, animal handling, and efficient halal bleeding. However, this review found that competent handling of animals, proper use of restraining devices, and the efficient bleeding process that follows halal slaughter maintains meat eating quality. In conclusion, halal bleeding, when carried out in accordance with recommended animal welfare procedures, will not only maintain the quality and wholesomeness of meat but could also potentially reduce suffering and pain. Maintained meat quality increases consumer satisfaction and food safety. Copyright © 2016. Published by Elsevier Ltd.

  19. Mobile retroreflectivity best practices handbook.

    DOT National Transportation Integrated Search

    2009-07-01

    This handbook documents best practices related to proper use of the mobile retroreflectometer, sampling of : sites for data collection, and handling of mobile retroreflectivity data. The best practices described in this : handbook are derived from th...

  20. Test procedures and data input techniques for skid testing.

    DOT National Transportation Integrated Search

    1974-01-01

    The purpose of this report is to describe the system for obtaining and handling skid data, including skid testing procedures and data input procedures. While all testing devices used in Virginia are covered (other than the British portable tester), t...

  1. Free-Space Optical Interconnect Employing VCSEL Diodes

    NASA Technical Reports Server (NTRS)

    Simons, Rainee N.; Savich, Gregory R.; Torres, Heidi

    2009-01-01

    Sensor signal processing is widely used on aircraft and spacecraft. The scheme employs multiple input/output nodes for data acquisition and CPU (central processing unit) nodes for data processing. To connect 110 nodes and CPU nodes, scalable interconnections such as backplanes are desired because the number of nodes depends on requirements of each mission. An optical backplane consisting of vertical-cavity surface-emitting lasers (VCSELs), VCSEL drivers, photodetectors, and transimpedance amplifiers is the preferred approach since it can handle several hundred megabits per second data throughput.The next generation of satellite-borne systems will require transceivers and processors that can handle several Gb/s of data. Optical interconnects have been praised for both their speed and functionality with hopes that light can relieve the electrical bottleneck predicted for the near future. Optoelectronic interconnects provide a factor of ten improvement over electrical interconnects.

  2. Evaluation of techniques for handling missing cost-to-charge ratios in the USA Nationwide Inpatient Sample: a simulation study.

    PubMed

    Yu, Tzy-Chyi; Zhou, Huanxue

    2015-09-01

    Evaluate performance of techniques used to handle missing cost-to-charge ratio (CCR) data in the USA Healthcare Cost and Utilization Project's Nationwide Inpatient Sample. Four techniques to replace missing CCR data were evaluated: deleting discharges with missing CCRs (complete case analysis), reweighting as recommended by Healthcare Cost and Utilization Project, reweighting by adjustment cells and hot deck imputation by adjustment cells. Bias and root mean squared error of these techniques on hospital cost were evaluated in five disease cohorts. Similar mean cost estimates would be obtained with any of the four techniques when the percentage of missing data is low (<10%). When total cost is the outcome of interest, a reweighting technique to avoid underestimation from dropping observations with missing data should be adopted.

  3. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Box, Dennis

    2014-01-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload.more » Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.« less

  4. External quality-assurance programs managed by the U.S. Geological Survey in support of the National Atmospheric Deposition Program/National Trends Network

    USGS Publications Warehouse

    Latysh, Natalie E.; Wetherbee, Gregory A.

    2005-01-01

    The U.S. Geological Survey, Branch of Quality Systems, operates the external quality-assurance programs for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN). Beginning in 1978, six different programs have been implemented?the intersite-comparison program, the blind-audit program, the sample-handling evaluation program, the field-audit program, the interlaboratory-comparison program, and the collocated-sampler program. Each program was designed to measure error contributed by specific components in the data-collection process. The intersite-comparison program, which was discontinued in 2004, was designed to assess the accuracy and reliability of field pH and specific-conductance measurements made by site operators. The blind-audit and sample-handling evaluation programs, which also were discontinued in 2002 and 2004, respectively, assessed contamination that may result from sampling equipment and routine handling and processing of the wet-deposition samples. The field-audit program assesses the effects of sample handling, processing, and field exposure. The interlaboratory-comparison program evaluates bias and precision of analytical results produced by the contract laboratory for NADP, the Illinois State Water Survey, Central Analytical Laboratory, and compares its performance with the performance of international laboratories. The collocated-sampler program assesses the overall precision of wet-deposition data collected by NADP/NTN. This report documents historical operations and the operating procedures for each of these external quality-assurance programs. USGS quality-assurance information allows NADP/NTN data users to discern between actual environmental trends and inherent measurement variability.

  5. Principled Missing Data Treatments.

    PubMed

    Lang, Kyle M; Little, Todd D

    2018-04-01

    We review a number of issues regarding missing data treatments for intervention and prevention researchers. Many of the common missing data practices in prevention research are still, unfortunately, ill-advised (e.g., use of listwise and pairwise deletion, insufficient use of auxiliary variables). Our goal is to promote better practice in the handling of missing data. We review the current state of missing data methodology and recent missing data reporting in prevention research. We describe antiquated, ad hoc missing data treatments and discuss their limitations. We discuss two modern, principled missing data treatments: multiple imputation and full information maximum likelihood, and we offer practical tips on how to best employ these methods in prevention research. The principled missing data treatments that we discuss are couched in terms of how they improve causal and statistical inference in the prevention sciences. Our recommendations are firmly grounded in missing data theory and well-validated statistical principles for handling the missing data issues that are ubiquitous in biosocial and prevention research. We augment our broad survey of missing data analysis with references to more exhaustive resources.

  6. On-the-fly form generation and on-line metadata configuration--a clinical data management Web infrastructure in Java.

    PubMed

    Beck, Peter; Truskaller, Thomas; Rakovac, Ivo; Cadonna, Bruno; Pieber, Thomas R

    2006-01-01

    In this paper we describe the approach to build a web-based clinical data management infrastructure on top of an entity-attribute-value (EAV) database which provides for flexible definition and extension of clinical data sets as well as efficient data handling and high performance query execution. A "mixed" EAV implementation provides a flexible and configurable data repository and at the same time utilizes the performance advantages of conventional database tables for rarely changing data structures. A dynamically configurable data dictionary contains further information for data validation. The online user interface can also be assembled dynamically. A data transfer object which encapsulates data together with all required metadata is populated by the backend and directly used to dynamically render frontend forms and handle incoming data. The "mixed" EAV model enables flexible definition and modification of clinical data sets while reducing performance drawbacks of pure EAV implementations to a minimum. The system currently is in use in an electronic patient record with focus on flexibility and a quality management application (www.healthgate.at) with high performance requirements.

  7. Master control data handling program uses automatic data input

    NASA Technical Reports Server (NTRS)

    Alliston, W.; Daniel, J.

    1967-01-01

    General purpose digital computer program is applicable for use with analysis programs that require basic data and calculated parameters as input. It is designed to automate input data preparation for flight control computer programs, but it is general enough to permit application in other areas.

  8. ASCEM Data Brower (ASCEMDB) v0.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ROMOSAN, ALEXANDRU

    Data management tool designed for the Advanced Simulation Capability for Environmental Management (ASCEM) framework. Distinguishing features of this gateway include: (1) handling of complex geometry data, (2) advance selection mechanism, (3) state of art rendering of spatiotemporal data records, and (4) seamless integration with a distributed workflow engine.

  9. Using R for large spatiotemporal data sets

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer

    2017-04-01

    Writing and sharing scientific software is a means to communicate scientific ideas for finding scientific consensus, no more and no less than writing and sharing scientific papers is. Important factors for successful communication are adopting an open source environment, and using a language that is understood by many. For many scientist, R's combination of rich data abstraction and highly exposed data structures makes it an attractive communication tool. This paper discusses the development of spatial and spatiotemporal data handling and analysis with R since 2000, and will point to some of R's strengths and weaknesses in a historical perspective. We will also discuss a new, S3-based package for feature data ("Simple Features for R"), and point to a way forward into the data science realm, where pipeline-based workflows are assumed. Finally, we will discuss how, in a similar vein, massive satellite or climate model data sets, potentially held in a cloud environment, can be handled and analyzed with R.

  10. OAST Space Theme Workshop. Volume 3: Working group summary. 2: Data handling, communications (E-2). A. Statement. B. Technology needs (form 1). C. Priority assessment (form 2)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Technologies required to support the stated OAST thrust to increase information return by X1000, while reducing costs by a factor of 10 are identified. The most significant driver is the need for an overall end-to-end data system management technology. Maximum use of LSI component technology and trade-offs between hardware and software are manifest in most all considerations of technology needs. By far, the greatest need for data handling technology was identified for the space Exploration and Global Services themes. Major advances are needed in NASA's ability to provide cost effective mass reduction of space data, and automated assessment of earth looking imagery, with a concomitant reduction in cost per useful bit. A combined approach embodying end-to-end system analysis, with onboard data set selection, onboard data processing, highly parallel image processing (both ground and space), low cost, high capacity memories, and low cost user data distribution systems would be necessary.

  11. Qualification Standards for Personnel Responsible for Hazardous or Noxious Chemicals in Bulk. Volume I.

    DTIC Science & Technology

    1976-05-01

    relate to qualifications and training of chemical handling personnel aboard tank - ships and tank barges for two cargo containment systems (i.e., ambient...Transportation b)j •Water; Human Factors; Functional Job ‘Analysis; Tank Ship; Tank Barge; Chemical Tankerman, Educational Cur- .riculum~ Personnel...for safe handling of hazard- ous chemicals transported In bulk by tankshlps and tank barges. One of the resul ts of this study is a data bank of tasks

  12. High performance interconnection between high data rate networks

    NASA Technical Reports Server (NTRS)

    Foudriat, E. C.; Maly, K.; Overstreet, C. M.; Zhang, L.; Sun, W.

    1992-01-01

    The bridge/gateway system needed to interconnect a wide range of computer networks to support a wide range of user quality-of-service requirements is discussed. The bridge/gateway must handle a wide range of message types including synchronous and asynchronous traffic, large, bursty messages, short, self-contained messages, time critical messages, etc. It is shown that messages can be classified into three basic classes, synchronous and large and small asynchronous messages. The first two require call setup so that packet identification, buffer handling, etc. can be supported in the bridge/gateway. Identification enables resequences in packet size. The third class is for messages which do not require call setup. Resequencing hardware based to handle two types of resequencing problems is presented. The first is for a virtual parallel circuit which can scramble channel bytes. The second system is effective in handling both synchronous and asynchronous traffic between networks with highly differing packet sizes and data rates. The two other major needs for the bridge/gateway are congestion and error control. A dynamic, lossless congestion control scheme which can easily support effective error correction is presented. Results indicate that the congestion control scheme provides close to optimal capacity under congested conditions. Under conditions where error may develop due to intervening networks which are not lossless, intermediate error recovery and correction takes 1/3 less time than equivalent end-to-end error correction under similar conditions.

  13. Data Handling and Communication

    NASA Astrophysics Data System (ADS)

    Hemmer, FréDéRic Giorgio Innocenti, Pier

    The following sections are included: * Introduction * Computing Clusters and Data Storage: The New Factory and Warehouse * Local Area Networks: Organizing Interconnection * High-Speed Worldwide Networking: Accelerating Protocols * Detector Simulation: Events Before the Event * Data Analysis and Programming Environment: Distilling Information * World Wide Web: Global Networking * References

  14. Medical Information Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, S.; Hipkins, K. R.; Friedman, C. A.

    1979-01-01

    On-line interactive information processing system easily and rapidly handles all aspects of data management related to patient care. General purpose system is flexible enough to be applied to other data management situations found in areas such as occupational safety data, judicial information, or personnel records.

  15. Data Validation and Sharing in a Large Research Program

    EPA Science Inventory

    Appropriate data handling practices are important in the support of large research teams with shifting and competing priorities. Determining those best practices is an ongoing effort for the US EPA’s National Aquatic Resource Surveys. We focus on the well understood data ...

  16. 7 CFR 800.99 - Checkweighing sacked grain.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the instructions. (c) Protecting samples and data. Official personnel and other employees of an agency or the Service shall protect official weight samples and data from manipulation, substitution, and improper and careless handling which might deprive the samples and sample data of their representativeness...

  17. 7 CFR 800.99 - Checkweighing sacked grain.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the instructions. (c) Protecting samples and data. Official personnel and other employees of an agency or the Service shall protect official weight samples and data from manipulation, substitution, and improper and careless handling which might deprive the samples and sample data of their representativeness...

  18. UK to train 100 PhD students in data science

    NASA Astrophysics Data System (ADS)

    Allen, Michael

    2017-12-01

    A new PhD programme to develop techniques to handle the vast amounts of data being generated by experiments and facilities has been launched by the UK's Science and Technology Facilities Council (STFC).

  19. Using HST: From proposal to science

    NASA Technical Reports Server (NTRS)

    Shames, P.

    1991-01-01

    The following subject areas are covered: a short history; uses of network STSCII (general communication, science collaboration, functional activities, internal data management, and external data access); proposal/observation handling; DMF access; and future uses and requirements.

  20. Earth and Space Science Informatics: Raising Awareness of the Scientists and the Public

    NASA Astrophysics Data System (ADS)

    Messerotti, M.; Cobabe-Ammann, E.

    2009-04-01

    The recent developments in Earth and Space Science Informatics led to the availability of advanced tools for data search, visualization and analysis through e.g. the Virtual Observatories or distributed data handling infrastructures. Such facilities are accessible via web interfaces and allow refined data handling to be carried out. Notwithstanding, to date their use is not exploited by the scientific community for a variety of reasons that we will analyze in this work by considering viable strategies to overcome the issue. Similarly, such facilities are powerful tools for teaching and for popularization provided that e-learning programs involving the teachers and respectively the communicators are made available. In this context we will consider the present activities and projects by stressing the role and the legacy of the Electronic Geophysical Year.

  1. LANDSAT-D data format control book. Volume 5: (Payload)

    NASA Technical Reports Server (NTRS)

    Andrew, H.

    1981-01-01

    The LANDSAT-D flight segment payload is the thematic mapper and the multispectral scanner. Narrative and visual descriptions of the LANDSAT-D payload data handling hardware and data flow paths from the sensing instruments through to the GSFC LANDSAT-D data management system are provided. Key subsystems are examined.

  2. Reporting Capabilities and Management of the DSN Energy Data Base

    NASA Technical Reports Server (NTRS)

    Hughes, R. D.; Boyd, S. T.

    1981-01-01

    The DSN Energy Data Base is a collection of computer files developed and maintained by DSN Engineering. The energy consumption data must be updated monthly and summarized and displayed in printed output as desired. The methods used to handle the data and perform these tasks are described.

  3. Mouse handling limits the impact of stress on metabolic endpoints.

    PubMed

    Ghosal, Sriparna; Nunley, Amanda; Mahbod, Parinaz; Lewis, Alfor G; Smith, Eric P; Tong, Jenny; D'Alessio, David A; Herman, James P

    2015-10-15

    Studies focused on end-points that are confounded by stress are best performed under minimally stressful conditions. The objective of this study was to demonstrate the impact of handling designed to reduce animal stress on measurements of glucose tolerance. A cohort of mice (CD1.C57BL/6) naïve to any specific handling was subjected to either a previously described "cup" handling method, or a "tail-picked" method in which the animals were picked up by the tail (as is common for metabolic studies). Following training, an elevated plus maze (EPM) test was performed followed by measurement of blood glucose and plasma corticosterone. A second cohort (CD1.C57BL/6) was rendered obese by exposure to a high fat diet, handled with either the tail-picked or cup method and subjected to an intraperitoneal glucose tolerance test. A third cohort of C57BL/6 mice was exposed to a cup regimen that included a component of massage and was subjected to tests of anxiety-like behavior, glucose homeostasis, and corticosterone secretion. We found that the cup mice showed reduced anxiety-like behaviors in the EPM coupled with a reduction in blood glucose levels compared to mice handled by the tail-picked method. Additionally, cup mice on the high fat diet exhibited improved glucose tolerance compared to tail-picked controls. Finally, we found that the cup/massage group showed lower glucose levels following an overnight fast, and decreased anxiety-like behaviors associated with lower stress-induced plasma corticosterone concentration compared to tail-picked controls. These data demonstrate that application of handling methods that reduce anxiety-like behaviors in mice mitigates the confounding contribution of stress to interpretation of metabolic endpoints (such as glucose tolerance). Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Mouse Handling Limits the Impact of Stress on Metabolic Endpoints

    PubMed Central

    Ghosal, Sriparna; Nunley, Amanda; Mahbod, Parinaz; Lewis, Alfor G.; Smith, Eric P.; Tong, Jenny; D’Alessio, David A.; Herman, James P.

    2015-01-01

    Studies focused on end-points that are confounded by stress are best performed under minimally stressful conditions. The objective of this study was to demonstrate the impact of handling designed to reduce animal stress on measurements of glucose tolerance. A cohort of mice (CD1.C57BL/6) naïve to any specific handling were subjected to either a previously described “cup” handling method, or a “tail-picked” method in which the animals were picked up by the tail (as is common for metabolic studies). Following training, an elevated plus maze (EPM) test was performed followed by measurement of blood glucose and plasma corticosterone. A second cohort (CD1.C57BL/6) was rendered obese by exposure to a high fat diet, handled with either the tail-picked or cup method and subjected to an intraperitoneal glucose tolerance test. A third cohort of C57BL/6 mice was exposed to a cup regimen that included a component of massage and was subjected to tests of anxiety-like behavior, glucose homeostasis, and corticosterone secretion. We found that the cup mice showed reduced anxiety-like behaviors in the EPM coupled with a reduction in blood glucose levels compared to mice handled by the tail-picked method. Additionally, cup mice on the high fat diet exhibited improved glucose tolerance compared to tail-picked controls. Finally, we found that the cup/massage group showed lower glucose levels following an overnight fast, and decreased anxiety-like behaviors associated with lower stress-induced plasma corticosterone concentration compared to tail-picked controls. These data demonstrate that application of handling methods that reduce anxiety-like behaviors in mice mitigates the confounding contribution of stress to interpretation of metabolic endpoints (such as glucose tolerance). PMID:26079207

  5. Identification of unique food handling practices that could represent food safety risks for minority consumers.

    PubMed

    Henley, Shauna C; Stein, Susan E; Quinlan, Jennifer J

    2012-11-01

    Foodborne illness caused by Salmonella and Campylobacter is a concern for consumers, and there is evidence that minority racial-ethnic populations experience greater rates of illness because of these pathogens. The limited body of research concerning food safety knowledge and practices among minority consumers has focused more on general food safety knowledge than on culturally specific food handling practices. The purpose of the research reported here was to explore food handling behaviors of minority racial-ethnic consumers through in-depth discussions in focus group settings. In this way, we hoped to identify potential unique, previously unidentified food handling practices among these consumers. Nine focus groups were held in Philadelphia, PA. Three focus groups were conducted with African American consumers, three with Hispanic consumers, and three with Asian consumers. In all, 56 consumers participated. Data were recorded, transcribed, and analyzed for unique and potentially unsafe food handling behaviors. Potentially unsafe food handling practices identified among all three groups included extended time to transport food from retail to home and washing of raw poultry. Culturally unique behaviors within groups included (i) using hot water (Asian, Hispanic) or acidic solutions (African American, Hispanic) to clean raw poultry, (ii) purchasing live poultry (Asian, Hispanic), (iii) cooking poultry overnight (African American), and (iv) preparing bite-size pieces of meat prior to cooking (Asian, Hispanic). To have focus groups include a limited number of participants and nonrandom sampling means that these themes and trends cannot be extrapolated to represent food mishandling among these populations in general. Results presented here allow modification of an existing food safety survey to identify the prevalence of these food handling practices among consumers of different demographics.

  6. A prospective three-step intervention study to prevent medication errors in drug handling in paediatric care.

    PubMed

    Niemann, Dorothee; Bertsche, Astrid; Meyrath, David; Koepf, Ellen D; Traiser, Carolin; Seebald, Katja; Schmitt, Claus P; Hoffmann, Georg F; Haefeli, Walter E; Bertsche, Thilo

    2015-01-01

    To prevent medication errors in drug handling in a paediatric ward. One in five preventable adverse drug events in hospitalised children is caused by medication errors. Errors in drug prescription have been studied frequently, but data regarding drug handling, including drug preparation and administration, are scarce. A three-step intervention study including monitoring procedure was used to detect and prevent medication errors in drug handling. After approval by the ethics committee, pharmacists monitored drug handling by nurses on an 18-bed paediatric ward in a university hospital prior to and following each intervention step. They also conducted a questionnaire survey aimed at identifying knowledge deficits. Each intervention step targeted different causes of errors. The handout mainly addressed knowledge deficits, the training course addressed errors caused by rule violations and slips, and the reference book addressed knowledge-, memory- and rule-based errors. The number of patients who were subjected to at least one medication error in drug handling decreased from 38/43 (88%) to 25/51 (49%) following the third intervention, and the overall frequency of errors decreased from 527 errors in 581 processes (91%) to 116/441 (26%). The issue of the handout reduced medication errors caused by knowledge deficits regarding, for instance, the correct 'volume of solvent for IV drugs' from 49-25%. Paediatric drug handling is prone to errors. A three-step intervention effectively decreased the high frequency of medication errors by addressing the diversity of their causes. Worldwide, nurses are in charge of drug handling, which constitutes an error-prone but often-neglected step in drug therapy. Detection and prevention of errors in daily routine is necessary for a safe and effective drug therapy. Our three-step intervention reduced errors and is suitable to be tested in other wards and settings. © 2014 John Wiley & Sons Ltd.

  7. Cubic map algebra functions for spatio-temporal analysis

    USGS Publications Warehouse

    Mennis, J.; Viger, R.; Tomlin, C.D.

    2005-01-01

    We propose an extension of map algebra to three dimensions for spatio-temporal data handling. This approach yields a new class of map algebra functions that we call "cube functions." Whereas conventional map algebra functions operate on data layers representing two-dimensional space, cube functions operate on data cubes representing two-dimensional space over a third-dimensional period of time. We describe the prototype implementation of a spatio-temporal data structure and selected cube function versions of conventional local, focal, and zonal map algebra functions. The utility of cube functions is demonstrated through a case study analyzing the spatio-temporal variability of remotely sensed, southeastern U.S. vegetation character over various land covers and during different El Nin??o/Southern Oscillation (ENSO) phases. Like conventional map algebra, the application of cube functions may demand significant data preprocessing when integrating diverse data sets, and are subject to limitations related to data storage and algorithm performance. Solutions to these issues include extending data compression and computing strategies for calculations on very large data volumes to spatio-temporal data handling.

  8. Practical Applications of Data Processing to School Purchasing.

    ERIC Educational Resources Information Center

    California Association of School Business Officials, San Diego. Imperial Section.

    Electronic data processing provides a fast and accurate system for handling large volumes of routine data. If properly employed, computers can perform myriad functions for purchasing operations, including purchase order writing; equipment inventory control; vendor inventory; and equipment acquisition, transfer, and retirement. The advantages of…

  9. PaleoMac: A Macintosh™ application for treating paleomagnetic data and making plate reconstructions

    NASA Astrophysics Data System (ADS)

    Cogné, J. P.

    2003-01-01

    This brief note provides an overview of a new Macintosh™ application, PaleoMac, (MacOS 8.0 or later, 15Mb RAM required) which permits rapid processing of paleomagnetic data, from the demagnetization data acquired in the laboratory, to the treatment of paleomagnetic poles, plate reconstructions, finite rotation computations on a sphere, and characterization of relative plate motions. Capabilities of PaleoMac include (1) high interactivity between the user and data displayed on screen which provides a fast and easy way to handle, add and remove data or contours, perform computations on subsets of points, change projections, sizes, etc.; (2) performance of all standard principal component analysis and statistical processing on a sphere [, 1953] etc.); (3) output of high quality plots, compatible with graphic programs such as Adobe Illustrator, and output of numerical results as ASCII files. Beyond its usefulness in treating paleomagnetic data, its ability to handle plate motion computations should be of large interest to the Earth science community.

  10. SpaceWire Data Handling Demonstration System

    NASA Astrophysics Data System (ADS)

    Mills, S.; Parkes, S. M.; O'Gribin, N.

    2007-08-01

    The SpaceWire standard was published in 2003 with the aim of providing a standard for onboard communications, defining the physical and data link layers of an interconnection, in order to improve reusability, reliability and to reduce the cost of mission development. The many benefits which it provides mean that it has already been used in a number of missions, both in Europe and throughout the world. Recent work by the SpaceWire community has included the development of higher level protocols for SpaceWire, such as the Remote Memory Access Protocol (RMAP) which can be used for many purposes, including the configuration of SpaceWire devices. Although SpaceWire has become very popular, the various ways in which it can be used are still being discovered, as are the most efficient ways to use it. At the same time, some in the space industry are not even aware of SpaceWire's existence. This paper describes the SpaceWire Data Handling Demonstration System that has been developed by the University of Dundee. This system simulates an onboard data handling network based on SpaceWire. It uses RMAP for all communication, and so demonstrates how SpaceWire and standardised higher level protocols can be used onboard a spacecraft. The system is not only a good advert for those who are unfamiliar with the benefits of SpaceWire, it is also a useful tool for those using SpaceWire to test ideas.

  11. Capuchin monkeys, Cebus apella fail to understand a cooperative task

    PubMed

    Chalmeau; Visalberghi; Gallo

    1997-11-01

    We investigated whether capuchin monkeys cooperate to solve a task and to what extent they take into account the behaviour of another individual when cooperating. Two groups of capuchin monkeys (N=5 and 6) were tested in a task whose solution required simultaneous pulling of two handles which were too far from one another to be pulled by one monkey. Before carrying out the cooperation study, individual monkeys were trained to pull one handle (training phase 1) and to pull two handles simultaneously (training phase 2) for a food reward. Nine subjects were successful in training phase 1, and five in training phase 2. In the cooperation study seven subjects were successful, that is, pulled one handle while a companion pulled the other. Further analyses revealed that capuchins did not increase their pulling actions when a partner was close to or at the other handle, that is, when cooperation might occur. These data suggest that capuchin monkeys acted together at the task and got the reward without understanding the role of the partner and without taking its behaviour into consideration. Social tolerance, as well as their tendency to explore and their manual dexterity, were the major factors accounting for the capuchins' success.Copyright 1997 The Association for the Study of Animal Behaviour1997The Association for the Study of Animal Behaviour

  12. Physical load handling and listening comprehension effects on balance control.

    PubMed

    Qu, Xingda

    2010-12-01

    The purpose of this study was to determine the physical load handling and listening comprehension effects on balance control. A total of 16 young and 16 elderly participants were recruited in this study. The physical load handling task required holding a 5-kg load in each hand with arms at sides. The listening comprehension task involved attentive listening to a short conversation. Three short questions were asked regarding the conversation right after the testing trial to test the participants' attentiveness during the experiment. Balance control was assessed by centre of pressure-based measures, which were calculated from the force platform data when the participants were quietly standing upright on a force platform. Results from this study showed that both physical load handling and listening comprehension adversely affected balance control. Physical load handling had a more deleterious effect on balance control under the listening comprehension condition vs. no-listening comprehension condition. Based on the findings from this study, interventions for the improvement of balance could be focused on avoiding exposures to physically demanding tasks and cognitively demanding tasks simultaneously. STATEMENT OF RELEVANCE: Findings from this study can aid in better understanding how humans maintain balance, especially when physical and cognitive loads are applied. Such information is useful for developing interventions to prevent fall incidents and injuries in occupational settings and daily activities.

  13. Land characteristics data on CD-ROM

    USGS Publications Warehouse

    ,

    1996-01-01

    The U.S. Geological Survey (USGS) publishes land characteristics data on CD-ROM. The following lists the types of products and their contents and specifications. The discs cost $32 each, plus a $5.00 handling fee per order mailed.

  14. Efficient accesses of data structures using processing near memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jayasena, Nuwan S.; Zhang, Dong Ping; Diez, Paula Aguilera

    Systems, apparatuses, and methods for implementing efficient queues and other data structures. A queue may be shared among multiple processors and/or threads without using explicit software atomic instructions to coordinate access to the queue. System software may allocate an atomic queue and corresponding queue metadata in system memory and return, to the requesting thread, a handle referencing the queue metadata. Any number of threads may utilize the handle for accessing the atomic queue. The logic for ensuring the atomicity of accesses to the atomic queue may reside in a management unit in the memory controller coupled to the memory wheremore » the atomic queue is allocated.« less

  15. pyres: a Python wrapper for electrical resistivity modeling with R2

    NASA Astrophysics Data System (ADS)

    Befus, Kevin M.

    2018-04-01

    A Python package, pyres, was written to handle common as well as specialized input and output tasks for the R2 electrical resistivity (ER) modeling program. Input steps including handling field data, creating quadrilateral or triangular meshes, and data filtering allow repeatable and flexible ER modeling within a programming environment. pyres includes non-trivial routines and functions for locating and constraining specific known or separately-parameterized regions in both quadrilateral and triangular meshes. Three basic examples of how to run forward and inverse models with pyres are provided. The importance of testing mesh convergence and model sensitivity are also addressed with higher-level examples that show how pyres can facilitate future research-grade ER analyses.

  16. Flexible automated approach for quantitative liquid handling of complex biological samples.

    PubMed

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  17. Graph Structured Program Evolution: Evolution of Loop Structures

    NASA Astrophysics Data System (ADS)

    Shirakawa, Shinichi; Nagao, Tomoharu

    Recently, numerous automatic programming techniques have been developed and applied in various fields. A typical example is genetic programming (GP), and various extensions and representations of GP have been proposed thus far. Complex programs and hand-written programs, however, may contain several loops and handle multiple data types. In this chapter, we propose a new method called Graph Structured Program Evolution (GRAPE). The representation of GRAPE is a graph structure; therefore, it can represent branches and loops using this structure. Each programis constructed as an arbitrary directed graph of nodes and a data set. The GRAPE program handles multiple data types using the data set for each type, and the genotype of GRAPE takes the form of a linear string of integers. We apply GRAPE to three test problems, factorial, exponentiation, and list sorting, and demonstrate that the optimum solution in each problem is obtained by the GRAPE system.

  18. Monitoring of the infrastructure and services used to handle and automatically produce Alignment and Calibration conditions at CMS

    NASA Astrophysics Data System (ADS)

    Sipos, Roland; Govi, Giacomo; Franzoni, Giovanni; Di Guida, Salvatore; Pfeiffer, Andreas

    2017-10-01

    The CMS experiment at CERN LHC has a dedicated infrastructure to handle the alignment and calibration data. This infrastructure is composed of several services, which take on various data management tasks required for the consumption of the non-event data (also called as condition data) in the experiment activities. The criticality of these tasks imposes tights requirements for the availability and the reliability of the services executing them. In this scope, a comprehensive monitoring and alarm generating system has been developed. The system has been implemented based on the Nagios open source industry standard for monitoring and alerting services, and monitors the database back-end, the hosting nodes and key heart-beat functionalities for all the services involved. This paper describes the design, implementation and operational experience with the monitoring system developed and deployed at CMS in 2016.

  19. Tourism forecasting using modified empirical mode decomposition and group method of data handling

    NASA Astrophysics Data System (ADS)

    Yahya, N. A.; Samsudin, R.; Shabri, A.

    2017-09-01

    In this study, a hybrid model using modified Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) model is proposed for tourism forecasting. This approach reconstructs intrinsic mode functions (IMFs) produced by EMD using trial and error method. The new component and the remaining IMFs is then predicted respectively using GMDH model. Finally, the forecasted results for each component are aggregated to construct an ensemble forecast. The data used in this experiment are monthly time series data of tourist arrivals from China, Thailand and India to Malaysia from year 2000 to 2016. The performance of the model is evaluated using Root Mean Square Error (RMSE) and Mean Absolute Percentage Error (MAPE) where conventional GMDH model and EMD-GMDH model are used as benchmark models. Empirical results proved that the proposed model performed better forecasts than the benchmarked models.

  20. SeqCompress: an algorithm for biological sequence compression.

    PubMed

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz; Bajwa, Hassan

    2014-10-01

    The growth of Next Generation Sequencing technologies presents significant research challenges, specifically to design bioinformatics tools that handle massive amount of data efficiently. Biological sequence data storage cost has become a noticeable proportion of total cost in the generation and analysis. Particularly increase in DNA sequencing rate is significantly outstripping the rate of increase in disk storage capacity, which may go beyond the limit of storage capacity. It is essential to develop algorithms that handle large data sets via better memory management. This article presents a DNA sequence compression algorithm SeqCompress that copes with the space complexity of biological sequences. The algorithm is based on lossless data compression and uses statistical model as well as arithmetic coding to compress DNA sequences. The proposed algorithm is compared with recent specialized compression tools for biological sequences. Experimental results show that proposed algorithm has better compression gain as compared to other existing algorithms. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Using Planning, Scheduling and Execution for Autonomous Mars Rover Operations

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel M.; Chouinard, Caroline M.; Fisher, Forest W.; Castano, Rebecca; Judd, Michele J.; Nesnas, Issa A.

    2006-01-01

    With each new rover mission to Mars, rovers are traveling significantly longer distances. This distance increase raises not only the opportunities for science data collection, but also amplifies the amount of environment and rover state uncertainty that must be handled in rover operations. This paper describes how planning, scheduling and execution techniques can be used onboard a rover to autonomously generate and execute rover activities and in particular to handle new science opportunities that have been identified dynamically. We also discuss some of the particular challenges we face in supporting autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations. Finally, we describe our experiences in testing this work using several Mars rover prototypes in a realistic environment.

  3. A pattern-mixture model approach for handling missing continuous outcome data in longitudinal cluster randomized trials.

    PubMed

    Fiero, Mallorie H; Hsu, Chiu-Hsieh; Bell, Melanie L

    2017-11-20

    We extend the pattern-mixture approach to handle missing continuous outcome data in longitudinal cluster randomized trials, which randomize groups of individuals to treatment arms, rather than the individuals themselves. Individuals who drop out at the same time point are grouped into the same dropout pattern. We approach extrapolation of the pattern-mixture model by applying multilevel multiple imputation, which imputes missing values while appropriately accounting for the hierarchical data structure found in cluster randomized trials. To assess parameters of interest under various missing data assumptions, imputed values are multiplied by a sensitivity parameter, k, which increases or decreases imputed values. Using simulated data, we show that estimates of parameters of interest can vary widely under differing missing data assumptions. We conduct a sensitivity analysis using real data from a cluster randomized trial by increasing k until the treatment effect inference changes. By performing a sensitivity analysis for missing data, researchers can assess whether certain missing data assumptions are reasonable for their cluster randomized trial. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Big Data breaking barriers - first steps on a long trail

    NASA Astrophysics Data System (ADS)

    Schade, S.

    2015-04-01

    Most data sets and streams have a geospatial component. Some people even claim that about 80% of all data is related to location. In the era of Big Data this number might even be underestimated, as data sets interrelate and initially non-spatial data becomes indirectly geo-referenced. The optimal treatment of Big Data thus requires advanced methods and technologies for handling the geospatial aspects in data storage, processing, pattern recognition, prediction, visualisation and exploration. On the one hand, our work exploits earth and environmental sciences for existing interoperability standards, and the foundational data structures, algorithms and software that are required to meet these geospatial information handling tasks. On the other hand, we are concerned with the arising needs to combine human analysis capacities (intelligence augmentation) with machine power (artificial intelligence). This paper provides an overview of the emerging landscape and outlines our (Digital Earth) vision for addressing the upcoming issues. We particularly request the projection and re-use of the existing environmental, earth observation and remote sensing expertise in other sectors, i.e. to break the barriers of all of these silos by investigating integrated applications.

  5. Bayesian correction for covariate measurement error: A frequentist evaluation and comparison with regression calibration.

    PubMed

    Bartlett, Jonathan W; Keogh, Ruth H

    2018-06-01

    Bayesian approaches for handling covariate measurement error are well established and yet arguably are still relatively little used by researchers. For some this is likely due to unfamiliarity or disagreement with the Bayesian inferential paradigm. For others a contributory factor is the inability of standard statistical packages to perform such Bayesian analyses. In this paper, we first give an overview of the Bayesian approach to handling covariate measurement error, and contrast it with regression calibration, arguably the most commonly adopted approach. We then argue why the Bayesian approach has a number of statistical advantages compared to regression calibration and demonstrate that implementing the Bayesian approach is usually quite feasible for the analyst. Next, we describe the closely related maximum likelihood and multiple imputation approaches and explain why we believe the Bayesian approach to generally be preferable. We then empirically compare the frequentist properties of regression calibration and the Bayesian approach through simulation studies. The flexibility of the Bayesian approach to handle both measurement error and missing data is then illustrated through an analysis of data from the Third National Health and Nutrition Examination Survey.

  6. [The risk of manual handling loads in the hotel sector].

    PubMed

    Muraca, G; Martino, L Barbaro; Abbate, A; De Pasquale, D; Barbuzza, O; Brecciaroli, R

    2007-01-01

    The aim of our study is to evaluate the manual handling risk and the incidence of muscle-skeletal pathologies in the hotel compartment. Our study is conducted on 264 workers of the hotel compartment. The sample is divided on the base of the working turn in the following groups: porter (both to the plans and in the kitchen); waiters to the plans; services (gardeners and workers). The duties have been valued according to the method NIOSH. The presence of muscle-skeletal pathologies has been verified on the base to the accused symptomology, and on the presence of clinical objectivity and to the reports of checks. The data has been compared to a control group. The application of the NIOSH method has showed for each working profile an elevated synthetic index, > 3, and for porter the index is 5. The clinical data has shown an elevated incidence of pathologies of the spine, especially lumbar spine, with a high prevalence in the group of male porters. In conclusion we believe that the manual handling represents a particularly remarkable risk for the workers in the hotel compartment.

  7. Handling of computational in vitro/in vivo correlation problems by Microsoft Excel II. Distribution functions and moments.

    PubMed

    Langenbucher, Frieder

    2003-01-01

    MS Excel is a useful tool to handle in vitro/in vivo correlation (IVIVC) distribution functions, with emphasis on the Weibull and the biexponential distribution, which are most useful for the presentation of cumulative profiles, e.g. release in vitro or urinary excretion in vivo, and differential profiles such as the plasma response in vivo. The discussion includes moments (AUC and mean) as summarizing statistics, and data-fitting algorithms for parameter estimation.

  8. 1984 CRC (Coordinating Research Council, Inc.) Octane Number Requirement Rating Workshop.

    DTIC Science & Technology

    1985-06-01

    Richard J . Tither Mobil Oil Corporation Sam D. Vallas Amoco Oil Company Douglas A. Voss Chevron Research Company Andy Vukovic Shell Canada Dave G...Instrumentation, * Preparation a Test Fuels: Procurement of Fuels and Cans, and Coordina- tion of On-Site Handling e Data Handling and Analysis j 2 nI |S 0- B-2 V...Doug McCorkell Union Oil Company of California James D. Merritt Amoco Oil Company Michael J . Mlotkowski Mobil Oil Corporation John Pandosh Sun Tech

  9. Laboratory identification of arthropod ectoparasites.

    PubMed

    Mathison, Blaine A; Pritt, Bobbi S

    2014-01-01

    The collection, handling, identification, and reporting of ectoparasitic arthropods in clinical and reference diagnostic laboratories are discussed in this review. Included are data on ticks, mites, lice, fleas, myiasis-causing flies, and bed bugs. The public health importance of these organisms is briefly discussed. The focus is on the morphological identification and proper handling and reporting of cases involving arthropod ectoparasites, particularly those encountered in the United States. Other arthropods and other organisms not of public health concern, but routinely submitted to laboratories for identification, are also briefly discussed.

  10. Dynamic stability and handling qualities tests on a highly augmented, statically unstable airplane

    NASA Technical Reports Server (NTRS)

    Gera, Joseph; Bosworth, John T.

    1987-01-01

    This paper describes some novel flight tests and analysis techniques in the flight dynamics and handling qualities area. These techniques were utilized during the initial flight envelope clearance of the X-29A aircraft and were largely responsible for the completion of the flight controls clearance program without any incidents or significant delays. The resulting open-loop and closed-loop frequency responses and the time history comparison using flight and linear simulation data are discussed.

  11. A helicopter handling-qualities study of the effects of engine response characteristics, height-control dynamics, and excess power on nap-of-the-Earth operations

    NASA Technical Reports Server (NTRS)

    Corliss, L. D.

    1982-01-01

    The helicopter configuration with an rpm-governed gas-turbine engine was examined. A wide range of engine response time, vehicle damping and sensitivity, and excess power levels was studied. The data are compared with the existing handling-qualities specifications, MIL-F-83300 and AGARD 577, and in general show a need for higher minimums when performing such NOE maneuvers as a dolphin and bob-up task.

  12. An Array Library for Microsoft SQL Server with Astrophysical Applications

    NASA Astrophysics Data System (ADS)

    Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.

    2012-09-01

    Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.

  13. Oxidative metabolism and Ca2+ handling in isolated brain mitochondria and striatal neurons from R6/2 mice, a model of Huntington's disease.

    PubMed

    Hamilton, James; Pellman, Jessica J; Brustovetsky, Tatiana; Harris, Robert A; Brustovetsky, Nickolay

    2016-07-01

    Alterations in oxidative metabolism and defects in mitochondrial Ca 2+ handling have been implicated in the pathology of Huntington's disease (HD), but existing data are contradictory. We investigated the effect of human mHtt fragments on oxidative metabolism and Ca 2+ handling in isolated brain mitochondria and cultured striatal neurons from the R6/2 mouse model of HD. Non-synaptic and synaptic mitochondria isolated from the brains of R6/2 mice had similar respiratory rates and Ca 2+ uptake capacity compared with mitochondria from wild-type (WT) mice. Respiratory activity of cultured striatal neurons measured with Seahorse XF24 flux analyzer revealed unaltered cellular respiration in neurons derived from R6/2 mice compared with neurons from WT animals. Consistent with the lack of respiratory dysfunction, ATP content of cultured striatal neurons from R6/2 and WT mice was similar. Mitochondrial Ca 2+ accumulation was also evaluated in cultured striatal neurons from R6/2 and WT animals. Our data obtained with striatal neurons derived from R6/2 and WT mice show that both glutamate-induced increases in cytosolic Ca 2+ and subsequent carbonilcyanide p-triflouromethoxyphenylhydrazone-induced increases in cytosolic Ca 2+ were similar between WT and R6/2, suggesting that mitochondria in neurons derived from both types of animals accumulated comparable amounts of Ca 2+ Overall, our data argue against respiratory deficiency and impaired Ca 2+ handling induced by human mHtt fragments in both isolated brain mitochondria and cultured striatal neurons from transgenic R6/2 mice. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Understanding Skill in EVA Mass Handling. Volume 4; An Integrated Methodology for Evaluating Space Suit Mobility and Stability

    NASA Technical Reports Server (NTRS)

    McDonald, P. Vernon; Newman, Dava

    1999-01-01

    The empirical investigation of extravehicular activity (EVA) mass handling conducted on NASA's Precision Air-Bearing Floor led to a Phase I SBIR from JSC. The purpose of the SBIR was to design an innovative system for evaluating space suit mobility and stability in conditions that simulate EVA on the surface of the Moon or Mars. The approach we used to satisfy the Phase I objectives was based on a structured methodology for the development of human-systems technology. Accordingly the project was broken down into a number of tasks and subtasks. In sequence, the major tasks were: 1) Identify missions and tasks that will involve EVA and resulting mobility requirements in the near and long term; 2) Assess possible methods for evaluating mobility of space suits during field-based EVA tests; 3) Identify requirements for behavioral evaluation by interacting with NASA stakeholders;.4) Identify necessary and sufficient technology for implementation of a mobility evaluation system; and 5) Prioritize and select technology solutions. The work conducted in these tasks is described in this final volume of the series on EVA mass handling. While prior volumes in the series focus on novel data-analytic techniques, this volume addresses technology that is necessary for minimally intrusive data collection and near-real-time data analysis and display.

  15. Reducing variation in decomposition odour profiling using comprehensive two-dimensional gas chromatography.

    PubMed

    Perrault, Katelynn A; Stefanuto, Pierre-Hugues; Stuart, Barbara H; Rai, Tapan; Focant, Jean-François; Forbes, Shari L

    2015-01-01

    Challenges in decomposition odour profiling have led to variation in the documented odour profile by different research groups worldwide. Background subtraction and use of controls are important considerations given the variation introduced by decomposition studies conducted in different geographical environments. The collection of volatile organic compounds (VOCs) from soil beneath decomposing remains is challenging due to the high levels of inherent soil VOCs, further confounded by the use of highly sensitive instrumentation. This study presents a method that provides suitable chromatographic resolution for profiling decomposition odour in soil by comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry using appropriate controls and field blanks. Logarithmic transformation and t-testing of compounds permitted the generation of a compound list of decomposition VOCs in soil. Principal component analysis demonstrated the improved discrimination between experimental and control soil, verifying the value of the data handling method. Data handling procedures have not been well documented in this field and standardisation would thereby reduce misidentification of VOCs present in the surrounding environment as decomposition byproducts. Uniformity of data handling and instrumental procedures will reduce analytical variation, increasing confidence in the future when investigating the effect of taphonomic variables on the decomposition VOC profile. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Manual for adjustment and handling of platforms for data collection via satellite (CDCP)

    NASA Technical Reports Server (NTRS)

    Handal, O. B.

    1977-01-01

    The LANDSAT and GOES data collection systems are described as well as the components of their convertible data collection platforms (CDCP). Methods are given for coding platform data, adjusting input, and operating both the LANDSAT and the GOES system. A glossary of terms is included with a bibliography.

  17. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  18. Survey data and metadata modelling using document-oriented NoSQL

    NASA Astrophysics Data System (ADS)

    Rahmatuti Maghfiroh, Lutfi; Gusti Bagus Baskara Nugraha, I.

    2018-03-01

    Survey data that are collected from year to year have metadata change. However it need to be stored integratedly to get statistical data faster and easier. Data warehouse (DW) can be used to solve this limitation. However there is a change of variables in every period that can not be accommodated by DW. Traditional DW can not handle variable change via Slowly Changing Dimension (SCD). Previous research handle the change of variables in DW to manage metadata by using multiversion DW (MVDW). MVDW is designed using relational model. Some researches also found that developing nonrelational model in NoSQL database has reading time faster than the relational model. Therefore, we propose changes to metadata management by using NoSQL. This study proposes a model DW to manage change and algorithms to retrieve data with metadata changes. Evaluation of the proposed models and algorithms result in that database with the proposed design can retrieve data with metadata changes properly. This paper has contribution in comprehensive data analysis with metadata changes (especially data survey) in integrated storage.

  19. Managing security and privacy concerns over data storage in healthcare research.

    PubMed

    Mackenzie, Isla S; Mantay, Brian J; McDonnell, Patrick G; Wei, Li; MacDonald, Thomas M

    2011-08-01

    Issues surrounding data security and privacy are of great importance when handling sensitive health-related data for research. The emphasis in the past has been on balancing the risks to individuals with the benefit to society of the use of databases for research. However, a new way of looking at such issues is that by optimising procedures and policies regarding security and privacy of data to the extent that there is no appreciable risk to the privacy of individuals, we can create a 'win-win' situation in which everyone benefits, and pharmacoepidemiological research can flourish with public support. We discuss holistic measures, involving both information technology and people, taken to improve the security and privacy of data storage. After an internal review, we commissioned an external audit by an independent consultant with a view to optimising our data storage and handling procedures. Improvements to our policies and procedures were implemented as a result of the audit. By optimising our storage of data, we hope to inspire public confidence and hence cooperation with the use of health care data in research. Copyright © 2011 John Wiley & Sons, Ltd.

  20. A systematic review of randomised controlled trials in rheumatoid arthritis: the reporting and handling of missing data in composite outcomes.

    PubMed

    Ibrahim, Fowzia; Tom, Brian D M; Scott, David L; Prevost, Andrew Toby

    2016-06-02

    Most reported outcome measures in rheumatoid arthritis (RA) trials are composite, whose components comprise single measures that are combined into one outcome. The aims of this review were to assess the range of missing data rates in primary composite outcomes and to document the current practice for handling and reporting missing data in published RA trials compared to the Consolidated Standards of Reporting Trials (CONSORT) recommendations. A systematic search for randomised controlled trials was conducted for RA trials published between 2008 and 2013 in four rheumatology and four high impact general medical journals. A total of 51 trials with a composite primary outcome were identified, of which 38 (75 %) used the binary American College of Rheumatology responder index and 13 (25 %) used the Disease Activity Score for 28 joints (DAS28). Forty-four trials (86 %) reported on an intention-to-treat analysis population, while 7 trials (14 %) analysed according to a modified intention-to-treat population. Missing data rates for the primary composite outcome ranged from 2-53 % and were above 30 % in 9 trials, 20-30 % in 11 trials, 10-20 % in 18 trials and below 10 % in 13 trials. Thirty-eight trials (75 %) used non-responder imputation and 10 (20 %) used last observation carried forward to impute missing composite outcome data at the primary time point. The rate of dropout was on average 61 % times higher in the placebo group compared to the treatment group in the 34 placebo controlled trials (relative rate 1.61, 95 % CI: 1.29, 2.02). Thirty-seven trials (73 %) did not report the use of sensitivity analyses to assess the handling of missing data in the primary analysis as recommended by CONSORT guidelines. This review highlights an improvement in rheumatology trial practice since the revision of CONSORT guidelines, in terms of power calculation and participant's flow diagram. However, there is a need to improve the handling and reporting of missing composite outcome data and their components in RA trials. In particular, sensitivity analyses need to be more widely used in RA trials because imputation is widespread and generally uses single imputation methods, and in this area the missing data rates are commonly differentially higher in the placebo group.

  1. Study of data entry requirements at Marshall Space Flight Computation Center

    NASA Technical Reports Server (NTRS)

    Sherman, G. R.

    1975-01-01

    An economic and systems analysis of a data center was conducted. Current facilities for data storage of documentation are shown to be inadequate and outmoded for efficient data handling. Redesign of documents, condensation of the keypunching operation, upgrading of hardware, and retraining of personnel are the solutions proposed to improve the present data system.

  2. Orion Entry Handling Qualities Assessments

    NASA Technical Reports Server (NTRS)

    Bihari, B.; Tiggers, M.; Strahan, A.; Gonzalez, R.; Sullivan, K.; Stephens, J. P.; Hart, J.; Law, H., III; Bilimoria, K.; Bailey, R.

    2011-01-01

    The Orion Command Module (CM) is a capsule designed to bring crew back from the International Space Station (ISS), the moon and beyond. The atmospheric entry portion of the flight is deigned to be flown in autopilot mode for nominal situations. However, there exists the possibility for the crew to take over manual control in off-nominal situations. In these instances, the spacecraft must meet specific handling qualities criteria. To address these criteria two separate assessments of the Orion CM s entry Handling Qualities (HQ) were conducted at NASA s Johnson Space Center (JSC) using the Cooper-Harper scale (Cooper & Harper, 1969). These assessments were conducted in the summers of 2008 and 2010 using the Advanced NASA Technology Architecture for Exploration Studies (ANTARES) six degree of freedom, high fidelity Guidance, Navigation, and Control (GN&C) simulation. This paper will address the specifics of the handling qualities criteria, the vehicle configuration, the scenarios flown, the simulation background and setup, crew interfaces and displays, piloting techniques, ratings and crew comments, pre- and post-fight briefings, lessons learned and changes made to improve the overall system performance. The data collection tools, methods, data reduction and output reports will also be discussed. The objective of the 2008 entry HQ assessment was to evaluate the handling qualities of the CM during a lunar skip return. A lunar skip entry case was selected because it was considered the most demanding of all bank control scenarios. Even though skip entry is not planned to be flown manually, it was hypothesized that if a pilot could fly the harder skip entry case, then they could also fly a simpler loads managed or ballistic (constant bank rate command) entry scenario. In addition, with the evaluation set-up of multiple tasks within the entry case, handling qualities ratings collected in the evaluation could be used to assess other scenarios such as the constant bank angle maintenance case. The 2008 entry assessment was divided into two sections (see Figure 1). Entry I was the first, high speed portion of a lunar return and Entry II was the second, lower speed portion of a lunar return, which is similar (but not identical) to a typical ISS return.

  3. A Piloted Simulator Evaluation of Transport Aircraft Rudder Pedal Force/Feel Characteristics

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.

    2008-01-01

    A piloted simulation study has been conducted in a fixed-base research simulator to assess the directional handling qualities for various rudder pedal feel characteristics for commercial transport airplanes. That is, the effects of static pedal force at maximum pedal travel, breakout force, and maximum pedal travel on handling qualities were studied. An artificial maneuver with a severe lateral wind shear and requiring runway tracking at an altitude of 50 feet in a crosswind was used to fully exercise the rudder pedals. Twelve active airline pilots voluntarily participated in the study and flew approximately 500 maneuvers. The pilots rated the maneuver performance with various rudder pedal feel characteristics using the Cooper- Harper rating scale. The test matrix had 15 unique combinations of the 3 static pedal feel characteristics. A 10-term, second-order equation for the Cooper-Harper pilot rating as a function of the 3 independent pedal feel parameters was fit to the data. The test matrix utilized a Central Composite Design that is very efficient for fitting an equation of this form. The equation was used to produce contour plots of constant pilot ratings as a function of two of the parameters with the third parameter held constant. These contour plots showed regions of good handling qualities as well as regions of degraded handling qualities. In addition, a numerical equation solver was used to predict the optimum parameter values (those with the lowest pilot rating). Quantitative pilot performance data were also analyzed. This analysis found that the peak values of the cross power spectra of the pedal force and heading angle could be used to quantify the tendency toward directional pilot induced oscillations (PIO). Larger peak values of the cross power spectra were correlated with larger (degraded) Cooper-Harper pilot ratings. Thus, the subjective data (Cooper-Harper pilot ratings) were consistent with the objective data (peak values of the cross power spectra).

  4. Understanding workers' exposure: Systematic review and data-analysis of emission potential for NOAA.

    PubMed

    Kuijpers, E; Bekker, C; Brouwer, D; le Feber, M; Fransman, W

    2017-05-01

    Exposure assessment for nano-objects, and their aggregates and agglomerates (NOAA), has evolved from explorative research toward more comprehensive exposure assessment, providing data to further develop currently used conservative control banding (CB) tools for risk assessment. This study aims to provide an overview of current knowledge on emission potential of NOAA across the occupational life cycle stages by a systematic review and subsequently use the results in a data analysis. Relevant parameters that influence emission were collected from peer-reviewed literature with a focus on the four source domains (SD) in the source-receptor conceptual framework for NOAA. To make the reviewed exposure data comparable, we applied an approach to normalize for workplace circumstances and measurement location, resulting in comparable "surrogate" emission levels. Finally, descriptive statistics were performed. During the synthesis of nanoparticles (SD1), mechanical reduction and gas phase synthesis resulted in the highest emission compared to wet chemistry and chemical vapor condensation. For the handling and transfer of bulk manufactured nanomaterial powders (SD2) the emission could be differentiated for five activity classes: (1) harvesting; (2) dumping; (3); mixing; (4) cleaning of a reactor; and (5) transferring. Additionally, SD2 was subdivided by the handled amount with cleaning further subdivided by energy level. Harvesting and dumping resulted in the highest emissions. Regarding processes with liquids (SD3b), it was possible to distinguish emissions for spraying (propellant gas, (high) pressure and pump), sonication and brushing/rolling. The highest emissions observed in SD3b were for propellant gas spraying and pressure spraying. The highest emissions for the handling of nano-articles (SD4) were found to nano-sized particles (including NOAA) for grinding. This study provides a valuable overview of emission assessments performed in the workplace during the occupational handling of NOAA. Analyses were made per source domain to derive emission levels which can be used for models to quantitatively predict the exposure.

  5. Copenhagen Airport Cohort: air pollution, manual baggage handling and health.

    PubMed

    Møller, Karina Lauenborg; Brauer, Charlotte; Mikkelsen, Sigurd; Loft, Steffen; Simonsen, Erik B; Koblauch, Henrik; Bern, Stine Hvid; Alkjær, Tine; Hertel, Ole; Becker, Thomas; Larsen, Karin Helweg; Bonde, Jens Peter; Thygesen, Lau Caspar

    2017-05-06

    Copenhagen Airport Cohort 1990-2012 presents a unique data source for studies of health effects of occupational exposure to air pollution (ultrafine particles) and manual baggage handling among airport employees. We describe the extent of information in the cohort and in the follow-up based on data linkage to the comprehensive Danish nationwide health registers. In the cohort, all information is linked to the personal identification number that also is used in Denmark Statistics demographic and socioeconomic databases and in the nationwide health registers. The cohort covers 69 175 men in unskilled positions. The exposed cohort includes men in unskilled jobs employed at Copenhagen Airport in the period 1990-2012 either as baggage handlers or in other outdoor work. The reference cohort includes men in unskilled jobs working in the greater Copenhagen area. The cohort includes environmental Global Positioning System (GPS) measurements in Copenhagen Airport, information on job function/task for each calendar year of employment between 1990 and 2012, exposure to air pollution at residence, average weight of baggage lifted per day and lifestyle. By linkage to registers, we retrieved socioeconomic and demographic data and data on healthcare contacts, drug subscriptions, incident cancer and mortality. The size of the cohort and the completeness of the register-based follow-up allow a more accurate assessment of the possible health risks of occupational exposure to ultrafine particles and manual baggage handling at airports than in previous studies. We plan to follow the cohort for the incidence of ischaemic heart diseases, cerebrovascular disease, lung and bladder cancer, asthma and chronic obstructive pulmonary disease, and further for associations between heavy manual baggage handling and musculoskeletal disorders. number 2012-41-0199. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. Differences among nursing homes in outcomes of a safe resident handling program

    PubMed Central

    Kurotvski, Alicia; Gore, Rebecca; Buchholz, Bryan; Punnett, Laura

    2018-01-01

    A large nursing home corporation implemented a safe resident handling program (SRHP) in 2004–2007. We evaluated its efficacy over a 2-year period by examining differences among 5 centers in program outcomes and potential predictors of those differences. We observed nursing assistants (NAs), recording activities and body postures at 60-second intervals on personal digital assistants at baseline and at 3-month, 12-month, and 24-month follow-ups. The two outcomes computed were change in equipment use during resident handling and change in a physical workload index that estimated spinal loading due to body postures and handled loads. Potential explanatory factors were extracted from post-observation interviews, investigator surveys of the workforce, from administrative data, and employee satisfaction surveys. The facility with the most positive outcome measures was associated with many positive changes in explanatory factors and the facility with the fewest positive outcome measures experienced negative changes in the same factors. These findings suggest greater SRHP benefits where there was lower NA turnover and agency staffing; less time pressure; and better teamwork, staff communication, and supervisory support. PMID:22833329

  7. A large Great Britain-wide outbreak of STEC O157 phage type 8 linked to handling of raw leeks and potatoes.

    PubMed

    Launders, N; Locking, M E; Hanson, M; Willshaw, G; Charlett, A; Salmon, R; Cowden, J; Harker, K S; Adak, G K

    2016-01-01

    Between December 2010 and July 2011, 252 cases of STEC O157 PT8 stx1 + 2 infection were reported in England, Scotland and Wales. This was the largest outbreak of STEC reported in England and the second largest in the UK to date. Eighty cases were hospitalized, with two cases of haemolytic uraemic syndrome and one death reported. Routine investigative data were used to generate a hypothesis but the subsequent case-control study was inconclusive. A second, more detailed, hypothesis generation exercise identified consumption or handling of vegetables as a potential mode of transmission. A second case-control study demonstrated that cases were more likely than controls to live in households whose members handled or prepared leeks bought unwrapped [odds ratio (OR) 40, 95% confidence interval (CI) 2·08-769·4], and potatoes bought in sacks (OR 13·13, 95% CI 1·19-145·3). This appears to be the first outbreak of STEC O157 infection linked to the handling of leeks.

  8. A Data Envelopment Analysis Model for Selecting Material Handling System Designs

    NASA Astrophysics Data System (ADS)

    Liu, Fuh-Hwa Franklin; Kuo, Wan-Ting

    The material handling system under design is an unmanned job shop with an automated guided vehicle that transport loads within the processing machines. The engineering task is to select the design alternatives that are the combinations of the four design factors: the ratio of production time to transportation time, mean job arrival rate to the system, input/output buffer capacities at each processing machine, and the vehicle control strategies. Each of the design alternatives is simulated to collect the upper and lower bounds of the five performance indices. We develop a Data Envelopment Analysis (DEA) model to assess the 180 designs with imprecise data of the five indices. The three-ways factorial experiment analysis for the assessment results indicates the buffer capacity and the interaction of job arrival rate and buffer capacity affect the performance significantly.

  9. A Data Handling System for Modern and Future Fermilab Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Illingworth, R. A.

    2014-01-01

    Current and future Fermilab experiments such as Minerva, NOνA, and MicroBoone are now using an improved version of the Fermilab SAM data handling system. SAM was originally used by the CDF and D0 experiments for Run II of the Fermilab Tevatron to provide file metadata and location cataloguing, uploading of new files to tape storage, dataset management, file transfers between global processing sites, and processing history tracking. However SAM was heavily tailored to the Run II environment and required complex and hard to deploy client software, which made it hard to adapt to new experiments. The Fermilab Computing Sector hasmore » progressively updated SAM to use modern, standardized, technologies in order to more easily deploy it for current and upcoming Fermilab experiments, and to support the data preservation efforts of the Run II experiments.« less

  10. Handling ethical, legal and social issues in birth cohort studies involving genetic research: responses from studies in six countries

    PubMed Central

    2010-01-01

    Background Research involving minors has been the subject of much ethical debate. The growing number of longitudinal, pediatric studies that involve genetic research present even more complex challenges to ensure appropriate protection of children and families as research participants. Long-term studies with a genetic component involve collection, retention and use of biological samples and personal information over many years. Cohort studies may be established to study specific conditions (e.g. autism, asthma) or may have a broad aim to research a range of factors that influence the health and development of children. Studies are increasingly intended to serve as research platforms by providing access to data and biological samples to researchers over many years. This study examines how six birth cohort studies in North America and Europe that involve genetic research handle key ethical, legal and social (ELS) issues: recruitment, especially parental authority to include a child in research; initial parental consent and subsequent assent and/or consent from the maturing child; withdrawal; confidentiality and sample/data protection; handling sensitive information; and disclosure of results. Methods Semi-structured telephone interviews were carried out in 2008/09 with investigators involved in six birth cohort studies in Canada, Denmark, England, France, the Netherlands and the United States. Interviewees self-identified as being knowledgeable about ELS aspects of the study. Interviews were conducted in English. Results The studies vary in breadth of initial consent, but none adopt a blanket consent for future use of samples/data. Ethics review of new studies is a common requirement. Studies that follow children past early childhood recognise a need to seek assent/consent as the child matures. All studies limit access to identifiable data and advise participants of the right to withdraw. The clearest differences among studies concern handling of sensitive information and return of results. In all studies, signs of child abuse require reports to authorities, but this disclosure duty is not always stated in consent materials. Studies vary in whether they will return to participants results of routine tests/measures, but none inform participants about findings with unknown clinical significance. Conclusions Analysis of how cohort studies in various jurisdictions handle key ELS issues provides informative data for comparison and contrast. Consideration of these and other examples and further scholarly exploration of ELS issues provides insight on how best to address these aspects in ways that respect the well-being of participants, especially children who become research subjects at the start of their lives. PMID:20331891

  11. Handling imbalance data in churn prediction using combined SMOTE and RUS with bagging method

    NASA Astrophysics Data System (ADS)

    Pura Hartati, Eka; Adiwijaya; Arif Bijaksana, Moch

    2018-03-01

    Customer churn has become a significant problem and also a challenge for Telecommunication company such as PT. Telkom Indonesia. It is necessary to evaluate whether the big problems of churn customer and the company’s managements will make appropriate strategies to minimize the churn and retaining the customer. Churn Customer data which categorized churn Atas Permintaan Sendiri (APS) in this Company is an imbalance data, and this issue is one of the challenging tasks in machine learning. This study will investigate how is handling class imbalance in churn prediction using combined Synthetic Minority Over-Sampling (SMOTE) and Random Under-Sampling (RUS) with Bagging method for a better churn prediction performance’s result. The dataset that used is Broadband Internet data which is collected from Telkom Regional 6 Kalimantan. The research firstly using data preprocessing to balance the imbalanced dataset and also to select features by sampling technique SMOTE and RUS, and then building churn prediction model using Bagging methods and C4.5.

  12. Novel transformation-based response prediction of shear building using interval neural network

    NASA Astrophysics Data System (ADS)

    Chakraverty, S.; Sahoo, Deepti Moyi

    2017-04-01

    Present paper uses powerful technique of interval neural network (INN) to simulate and estimate structural response of multi-storey shear buildings subject to earthquake motion. The INN is first trained for a real earthquake data, viz., the ground acceleration as input and the numerically generated responses of different floors of multi-storey buildings as output. Till date, no model exists to handle positive and negative data in the INN. As such here, the bipolar data in [ -1, 1] are converted first to unipolar form, i.e., to [0, 1] by means of a novel transformation for the first time to handle the above training patterns in normalized form. Once the training is done, again the unipolar data are converted back to its bipolar form by using the inverse transformation. The trained INN architecture is then used to simulate and test the structural response of different floors for various intensity earthquake data and it is found that the predicted responses given by INN model are good for practical purposes.

  13. DASS-GUI: a user interface for identification and analysis of significant patterns in non-sequential data.

    PubMed

    Hollunder, Jens; Friedel, Maik; Kuiper, Martin; Wilhelm, Thomas

    2010-04-01

    Many large 'omics' datasets have been published and many more are expected in the near future. New analysis methods are needed for best exploitation. We have developed a graphical user interface (GUI) for easy data analysis. Our discovery of all significant substructures (DASS) approach elucidates the underlying modularity, a typical feature of complex biological data. It is related to biclustering and other data mining approaches. Importantly, DASS-GUI also allows handling of multi-sets and calculation of statistical significances. DASS-GUI contains tools for further analysis of the identified patterns: analysis of the pattern hierarchy, enrichment analysis, module validation, analysis of additional numerical data, easy handling of synonymous names, clustering, filtering and merging. Different export options allow easy usage of additional tools such as Cytoscape. Source code, pre-compiled binaries for different systems, a comprehensive tutorial, case studies and many additional datasets are freely available at http://www.ifr.ac.uk/dass/gui/. DASS-GUI is implemented in Qt.

  14. Digital processing of radiographic images

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.

    1973-01-01

    Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

  15. m-BIRCH: an online clustering approach for computer vision applications

    NASA Astrophysics Data System (ADS)

    Madan, Siddharth K.; Dana, Kristin J.

    2015-03-01

    We adapt a classic online clustering algorithm called Balanced Iterative Reducing and Clustering using Hierarchies (BIRCH), to incrementally cluster large datasets of features commonly used in multimedia and computer vision. We call the adapted version modified-BIRCH (m-BIRCH). The algorithm uses only a fraction of the dataset memory to perform clustering, and updates the clustering decisions when new data comes in. Modifications made in m-BIRCH enable data driven parameter selection and effectively handle varying density regions in the feature space. Data driven parameter selection automatically controls the level of coarseness of the data summarization. Effective handling of varying density regions is necessary to well represent the different density regions in data summarization. We use m-BIRCH to cluster 840K color SIFT descriptors, and 60K outlier corrupted grayscale patches. We use the algorithm to cluster datasets consisting of challenging non-convex clustering patterns. Our implementation of the algorithm provides an useful clustering tool and is made publicly available.

  16. Wildlife tracking data management: a new vision.

    PubMed

    Urbano, Ferdinando; Cagnacci, Francesca; Calenge, Clément; Dettki, Holger; Cameron, Alison; Neteler, Markus

    2010-07-27

    To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioural data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals' environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling.

  17. Wildlife tracking data management: a new vision

    PubMed Central

    Urbano, Ferdinando; Cagnacci, Francesca; Calenge, Clément; Dettki, Holger; Cameron, Alison; Neteler, Markus

    2010-01-01

    To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioural data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals' environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling. PMID:20566495

  18. A Review of Methods for Missing Data.

    ERIC Educational Resources Information Center

    Pigott, Therese D.

    2001-01-01

    Reviews methods for handling missing data in a research study. Model-based methods, such as maximum likelihood using the EM algorithm and multiple imputation, hold more promise than ad hoc methods. Although model-based methods require more specialized computer programs and assumptions about the nature of missing data, these methods are appropriate…

  19. Alternative Fuels Data Center: Biodiesel Fueling Infrastructure Development

    Science.gov Websites

    Biodiesel Fueling Station Locations by State More Biodiesel Data | All Maps & Data Case Studies Recycled Fuels Help Ensure America's National Parks Stay Green for Another Century More Biodiesel Case Studies | All Case Studies Publications 2016 Vehicle Technologies Market Report Biodiesel Handling and Use Guide

  20. Detecting Satisficing in Online Surveys

    ERIC Educational Resources Information Center

    Salifu, Shani

    2012-01-01

    The proliferation of computers and high speed internet services are making online activities an integral part of peoples' lives as connect with friends, shop, and exchange data. The increasing ability of the internet to handle sophisticated data exchanges is endearing it to researchers interested in gathering all kinds of data. This method has the…

  1. Fill in the Blanks: A Tale of Data Gone Missing.

    PubMed

    Jupiter, Daniel C

    2016-01-01

    In studies, we often encounter patients for whom data is missing. More than a nuisance, such missing data can seriously impact our analyses. I discuss here some methods to handle these situations. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  2. Ocean bottom seismometer: design and test of a measurement system for marine seismology.

    PubMed

    Mànuel, Antoni; Roset, Xavier; Del Rio, Joaquin; Toma, Daniel Mihai; Carreras, Normandino; Panahi, Shahram Shariat; Garcia-Benadí, A; Owen, Tim; Cadena, Javier

    2012-01-01

    The Ocean Bottom Seismometer (OBS) is a key instrument for the geophysical study of sea sub-bottom layers. At present, more reliable autonomous instruments capable of recording underwater for long periods of time and therefore handling large data storage are needed. This paper presents a new Ocean Bottom Seismometer designed to be used in long duration seismic surveys. Power consumption and noise level of the acquisition system are the key points to optimize the autonomy and the data quality. To achieve our goals, a new low power data logger with high resolution and Signal-to-Noise Ratio (SNR) based on Compact Flash memory card is designed to enable continuous data acquisition. The equipment represents the achievement of joint work from different scientific and technological disciplines as electronics, mechanics, acoustics, communications, information technology, marine geophysics, etc. This easy to handle and sophisticated equipment allows the recording of useful controlled source and passive seismic data, as well as other time varying data, with multiple applications in marine environment research. We have been working on a series of prototypes for ten years to improve many of the aspects that make the equipment easy to handle and useful to work in deep-water areas. Ocean Bottom Seismometers (OBS) have received growing attention from the geoscience community during the last forty years. OBS sensors recording motion of the ocean floor hold key information in order to study offshore seismicity and to explore the Earth's crust. In a seismic survey, a series of OBSs are placed on the seabed of the area under study, where they record either natural seismic activity or acoustic signals generated by compressed air-guns on the ocean surface. The resulting data sets are subsequently used to model both the earthquake locations and the crustal structure.

  3. Multiple imputation for handling missing outcome data when estimating the relative risk.

    PubMed

    Sullivan, Thomas R; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-09-06

    Multiple imputation is a popular approach to handling missing data in medical research, yet little is known about its applicability for estimating the relative risk. Standard methods for imputing incomplete binary outcomes involve logistic regression or an assumption of multivariate normality, whereas relative risks are typically estimated using log binomial models. It is unclear whether misspecification of the imputation model in this setting could lead to biased parameter estimates. Using simulated data, we evaluated the performance of multiple imputation for handling missing data prior to estimating adjusted relative risks from a correctly specified multivariable log binomial model. We considered an arbitrary pattern of missing data in both outcome and exposure variables, with missing data induced under missing at random mechanisms. Focusing on standard model-based methods of multiple imputation, missing data were imputed using multivariate normal imputation or fully conditional specification with a logistic imputation model for the outcome. Multivariate normal imputation performed poorly in the simulation study, consistently producing estimates of the relative risk that were biased towards the null. Despite outperforming multivariate normal imputation, fully conditional specification also produced somewhat biased estimates, with greater bias observed for higher outcome prevalences and larger relative risks. Deleting imputed outcomes from analysis datasets did not improve the performance of fully conditional specification. Both multivariate normal imputation and fully conditional specification produced biased estimates of the relative risk, presumably since both use a misspecified imputation model. Based on simulation results, we recommend researchers use fully conditional specification rather than multivariate normal imputation and retain imputed outcomes in the analysis when estimating relative risks. However fully conditional specification is not without its shortcomings, and so further research is needed to identify optimal approaches for relative risk estimation within the multiple imputation framework.

  4. Ensuring quality in studies linking cancer registries and biobanks.

    PubMed

    Langseth, Hilde; Luostarinen, Tapio; Bray, Freddie; Dillner, Joakim

    2010-04-01

    The Nordic countries have a long tradition of providing comparable and high quality cancer data through the national population-based cancer registries and the capability to link the diverse large-scale biobanks currently in operation. The joining of these two infrastructural resources can provide a study base for large-scale studies of etiology, treatment and early detection of cancer. Research projects based on combined data from cancer registries and biobanks provides great opportunities, but also presents major challenges. Biorepositories have become an important resource in molecular epidemiology, and the increased interest in performing etiological, clinical and gene-environment-interaction studies, involving information from biological samples linked to population-based cancer registries, warrants a joint evaluation of the quality aspects of the two resources, as well as an assessment of whether the resources can be successfully combined into a high quality study. While the quality of biospecimen handling and analysis is commonly considered in different studies, the logistics of data handling including the linkage of the biobank with the cancer registry is an overlooked aspect of a biobank-based study. It is thus the aim of this paper to describe recommendations on data handling, in particular the linkage of biobank material to cancer registry data and the quality aspects thereof, based on the experience of Nordic collaborative projects combining data from cancer registries and biobanks. We propose a standard documentation with respect to the following topics: the quality control aspects of cancer registration, the identification of cases and controls, the identification and use of data confounders, the stability of serum components, historical storage conditions, aliquoting history, the number of freeze/thaw cycles and available volumes.

  5. Ocean Bottom Seismometer: Design and Test of a Measurement System for Marine Seismology

    PubMed Central

    Mànuel, Antoni; Roset, Xavier; Del Rio, Joaquin; Toma, Daniel Mihai; Carreras, Normandino; Panahi, Shahram Shariat; Garcia-Benadí, A.; Owen, Tim; Cadena, Javier

    2012-01-01

    The Ocean Bottom Seismometer (OBS) is a key instrument for the geophysical study of sea sub-bottom layers. At present, more reliable autonomous instruments capable of recording underwater for long periods of time and therefore handling large data storage are needed. This paper presents a new Ocean Bottom Seismometer designed to be used in long duration seismic surveys. Power consumption and noise level of the acquisition system are the key points to optimize the autonomy and the data quality. To achieve our goals, a new low power data logger with high resolution and Signal–to-Noise Ratio (SNR) based on Compact Flash memory card is designed to enable continuous data acquisition. The equipment represents the achievement of joint work from different scientific and technological disciplines as electronics, mechanics, acoustics, communications, information technology, marine geophysics, etc. This easy to handle and sophisticated equipment allows the recording of useful controlled source and passive seismic data, as well as other time varying data, with multiple applications in marine environment research. We have been working on a series of prototypes for ten years to improve many of the aspects that make the equipment easy to handle and useful to work in deep-water areas. Ocean Bottom Seismometers (OBS) have received growing attention from the geoscience community during the last forty years. OBS sensors recording motion of the ocean floor hold key information in order to study offshore seismicity and to explore the Earth’s crust. In a seismic survey, a series of OBSs are placed on the seabed of the area under study, where they record either natural seismic activity or acoustic signals generated by compressed air-guns on the ocean surface. The resulting data sets are subsequently used to model both the earthquake locations and the crustal structure. PMID:22737032

  6. Biomedical health assessments of the Florida manatee in Crystal River - providing opportunities for training during the capture, handling, and processing of this endangered aquatic mammal

    USGS Publications Warehouse

    Bonde, Robert K.; Garrett, Andrew; Belanger, Michael; Askin, Nesime; Tan, Luke; Wittnich, Carin

    2012-01-01

    Federal and state researchers have been involved in manatee (Trichechus manatus) biomedical health assessment programs for a couple of decades. These benchmark studies have provided a foundation for the development of consistent capture, handling, and processing techniques and protocols. Biologists have implemented training and encouraged multi-agency participation whenever possible to ensure reliable data acquisition, recording, sample collection, publication integrity, and meeting rigorous archival standards. Under a U.S. Fish and Wildlife Service wildlife research permit granted to the U.S. Geological Survey (USGS) Sirenia Project, federal biologists and collaborators are allowed to conduct research studies on wild and captive manatees detailing various aspects of their biology. Therefore, researchers with the project have been collaborating on numerous studies over the last several years. One extensive study, initiated in 2006 has focused on health and fitness of the winter manatee population located in Crystal River, Florida. During those health assessments, capture, handling, and work-up training has been afforded to many of the participants. That study has successfully captured and handled 123 manatees. The data gathered have provided baseline information on manatee health, reproductive status, and nutritional condition. This research initiative addresses concerns and priorities outlined in the Florida Manatee Recovery Plan. The assessment teams strive to continue this collaborative effort to help advance our understanding of health-related issues confronting manatees throughout their range and interlacing these findings with surrogate species concepts.

  7. PubMed Central

    RAJI, A.O.Q.

    2014-01-01

    Summary Introduction. Food-borne disease outbreaks remain a major global health challenge and cross-contamination from raw meat due to poor handling is a major cause in developing countries. Adequate knowledge of meat handlers is important in limiting these outbreaks. This study evaluated and compared the safe meat-handling knowledge, attitudes and practices (KAP) of private (PMPP) and government meat processing plants' (GMPP) workers in south-western Nigeria. Methods. This cross sectional study comprised 190 meat handlers (PMPP = 55; GMPP = 135). Data concerning their safe meat-handling knowledge, attitudes and practices as well as their socio-demographic characteristics, such as age, gender and work experience were collected. Results. A significant association was observed between the type of meat processing plants and their knowledge (p = 0.000), attitudes (p = 0.000) and practices (p = 0.000) of safe meat-handling. Meat handlers in the GMPP were respectively, about 17 times (OR = 0.060, 95%CI: 0.018-0.203), 57 times (OR = 0.019, 95% CI: 0.007-0.054) and 111 times (OR = 0.009, 95%CI: 0.001- 0.067) less likely to obtain good knowledge, attitude and practice level of safe meat-handling than those from PMPP. Further, KAP levels were significantly associated with age group, education and work experience (p < 0.05). Discussion. Study findings suggest the need for future policy in food industry in developing countries to accommodate increased involvement of private sector for improved food safety and quality delivery. Public health education on safe food handling and hygiene should be on the front burner among food handlers in general. PMID:25916026

  8. 76 FR 63575 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...

  9. 76 FR 63554 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...

  10. 77 FR 11394 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-27

    ... written in FORTRAN and used simple text files for data input and output, MOVES is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables.\\13\\ \\13\\ Some...

  11. Development of a statewide online system for traffic data quality control and sharing.

    DOT National Transportation Integrated Search

    2009-12-01

    The Washington State Department of Transportation (WSDOT) operates thousands of Inductive Loop : Detectors (ILDs) on the freeways and highways of Washington State. The collection and disbursement of : this data is handled at the regional level, which...

  12. A general method for handling missing binary outcome data in randomized controlled trials

    PubMed Central

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-01-01

    Aims The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. Design We propose a sensitivity analysis where standard analyses, which could include ‘missing = smoking’ and ‘last observation carried forward’, are embedded in a wider class of models. Setting We apply our general method to data from two smoking cessation trials. Participants A total of 489 and 1758 participants from two smoking cessation trials. Measurements The abstinence outcomes were obtained using telephone interviews. Findings The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. Conclusions A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. PMID:25171441

  13. Adaptive data rate SSMA system for personal and mobile satellite communications

    NASA Technical Reports Server (NTRS)

    Ikegami, Tetsushi; Takahashi, Takashi; Arakaki, Yoshiya; Wakana, Hiromitsu

    1995-01-01

    An adaptive data rate SSMA (spread spectrum multiple access) system is proposed for mobile and personal multimedia satellite communications without the aid of system control earth stations. This system has a constant occupied bandwidth and has variable data rates and processing gains to mitigate communication link impairments such as fading, rain attenuation and interference as well as to handle variable data rate on demand. Proof of concept hardware for 6MHz bandwidth transponder is developed, that uses offset-QPSK (quadrature phase shift keying) and MSK (minimum shift keying) for direct sequence spread spectrum modulation and handle data rates of 4k to 64kbps. The RS422 data interface, low rate voice and H.261 video codecs are installed. The receiver is designed with coherent matched filter technique to achieve fast code acquisition, AFC (automatic frequency control) and coherent detection with minimum hardware losses in a single matched filter circuit. This receiver structure facilitates variable data rate on demand during a call. This paper shows the outline of the proposed system and the performance of the prototype equipment.

  14. Data fusion and classification using a hybrid intrinsic cellular inference network

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Walenz, Brett; Seiffertt, John; Robinette, Paul; Wunsch, Donald

    2010-04-01

    Hybrid Intrinsic Cellular Inference Network (HICIN) is designed for battlespace decision support applications. We developed an automatic method of generating hypotheses for an entity-attribute classifier. The capability and effectiveness of a domain specific ontology was used to generate automatic categories for data classification. Heterogeneous data is clustered using an Adaptive Resonance Theory (ART) inference engine on a sample (unclassified) data set. The data set is the Lahman baseball database. The actual data is immaterial to the architecture, however, parallels in the data can be easily drawn (i.e., "Team" maps to organization, "Runs scored/allowed" to Measure of organization performance (positive/negative), "Payroll" to organization resources, etc.). Results show that HICIN classifiers create known inferences from the heterogonous data. These inferences are not explicitly stated in the ontological description of the domain and are strictly data driven. HICIN uses data uncertainty handling to reduce errors in the classification. The uncertainty handling is based on subjective logic. The belief mass allows evidence from multiple sources to be mathematically combined to increase or discount an assertion. In military operations the ability to reduce uncertainty will be vital in the data fusion operation.

  15. Prediction of pilot opinion ratings using an optimal pilot model. [of aircraft handling qualities in multiaxis tasks

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1977-01-01

    A brief review of some of the more pertinent applications of analytical pilot models to the prediction of aircraft handling qualities is undertaken. The relative ease with which multiloop piloting tasks can be modeled via the optimal control formulation makes the use of optimal pilot models particularly attractive for handling qualities research. To this end, a rating hypothesis is introduced which relates the numerical pilot opinion rating assigned to a particular vehicle and task to the numerical value of the index of performance resulting from an optimal pilot modeling procedure as applied to that vehicle and task. This hypothesis is tested using data from piloted simulations and is shown to be reasonable. An example concerning a helicopter landing approach is introduced to outline the predictive capability of the rating hypothesis in multiaxis piloting tasks.

  16. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.

  17. 7 CFR 735.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., optical, or similar means, including, but not limited to, electronic data interchange, advanced... lawfully engaged in the business of storing or handling agricultural products. Warehousing activities and...

  18. 7 CFR 735.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., optical, or similar means, including, but not limited to, electronic data interchange, advanced... lawfully engaged in the business of storing or handling agricultural products. Warehousing activities and...

  19. TSTA Piping and Flame Arrestor Operating Experience Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadwallader, Lee C.; Willms, R. Scott

    The Tritium Systems Test Assembly (TSTA) was a facility dedicated to tritium handling technology and experiment research at the Los Alamos National Laboratory. The facility operated from 1984 to 2001, running a prototype fusion fuel processing loop with ~100 grams of tritium as well as small experiments. There have been several operating experience reports written on this facility’s operation and maintenance experience. This paper describes analysis of two additional components from TSTA, small diameter gas piping that handled small amounts of tritium in a nitrogen carrier gas, and the flame arrestor used in this piping system. The operating experiences andmore » the component failure rates for these components are discussed in this paper. Comparison data from other applications are also presented.« less

  20. Inference in randomized trials with death and missingness.

    PubMed

    Wang, Chenguang; Scharfstein, Daniel O; Colantuoni, Elizabeth; Girard, Timothy D; Yan, Ying

    2017-06-01

    In randomized studies involving severely ill patients, functional outcomes are often unobserved due to missed clinic visits, premature withdrawal, or death. It is well known that if these unobserved functional outcomes are not handled properly, biased treatment comparisons can be produced. In this article, we propose a procedure for comparing treatments that is based on a composite endpoint that combines information on both the functional outcome and survival. We further propose a missing data imputation scheme and sensitivity analysis strategy to handle the unobserved functional outcomes not due to death. Illustrations of the proposed method are given by analyzing data from a recent non-small cell lung cancer clinical trial and a recent trial of sedation interruption among mechanically ventilated patients. © 2016, The International Biometric Society.

  1. Predictive Techniques for Spacecraft Cabin Air Quality Control

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Cromes, Scott D. (Technical Monitor)

    2001-01-01

    As assembly of the International Space Station (ISS) proceeds, predictive techniques are used to determine the best approach for handling a variety of cabin air quality challenges. These techniques use equipment offgassing data collected from each ISS module before flight to characterize the trace chemical contaminant load. Combined with crew metabolic loads, these data serve as input to a predictive model for assessing the capability of the onboard atmosphere revitalization systems to handle the overall trace contaminant load as station assembly progresses. The techniques for predicting in-flight air quality are summarized along with results from early ISS mission analyses. Results from groundbased analyses of in-flight air quality samples are compared to the predictions to demonstrate the technique's relative conservatism.

  2. GRASS GIS: The first Open Source Temporal GIS

    NASA Astrophysics Data System (ADS)

    Gebbert, Sören; Leppelt, Thomas

    2015-04-01

    GRASS GIS is a full featured, general purpose Open Source geographic information system (GIS) with raster, 3D raster and vector processing support[1]. Recently, time was introduced as a new dimension that transformed GRASS GIS into the first Open Source temporal GIS with comprehensive spatio-temporal analysis, processing and visualization capabilities[2]. New spatio-temporal data types were introduced in GRASS GIS version 7, to manage raster, 3D raster and vector time series. These new data types are called space time datasets. They are designed to efficiently handle hundreds of thousands of time stamped raster, 3D raster and vector map layers of any size. Time stamps can be defined as time intervals or time instances in Gregorian calendar time or relative time. Space time datasets are simplifying the processing and analysis of large time series in GRASS GIS, since these new data types are used as input and output parameter in temporal modules. The handling of space time datasets is therefore equal to the handling of raster, 3D raster and vector map layers in GRASS GIS. A new dedicated Python library, the GRASS GIS Temporal Framework, was designed to implement the spatio-temporal data types and their management. The framework provides the functionality to efficiently handle hundreds of thousands of time stamped map layers and their spatio-temporal topological relations. The framework supports reasoning based on the temporal granularity of space time datasets as well as their temporal topology. It was designed in conjunction with the PyGRASS [3] library to support parallel processing of large datasets, that has a long tradition in GRASS GIS [4,5]. We will present a subset of more than 40 temporal modules that were implemented based on the GRASS GIS Temporal Framework, PyGRASS and the GRASS GIS Python scripting library. These modules provide a comprehensive temporal GIS tool set. The functionality range from space time dataset and time stamped map layer management over temporal aggregation, temporal accumulation, spatio-temporal statistics, spatio-temporal sampling, temporal algebra, temporal topology analysis, time series animation and temporal topology visualization to time series import and export capabilities with support for NetCDF and VTK data formats. We will present several temporal modules that support parallel processing of raster and 3D raster time series. [1] GRASS GIS Open Source Approaches in Spatial Data Handling In Open Source Approaches in Spatial Data Handling, Vol. 2 (2008), pp. 171-199, doi:10.1007/978-3-540-74831-19 by M. Neteler, D. Beaudette, P. Cavallini, L. Lami, J. Cepicky edited by G. Brent Hall, Michael G. Leahy [2] Gebbert, S., Pebesma, E., 2014. A temporal GIS for field based environmental modeling. Environ. Model. Softw. 53, 1-12. [3] Zambelli, P., Gebbert, S., Ciolli, M., 2013. Pygrass: An Object Oriented Python Application Programming Interface (API) for Geographic Resources Analysis Support System (GRASS) Geographic Information System (GIS). ISPRS Intl Journal of Geo-Information 2, 201-219. [4] Löwe, P., Klump, J., Thaler, J. (2012): The FOSS GIS Workbench on the GFZ Load Sharing Facility compute cluster, (Geophysical Research Abstracts Vol. 14, EGU2012-4491, 2012), General Assembly European Geosciences Union (Vienna, Austria 2012). [5] Akhter, S., Aida, K., Chemin, Y., 2010. "GRASS GIS on High Performance Computing with MPI, OpenMP and Ninf-G Programming Framework". ISPRS Conference, Kyoto, 9-12 August 2010

  3. A compilation and analysis of helicopter handling qualities data. Volume 1: Data compilation

    NASA Technical Reports Server (NTRS)

    Heffley, R. K.; Jewell, W. F.; Lehman, J. M.; Vanwinkle, R. A.

    1979-01-01

    A collection of basic descriptive data, stability derivatives and transfer functions for six degrees of freedom, quasi-static model is introduced. The data are arranged in a common, compact format for each of the five helicopters represented. The vehicles studied include the BO-105, AH-1h, and the CH53D.

  4. Data bases and data base systems related to NASA's Aerospace Program: A bibliography with indexes

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This bibliography lists 641 reports, articles, and other documents introduced into the NASA scientific and technical information system during the period January 1, 1981 through June 30, 1982. The directory was compiled to assist in the location of numerical and factual data bases and data base handling and management systems.

  5. Handling Missing Data: Analysis of a Challenging Data Set Using Multiple Imputation

    ERIC Educational Resources Information Center

    Pampaka, Maria; Hutcheson, Graeme; Williams, Julian

    2016-01-01

    Missing data is endemic in much educational research. However, practices such as step-wise regression common in the educational research literature have been shown to be dangerous when significant data are missing, and multiple imputation (MI) is generally recommended by statisticians. In this paper, we provide a review of these advances and their…

  6. A Data-Processing System for Quantitative Analysis in Speech Production. CLCS Occasional Paper No. 17.

    ERIC Educational Resources Information Center

    Chasaide, Ailbhe Ni; Davis, Eugene

    The data processing system used at Trinity College's Centre for Language and Communication Studies (Ireland) enables computer-automated collection and analysis of phonetic data and has many advantages for research on speech production. The system allows accurate handling of large quantities of data, eliminates many of the limitations of manual…

  7. 77 FR 15026 - Privacy Act of 1974; Farm Records File (Automated) System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-14

    ... Mining Project, all program data collected and handled by either RMA or FSA will be treated with the full... data warehouse and data mining operation. RMA will use the information to search or ``mine'' existing... fraud, waste, and abuse. The data mining operation is authorized by the Agricultural Risk Protection Act...

  8. Reporting the Use of Multiple Imputation for Missing Data in Higher Education Research

    ERIC Educational Resources Information Center

    Manly, Catherine A.; Wells, Ryan S.

    2015-01-01

    Higher education researchers using survey data often face decisions about handling missing data. Multiple imputation (MI) is considered by many statisticians to be the most appropriate technique for addressing missing data in many circumstances. In particular, it has been shown to be preferable to listwise deletion, which has historically been a…

  9. An alternative data filling approach for prediction of missing data in soft sets (ADFIS).

    PubMed

    Sadiq Khan, Muhammad; Al-Garadi, Mohammed Ali; Wahab, Ainuddin Wahid Abdul; Herawan, Tutut

    2016-01-01

    Soft set theory is a mathematical approach that provides solution for dealing with uncertain data. As a standard soft set, it can be represented as a Boolean-valued information system, and hence it has been used in hundreds of useful applications. Meanwhile, these applications become worthless if the Boolean information system contains missing data due to error, security or mishandling. Few researches exist that focused on handling partially incomplete soft set and none of them has high accuracy rate in prediction performance of handling missing data. It is shown that the data filling approach for incomplete soft set (DFIS) has the best performance among all previous approaches. However, in reviewing DFIS, accuracy is still its main problem. In this paper, we propose an alternative data filling approach for prediction of missing data in soft sets, namely ADFIS. The novelty of ADFIS is that, unlike the previous approach that used probability, we focus more on reliability of association among parameters in soft set. Experimental results on small, 04 UCI benchmark data and causality workbench lung cancer (LUCAP2) data shows that ADFIS performs better accuracy as compared to DFIS.

  10. Field Model: An Object-Oriented Data Model for Fields

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.

    2001-01-01

    We present an extensible, object-oriented data model designed for field data entitled Field Model (FM). FM objects can represent a wide variety of fields, including fields of arbitrary dimension and node type. FM can also handle time-series data. FM achieves generality through carefully selected topological primitives and through an implementation that leverages the potential of templated C++. FM supports fields where the nodes values are paired with any cell type. Thus FM can represent data where the field nodes are paired with the vertices ("vertex-centered" data), fields where the nodes are paired with the D-dimensional cells in R(sup D) (often called "cell-centered" data), as well as fields where nodes are paired with edges or other cell types. FM is designed to effectively handle very large data sets; in particular FM employs a demand-driven evaluation strategy that works especially well with large field data. Finally, the interfaces developed for FM have the potential to effectively abstract field data based on adaptive meshes. We present initial results with a triangular adaptive grid in R(sup 2) and discuss how the same design abstractions would work equally well with other adaptive-grid variations, including meshes in R(sup 3).

  11. Improvements in multimedia data buffering using master/slave architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheikh, S.; Ganesan, R.

    1996-12-31

    Advances in the networking technology and multimedia technology has necessitated a need for multimedia servers to be robust and reliable. Existing solutions have direct limitations such as I/O bottleneck and reliability of data retrieval. The system can store the stream of incoming data if enough buffer space is available or the mass storage is clearing the buffer data faster than queue input. A single buffer queue is not sufficient to handle the large frames. Queue sizes are normally several megabytes in length and thus in turn will introduce a state of overflow. The system should also keep track of themore » rewind, fast forwarding, and pause requests, otherwise queue management will become intricate. In this paper, we present a master/slave (server that is designated to monitor the workflow of the complete system. This server holds every other information of slaves by maintaining a dynamic table. It also controls the workload on each of the systems by redistributing request to others or handles the request by itself) approach which will overcome the limitations of today`s storage and also satisfy tomorrow`s storage needs. This approach will maintain the system reliability and yield faster response by using more storage units in parallel. A network of master/slave can handle many requests and synchronize them at all times. Using dedicated CPU and a common pool of queues we explain how queues can be controlled and buffer overflow can be avoided. We propose a layered approach to the buffering problem and provide a read-ahead solution to ensure continuous storage and retrieval of multimedia data.« less

  12. Evaluating beneficial drug effects in a non‐interventional setting: a review of effectiveness studies based on Swedish Prescribed Drug Register data

    PubMed Central

    Hoffmann, Mikael

    2017-01-01

    Aims To describe and assess current effectiveness studies published up to 2014 using Swedish Prescribed Drug Register (SPDR) data. Methods Study characteristics were extracted. Each study was assessed concerning the clinical relevance of the research question, the risk of bias according to a structured checklist, and as to whether its findings contributed to new knowledge. The biases encountered and ways of handling these were retrieved. Results A total of 24 effectiveness studies were included in the review, the majority on cardiovascular or psychiatric disease (n = 17; 71%). The articles linked data from four (interquartile range: three to four) registers, and were published in 21 different journals with an impact factor ranging from 1.58 to 51.66. All articles had a clinically relevant research question. According to the systematic quality assessments, the overall risk of bias was low in one (4%), moderate in eight (33%) and high in 15 (62%) studies. Overall, two (8%) studies were assessed as contributing to new knowledge. Frequently occurring problems were selection bias making the comparison groups incomparable, treatment bias with suboptimal handling of drug exposure and an intention‐to‐treat approach, and assessment bias including immortal time bias. Good examples of how to handle bias problems included propensity score matching and sensitivity analyses. Conclusion Although this review illustrates that effectiveness studies based on dispensed drug register data can contribute to new evidence of intended effects of drug treatment in clinical practice, the expectations of such data to provide valuable information need to be tempered due to methodological issues. PMID:27928842

  13. Evaluation of High-Speed Civil Transport Handling Qualities Criteria with Supersonic Flight Data

    NASA Technical Reports Server (NTRS)

    Cox, Timothy H.; Jackson, Dante W.

    1997-01-01

    Most flying qualities criteria have been developed from data in the subsonic flight regime. Unique characteristics of supersonic flight raise questions about whether these criteria successfully extend into the supersonic flight regime. Approximately 25 years ago NASA Dryden Flight Research Center addressed this issue with handling qualities evaluations of the XB-70 and YF-12. Good correlations between some of the classical handling qualities parameters, such as the control anticipation parameter as a function of damping, were discovered. More criteria have been developed since these studies. Some of these more recent criteria are being used in designing the High-Speed Civil Transport (HSCT). A second research study recently addressed this issue through flying qualities evaluations of the SR-71 at Mach 3. The research goal was to extend the high-speed flying qualities experience of large airplanes and to evaluate more recent MIL-STD-1797 criteria against pilot comments and ratings. Emphasis was placed on evaluating the criteria used for designing the HSCT. XB-70 and YF-12 data from the previous research supplemented the SR-71 data. The results indicate that the criteria used in the HSCT design are conservative and should provide good flying qualities for typical high-speed maneuvering. Additional results show correlation between the ratings and comments and criteria for gradual maneuvering with precision control. Correlation is shown between ratings and comments and an extension of the Neal/Smith criterion using normal acceleration instead of pitch rate.

  14. ARES - A New Airborne Reflective Emissive Spectrometer

    DTIC Science & Technology

    2005-10-01

    Information and Management System (DIMS), an automated processing environment with robot archive interface as established for the handling of satellite data...consisting of geocoded ground reflectance data. All described processing steps will be integrated in the automated processing environment DIMS to assure a

  15. Surface electrical properties experiment, part 1. [flown on Apollo 17

    NASA Technical Reports Server (NTRS)

    Strangway, D. W.; Annan, A. P.; Redman, J. D.; Rossiter, J. R.; Rylaarsdam, J. A.; Watts, R. D.

    1974-01-01

    The work is reported which was performed on the Surface Electrical Properties Experiment Data Acquisition System. Areas discussed include: data handling and processing, installation and external signal application, operation of the equipment, and digital output. Detailed circuit descriptions are included.

  16. Sandia National Laboratories Small-Scale Sensitivity Testing (SSST) Report: Calcium Nitrate Mixtures with Various Fuels.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, Jason Joe

    Based upon the presented sensitivity data for the examined calcium nitrate mixtures using sugar and sawdust, contact handling/mixing of these materials does not present hazards greater than those occurring during handling of dry PETN powder. The aluminized calcium nitrate mixtures present a known ESD fire hazard due to the fine aluminum powder fuel. These mixtures may yet present an ESD explosion hazard, though this has not been investigated at this time. The detonability of these mixtures will be investigated during Phase III testing.

  17. Conditioning laboratory cats to handling and transport.

    PubMed

    Gruen, Margaret E; Thomson, Andrea E; Clary, Gillian P; Hamilton, Alexandra K; Hudson, Lola C; Meeker, Rick B; Sherman, Barbara L

    2013-10-01

    As research subjects, cats have contributed substantially to our understanding of biological systems, from the development of mammalian visual pathways to the pathophysiology of feline immunodeficiency virus as a model for human immunodeficiency virus. Few studies have evaluated humane methods for managing cats in laboratory animal facilities, however, in order to reduce fear responses and improve their welfare. The authors describe a behavioral protocol used in their laboratory to condition cats to handling and transport. Such behavioral conditioning benefits the welfare of the cats, the safety of animal technicians and the quality of feline research data.

  18. Application of frequency domain handling qualities criteria to the longitudinal landing task

    NASA Technical Reports Server (NTRS)

    Sarrafian, S. K.; Powers, B. G.

    1985-01-01

    Three frequency-domain handling qualities criteria have been applied to the observed data to correlate the actual pilot ratings assigned to generic transport configurations with stability augmentation during the longitudinal landing task. The criteria are based on closed-loop techniques using pitch attitude, altitude rate at the pilot station, and altitude at the pilot station as dominating control parameters during this task. It is found that most promising results are obtained with altitude control performed by closing an inner loop on pitch attitude and closing an outer loop on altitude.

  19. Laboratory Identification of Arthropod Ectoparasites

    PubMed Central

    Pritt, Bobbi S.

    2014-01-01

    SUMMARY The collection, handling, identification, and reporting of ectoparasitic arthropods in clinical and reference diagnostic laboratories are discussed in this review. Included are data on ticks, mites, lice, fleas, myiasis-causing flies, and bed bugs. The public health importance of these organisms is briefly discussed. The focus is on the morphological identification and proper handling and reporting of cases involving arthropod ectoparasites, particularly those encountered in the United States. Other arthropods and other organisms not of public health concern, but routinely submitted to laboratories for identification, are also briefly discussed. PMID:24396136

  20. EC97-44354-2

    NASA Image and Video Library

    1997-12-16

    An image of the F-16XL #1 during its functional flight check of the Digital Flight Control System (DFCS) on December 16, 1997. The mission was flown by NASA research pilot Dana Purifoy, and lasted 1 hour and 25 minutes. The tests included pilot familiarly, functional check, and handling qualities evaluation maneuvers to a speed of Mach 0.6 and 300 knots. Purifoy completed all the briefed data points with no problems, and reported that the DFCS handled as well, if not better than the analog computer system that it replaced.

  1. Gravity flow rate of solids through orifices and pipes

    NASA Technical Reports Server (NTRS)

    Gardner, J. F.; Smith, J. E.; Hobday, J. M.

    1977-01-01

    Lock-hopper systems are the most common means for feeding solids to and from coal conversion reactor vessels. The rate at which crushed solids flow by gravity through the vertical pipes and valves in lock-hopper systems affects the size of pipes and valves needed to meet the solids-handling requirements of the coal conversion process. Methods used to predict flow rates are described and compared with experimental data. Preliminary indications are that solids-handling systems for coal conversion processes are over-designed by a factor of 2 or 3.

  2. CRUNCH_PARALLEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumaker, Dana E.; Steefel, Carl I.

    The code CRUNCH_PARALLEL is a parallel version of the CRUNCH code. CRUNCH code version 2.0 was previously released by LLNL, (UCRL-CODE-200063). Crunch is a general purpose reactive transport code developed by Carl Steefel and Yabusake (Steefel Yabsaki 1996). The code handles non-isothermal transport and reaction in one, two, and three dimensions. The reaction algorithm is generic in form, handling an arbitrary number of aqueous and surface complexation as well as mineral dissolution/precipitation. A standardized database is used containing thermodynamic and kinetic data. The code includes advective, dispersive, and diffusive transport.

  3. User's guide to FBASE: Relational database software for managing R1/R4 (Northern/Intermountain Regions) fish habitat inventory data

    Treesearch

    Sherry P. Wollrab

    1999-01-01

    FBASE is a microcomputer relational database package that handles data collected using the R1/R4 Fish and Fish Habitat Standard Inventory Procedures (Overton and others 1997). FBASE contains standard data entry screens, data validations for quality control, data maintenance features, and summary report options. This program also prepares data for importation into an...

  4. Emissions of NOx, particle mass and particle numbers from aircraft main engines, APU's and handling equipment at Copenhagen Airport

    NASA Astrophysics Data System (ADS)

    Winther, Morten; Kousgaard, Uffe; Ellermann, Thomas; Massling, Andreas; Nøjgaard, Jacob Klenø; Ketzel, Matthias

    2015-01-01

    This paper presents a detailed emission inventory for NOx, particle mass (PM) and particle numbers (PN) for aircraft main engines, APU's and handling equipment at Copenhagen Airport (CPH) based on time specific activity data and representative emission factors for the airport. The inventory has a high spatial resolution of 5 m × 5 m in order to be suited for further air quality dispersion calculations. Results are shown for the entire airport and for a section of the airport apron area ("inner apron") in focus. The methodology presented in this paper can be used to quantify the emissions from aircraft main engines, APU and handling equipment in other airports. For the entire airport, aircraft main engines is the largest source of fuel consumption (93%), NOx, (87%), PM (61%) and PN (95%). The calculated fuel consumption [NOx, PM, PN] shares for APU's and handling equipment are 5% [4%, 8%, 5%] and 2% [9%, 31%, 0%], respectively. At the inner apron area for handling equipment the share of fuel consumption [NOx, PM, PN] are 24% [63%, 75%, 2%], whereas APU and main engines shares are 43% [25%, 19%, 54%], and 33% [11%, 6%, 43%], respectively. The inner apron NOx and PM emission levels are high for handling equipment due to high emission factors for the diesel fuelled handling equipment and small for aircraft main engines due to small idle-power emission factors. Handling equipment is however a small PN source due to the low number based emission factors. Jet fuel sulphur-PM sensitivity calculations made in this study with the ICAO FOA3.0 method suggest that more than half of the PM emissions from aircraft main engines at CPH originate from the sulphur content of the fuel used at the airport. Aircraft main engine PN emissions are very sensitive to the underlying assumptions. Replacing this study's literature based average emission factors with "high" and "low" emission factors from the literature, the aircraft main engine PN emissions were estimated to change with a factor of 14.

  5. Onboard Experiment Data Support Facility

    NASA Technical Reports Server (NTRS)

    1976-01-01

    An onboard array structure has been devised for end to end processing of data from multiple spaceborne sensors. The array constitutes sets of programmable pipeline processors whose elements perform each assigned function in 0.25 microseconds. This space shuttle computer system can handle data rates from a few bits to over 100 megabits per second.

  6. 41 CFR 101-42.202 - Identification of hazardous materials.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...'s Federal Supply Service (4FQ) maintains an automated data base, accessible via modem and computer... on the terminal screen, the system allows for the addition of the MSDS to the user's local data base... personnel who handle, store, ship, use or dispose of hazardous materials. Each record in the data base is...

  7. 41 CFR 101-42.202 - Identification of hazardous materials.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...'s Federal Supply Service (4FQ) maintains an automated data base, accessible via modem and computer... on the terminal screen, the system allows for the addition of the MSDS to the user's local data base... personnel who handle, store, ship, use or dispose of hazardous materials. Each record in the data base is...

  8. 41 CFR 101-42.202 - Identification of hazardous materials.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...'s Federal Supply Service (4FQ) maintains an automated data base, accessible via modem and computer... on the terminal screen, the system allows for the addition of the MSDS to the user's local data base... personnel who handle, store, ship, use or dispose of hazardous materials. Each record in the data base is...

  9. 41 CFR 101-42.202 - Identification of hazardous materials.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...'s Federal Supply Service (4FQ) maintains an automated data base, accessible via modem and computer... on the terminal screen, the system allows for the addition of the MSDS to the user's local data base... personnel who handle, store, ship, use or dispose of hazardous materials. Each record in the data base is...

  10. 41 CFR 101-42.202 - Identification of hazardous materials.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...'s Federal Supply Service (4FQ) maintains an automated data base, accessible via modem and computer... on the terminal screen, the system allows for the addition of the MSDS to the user's local data base... personnel who handle, store, ship, use or dispose of hazardous materials. Each record in the data base is...

  11. A Comparison of Techniques for Handling and Assessing the Influence of Mobility on Student Achievement

    ERIC Educational Resources Information Center

    Smith, Lindsey J. Wolff; Beretvas, S. Natasha

    2017-01-01

    Conventional multilevel modeling works well with purely hierarchical data; however, pure hierarchies rarely exist in real datasets. Applied researchers employ ad hoc procedures to create purely hierarchical data. For example, applied educational researchers either delete mobile participants' data from the analysis or identify the student only with…

  12. Strategies for Handling Missing Data with Maximum Likelihood Estimation in Career and Technical Education Research

    ERIC Educational Resources Information Center

    Lee, In Heok

    2012-01-01

    Researchers in career and technical education often ignore more effective ways of reporting and treating missing data and instead implement traditional, but ineffective, missing data methods (Gemici, Rojewski, & Lee, 2012). The recent methodological, and even the non-methodological, literature has increasingly emphasized the importance of…

  13. Comparison of Missing Data Treatments in Producing Factor Scores.

    ERIC Educational Resources Information Center

    Witta, E. Lea

    Because ignoring the missing data in an evaluation may lead to results that are questionable, this study investigated the effects of use of four missing data handling techniques on a survey instrument. A questionnaire containing 35 5-point Likert-style questions was completed by 384 respondents. Of these, 166 (43%) questionnaires contained 1 or…

  14. LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox

    ERIC Educational Resources Information Center

    Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich

    2016-01-01

    To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…

  15. Earth Orbiter 1: Wideband Advanced Recorder and Processor (WARP)

    NASA Technical Reports Server (NTRS)

    Smith, Terry; Kessler, John

    1999-01-01

    An advanced on-board spacecraft data system component is presented. The component is computer-based and provides science data acquisition, processing, storage, and base-band transmission functions. Specifically, the component is a very high rate solid state recorder, serving as a pathfinder for achieving the data handling requirements of next-generation hyperspectral imaging missions.

  16. Real-Time Reconnaissance-A Systems Look At Advanced Technology

    NASA Astrophysics Data System (ADS)

    Lapp, Henry

    1981-12-01

    An important role for reconnaissance is the location and identification of targets in real time. Current technology has been compartmented into sensors, automatic target recognizers, data links, ground exploitation and finally dissemination. In the days of bring home film recce, this segmentation of functions was appropriate. With the current emphasis on real time decision making from outputs of high resolution sensors this thinking has to be re-analyzed. A total systems approach to data management must be employed using the constraints imposed by technology as well as the atmosphere, survivable flight profiles, and the human workload. This paper will analyze the target acquisition through exploitation tasks and discuss the current advanced development technology that are applicable. A philosophy of processing data to get information as early as possible in the data handling chain is examined in the context of ground exploitation and dissemination needs. Examples of how the various real time sensors (screeners and processors), jam resistant data links and near real time ground data handling systems fit into this scenario are discussed. Specific DoD programs will be used to illustrate the credibility of this integrated approach.

  17. Variable Selection in the Presence of Missing Data: Imputation-based Methods.

    PubMed

    Zhao, Yize; Long, Qi

    2017-01-01

    Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.

  18. LBMD : a layer-based mesh data structure tailored for generic API infrastructures.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ebeida, Mohamed S.; Knupp, Patrick Michael

    2010-11-01

    A new mesh data structure is introduced for the purpose of mesh processing in Application Programming Interface (API) infrastructures. This data structure utilizes a reduced mesh representation to increase its ability to handle significantly larger meshes compared to full mesh representation. In spite of the reduced representation, each mesh entity (vertex, edge, face, and region) is represented using a unique handle, with no extra storage cost, which is a crucial requirement in most API libraries. The concept of mesh layers makes the data structure more flexible for mesh generation and mesh modification operations. This flexibility can have a favorable impactmore » in solver based queries of finite volume and multigrid methods. The capabilities of LBMD make it even more attractive for parallel implementations using Message Passing Interface (MPI) or Graphics Processing Units (GPUs). The data structure is associated with a new classification method to relate mesh entities to their corresponding geometrical entities. The classification technique stores the related information at the node level without introducing any ambiguities. Several examples are presented to illustrate the strength of this new data structure.« less

  19. Development and Operations of the Astrophysics Data System

    NASA Technical Reports Server (NTRS)

    Murray, Stephen S.; Oliversen, Ronald (Technical Monitor)

    2003-01-01

    SAO TASKS ACCOMPLISHED: Abstract Service: (1) Continued regular updates of abstracts in the databases, both at SAO and at all mirror sites; (2) Established a new naming convention of QB books in preparation for adding physics books from Hollis or Library of Congress; (3) Modified handling of object tag so as not to interfere with XHTML definition; (4) Worked on moving 'what's new' announcements to a majordomo email list so as not to interfere with divisional mail handling; (5) Implemented and tested new first author feature following suggestions from users at the AAS meeting; (6) Added SSRv entries back to volume 1 in preparation for scanning of the journal; (7) Assisted in the re-configuration of the ADS mirror site at the CDS and sent a new set of tapes containing article data to allow re-creation of the ADS article data lost during the move; (8) Created scripts to automatically download Astrobiology.

  20. Incorporating CCSDS telemetry standards and philosophy on Cassini

    NASA Technical Reports Server (NTRS)

    Day, John C.; Elson, Anne B.

    1995-01-01

    The Cassini project at the Jet Propulsion Laboratory (JPL) is implementing a spacecraft telemetry system based on the Consultative Committee for Space Data Systems (CCSDS) packet telemetry standards. Resolving the CCSDS concepts with a Ground Data System designed to handle time-division-multiplexed telemetry and also handling constraints unique to a deep-space planetary spacecraft (such as fixed downlink opportunities, small downlink rates and requirements for on-board data storage) have resulted in spacecraft and ground system design challenges. Solving these design challenges involved adapting and extending the CCSDS telemetry standards as well as changes to the spacecraft and ground system designs. The resulting spacecraft/ground system design is an example of how new ideas and philosophies can be incorporated into existing systems and design approaches without requiring significant rework. In addition, it shows that the CCSDS telemetry standards can be successfully applied to deep-space planetary spacecraft.

  1. The handling, hazards, and maintenance of heavy liquids in the geologic laboratory

    USGS Publications Warehouse

    Hauff, Phoebe L.; Airey, Joseph

    1980-01-01

    In geologic laboratories the organic heavy liquids bromoform, methylene iodide, tetrabromoethane, and clerici compounds have been used for years in mineral separation processes. Because the volume of use of these compounds is low, insufficient data is available on their toxic properties. This report is an attempt to summarize the known data from published and industry sources. The physical properties, hazards of handling,proper storage facilities, and adequate protective Clothing are discussed for each compound as well as for their common and less-common solvents. Toxicity data for these materials is listed along with exposure symptoms and suggested first aid treatments. Safety for the worker is emphasized. Three reclamation methods which recover the solvent used as a dilutant and purify the heavy liquid are discussed and illustrated. These include: the water cascade, re fluxing-distillation-condensation, and flash evaporation methods. Various techniques for restoration and stabilization of these heavy liquids are also included.

  2. Fiber tracking of brain white matter based on graph theory.

    PubMed

    Lu, Meng

    2015-01-01

    Brain white matter tractography is reconstructed via diffusion-weighted magnetic resonance images. Due to the complex structure of brain white matter fiber bundles, fiber crossing and fiber branching are abundant in human brain. And regular methods with diffusion tensor imaging (DTI) can't accurately handle this problem. the biggest problems of the brain tractography. Therefore, this paper presented a novel brain white matter tractography method based on graph theory, so the fiber tracking between two voxels is transformed into locating the shortest path in a graph. Besides, the presented method uses Q-ball imaging (QBI) as the source data instead of DTI, because QBI can provide accurate information about multiple fiber crossing and branching in one voxel using orientation distribution function (ODF). Experiments showed that the presented method can accurately handle the problem of brain white matter fiber crossing and branching, and reconstruct brain tractograhpy both in phantom data and real brain data.

  3. Handling properties of diverse automobiles and correlation with full scale response data. [driver/vehicle response to aerodynamic disturbances

    NASA Technical Reports Server (NTRS)

    Hoh, R. H.; Weir, D. H.

    1973-01-01

    Driver/vehicle response and performance of a variety of vehicles in the presence of aerodynamic disturbances are discussed. Steering control is emphasized. The vehicles include full size station wagon, sedan, compact sedan, van, pickup truck/camper, and wagon towing trailer. Driver/vehicle analyses are used to estimate response and performance. These estimates are correlated with full scale data with test drivers and the results are used to refine the driver/vehicle models, control structure, and loop closure criteria. The analyses and data indicate that the driver adjusts his steering control properties (when he can) to achieve roughly the same level of performance despite vehicle variations. For the more disturbance susceptible vehicles, such as the van, the driver tightens up his control. Other vehicles have handling dynamics which cause him to loosen his control response, even though performance degrades.

  4. Brain white matter fiber estimation and tractography using Q-ball imaging and Bayesian MODEL.

    PubMed

    Lu, Meng

    2015-01-01

    Diffusion tensor imaging allows for the non-invasive in vivo mapping of the brain tractography. However, fiber bundles have complex structures such as fiber crossings, fiber branchings and fibers with large curvatures that tensor imaging (DTI) cannot accurately handle. This study presents a novel brain white matter tractography method using Q-ball imaging as the data source instead of DTI, because QBI can provide accurate information about multiple fiber crossings and branchings in a single voxel using an orientation distribution function (ODF). The presented method also uses graph theory to construct the Bayesian model-based graph, so that the fiber tracking between two voxels can be represented as the shortest path in a graph. Our experiment showed that our new method can accurately handle brain white matter fiber crossings and branchings, and reconstruct brain tractograhpy both in phantom data and real brain data.

  5. Microbial biogeography of a university campus.

    PubMed

    Ross, Ashley A; Neufeld, Josh D

    2015-12-01

    Microorganisms are distributed on surfaces within homes, workplaces, and schools, with the potential to impact human health and disease. University campuses represent a unique opportunity to explore the distribution of microorganisms within built environments because of high human population densities, throughput, and variable building usage. For example, the main campus of the University of Waterloo spans four square kilometres, hosts over 40,000 individuals daily, and is comprised of a variety of buildings, including lecture halls, gyms, restaurants, residences, and a daycare. Representative left and right entrance door handles from each of the 65 buildings at the University of Waterloo were swabbed at three time points during an academic term in order to determine if microbial community assemblages coincided with building usage and whether these communities are stable temporally. Across all door handles, the dominant phyla were Proteobacteria, Firmicutes, Actinobacteria, and Bacteroidetes, which comprised 89.0 % of all reads. A total of 713 genera were observed, 16 of which constituted a minimum of 1 % of the 2,458,094 classified and rarefied reads. Archaea were found in low abundance (~0.03 %) but were present on 42.8 % of the door handles on 96 % of buildings across all time points, indicating that they are ubiquitous at very low levels on door handle surfaces. Although inter-handle variability was high, several individual building entrances harbored distinct microbial communities that were consistent over time. The presence of visible environmental debris on a subset of handles was associated with distinct microbial communities (beta diversity), increased richness (alpha diversity), and higher biomass (adenosine 5'-triphosphate; ATP). This study demonstrates highly variable microbial communities associated with frequently contacted door handles on a university campus. Nonetheless, the data also revealed several building-specific and temporally stable bacterial and archaeal community patterns, with a potential impact of accumulated debris, a possible result of low human throughput, on detected microbial communities.

  6. Unleashing the Power of Distributed CPU/GPU Architectures: Massive Astronomical Data Analysis and Visualization Case Study

    NASA Astrophysics Data System (ADS)

    Hassan, A. H.; Fluke, C. J.; Barnes, D. G.

    2012-09-01

    Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a “software as a service” manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.

  7. Farmers' use of personal protective equipment during handling of plant protection products: Determinants of implementation.

    PubMed

    Damalas, Christos A; Abdollahzadeh, Gholamhossein

    2016-11-15

    Understanding factors affecting the use of personal protective equipment (PPE) during handling of plant protection products (PPPs) is of major importance for the design of tailored interventions to minimize exposure among farmers. However, data regarding this issue are highly limited. Factors related to the use of PPE during handling of PPPs were explored in a survey of cotton farmers in northern Greece. Data were collected through face-to-face interviews with the farmers based on a questionnaire with structured items on the frequency of use of various personal protective devices during handling of PPPs. New evidence on patterns of PPE use and potential exposure of farmers to PPPs is provided. Most farmers (49.3%) showed potentially unsafe behaviour with respect to PPE use. Hat and boots were the most commonly used protective items during PPPs use, but most of the farmers surveyed reported low frequency of use for gloves, goggles, face mask, coveralls, and respirator. Especially the respirator was reported to be the least used PPE item amongst farmers. Farmers who perceived PPPs as harmful substances or those who had an episode of intoxication in the past reported more frequent use of several PPE items. Stepwise multiple regression analysis revealed that the variable episode of intoxication in the past exerted the strongest positive influence on PPE use, followed by the perception of PPPs being hazardous substances, upper secondary education, previous training on PPPs (i.e., spraying equipment, application parameters, risks to human health and environment, safety issues) and farm size under cultivation. Old age exerted a significant negative influence on PPE use, namely, elderly farmers tended not to use PPE. Strategies to maximize the protection of applicators of PPPs from hazardous exposures still require innovation to achieve increased effectiveness. Emphasis on lifelong training and education of farmers about hazards and risks of PPPs is crucial for changing wrong behaviours in handling of PPPs. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. The HCMM system: Development and performance

    NASA Technical Reports Server (NTRS)

    Stuart, L. M., Jr.

    1982-01-01

    The structure and history of the heat capacity mapping mission program is reviewed and the spacecraft is described including engineering specifications, instrument design, data handling, and image characteristics.

  9. The natural angle between the hand and handle and the effect of handle orientation on wrist radial/ulnar deviation during maximal push exertions.

    PubMed

    Young, Justin G; Lin, Jia-Hua; Chang, Chien-Chi; McGorry, Raymond W

    2013-01-01

    The purpose of this experiment was to quantify the natural angle between the hand and a handle, and to investigate three design factors: handle rotation, handle tilt and between-handle width on the natural angle as well as resultant wrist radial/ulnar deviation ('RUD') for pushing tasks. Photographs taken of the right upper limb of 31 participants (14 women and 17 men) performing maximal seated push exertions on different handles were analysed. Natural hand/handle angle and RUD were assessed. It was found that all of the three design factors significantly affected natural handle angle and wrist RUD, but participant gender did not. The natural angle between the hand and the cylindrical handle was 65 ± 7°. Wrist deviation was reduced for handles that were rotated 0° (horizontal) and at the narrow width (31 cm). Handles that were tilted forward 15° reduced radial deviation consistently (12-13°) across handle conditions. Manual materials handling (MMH) tasks involving pushing have been related to increased risk of musculoskeletal injury. This study shows that handle orientation influences hand and wrist posture during pushing, and suggests that the design of push handles on carts and other MMH aids can be improved by adjusting their orientation to fit the natural interface between the hand and handle.

  10. Expanding the use of Scientific Data through Maps and Apps

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Zimble, D. A.; Herring, D.; Halpert, M.

    2014-12-01

    The importance of making scientific data more available can't be overstated. There is a wealth of useful scientific data available and demand for this data is only increasing; however, applying scientific data towards practical uses poses several technical challenges. These challenges can arise from difficulty in handling the data due largely to 1) the complexity, variety and volume of scientific data and 2) applying and operating the techniques and tools needed to visualize and analyze the data. As a result, the combined knowledge required to take advantage of these data requires highly specialized skill sets that in total, limit the ability of scientific data from being used in more practical day-to-day decision making activities. While these challenges are daunting, information technologies do exist that can help mitigate some of these issues. Many organizations for years have already been enjoying the benefits of modern service oriented architectures (SOAs) for everyday enterprise tasks. We can use this approach to modernize how we share and access our scientific data where much of the specialized tools and techniques needed to handle and present scientific data can be automated and executed by servers and done so in an appropriate way. We will discuss and show an approach for preparing file based scientific data (e.g. GRIB, netCDF) for use in standard based scientific web services. These scientific web services are able to encapsulate the logic needed to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. By combining these types of services and leveraging well-documented and modern web development APIs, we can afford to focus our attention on the design and development of user-friendly maps and apps. Our scenario will include developing online maps through these services by integrating various forecast data from the Climate Forecast System (CFSv2). This presentation showcases a collaboration between the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov portal, Climate Prediction Center and Esri, Inc. on the implementation of the ArcGIS platform, which is aimed at helping modernize scientific data access through a service oriented architecture.

  11. Examples of EOS Variables as compared to the UMM-Var Data Model

    NASA Technical Reports Server (NTRS)

    Cantrell, Simon; Lynnes, Chris

    2016-01-01

    In effort to provide EOSDIS clients a way to discover and use variable data from different providers, a Unified Metadata Model for Variables is being created. This presentation gives an overview of the model and use cases we are handling.

  12. America's freight transportation gateways : connecting our nation to places and markets abroad : [2009

    DOT National Transportation Integrated Search

    2009-11-01

    This report ranks freight gateways by the value of merchandise trade they handle. Value : data were compiled from multiple sources, allowing comparison of all the freight modes. : See box 2 for a detailed description of the freight data sources. : Th...

  13. Combining partially ranked data in plant breeding and biology: II. Analysis with Rasch model.

    USDA-ARS?s Scientific Manuscript database

    Many years of breeding experiments, germplasm screening, and molecular biologic experimentation have generated volumes of sequence, genotype, and phenotype information that have been stored in public data repositories. These resources afford genetic and genomic researchers the opportunity to handle ...

  14. Lidar In-space Technology Experiment (LITE) Electronics Overview

    NASA Technical Reports Server (NTRS)

    Blythe, Michael P.; Couch, Richard H.; Rowland, Carroll W.; Kitchen, Wayne L.; Regan, Curtis P.; Koch, Michael R.; Antill, Charles W.; Stevens, William T.; Rollins, Courtney H.; Kist, Edward H.

    1992-01-01

    The LITE electronics system consists of the following seven subsystems: Laser Transmitter Module (LTM), Boresight Assembly (BA), Aft-Optics Electronics (AOE), Digital Data Handling Unit (DDHU), Engineering Data System (EDS), Instrument Controller (IC), and the Ground Support Equipment (GSE). Each of these subsystems is discussed.

  15. Prototype smart phone application to report water quality conditions.

    EPA Science Inventory

    The EPA Pathfinder Innovation Project has identified that environmental managers are typically limited in their time and ability to use and handle satellite remote sensing data due to the file size and complexity in the data structures. Therefore this project developed the Mobil...

  16. 21 CFR 58.81 - Standard operating procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... quality and integrity of the data generated in the course of a study. All deviations in a study from... data. Significant changes in established standard operating procedures shall be properly authorized in... following: (1) Animal room preparation. (2) Animal care. (3) Receipt, identification, storage, handling...

  17. Determination of heat capacity of ionic liquid based nanofluids using group method of data handling technique

    NASA Astrophysics Data System (ADS)

    Sadi, Maryam

    2018-01-01

    In this study a group method of data handling model has been successfully developed to predict heat capacity of ionic liquid based nanofluids by considering reduced temperature, acentric factor and molecular weight of ionic liquids, and nanoparticle concentration as input parameters. In order to accomplish modeling, 528 experimental data points extracted from the literature have been divided into training and testing subsets. The training set has been used to predict model coefficients and the testing set has been applied for model validation. The ability and accuracy of developed model, has been evaluated by comparison of model predictions with experimental values using different statistical parameters such as coefficient of determination, mean square error and mean absolute percentage error. The mean absolute percentage error of developed model for training and testing sets are 1.38% and 1.66%, respectively, which indicate excellent agreement between model predictions and experimental data. Also, the results estimated by the developed GMDH model exhibit a higher accuracy when compared to the available theoretical correlations.

  18. WMAP C&DH Software

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan; Leath, Tim; Ferrer, Art; Miller, Todd; Walters, Mark; Savadkin, Bruce; Wu, Ji-Wei; Slegel, Steve; Stagmer, Emory

    2007-01-01

    The command-and-data-handling (C&DH) software of the Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft functions as the sole interface between (1) the spacecraft and its instrument subsystem and (2) ground operations equipment. This software includes a command-decoding and -distribution system, a telemetry/data-handling system, and a data-storage-and-playback system. This software performs onboard processing of attitude sensor data and generates commands for attitude-control actuators in a closed-loop fashion. It also processes stored commands and monitors health and safety functions for the spacecraft and its instrument subsystems. The basic functionality of this software is the same of that of the older C&DH software of the Rossi X-Ray Timing Explorer (RXTE) spacecraft, the main difference being the addition of the attitude-control functionality. Previously, the C&DH and attitude-control computations were performed by different processors because a single RXTE processor did not have enough processing power. The WMAP spacecraft includes a more-powerful processor capable of performing both computations.

  19. Large-Scale Compute-Intensive Analysis via a Combined In-situ and Co-scheduling Workflow Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Messer, Bronson; Sewell, Christopher; Heitmann, Katrin

    2015-01-01

    Large-scale simulations can produce tens of terabytes of data per analysis cycle, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of performing the analysis in situ, utilizing the same resources as the simulation, and/or off-loading subsets of the data to a compute-intensive analysis system. We introduce an analysis framework developed for HACC, a cosmological N-body code, that uses both in situ and co-scheduling approaches for handling Petabyte-size outputs. An initial inmore » situ step is used to reduce the amount of data to be analyzed, and to separate out the data-intensive tasks handled off-line. The analysis routines are implemented using the PISTON/VTK-m framework, allowing a single implementation of an algorithm that simultaneously targets a variety of GPU, multi-core, and many-core architectures.« less

  20. 7 CFR 925.29 - Duties.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE GRAPES GROWN IN A DESIGNATED AREA OF... investigate and assemble data on the growing, handling, and marketing conditions with respect to grapes; (i...

Top