Sample records for instrument database system

  1. A Database Management Assessment Instrument

    ERIC Educational Resources Information Center

    Landry, Jeffrey P.; Pardue, J. Harold; Daigle, Roy; Longenecker, Herbert E., Jr.

    2013-01-01

    This paper describes an instrument designed for assessing learning outcomes in data management. In addition to assessment of student learning and ABET outcomes, we have also found the instrument to be effective for determining database placement of incoming information systems (IS) graduate students. Each of these three uses is discussed in this…

  2. Advanced instrumentation: Technology database enhancement, volume 4, appendix G

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The purpose of this task was to add to the McDonnell Douglas Space Systems Company's Sensors Database, including providing additional information on the instruments and sensors applicable to physical/chemical Environmental Control and Life Support System (P/C ECLSS) or Closed Ecological Life Support System (CELSS) which were not previously included. The Sensors Database was reviewed in order to determine the types of data required, define the data categories, and develop an understanding of the data record structure. An assessment of the MDSSC Sensors Database identified limitations and problems in the database. Guidelines and solutions were developed to address these limitations and problems in order that the requirements of the task could be fulfilled.

  3. 48 CFR 5.601 - Governmentwide database of contracts.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Governmentwide database of... database of contracts. (a) A Governmentwide database of contracts and other procurement instruments.../contractdirectory/.This searchable database is a tool that may be used to identify existing contracts and other...

  4. 48 CFR 5.601 - Governmentwide database of contracts.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Governmentwide database of... database of contracts. (a) A Governmentwide database of contracts and other procurement instruments.../contractdirectory/. This searchable database is a tool that may be used to identify existing contracts and other...

  5. 48 CFR 5.601 - Governmentwide database of contracts.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Governmentwide database of... database of contracts. (a) A Governmentwide database of contracts and other procurement instruments.../contractdirectory/ .This searchable database is a tool that may be used to identify existing contracts and other...

  6. 48 CFR 5.601 - Governmentwide database of contracts.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Governmentwide database of... database of contracts. (a) A Governmentwide database of contracts and other procurement instruments.../contractdirectory/.This searchable database is a tool that may be used to identify existing contracts and other...

  7. Integration of Web-based and PC-based clinical research databases.

    PubMed

    Brandt, C A; Sun, K; Charpentier, P; Nadkarni, P M

    2004-01-01

    We have created a Web-based repository or data library of information about measurement instruments used in studies of multi-factorial geriatric health conditions (the Geriatrics Research Instrument Library - GRIL) based upon existing features of two separate clinical study data management systems. GRIL allows browsing, searching, and selecting measurement instruments based upon criteria such as keywords and areas of applicability. Measurement instruments selected can be printed and/or included in an automatically generated standalone microcomputer database application, which can be downloaded by investigators for use in data collection and data management. Integration of database applications requires the creation of a common semantic model, and mapping from each system to this model. Various database schema conflicts at the table and attribute level must be identified and resolved prior to integration. Using a conflict taxonomy and a mapping schema facilitates this process. Critical conflicts at the table level that required resolution included name and relationship differences. A major benefit of integration efforts is the sharing of features and cross-fertilization of applications created for similar purposes in different operating environments. Integration of applications mandates some degree of metadata model unification.

  8. [Design and implementation of medical instrument standard information retrieval system based on APS.NET].

    PubMed

    Yu, Kaijun

    2010-07-01

    This paper Analys the design goals of Medical Instrumentation standard information retrieval system. Based on the B /S structure,we established a medical instrumentation standard retrieval system with ASP.NET C # programming language, IIS f Web server, SQL Server 2000 database, in the. NET environment. The paper also Introduces the system structure, retrieval system modules, system development environment and detailed design of the system.

  9. Team X Spacecraft Instrument Database Consolidation

    NASA Technical Reports Server (NTRS)

    Wallenstein, Kelly A.

    2005-01-01

    In the past decade, many changes have been made to Team X's process of designing each spacecraft, with the purpose of making the overall procedure more efficient over time. One such improvement is the use of information databases from previous missions, designs, and research. By referring to these databases, members of the design team can locate relevant instrument data and significantly reduce the total time they spend on each design. The files in these databases were stored in several different formats with various levels of accuracy. During the past 2 months, efforts have been made in an attempt to combine and organize these files. The main focus was in the Instruments department, where spacecraft subsystems are designed based on mission measurement requirements. A common database was developed for all instrument parameters using Microsoft Excel to minimize the time and confusion experienced when searching through files stored in several different formats and locations. By making this collection of information more organized, the files within them have become more easily searchable. Additionally, the new Excel database offers the option of importing its contents into a more efficient database management system in the future. This potential for expansion enables the database to grow and acquire more search features as needed.

  10. Ground Support Software for Spaceborne Instrumentation

    NASA Technical Reports Server (NTRS)

    Anicich, Vincent; Thorpe, rob; Fletcher, Greg; Waite, Hunter; Xu, Hykua; Walter, Erin; Frick, Kristie; Farris, Greg; Gell, Dave; Furman, Jufy; hide

    2004-01-01

    ION is a system of ground support software for the ion and neutral mass spectrometer (INMS) instrument aboard the Cassini spacecraft. By incorporating commercial off-the-shelf database, Web server, and Java application components, ION offers considerably more ground-support-service capability than was available previously. A member of the team that operates the INMS or a scientist who uses the data collected by the INMS can gain access to most of the services provided by ION via a standard pointand click hyperlink interface generated by almost any Web-browser program running in almost any operating system on almost any computer. Data are stored in one central location in a relational database in a non-proprietary format, are accessible in many combinations and formats, and can be combined with data from other instruments and spacecraft. The use of the Java programming language as a system-interface language offers numerous capabilities for object-oriented programming and for making the database accessible to participants using a variety of computer hardware and software.

  11. ISSARS Aerosol Database : an Incorporation of Atmospheric Particles into a Universal Tool to Simulate Remote Sensing Instruments

    NASA Technical Reports Server (NTRS)

    Goetz, Michael B.

    2011-01-01

    The Instrument Simulator Suite for Atmospheric Remote Sensing (ISSARS) entered its third and final year of development with an overall goal of providing a unified tool to simulate active and passive space borne atmospheric remote sensing instruments. These simulations focus on the atmosphere ranging from UV to microwaves. ISSARS handles all assumptions and uses various models on scattering and microphysics to fill the gaps left unspecified by the atmospheric models to create each instrument's measurements. This will help benefit mission design and reduce mission cost, create efficient implementation of multi-instrument/platform Observing System Simulation Experiments (OSSE), and improve existing models as well as new advanced models in development. In this effort, various aerosol particles are incorporated into the system, and a simulation of input wavelength and spectral refractive indices related to each spherical test particle(s) generate its scattering properties and phase functions. These atmospheric particles being integrated into the system comprise the ones observed by the Multi-angle Imaging SpectroRadiometer(MISR) and by the Multiangle SpectroPolarimetric Imager(MSPI). In addition, a complex scattering database generated by Prof. Ping Yang (Texas A&M) is also incorporated into this aerosol database. Future development with a radiative transfer code will generate a series of results that can be validated with results obtained by the MISR and MSPI instruments; nevertheless, test cases are simulated to determine the validity of various plugin libraries used to determine or gather the scattering properties of particles studied by MISR and MSPI, or within the Single-scattering properties of tri-axial ellipsoidal mineral dust particles database created by Prof. Ping Yang.

  12. Operational Monitoring of GOME-2 and IASI Level 1 Product Processing at EUMETSAT

    NASA Astrophysics Data System (ADS)

    Livschitz, Yakov; Munro, Rosemary; Lang, Rüdiger; Fiedler, Lars; Dyer, Richard; Eisinger, Michael

    2010-05-01

    The growing complexity of operational level 1 radiance products from Low Earth Orbiting (LEO) platforms like EUMETSATs Metop series makes near-real-time monitoring of product quality a challenging task. The main challenge is to provide a monitoring system which is flexible and robust enough to identify and to react to anomalies which may be previously unknown to the system, as well as to provide all means and parameters necessary in order to support efficient ad-hoc analysis of the incident. The operational monitoring system developed at EUMETSAT for monitoring of GOME-2 and IASI level 1 data allows to perform near-real-time monitoring of operational products and instrument's health in a robust and flexible fashion. For effective information management, the system is based on a relational database (Oracle). An Extract, Transform, Load (ETL) process transforms products in EUMETSAT Polar System (EPS) format into relational data structures. The identification of commonalities between products and instruments allows for a database structure design in such a way that different data can be analyzed using the same business intelligence functionality. An interactive analysis software implementing modern data mining techniques is also provided for a detailed look into the data. The system is effectively used for day-to-day monitoring, long-term reporting, instrument's degradation analysis as well as for ad-hoc queries in case of an unexpected instrument or processing behaviour. Having data from different sources on a single instrument and even from different instruments, platforms or numerical weather prediction within the same database allows effective cross-comparison and looking for correlated parameters. Automatic alarms raised by checking for deviation of certain parameters, for data losses and other events significantly reduce time, necessary to monitor the processing on a day-to-day basis.

  13. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    NASA Astrophysics Data System (ADS)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.

  14. Instrument Failures for the da Vinci Surgical System: a Food and Drug Administration MAUDE Database Study.

    PubMed

    Friedman, Diana C W; Lendvay, Thomas S; Hannaford, Blake

    2013-05-01

    Our goal was to analyze reported instances of the da Vinci robotic surgical system instrument failures using the FDA's MAUDE (Manufacturer and User Facility Device Experience) database. From these data we identified some root causes of failures as well as trends that may assist surgeons and users of the robotic technology. We conducted a survey of the MAUDE database and tallied robotic instrument failures that occurred between January 2009 and December 2010. We categorized failures into five main groups (cautery, shaft, wrist or tool tip, cable, and control housing) based on technical differences in instrument design and function. A total of 565 instrument failures were documented through 528 reports. The majority of failures (285) were of the instrument's wrist or tool tip. Cautery problems comprised 174 failures, 76 were shaft failures, 29 were cable failures, and 7 were control housing failures. Of the reports, 10 had no discernible failure mode and 49 exhibited multiple failures. The data show that a number of robotic instrument failures occurred in a short period of time. In reality, many instrument failures may go unreported, thus a true failure rate cannot be determined from these data. However, education of hospital administrators, operating room staff, surgeons, and patients should be incorporated into discussions regarding the introduction and utilization of robotic technology. We recommend institutions incorporate standard failure reporting policies so that the community of robotic surgery companies and surgeons can improve on existing technologies for optimal patient safety and outcomes.

  15. Denver International Airport sensor processing and database

    DOT National Transportation Integrated Search

    2000-03-01

    Data processing and database design is described for an instrumentation system installed on runway 34R at Denver International Airport (DIA). Static (low-speed) and dynamic (high-speed) sensors are installed in the pavement. The static sensors includ...

  16. Assessing barriers to health insurance and threats to equity in comparative perspective: The Health Insurance Access Database

    PubMed Central

    2012-01-01

    Background Typologies traditionally used for international comparisons of health systems often conflate many system characteristics. To capture policy changes over time and by service in health systems regulation of public and private insurance, we propose a database containing explicit, standardized indicators of policy instruments. Methods The Health Insurance Access Database (HIAD) will collect policy information for ten OECD countries, over a range of eight health services, from 1990–2010. Policy indicators were selected through a comprehensive literature review which identified policy instruments most likely to constitute barriers to health insurance, thus potentially posing a threat to equity. As data collection is still underway, we present here the theoretical bases and methodology adopted, with a focus on the rationale underpinning the study instruments. Results These harmonized data will allow the capture of policy changes in health systems regulation of public and private insurance over time and by service. The standardization process will permit international comparisons of systems’ performance with regards to health insurance access and equity. Conclusion This research will inform and feed the current debate on the future of health care in developed countries and on the role of the private sector in these changes. PMID:22551599

  17. Unified Planetary Coordinates System: A Searchable Database of Geodetic Information

    NASA Technical Reports Server (NTRS)

    Becker, K. J.a; Gaddis, L. R.; Soderblom, L. A.; Kirk, R. L.; Archinal, B. A.; Johnson, J. R.; Anderson, J. A.; Bowman-Cisneros, E.; LaVoie, S.; McAuley, M.

    2005-01-01

    Over the past 40 years, an enormous quantity of orbital remote sensing data has been collected for Mars from many missions and instruments. Unfortunately these datasets currently exist in a wide range of disparate coordinate systems, making it extremely difficult for the scientific community to easily correlate, combine, and compare data from different Mars missions and instruments. As part of our work for the PDS Imaging Node and on behalf of the USGS Astrogeology Team, we are working to solve this problem and to provide the NASA scientific research community with easy access to Mars orbital data in a unified, consistent coordinate system along with a wide variety of other key geometric variables. The Unified Planetary Coordinates (UPC) system is comprised of two main elements: (1) a database containing Mars orbital remote sensing data computed using a uniform coordinate system, and (2) a process by which continual maintainance and updates to the contents of the database are performed.

  18. Digital Education Governance: Data Visualization, Predictive Analytics, and "Real-Time" Policy Instruments

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    Educational institutions and governing practices are increasingly augmented with digital database technologies that function as new kinds of policy instruments. This article surveys and maps the landscape of digital policy instrumentation in education and provides two detailed case studies of new digital data systems. The Learning Curve is a…

  19. EOSCUBE: A Constraint Database System for High-Level Specification and Efficient Generation of EOSDIS Products. Phase 1; Proof-of-Concept

    NASA Technical Reports Server (NTRS)

    Brodsky, Alexander; Segal, Victor E.

    1999-01-01

    The EOSCUBE constraint database system is designed to be a software productivity tool for high-level specification and efficient generation of EOSDIS and other scientific products. These products are typically derived from large volumes of multidimensional data which are collected via a range of scientific instruments.

  20. ECLSS Integration Analysis: Advanced ECLSS Subsystem and Instrumentation Technology Study for the Space Exploration Initiative

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In his July 1989 space policy speech, President Bush proposed a long range continuing commitment to space exploration and development. Included in his goals were the establishment of permanent lunar and Mars habitats and the development of extended duration space transportation. In both cases, a major issue is the availability of qualified sensor technologies for use in real-time monitoring and control of integrated physical/chemical/biological (p/c/b) Environmental Control and Life Support Systems (ECLSS). The purpose of this study is to determine the most promising instrumentation technologies for future ECLSS applications. The study approach is as follows: 1. Precursor ECLSS Subsystem Technology Trade Study - A database of existing and advanced Atmosphere Revitalization (AR) and Water Recovery and Management (WRM) ECLSS subsystem technologies was created. A trade study was performed to recommend AR and WRM subsystem technologies for future lunar and Mars mission scenarios. The purpose of this trade study was to begin defining future ECLSS instrumentation requirements as a precursor to determining the instrumentation technologies that will be applicable to future ECLS systems. 2. Instrumentation Survey - An instrumentation database of Chemical, Microbial, Conductivity, Humidity, Flowrate, Pressure, and Temperature sensors was created. Each page of the sensor database report contains information for one type of sensor, including a description of the operating principles, specifications, and the reference(s) from which the information was obtained. This section includes a cursory look at the history of instrumentation on U.S. spacecraft. 3. Results and Recommendations - Instrumentation technologies were recommended for further research and optimization based on a consideration of both of the above sections. A sensor or monitor technology was recommended based on its applicability to future ECLS systems, as defined by the ECLSS Trade Study (1), and on whether its characteristics were considered favorable relative to similar instrumentation technologies (competitors), as determined from the Instrumentation Survey (2). The instrumentation technologies recommended by this study show considerable potential for development and promise significant returns if research efforts are invested.

  1. The Lifeways Cross-Generation Study: design, recruitment and data management considerations.

    PubMed

    O'Mahony, D; Fallon, U B; Hannon, F; Kloeckner, K; Avalos, G; Murphy, A W; Kelleher, C C

    2007-09-01

    The Lifeways Cross-Generation Cohort Study was first established in 2001 and is a unique longitudinal database in Ireland, with currently over three and a half thousand family participants derived from 1124 mothers recruited initially during pregnancy, mainly during 2002. The database comprises a) baseline self-reported health data for all mothers, a third of fathers and at least one grandparent b) clinical hospital data at recruitment, c) three year follow-up data from the families' General Practitioners, and d) linkage to hospital and vaccination databases. Data collection for the five-year follow-up with parents is underway, continuing through 2007. Because there is at present no single national/regional health information system in Ireland, original data instruments were designed to capture data directly from family members and through their hospitals and healthcare providers. A system of relational databases was designed to coordinate data capture for a complex array of study instruments and to facilitate tracking of family members at different time points.

  2. School Performance Feedback Systems in the USA and in the Netherlands: A Comparison

    ERIC Educational Resources Information Center

    Schildkamp, Kim; Teddlie, Charles

    2008-01-01

    Schools around the world are using instruments for performance feedback, but there is no scientific evidence that they have positive effects on education. This paper compares a School Performance Feedback System (SPFS) used in the USA as an accountability instrument to an SPFS used in The Netherlands. The study employs a unique database: one in…

  3. Operational Monitoring of GOME-2 and IASI Level 1 Product Processing at EUMETSAT

    NASA Astrophysics Data System (ADS)

    Livschitz, Y.; Munro, R.; Lang, R.; Fiedler, L.; Dyer, R.; Eisinger, M.

    2009-12-01

    The growing complexity of operational level 1 radiance products from Low Earth Orbiting (LEO) platforms like EUMETSATs Metop series makes near-real-time monitoring of product quality a challenging task. The main challenge is to provide a monitoring system which is flexible and robust enough to identify and to react to anomalies which may be previously unknown to the system, as well as to provide all means and parameters necessary in order to support efficient ad-hoc analysis of the incident. The operational monitoring system developed at EUMETSAT for monitoring of GOME-2 and IASI level 1 data allows to perform near-real-time monitoring of operational products and instrument’s health in a robust and flexible fashion. For effective information management, the system is based on a relational database (Oracle). An Extract, Transform, Load (ETL) process transforms products in EUMETSAT Polar System (EPS) format into relational data structures. The identification of commonalities between products and instruments allows for a database structure design in such a way that different data can be analyzed using the same business intelligence functionality. An interactive analysis software implementing modern data mining techniques is also provided for a detailed look into the data. The system is effectively used for day-to-day monitoring, long-term reporting, instrument’s degradation analysis as well as for ad-hoc queries in case of an unexpected instrument or processing behaviour. Having data from different sources on a single instrument and even from different instruments, platforms or numerical weather prediction within the same database allows effective cross-comparison and looking for correlated parameters. Automatic alarms raised by checking for deviation of certain parameters, for data losses and other events significantly reduce time, necessary to monitor the processing on a day-to-day basis.

  4. Decision Support for Emergency Operations Centers

    NASA Technical Reports Server (NTRS)

    Harvey, Craig; Lawhead, Joel; Watts, Zack

    2005-01-01

    The Flood Disaster Mitigation Decision Support System (DSS) is a computerized information system that allows regional emergency-operations government officials to make decisions regarding the dispatch of resources in response to flooding. The DSS implements a real-time model of inundation utilizing recently acquired lidar elevation data as well as real-time data from flood gauges, and other instruments within and upstream of an area that is or could become flooded. The DSS information is updated as new data become available. The model generates realtime maps of flooded areas and predicts flood crests at specified locations. The inundation maps are overlaid with information on population densities, property values, hazardous materials, evacuation routes, official contact information, and other information needed for emergency response. The program maintains a database and a Web portal through which real-time data from instrumentation are gathered into the database. Also included in the database is a geographic information system, from which the program obtains the overlay data for areas of interest as needed. The portal makes some portions of the database accessible to the public. Access to other portions of the database is restricted to government officials according to various levels of authorization. The Flood Disaster Mitigation DSS has been integrated into a larger DSS named REACT (Real-time Emergency Action Coordination Tool), which also provides emergency operations managers with data for any type of impact area such as floods, fires, bomb

  5. New DMSP database of precipitating auroral electrons and ions

    NASA Astrophysics Data System (ADS)

    Redmon, Robert J.; Denig, William F.; Kilcommons, Liam M.; Knipp, Delores J.

    2017-08-01

    Since the mid-1970s, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low Earth orbit. As the program evolved, so have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) to F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information and the Coordinated Data Analysis Web. We describe how the new database is being applied to high-latitude studies of the colocation of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field-aligned currents, and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.

  6. The HST/WFC3 Quicklook Project: A User Interface to Hubble Space Telescope Wide Field Camera 3 Data

    NASA Astrophysics Data System (ADS)

    Bourque, Matthew; Bajaj, Varun; Bowers, Ariel; Dulude, Michael; Durbin, Meredith; Gosmeyer, Catherine; Gunning, Heather; Khandrika, Harish; Martlin, Catherine; Sunnquist, Ben; Viana, Alex

    2017-06-01

    The Hubble Space Telescope's Wide Field Camera 3 (WFC3) instrument, comprised of two detectors, UVIS (Ultraviolet-Visible) and IR (Infrared), has been acquiring ~ 50-100 images daily since its installation in 2009. The WFC3 Quicklook project provides a means for instrument analysts to store, calibrate, monitor, and interact with these data through the various Quicklook systems: (1) a ~ 175 TB filesystem, which stores the entire WFC3 archive on disk, (2) a MySQL database, which stores image header data, (3) a Python-based automation platform, which currently executes 22 unique calibration/monitoring scripts, (4) a Python-based code library, which provides system functionality such as logging, downloading tools, database connection objects, and filesystem management, and (5) a Python/Flask-based web interface to the Quicklook system. The Quicklook project has enabled large-scale WFC3 analyses and calibrations, such as the monitoring of the health and stability of the WFC3 instrument, the measurement of ~ 20 million WFC3/UVIS Point Spread Functions (PSFs), the creation of WFC3/IR persistence calibration products, and many others.

  7. Footprint Representation of Planetary Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Walter, S. H. G.; Gasselt, S. V.; Michael, G.; Neukum, G.

    The geometric outline of remote sensing image data, the so called footprint, can be represented as a number of coordinate tuples. These polygons are associated with according attribute information such as orbit name, ground- and image resolution, solar longitude and illumination conditions to generate a powerful base for classification of planetary experiment data. Speed, handling and extended capabilites are the reasons for using geodatabases to store and access these data types. Techniques for such a spatial database of footprint data are demonstrated using the Relational Database Management System (RDBMS) PostgreSQL, spatially enabled by the PostGIS extension. Exemplary, footprints of the HRSC and OMEGA instruments, both onboard ESA's Mars Express Orbiter, are generated and connected to attribute information. The aim is to provide high-resolution footprints of the OMEGA instrument to the science community for the first time and make them available for web-based mapping applications like the "Planetary Interactive GIS-on-the-Web Analyzable Database" (PIG- WAD), produced by the USGS. Map overlays with HRSC or other instruments like MOC and THEMIS (footprint maps are already available for these instruments and can be integrated into the database) allow on-the-fly intersection and comparison as well as extended statistics of the data. Footprint polygons are generated one by one using standard software provided by the instrument teams. Attribute data is calculated and stored together with the geometric information. In the case of HRSC, the coordinates of the footprints are already available in the VICAR label of each image file. Using the VICAR RTL and PostgreSQL's libpq C library they are loaded into the database using the Well-Known Text (WKT) notation by the Open Geospatial Consortium, Inc. (OGC). For the OMEGA instrument, image data is read using IDL routines developed and distributed by the OMEGA team. Image outlines are exported together with relevant attribute data to the industry standard Shapefile format. These files are translated to a Structured Query Language (SQL) command sequence suitable for insertion into the PostGIS/PostgrSQL database using the shp2pgsql data loader provided by the PostGIS software. PostgreSQL's advanced features such as geometry types, rules, operators and functions allow complex spatial queries and on-the-fly processing of data on DBMS level e.g. generalisation of the outlines. Processing done by the DBMS, visualisation via GIS systems and utilisation for web-based applications like mapservers will be demonstrated.

  8. Advanced Land Imager Assessment System

    NASA Technical Reports Server (NTRS)

    Chander, Gyanesh; Choate, Mike; Christopherson, Jon; Hollaren, Doug; Morfitt, Ron; Nelson, Jim; Nelson, Shar; Storey, James; Helder, Dennis; Ruggles, Tim; hide

    2008-01-01

    The Advanced Land Imager Assessment System (ALIAS) supports radiometric and geometric image processing for the Advanced Land Imager (ALI) instrument onboard NASA s Earth Observing-1 (EO-1) satellite. ALIAS consists of two processing subsystems for radiometric and geometric processing of the ALI s multispectral imagery. The radiometric processing subsystem characterizes and corrects, where possible, radiometric qualities including: coherent, impulse; and random noise; signal-to-noise ratios (SNRs); detector operability; gain; bias; saturation levels; striping and banding; and the stability of detector performance. The geometric processing subsystem and analysis capabilities support sensor alignment calibrations, sensor chip assembly (SCA)-to-SCA alignments and band-to-band alignment; and perform geodetic accuracy assessments, modulation transfer function (MTF) characterizations, and image-to-image characterizations. ALIAS also characterizes and corrects band-toband registration, and performs systematic precision and terrain correction of ALI images. This system can geometrically correct, and automatically mosaic, the SCA image strips into a seamless, map-projected image. This system provides a large database, which enables bulk trending for all ALI image data and significant instrument telemetry. Bulk trending consists of two functions: Housekeeping Processing and Bulk Radiometric Processing. The Housekeeping function pulls telemetry and temperature information from the instrument housekeeping files and writes this information to a database for trending. The Bulk Radiometric Processing function writes statistical information from the dark data acquired before and after the Earth imagery and the lamp data to the database for trending. This allows for multi-scene statistical analyses.

  9. The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science

    NASA Astrophysics Data System (ADS)

    Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.

    2017-12-01

    The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.

  10. Databases as policy instruments. About extending networks as evidence-based policy.

    PubMed

    de Bont, Antoinette; Stoevelaar, Herman; Bal, Roland

    2007-12-07

    This article seeks to identify the role of databases in health policy. Access to information and communication technologies has changed traditional relationships between the state and professionals, creating new systems of surveillance and control. As a result, databases may have a profound effect on controlling clinical practice. We conducted three case studies to reconstruct the development and use of databases as policy instruments. Each database was intended to be employed to control the use of one particular pharmaceutical in the Netherlands (growth hormone, antiretroviral drugs for HIV and Taxol, respectively). We studied the archives of the Dutch Health Insurance Board, conducted in-depth interviews with key informants and organized two focus groups, all focused on the use of databases both in policy circles and in clinical practice. Our results demonstrate that policy makers hardly used the databases, neither for cost control nor for quality assurance. Further analysis revealed that these databases facilitated self-regulation and quality assurance by (national) bodies of professionals, resulting in restrictive prescription behavior amongst physicians. The databases fulfill control functions that were formerly located within the policy realm. The databases facilitate collaboration between policy makers and physicians, since they enable quality assurance by professionals. Delegating regulatory authority downwards into a network of physicians who control the use of pharmaceuticals seems to be a good alternative for centralized control on the basis of monitoring data.

  11. Landsat-4 and Landsat-5 thematic mapper band 6 historical performance and calibration

    USGS Publications Warehouse

    Barsi, J.A.; Chander, G.; Markham, B.L.; Higgs, N.; ,

    2005-01-01

    Launched in 1982 and 1984 respectively, the Landsat-4 and -5 Thematic Mappers (TM) are the backbone of an extensive archive of moderate resolution Earth imagery. However, these sensors and their data products were not subjected to the type of intensive monitoring that has been part of the Landsat-7 system since its launch in 1999. With Landsat-4's 11 year and Landsat-5's 20+ year data record, there is a need to understand the historical behavior of the instruments in order to verify the scientific integrity of the archive and processed products. Performance indicators of the Landsat-4 and -5 thermal bands have recently been extracted from a processing system database allowing for a more complete study of thermal band characteristics and calibration than was previously possible. The database records responses to the internal calibration system, instrument temperatures and applied gains and offsets for each band for every scene processed through the National Landsat Archive Production System (NLAPS). Analysis of this database has allowed for greater understanding of the calibration and improvement in the processing system. This paper will cover the trends in the Landsat-4 and -5 thermal bands, the effect of the changes seen in the trends, and how these trends affect the use of the thermal data.

  12. New DMSP Database of Precipitating Auroral Electrons and Ions.

    PubMed

    Redmon, Robert J; Denig, William F; Kilcommons, Liam M; Knipp, Delores J

    2017-08-01

    Since the mid 1970's, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low earth orbit. As the program evolved, so to have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) through F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information (NCEI) and the Coordinated Data Analysis Web (CDAWeb). We describe how the new database is being applied to high latitude studies of: the co-location of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field aligned currents and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.

  13. Documentation of Assessment Instrumentation--The NORC/CRESST 12th Grade Science Assessment, Item Databases, and Test Booklets. Project 2.6: Analytic Models To Monitor Status & Progress of Learning & Performance & Their Antecedents: The School Science Assessment Project.

    ERIC Educational Resources Information Center

    Bock, H. Darrell

    The hardware and software system used to create the National Opinion Research Center/Center for Research on Evaluation, Standards, and Student Testing (NORC/CRESST) item databases and test booklets for the 12th-grade science assessment are described. A general description of the capabilities of the system is given, with some specific information…

  14. Ambient Optomechanical Alignment and Pupil Metrology for the Flight Instruments Aboard the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Beaton, Alexander; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hayden, Joseph E.; Hummel, Susann; Hylan, Jason E.; Lee, David; Madison, Timothy J.; Maszkiewicz, Michael; hide

    2014-01-01

    The James Webb Space Telescope science instruments are in the final stages of being integrated into the Integrated Science Instrument Module (ISIM) element. Each instrument is tied into a common coordinate system through mechanical references that are used for optical alignment and metrology within ISIM after element-level assembly. In addition, a set of ground support equipment (GSE) consisting of large, precisely calibrated, ambient, and cryogenic structures are used as alignment references and gauges during various phases of integration and test (I&T). This GSE, the flight instruments, and ISIM structure feature different types of complimentary metrology targeting. These GSE targets are used to establish and track six degrees of freedom instrument alignment during I&T in the vehicle coordinate system (VCS). This paper describes the optomechanical metrology conducted during science instrument integration and alignment in the Spacecraft Systems Development and Integration Facility (SSDIF) cleanroom at NASA Goddard Space Flight Center (GSFC). The measurement of each instrument's ambient entrance pupil location in the telescope coordinate system is discussed. The construction of the database of target locations and the development of metrology uncertainties is also discussed.

  15. Instrument response measurements of ion mobility spectrometers in situ: maintaining optimal system performance of fielded systems

    NASA Astrophysics Data System (ADS)

    Wallis, Eric; Griffin, Todd M.; Popkie, Norm, Jr.; Eagan, Michael A.; McAtee, Robert F.; Vrazel, Danet; McKinly, Jim

    2005-05-01

    Ion Mobility Spectroscopy (IMS) is the most widespread detection technique in use by the military for the detection of chemical warfare agents, explosives, and other threat agents. Moreover, its role in homeland security and force protection has expanded due, in part, to its good sensitivity, low power, lightweight, and reasonable cost. With the increased use of IMS systems as continuous monitors, it becomes necessary to develop tools and methodologies to ensure optimal performance over a wide range of conditions and extended periods of time. Namely, instrument calibration is needed to ensure proper sensitivity and to correct for matrix or environmental effects. We have developed methodologies to deal with the semi-quantitative nature of IMS and allow us to generate response curves that allow a gauge of instrument performance and maintenance requirements. This instrumentation communicates to the IMS systems via a software interface that was developed in-house. The software measures system response, logs information to a database, and generates the response curves. This paper will discuss the instrumentation, software, data collected, and initial results from fielded systems.

  16. CUNY+ Web: Usability Study of the Web-Based GUI Version of the Bibliographic Database of the City University of New York (CUNY).

    ERIC Educational Resources Information Center

    Oulanov, Alexei; Pajarillo, Edmund J. Y.

    2002-01-01

    Describes the usability evaluation of the CUNY (City University of New York) information system in Web and Graphical User Interface (GUI) versions. Compares results to an earlier usability study of the basic information database available on CUNY's wide-area network and describes the applicability of the previous usability instrument to this…

  17. Advanced instrumentation for next-generation aerospace propulsion control systems

    NASA Technical Reports Server (NTRS)

    Barkhoudarian, S.; Cross, G. S.; Lorenzo, Carl F.

    1993-01-01

    New control concepts for the next generation of advanced air-breathing and rocket engines and hypersonic combined-cycle propulsion systems are analyzed. The analysis provides a database on the instrumentation technologies for advanced control systems and cross matches the available technologies for each type of engine to the control needs and applications of the other two types of engines. Measurement technologies that are considered to be ready for implementation include optical surface temperature sensors, an isotope wear detector, a brushless torquemeter, a fiberoptic deflectometer, an optical absorption leak detector, the nonintrusive speed sensor, and an ultrasonic triducer. It is concluded that all 30 advanced instrumentation technologies considered can be recommended for further development to meet need of the next generation of jet-, rocket-, and hypersonic-engine control systems.

  18. Instruments of scientific visual representation in atomic databases

    NASA Astrophysics Data System (ADS)

    Kazakov, V. V.; Kazakov, V. G.; Meshkov, O. I.

    2017-10-01

    Graphic tools of spectral data representation provided by operating information systems on atomic spectroscopy—ASD NIST, VAMDC, SPECTR-W3, and Electronic Structure of Atoms—for the support of scientific-research and human-resource development are presented. Such tools of visual representation of scientific data as those of the spectrogram and Grotrian diagram plotting are considered. The possibility of comparative analysis of the experimentally obtained spectra and reference spectra of atomic systems formed according to the database of a resource is described. The access techniques to the mentioned graphic tools are presented.

  19. Autonomous mission planning and scheduling: Innovative, integrated, responsive

    NASA Technical Reports Server (NTRS)

    Sary, Charisse; Liu, Simon; Hull, Larry; Davis, Randy

    1994-01-01

    Autonomous mission scheduling, a new concept for NASA ground data systems, is a decentralized and distributed approach to scientific spacecraft planning, scheduling, and command management. Systems and services are provided that enable investigators to operate their own instruments. In autonomous mission scheduling, separate nodes exist for each instrument and one or more operations nodes exist for the spacecraft. Each node is responsible for its own operations which include planning, scheduling, and commanding; and for resolving conflicts with other nodes. One or more database servers accessible to all nodes enable each to share mission and science planning, scheduling, and commanding information. The architecture for autonomous mission scheduling is based upon a realistic mix of state-of-the-art and emerging technology and services, e.g., high performance individual workstations, high speed communications, client-server computing, and relational databases. The concept is particularly suited to the smaller, less complex missions of the future.

  20. The Design of Integrated Information System for High Voltage Metering Lab

    NASA Astrophysics Data System (ADS)

    Ma, Yan; Yang, Yi; Xu, Guangke; Gu, Chao; Zou, Lida; Yang, Feng

    2018-01-01

    With the development of smart grid, intelligent and informatization management of high-voltage metering lab become increasingly urgent. In the paper we design an integrated information system, which automates the whole transactions from accepting instruments, make experiments, generating report, report signature to instrument claims. Through creating database for all the calibrated instruments, using two-dimensional code, integrating report templates in advance, establishing bookmarks and online transmission of electronical signatures, our manual procedures reduce largely. These techniques simplify the complex process of account management and report transmission. After more than a year of operation, our work efficiency improves about forty percent averagely, and its accuracy rate and data reliability are much higher as well.

  1. The evolution of spinal instrumentation for the management of occipital cervical and cervicothoracic junctional injuries.

    PubMed

    Smucker, Joseph D; Sasso, Rick C

    2006-05-15

    Independent computer-based literature review of articles pertaining to instrumentation and fusion of junctional injuries of the cervical spine. To review and discuss the evolution of instrumentation techniques and systems used in the treatment of cervical spine junctional injuries. Instrumentation of junctional injuries of the cervical spine has been limited historically by failure to achieve rigid internal fixation in multiple planes. The evolution of these techniques has required increased insight into the morphology and unique biomechanics of the structures to be instrumented. Computer-based literature search of Ovid and PubMed databases. Extensive literature search yielded insights into the evolution of systems initially based on onlay bone graft combined with wiring techniques. Such techniques have come to include systems incorporating rigid, longitudinal struts that accommodate multiplanar screws placed in the lateral masses, pedicles, transarticular regions, and occipital bone. Despite a rapid evolution of techniques and instrumentation technologies, it remains incumbent on the physician to provide the patient with a surgical procedure that balances the likelihood of a favorable outcome with the risk inherent in the implementation of the procedure.

  2. OOMM--Object-Oriented Matrix Modelling: an instrument for the integration of the Brasilia Regional Health Information System.

    PubMed

    Cammarota, M; Huppes, V; Gaia, S; Degoulet, P

    1998-01-01

    The development of Health Information Systems is widely determined by the establishment of the underlying information models. An Object-Oriented Matrix Model (OOMM) is described which target is to facilitate the integration of the overall health system. The model is based on information modules named micro-databases that are structured in a three-dimensional network: planning, health structures and information systems. The modelling tool has been developed as a layer on top of a relational database system. A visual browser facilitates the development and maintenance of the information model. The modelling approach has been applied to the Brasilia University Hospital since 1991. The extension of the modelling approach to the Brasilia regional health system is considered.

  3. Reliability and Validity of Survey Instruments to Measure Work-Related Fatigue in the Emergency Medical Services Setting: A Systematic Review.

    PubMed

    Patterson, P Daniel; Weaver, Matthew D; Fabio, Anthony; Teasley, Ellen M; Renn, Megan L; Curtis, Brett R; Matthews, Margaret E; Kroemer, Andrew J; Xun, Xiaoshuang; Bizhanova, Zhadyra; Weiss, Patricia M; Sequeira, Denisse J; Coppler, Patrick J; Lang, Eddy S; Higgins, J Stephen

    2018-02-15

    This study sought to systematically search the literature to identify reliable and valid survey instruments for fatigue measurement in the Emergency Medical Services (EMS) occupational setting. A systematic review study design was used and searched six databases, including one website. The research question guiding the search was developed a priori and registered with the PROSPERO database of systematic reviews: "Are there reliable and valid instruments for measuring fatigue among EMS personnel?" (2016:CRD42016040097). The primary outcome of interest was criterion-related validity. Important outcomes of interest included reliability (e.g., internal consistency), and indicators of sensitivity and specificity. Members of the research team independently screened records from the databases. Full-text articles were evaluated by adapting the Bolster and Rourke system for categorizing findings of systematic reviews, and the rated data abstracted from the body of literature as favorable, unfavorable, mixed/inconclusive, or no impact. The Grading of Recommendations, Assessment, Development and Evaluation (GRADE) methodology was used to evaluate the quality of evidence. The search strategy yielded 1,257 unique records. Thirty-four unique experimental and non-experimental studies were determined relevant following full-text review. Nineteen studies reported on the reliability and/or validity of ten different fatigue survey instruments. Eighteen different studies evaluated the reliability and/or validity of four different sleepiness survey instruments. None of the retained studies reported sensitivity or specificity. Evidence quality was rated as very low across all outcomes. In this systematic review, limited evidence of the reliability and validity of 14 different survey instruments to assess the fatigue and/or sleepiness status of EMS personnel and related shift worker groups was identified.

  4. VO Access to BASECOL Database

    NASA Astrophysics Data System (ADS)

    Moreau, N.; Dubernet, M. L.

    2006-07-01

    Basecol is a combination of a website (using PHP and HTML) and a MySQL database concerning molecular ro-vibrational transitions induced by collisions with atoms or molecules. This database has been created in view of the scientific preparation of the Heterodyne Instrument for the Far-Infrared on board the Herschel Space Observatory (HSO). Basecol offers an access to numerical and bibliographic data through various output methods such as ASCII, HTML or VOTable (which is a first step towards a VO compliant system). A web service using Apache Axis has been developed in order to provide a direct access to data for external applications.

  5. An optical scan/statistical package for clinical data management in C-L psychiatry.

    PubMed

    Hammer, J S; Strain, J J; Lyerly, M

    1993-03-01

    This paper explores aspects of the need for clinical database management systems that permit ongoing service management, measurement of the quality and appropriateness of care, databased administration of consultation liaison (C-L) services, teaching/educational observations, and research. It describes an OPTICAL SCAN databased management system that permits flexible form generation, desktop publishing, and linking of observations in multiple files. This enhanced MICRO-CARES software system--Medical Application Platform (MAP)--permits direct transfer of the data to ASCII and SAS format for mainframe manipulation of the clinical information. The director of a C-L service may now develop his or her own forms, incorporate structured instruments, or develop "branch chains" of essential data to add to the core data set without the effort and expense to reprint forms or consult with commercial vendors.

  6. ECLSS evolution: Advanced instrumentation interface requirements. Volume 3: Appendix C

    NASA Technical Reports Server (NTRS)

    1991-01-01

    An Advanced ECLSS (Environmental Control and Life Support System) Technology Interfaces Database was developed primarily to provide ECLSS analysts with a centralized and portable source of ECLSS technologies interface requirements data. The database contains 20 technologies which were previously identified in the MDSSC ECLSS Technologies database. The primary interfaces of interest in this database are fluid, electrical, data/control interfaces, and resupply requirements. Each record contains fields describing the function and operation of the technology. Fields include: an interface diagram, description applicable design points and operating ranges, and an explaination of data, as required. A complete set of data was entered for six of the twenty components including Solid Amine Water Desorbed (SAWD), Thermoelectric Integrated Membrane Evaporation System (TIMES), Electrochemical Carbon Dioxide Concentrator (EDC), Solid Polymer Electrolysis (SPE), Static Feed Electrolysis (SFE), and BOSCH. Additional data was collected for Reverse Osmosis Water Reclaimation-Potable (ROWRP), Reverse Osmosis Water Reclaimation-Hygiene (ROWRH), Static Feed Solid Polymer Electrolyte (SFSPE), Trace Contaminant Control System (TCCS), and Multifiltration Water Reclamation - Hygiene (MFWRH). A summary of the database contents is presented in this report.

  7. Normative Databases for Imaging Instrumentation.

    PubMed

    Realini, Tony; Zangwill, Linda M; Flanagan, John G; Garway-Heath, David; Patella, Vincent M; Johnson, Chris A; Artes, Paul H; Gaddie, Ian B; Fingeret, Murray

    2015-08-01

    To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer's database differs in size, eligibility criteria, and ethnic make-up, among other key features. The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments.

  8. Normative Databases for Imaging Instrumentation

    PubMed Central

    Realini, Tony; Zangwill, Linda; Flanagan, John; Garway-Heath, David; Patella, Vincent Michael; Johnson, Chris; Artes, Paul; Ben Gaddie, I.; Fingeret, Murray

    2015-01-01

    Purpose To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. Methods A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Results Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer’s database differs in size, eligibility criteria, and ethnic make-up, among other key features. Conclusions The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments. PMID:25265003

  9. Operational problems experienced by single pilots in instrument meteorological conditions

    NASA Technical Reports Server (NTRS)

    Weislogel, S.

    1981-01-01

    The development and implementation of a search strategy to extract pertinent reports from the Aviation Safety Reporting System-2 (ASRS-2) database are described. For any particular occurence to be pertinent to the study, it must have satisfied the following conditions: the aircraft must be of the type usually flown by a single pilot; operation on an IFR flight plan in instrument meteorological conditions; pilot experienced an operational problem. The occurances consist of reports by the pilot about his own performance, by the pilot about the system performance, or by an air traffic controller about a pilot's performance.

  10. Innovations: clinical computing: an audio computer-assisted self-interviewing system for research and screening in public mental health settings.

    PubMed

    Bertollo, David N; Alexander, Mary Jane; Shinn, Marybeth; Aybar, Jalila B

    2007-06-01

    This column describes the nonproprietary software Talker, used to adapt screening instruments to audio computer-assisted self-interviewing (ACASI) systems for low-literacy populations and other populations. Talker supports ease of programming, multiple languages, on-site scoring, and the ability to update a central research database. Key features include highly readable text display, audio presentation of questions and audio prompting of answers, and optional touch screen input. The scripting language for adapting instruments is briefly described as well as two studies in which respondents provided positive feedback on its use.

  11. nStudy: A System for Researching Information Problem Solving

    ERIC Educational Resources Information Center

    Winne, Philip H.; Nesbit, John C.; Popowich, Fred

    2017-01-01

    A bottleneck in gathering big data about learning is instrumentation designed to record data about processes students use to learn and information on which those processes operate. The software system nStudy fills this gap. nStudy is an extension to the Chrome web browser plus a server side database for logged trace data plus peripheral modules…

  12. The IAGOS Information System: From the aircraft measurements to the users.

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Thouret, Valérie; Cammas, Jean-Pierre; Petzold, Andreas; Volz-Thomas, Andreas; Gerbig, Christoph; Brenninkmeijer, Carl A. M.

    2013-04-01

    IAGOS (In-service Aircraft for a Global Observing System, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in-situ observations of atmospheric chemical composition throughout the troposphere and in the UTLS. It builds on almost 20 years of scientific and technological expertise gained in the research projects MOZAIC (Measurement of Ozone and Water Vapour on Airbus In-service Aircraft) and CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container). The European consortium includes research centres, universities, national weather services, airline operators and aviation industry. IAGOS consists of two complementary building blocks proving a unique global observation system: IAGOS-CORE deploys newly developed instrumentation for regular in-situ measurements of atmospheric chemical species both reactive and greenhouse gases (O3, CO, NOx, NOy, H2O, CO2, CH4), aerosols and cloud particles. In IAGOS-CARIBIC a cargo container is deployed monthly as a flying laboratory aboard one aircraft. Involved airlines ensure global operation of the network. Today, 5 aircraft are flying with the MOZAIC (3) or IAGOS-CORE (2) instrumentation namely 3 aircraft from Lufthansa, 1 from Air Namibia, and 1 from China Airlines Taiwan. A main improvement and new aspect of the IAGOS-CORE instrumentation compared to MOZAIC is to deliver the raw data in near real time (i.e. as soon as the aircraft lands data are transmitted). After a first and quick validation of the O3 and CO measurements, preliminary data are made available in the central database for both the MACC project (Monitoring Atmospheric Composition and Climate) and scientific research groups. In addition to recorded measurements, the database also contains added-value products such as meteorological information (tropopause height, air mass backtrajectories) and lagrangian model outputs (FLEXPART). Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web site: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The MOZAIC-IAGOS database contains today more than 35000 flights covering mostly the northern hemisphere mid-latitudes but with reduced representation of the Pacific region. The recently equipped China Airlines Taiwan aircraft started in July 2012 filling this gap. Future equipped aircraft scheduled in 2013 from Air France, Cathay Pacific and Iberia will cover the Asia-Oceania sector and Europe-South America transects. The database, as well as the research infrastructure itself are in continuous development and improvement. In the framework of the new starting IGAS project (IAGOS for GMES Atmospheric Service), major achievements will be reached such as metadata and formats standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC data integration within the central database, and the real-time data transmission.

  13. The International Experimental Thermal Hydraulic Systems database – TIETHYS: A new NEA validation tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, Upendra S.

    Nuclear reactor codes require validation with appropriate data representing the plant for specific scenarios. The thermal-hydraulic data is scattered in different locations and in different formats. Some of the data is in danger of being lost. A relational database is being developed to organize the international thermal hydraulic test data for various reactor concepts and different scenarios. At the reactor system level, that data is organized to include separate effect tests and integral effect tests for specific scenarios and corresponding phenomena. The database relies on the phenomena identification sections of expert developed PIRTs. The database will provide a summary ofmore » appropriate data, review of facility information, test description, instrumentation, references for the experimental data and some examples of application of the data for validation. The current database platform includes scenarios for PWR, BWR, VVER, and specific benchmarks for CFD modelling data and is to be expanded to include references for molten salt reactors. There are place holders for high temperature gas cooled reactors, CANDU and liquid metal reactors. This relational database is called The International Experimental Thermal Hydraulic Systems (TIETHYS) database and currently resides at Nuclear Energy Agency (NEA) of the OECD and is freely open to public access. Going forward the database will be extended to include additional links and data as they become available. https://www.oecd-nea.org/tiethysweb/« less

  14. Apical extrusion of debris in four different endodontic instrumentation systems: A meta-analysis.

    PubMed

    Western, J Sylvia; Dicksit, Daniel Devaprakash

    2017-01-01

    All endodontic instrumentation systems tested so far, promote apical extrusion of debris, which is one of the main causes of postoperative pain, flare ups, and delayed healing. Of this meta-analysis was to collect and analyze in vitro studies quantifying apically extruded debris while using Hand ProTaper (manual), ProTaper Universal (rotary), Wave One (reciprocating), and self-adjusting file (SAF; vibratory) endodontic instrumentation systems and to determine methods which produced lesser extrusion of debris apically. An extensive electronic database search was done in PubMed, Scopus, Cochrane, LILACS, and Google Scholar from inception until February 2016 using the key terms "Apical Debris Extrusion, extruded material, and manual/rotary/reciprocating/SAF systems." A systematic search strategy was followed to extract 12 potential articles from a total of 1352 articles. The overall effect size was calculated from the raw mean difference of weight of apically extruded debris. Statistically significant difference was seen in the following comparisons: SAF < Wave One, SAF < Rotary ProTaper. Apical extrusion of debris was invariably present in all the instrumentation systems analyzed. SAF system seemed to be periapical tissue friendly as it caused reduced apical extrusion compared to Rotary ProTaper and Wave One.

  15. STATUS OF VARIOUS SNS DIAGNOSTIC SYSTEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blokland, Willem; Purcell, J David; Patton, Jeff

    2007-01-01

    The Spallation Neutron Source (SNS) accelerator systems are ramping up to deliver a 1.0 GeV, 1.4 MW proton beam to a liquid mercury target for neutron scattering research. Enhancements or additions have been made to several instrument systems to support the ramp up in intensity, improve reliability, and/or add functionality. The Beam Current Monitors now support increased rep rates, the Harp system now includes charge density calculations for the target, and a new system has been created to collect data for the beam accounting and present the data over the web and to the operator consoles. The majority of themore » SNS beam instruments are PC-based and their configuration files are now managed through the Oracle relational database. A new version for the wire scanner software was developed to add features to correlate the scan with beam loss, parking in the beam, and measuring the longitudinal beam current. This software is currently being tested. This paper also includes data from the selected instruments.« less

  16. DNA Profiling of Convicted Offender Samples for the Combined DNA Index System

    ERIC Educational Resources Information Center

    Millard, Julie T

    2011-01-01

    The cornerstone of forensic chemistry is that a perpetrator inevitably leaves trace evidence at a crime scene. One important type of evidence is DNA, which has been instrumental in both the implication and exoneration of thousands of suspects in a wide range of crimes. The Combined DNA Index System (CODIS), a network of DNA databases, provides…

  17. Data analysis of the COMPTEL instrument on the NASA gamma ray observatory

    NASA Technical Reports Server (NTRS)

    Diehl, R.; Bennett, K.; Collmar, W.; Connors, A.; Denherder, J. W.; Hermsen, W.; Lichti, G. G.; Lockwood, J. A.; Macri, J.; Mcconnell, M.

    1992-01-01

    The Compton imaging telescope (COMPTEL) on the Gamma Ray Observatory (GRO) is a wide field of view instrument. The coincidence measurement technique in two scintillation detector layers requires specific analysis methods. Straightforward event projection into the sky is impossible. Therefore, detector events are analyzed in a multi-dimensional dataspace using a gamma ray sky hypothesis convolved with the point spread function of the instrument in this dataspace. Background suppression and analysis techniques have important implications on the gamma ray source results for this background limited telescope. The COMPTEL collaboration applies a software system of analysis utilities, organized around a database management system. The use of this system for the assistance of guest investigators at the various collaboration sites and external sites is foreseen and allows different detail levels of cooperation with the COMPTEL institutes, dependent on the type of data to be studied.

  18. Live load monitoring for the I-10 twin span bridge : research project capsule.

    DOT National Transportation Integrated Search

    2014-10-01

    To establish a site-specific database for bridge evaluation and future bridge design, : DOTD established a long-term health monitoring system at the I-10 Twin Span Bridge. : The bridge is instrumented from deck to piles to capture bridge response (bo...

  19. T-LECS: The Control Software System for MOIRCS

    NASA Astrophysics Data System (ADS)

    Yoshikawa, T.; Omata, K.; Konishi, M.; Ichikawa, T.; Suzuki, R.; Tokoku, C.; Katsuno, Y.; Nishimura, T.

    2006-07-01

    MOIRCS (Multi-Object Infrared Camera and Spectrograph) is a new instrument for the Subaru Telescope. We present the system design of the control software system for MOIRCS, named T-LECS (Tohoku University - Layered Electronic Control System). T-LECS is a PC-Linux based network distributed system. Two PCs equipped with the focal plane array system operate two HAWAII2 detectors, respectively, and another PC is used for user interfaces and a database server. Moreover, these PCs control various devices for observations distributed on a TCP/IP network. T-LECS has three interfaces; interfaces to the devices and two user interfaces. One of the user interfaces is to the integrated observation control system (Subaru Observation Software System) for observers, and another one provides the system developers the direct access to the devices of MOIRCS. In order to help the communication between these interfaces, we employ an SQL database system.

  20. Development of a database of instruments for resource-use measurement: purpose, feasibility, and design.

    PubMed

    Ridyard, Colin H; Hughes, Dyfrig A

    2012-01-01

    Health economists frequently rely on methods based on patient recall to estimate resource utilization. Access to questionnaires and diaries, however, is often limited. This study examined the feasibility of establishing an open-access Database of Instruments for Resource-Use Measurement, identified relevant fields for data extraction, and outlined its design. An electronic survey was sent to authors of full UK economic evaluations listed in the National Health Service Economic Evaluation Database (2008-2010), authors of monographs of Health Technology Assessments (1998-2010), and subscribers to the JISCMail health economics e-mailing list. The survey included questions on piloting, validation, recall period, and data capture method. Responses were analyzed and data extracted to generate relevant fields for the database. A total of 143 responses to the survey provided data on 54 resource-use instruments for inclusion in the database. All were reliant on patient or carer recall, and a majority (47) were questionnaires. Thirty-seven were designed for self-completion by the patient, carer, or guardian, and the remainder were designed for completion by researchers or health care professionals while interviewing patients. Methods of development were diverse, particularly in areas such as the planning of resource itemization (evident in 25 instruments), piloting (25), and validation (29). On the basis of the present analysis, we developed a Web-enabled Database of Instruments for Resource-Use Measurement, accessible via www.DIRUM.org. This database may serve as a practical resource for health economists, as well as a means to facilitate further research in the area of resource-use data collection. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. Use of patient safety culture instruments in operating rooms: A systematic literature review.

    PubMed

    Zhao, Pujng; Li, Yaqin; Li, Zhi; Jia, Pengli; Zhang, Longhao; Zhang, Mingming

    2017-05-01

    To identify and qualitatively describe, in a literature review, how the instruments were used to evaluate patient safety culture in the operating rooms of published studies. Systematic searches of the literature were conducted using the major database including MEDLINE, EMbase, The Cochrane Library, and four Chinese databases including Chinese Biomedical Literature Database (CBM), Wanfang Data, Chinese Scientific Journal Database (VIP), and Chinese Journals Full-text Database (CNKI) for studies published up to March 2016. We summarized and analyzed the country scope, the instrument utilized in the study, the year when the instrument was used, and fields of operating rooms. Study populations, study settings, and the time span between baseline and follow-up phase were evaluated according to the study design. We identified 1025 references, of which 99 were obtained for full-text assessment; 47 of these studies were deemed relevant and included in the literature review. Most of the studies were from the USA. The most commonly used patient safety culture instrument was Safety Attitude Questionnaire. All identified instruments were used after 2002 and across many fields. Most included studies on patient safety culture were conducted in teaching hospitals or university hospitals. The study population in the cross-sectional studies was much more than that in the before-after studies. The time span between baseline and follow-up phase of before-after studies were almost over three months. Although patient safety culture is considered important in health care and patient safety, the number of studies in which patient safety culture has been estimated using the instruments in operating rooms, is fairly small. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  2. The Footprint Database and Web Services of the Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Dobos, László; Varga-Verebélyi, Erika; Verdugo, Eva; Teyssier, David; Exter, Katrina; Valtchanov, Ivan; Budavári, Tamás; Kiss, Csaba

    2016-10-01

    Data from the Herschel Space Observatory is freely available to the public but no uniformly processed catalogue of the observations has been published so far. To date, the Herschel Science Archive does not contain the exact sky coverage (footprint) of individual observations and supports search for measurements based on bounding circles only. Drawing on previous experience in implementing footprint databases, we built the Herschel Footprint Database and Web Services for the Herschel Space Observatory to provide efficient search capabilities for typical astronomical queries. The database was designed with the following main goals in mind: (a) provide a unified data model for meta-data of all instruments and observational modes, (b) quickly find observations covering a selected object and its neighbourhood, (c) quickly find every observation in a larger area of the sky, (d) allow for finding solar system objects crossing observation fields. As a first step, we developed a unified data model of observations of all three Herschel instruments for all pointing and instrument modes. Then, using telescope pointing information and observational meta-data, we compiled a database of footprints. As opposed to methods using pixellation of the sphere, we represent sky coverage in an exact geometric form allowing for precise area calculations. For easier handling of Herschel observation footprints with rather complex shapes, two algorithms were implemented to reduce the outline. Furthermore, a new visualisation tool to plot footprints with various spherical projections was developed. Indexing of the footprints using Hierarchical Triangular Mesh makes it possible to quickly find observations based on sky coverage, time and meta-data. The database is accessible via a web site http://herschel.vo.elte.hu and also as a set of REST web service functions, which makes it readily usable from programming environments such as Python or IDL. The web service allows downloading footprint data in various formats including Virtual Observatory standards.

  3. Mass measurement errors of Fourier-transform mass spectrometry (FTMS): distribution, recalibration, and application.

    PubMed

    Zhang, Jiyang; Ma, Jie; Dou, Lei; Wu, Songfeng; Qian, Xiaohong; Xie, Hongwei; Zhu, Yunping; He, Fuchu

    2009-02-01

    The hybrid linear trap quadrupole Fourier-transform (LTQ-FT) ion cyclotron resonance mass spectrometer, an instrument with high accuracy and resolution, is widely used in the identification and quantification of peptides and proteins. However, time-dependent errors in the system may lead to deterioration of the accuracy of these instruments, negatively influencing the determination of the mass error tolerance (MET) in database searches. Here, a comprehensive discussion of LTQ/FT precursor ion mass error is provided. On the basis of an investigation of the mass error distribution, we propose an improved recalibration formula and introduce a new tool, FTDR (Fourier-transform data recalibration), that employs a graphic user interface (GUI) for automatic calibration. It was found that the calibration could adjust the mass error distribution to more closely approximate a normal distribution and reduce the standard deviation (SD). Consequently, we present a new strategy, LDSF (Large MET database search and small MET filtration), for database search MET specification and validation of database search results. As the name implies, a large-MET database search is conducted and the search results are then filtered using the statistical MET estimated from high-confidence results. By applying this strategy to a standard protein data set and a complex data set, we demonstrate the LDSF can significantly improve the sensitivity of the result validation procedure.

  4. Information resources at the National Center for Biotechnology Information.

    PubMed Central

    Woodsmall, R M; Benson, D A

    1993-01-01

    The National Center for Biotechnology Information (NCBI), part of the National Library of Medicine, was established in 1988 to perform basic research in the field of computational molecular biology as well as build and distribute molecular biology databases. The basic research has led to new algorithms and analysis tools for interpreting genomic data and has been instrumental in the discovery of human disease genes for neurofibromatosis and Kallmann syndrome. The principal database responsibility is the National Institutes of Health (NIH) genetic sequence database, GenBank. NCBI, in collaboration with international partners, builds, distributes, and provides online and CD-ROM access to over 112,000 DNA sequences. Another major program is the integration of multiple sequences databases and related bibliographic information and the development of network-based retrieval systems for Internet access. PMID:8374583

  5. Optoelectronic instrumentation enhancement using data mining feedback for a 3D measurement system

    NASA Astrophysics Data System (ADS)

    Flores-Fuentes, Wendy; Sergiyenko, Oleg; Gonzalez-Navarro, Félix F.; Rivas-López, Moisés; Hernandez-Balbuena, Daniel; Rodríguez-Quiñonez, Julio C.; Tyrsa, Vera; Lindner, Lars

    2016-12-01

    3D measurement by a cyber-physical system based on optoelectronic scanning instrumentation has been enhanced by outliers and regression data mining feedback. The prototype has applications in (1) industrial manufacturing systems that include: robotic machinery, embedded vision, and motion control, (2) health care systems for measurement scanning, and (3) infrastructure by providing structural health monitoring. This paper presents new research performed in data processing of a 3D measurement vision sensing database. Outliers from multivariate data have been detected and removal to improve artificial intelligence regression algorithm results. Physical measurement error regression data has been used for 3D measurements error correction. Concluding, that the joint of physical phenomena, measurement and computation is an effectiveness action for feedback loops in the control of industrial, medical and civil tasks.

  6. The New Zealand Tsunami Database: historical and modern records

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Downes, G. L.; Cochran, U. A.; Clark, K.; Scheele, F.

    2016-12-01

    A database of historical (pre-instrumental) and modern (instrumentally recorded)tsunamis that have impacted or been observed in New Zealand has been compiled andpublished online. New Zealand's tectonic setting, astride an obliquely convergenttectonic boundary on the Pacific Rim, means that it is vulnerable to local, regional andcircum-Pacific tsunamis. Despite New Zealand's comparatively short written historicalrecord of c. 200 years there is a wealth of information about the impact of past tsunamis.The New Zealand Tsunami Database currently has 800+ entries that describe >50 highvaliditytsunamis. Sources of historical information include witness reports recorded indiaries, notes, newspapers, books, and photographs. Information on recent events comesfrom tide gauges and other instrumental recordings such as DART® buoys, and media ofgreater variety, for example, video and online surveys. The New Zealand TsunamiDatabase is an ongoing project with information added as further historical records cometo light. Modern tsunamis are also added to the database once the relevant data for anevent has been collated and edited. This paper briefly overviews the procedures and toolsused in the recording and analysis of New Zealand's historical tsunamis, with emphasison database content.

  7. Instruments for measuring mental health recovery: a systematic review.

    PubMed

    Sklar, Marisa; Groessl, Erik J; O'Connell, Maria; Davidson, Larry; Aarons, Gregory A

    2013-12-01

    Persons in recovery, providers, and policymakers alike are advocating for recovery-oriented mental health care, with the promotion of recovery becoming a prominent feature of mental health policy in the United States and internationally. One step toward creating a recovery-oriented system of care is to use recovery-oriented outcome measures. Numerous instruments have been developed to assess progress towards mental health recovery. This review identifies instruments of mental health recovery and evaluates the appropriateness of their use including their psychometric properties, ease of administration, and service-user involvement in their development. A literature search using the Medline and Psych-INFO databases was conducted, identifying 21 instruments for potential inclusion in this review, of which thirteen met inclusion criteria. Results suggest only three instruments (25%) have had their psychometric properties assessed in three or more unique samples of participants. Ease of administration varied between instruments, and for the majority of instruments, development included service user involvement. This review updates and expands previous reviews of instruments to assess mental health recovery. As mental health care continues to transform to a recovery-oriented model of service delivery, this review may facilitate selection of appropriate assessments of mental health recovery for systems to use in evaluating and improving the care they provide. © 2013.

  8. Phenotip - a web-based instrument to help diagnosing fetal syndromes antenatally.

    PubMed

    Porat, Shay; de Rham, Maud; Giamboni, Davide; Van Mieghem, Tim; Baud, David

    2014-12-10

    Prenatal ultrasound can often reliably distinguish fetal anatomic anomalies, particularly in the hands of an experienced ultrasonographer. Given the large number of existing syndromes and the significant overlap in prenatal findings, antenatal differentiation for syndrome diagnosis is difficult. We constructed a hierarchic tree of 1140 sonographic markers and submarkers, organized per organ system. Subsequently, a database of prenatally diagnosable syndromes was built. An internet-based search engine was then designed to search the syndrome database based on a single or multiple sonographic markers. Future developments will include a database with magnetic resonance imaging findings as well as further refinements in the search engine to allow prioritization based on incidence of syndromes and markers.

  9. Volcano-Monitoring Instrumentation in the United States, 2008

    USGS Publications Warehouse

    Guffanti, Marianne; Diefenbach, Angela K.; Ewert, John W.; Ramsey, David W.; Cervelli, Peter F.; Schilling, Steven P.

    2010-01-01

    The United States is one of the most volcanically active countries in the world. According to the global volcanism database of the Smithsonian Institution, the United States (including its Commonwealth of the Northern Mariana Islands) is home to about 170 volcanoes that are in an eruptive phase, have erupted in historical time, or have not erupted recently but are young enough (eruptions within the past 10,000 years) to be capable of reawakening. From 1980 through 2008, 30 of these volcanoes erupted, several repeatedly. Volcano monitoring in the United States is carried out by the U.S. Geological Survey (USGS) Volcano Hazards Program, which operates a system of five volcano observatories-Alaska Volcano Observatory (AVO), Cascades Volcano Observatory (CVO), Hawaiian Volcano Observatory (HVO), Long Valley Observatory (LVO), and Yellowstone Volcano Observatory (YVO). The observatories issue public alerts about conditions and hazards at U.S. volcanoes in support of the USGS mandate under P.L. 93-288 (Stafford Act) to provide timely warnings of potential volcanic disasters to the affected populace and civil authorities. To make efficient use of the Nation's scientific resources, the volcano observatories operate in partnership with universities and other governmental agencies through various formal agreements. The Consortium of U.S. Volcano Observatories (CUSVO) was established in 2001 to promote scientific cooperation among the Federal, academic, and State agencies involved in observatory operations. Other groups also contribute to volcano monitoring by sponsoring long-term installation of geophysical instruments at some volcanoes for specific research projects. This report describes a database of information about permanently installed ground-based instruments used by the U.S. volcano observatories to monitor volcanic activity (unrest and eruptions). The purposes of this Volcano-Monitoring Instrumentation Database (VMID) are to (1) document the Nation's existing, ground-based, volcano-monitoring capabilities, (2) answer queries within a geospatial framework about the nature of the instrumentation, and (3) provide a benchmark for planning future monitoring improvements. The VMID is not an archive of the data collected by monitoring instruments, nor is it intended to keep track of whether a station is temporarily unavailable due to telemetry or equipment problems. Instead, it is a compilation of basic information about each instrument such as location, type, and sponsoring agency. Typically, instruments installed expressly for volcano monitoring are emplaced within about 20 kilometers (km) of a volcanic center; however, some more distant instruments (as far away as 100 km) can be used under certain circumstances and therefore are included in the database. Not included is information about satellite-based and airborne sensors and temporarily deployed instrument arrays, which also are used for volcano monitoring but do not lend themselves to inclusion in a geospatially organized compilation of sensor networks. This Open-File Report is provided in two parts: (1) an Excel spreadsheet (http://pubs.usgs.gov/of/2009/1165/) containing the version of the Volcano-Monitoring Instrumentation Database current through 31 December 2008 and (2) this text (in Adobe PDF format), which serves as metadata for the VMID. The disclaimer for the VMID is in appendix 1 of the text. Updated versions of the VMID will be posted on the Web sites of the Consortium of U.S. Volcano Observatories (http://www.cusvo.org/) and the USGS Volcano Hazards Program http://volcanoes.usgs.gov/activity/data/index.php.

  10. Apical extrusion of debris in four different endodontic instrumentation systems: A meta-analysis

    PubMed Central

    Western, J. Sylvia; Dicksit, Daniel Devaprakash

    2017-01-01

    Background: All endodontic instrumentation systems tested so far, promote apical extrusion of debris, which is one of the main causes of postoperative pain, flare ups, and delayed healing. Objectives: Of this meta-analysis was to collect and analyze in vitro studies quantifying apically extruded debris while using Hand ProTaper (manual), ProTaper Universal (rotary), Wave One (reciprocating), and self-adjusting file (SAF; vibratory) endodontic instrumentation systems and to determine methods which produced lesser extrusion of debris apically. Methodology: An extensive electronic database search was done in PubMed, Scopus, Cochrane, LILACS, and Google Scholar from inception until February 2016 using the key terms “Apical Debris Extrusion, extruded material, and manual/rotary/reciprocating/SAF systems.” A systematic search strategy was followed to extract 12 potential articles from a total of 1352 articles. The overall effect size was calculated from the raw mean difference of weight of apically extruded debris. Results: Statistically significant difference was seen in the following comparisons: SAF < Wave One, SAF < Rotary ProTaper. Conclusions: Apical extrusion of debris was invariably present in all the instrumentation systems analyzed. SAF system seemed to be periapical tissue friendly as it caused reduced apical extrusion compared to Rotary ProTaper and Wave One. PMID:28761250

  11. A new fully automated FTIR system for total column measurements of greenhouse gases

    NASA Astrophysics Data System (ADS)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-10-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON). It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control. First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  12. Chemiluminescence: Measuring methods. (Latest citations from the NTIS bibliographic database). Published Search

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The bibliography contains citations concerning chemiluminescence assays. The citations include sample system design, sample collection, measurement techniques, and sensitivity of the instrumentation. Applications in high altitude air pollution studies are emphasized. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  13. Data-Based Locally Directed Evaluation of Vocational Education Programs. Component 3. Assessing Student Career Interests.

    ERIC Educational Resources Information Center

    Florida State Univ., Tallahassee. Program of Vocational Education.

    Part of a system by which local education agency (LEA) personnel may evaluate secondary and postsecondary vocational education programs, this third of eight components focuses on assessment of student career interests. Procedures covered include determination of the student population to be surveyed, selection of the instrument, and discrepancy…

  14. Optimization of Extended Relational Database Systems

    DTIC Science & Technology

    1986-07-23

    control functions are integrated into a single system in a homogeneoua way. As a first exam - ple, consider previous work in supporting various semantic...sizes are reduced and, wnk? quently, the number of materializations that will be needed is aba lower. For exam - pie, in the above query tuple...retrieve (EMP.name) where EMP hobbies instrument = ’ violin ’ When the various entries in the hobbies field are materialized, only those queries that

  15. NASA Astrophysics Data System's New Data

    NASA Astrophysics Data System (ADS)

    Eichhorn, G.; Accomazzi, A.; Demleitner, M.; Grant, C. S.; Kurtz, M. J.; Murray, S. S.

    2000-05-01

    The NASA Astrophysics Data System has greatly increased its data holdings. The Physics database now contains almost 900,000 references and the Astronomy database almost 550,000 references. The Instrumentation database has almost 600,000 references. The scanned articles in the ADS Article Service are increasing in number continuously. Almost 1 million pages have been scanned so far. Recently the abstracts books from the Lunar and Planetary Science Conference have been scanned and put on-line. The Monthly Notices of the Royal Astronomical Society are currently being scanned back to Volume 1. This is the last major journal to be completely scanned and on-line. In cooperation with a conservation project of the Harvard libraries, microfilms of historical observatory literature are currently being scanned. This will provide access to an important part of the historical literature. The ADS can be accessed at: http://adswww.harvard.edu This project is funded by NASA under grant NCC5-189.

  16. Database of proposed payloads and instruments for SEI missions

    NASA Technical Reports Server (NTRS)

    Barlow, N. G.

    1992-01-01

    A database of all payloads and instruments proposed for lunar and Mars missions was compiled by the author for the Exploration Programs Office at NASA's Johnson Sapce Center. The database is an outgrowth of the document produced by C. J. Budney et al. at the Jet Propulsion Laboratory in 1991. The present database consists not only of payloads proposed for human exploratory missions of the Moon and Mars, but also experiments selected or proposed for robotic precursor missions such as Lunar Scout, Mars Observer, and MESUR. The database consists of two parts: a written payload description and a matrix that provides a breakdown of payload components. Each payload description consists of the following information: (1) the rationale for why the instrument or payload package is being proposed for operation on the Moon or Mars; (2) a description of how the instrument works; (3) a breakdown of the payload, providing detailed information about the mass, volume, power requirements, and data rates for the constituent pieces of the experiment; (4) estimates of the power consumption and data rate; (5) how the data will be returned to Earth and distributed to the scientific community; (6) any constraints on the location or conditions under which the instrument can or cannot operate; (7) what type of crew interaction (if any) is needed; (8) how the payload is to be delivered to the lunar or martian surface (along with alternative delivery options); (9) how long the instrument or payload package will take to set up; (10) what type of maintenance needs are anticipated for the experiment; (11) stage of development for the instrument and environmental conditions under which the instrument has been tested; (12) an interface required by the instrument with the lander, a rover, an outpost, etc.; (13) information about how often the experiment will need to be resupplied with parts or consumables, if it is to be resupplied; (14) the name and affiliation of a contact person for the experiment; and (15) references where further information about the experiment can be found.

  17. GeoMEx: Geographic Information System (GIS) Prototype for Mars Express Data

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Frigeri, A.; Ivanov, A. B.

    2013-09-01

    As of today almost a decade of observational data have been returned by the multidisciplinary instruments on-board the ESA's Mars Express spacecraft. All data are archived into the ESA's Planetary Science Archive (PSA), which is the central repository for all ESA's Solar System missions [1]. Data users can perform advanced queries and retrieve data from the PSA using graphical and map-based search interfaces, or via direct FTP download [2]. However the PSA still offers limited geometrical search and visualisation capabilities that are essential for scientists to identify their data of interest. A former study has shown [3] that this limitation is mostly due to the fact that (1) only a subset of the instruments observations geometry information has been modeled and ingested into the PSA, and (2) that the access to that information from GIS software is impossible without going through a cumbersome and undocumented process. With the increasing number of Mars GIS data sets available to the community [4], GIS software have become invaluable tools for researchers to capture, manage, visualise, and analyse data from various sources. Although Mars Express surface imaging data are natural candidates for use in a GIS environment, other non-imaging instruments data (subsurface, atmosphere, plasma) integration is being investigated [5]. The objective of this work is to develop a GIS prototype that will integrate all the Mars Express instruments observations geometry information into a spatial database that can be accessed from external GIS software using standard WMS and WFS protocols. We will firstly focus on the integration of surface and subsurface instruments data (HRSC, OMEGA, MARSIS). In addition to the geometry information, base and context maps of Mars derived from surface mapping instruments data will also be ingested into the system. The system back-end architecture will be implemented using open-source GIS frameworks: PostgreSQL/PostGIS for the database, and MapServer for the web publishing module. Interfaces with existing GIS front-end software (such as QGIS, GRASS, ArcView, or OpenLayers) will be investigated and tested in a second phase. This prototype is primarily intended to be used by the Mars Express instruments teams in support to their scientific investigations. It will also be used by the mission Archive Scientist in support to the data validation and PSA interface requirements definition tasks. Depending on its success, this prototype might be used in the future to demonstrate the benefit of a GIS component integration to ESA's planetary science operations planning systems.

  18. Palmprint and face score level fusion: hardware implementation of a contactless small sample biometric system

    NASA Astrophysics Data System (ADS)

    Poinsot, Audrey; Yang, Fan; Brost, Vincent

    2011-02-01

    Including multiple sources of information in personal identity recognition and verification gives the opportunity to greatly improve performance. We propose a contactless biometric system that combines two modalities: palmprint and face. Hardware implementations are proposed on the Texas Instrument Digital Signal Processor and Xilinx Field-Programmable Gate Array (FPGA) platforms. The algorithmic chain consists of a preprocessing (which includes palm extraction from hand images), Gabor feature extraction, comparison by Hamming distance, and score fusion. Fusion possibilities are discussed and tested first using a bimodal database of 130 subjects that we designed (uB database), and then two common public biometric databases (AR for face and PolyU for palmprint). High performance has been obtained for recognition and verification purpose: a recognition rate of 97.49% with AR-PolyU database and an equal error rate of 1.10% on the uB database using only two training samples per subject have been obtained. Hardware results demonstrate that preprocessing can easily be performed during the acquisition phase, and multimodal biometric recognition can be treated almost instantly (0.4 ms on FPGA). We show the feasibility of a robust and efficient multimodal hardware biometric system that offers several advantages, such as user-friendliness and flexibility.

  19. Development and expansion of high-quality control region databases to improve forensic mtDNA evidence interpretation.

    PubMed

    Irwin, Jodi A; Saunier, Jessica L; Strouss, Katharine M; Sturk, Kimberly A; Diegoli, Toni M; Just, Rebecca S; Coble, Michael D; Parson, Walther; Parsons, Thomas J

    2007-06-01

    In an effort to increase the quantity, breadth and availability of mtDNA databases suitable for forensic comparisons, we have developed a high-throughput process to generate approximately 5000 control region sequences per year from regional US populations, global populations from which the current US population is derived and global populations currently under-represented in available forensic databases. The system utilizes robotic instrumentation for all laboratory steps from pre-extraction through sequence detection, and a rigorous eight-step, multi-laboratory data review process with entirely electronic data transfer. Over the past 3 years, nearly 10,000 control region sequences have been generated using this approach. These data are being made publicly available and should further address the need for consistent, high-quality mtDNA databases for forensic testing.

  20. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.

  1. Event Recording Data Acquisition System and Experiment Data Management System for Neutron Experiments at MLF, J-PARC

    NASA Astrophysics Data System (ADS)

    Nakatani, T.; Inamura, Y.; Moriyama, K.; Ito, T.; Muto, S.; Otomo, T.

    Neutron scattering can be a powerful probe in the investigation of many phenomena in the materials and life sciences. The Materials and Life Science Experimental Facility (MLF) at the Japan Proton Accelerator Research Complex (J-PARC) is a leading center of experimental neutron science and boasts one of the most intense pulsed neutron sources in the world. The MLF currently has 18 experimental instruments in operation that support a wide variety of users from across a range of research fields. The instruments include optical elements, sample environment apparatus and detector systems that are controlled and monitored electronically throughout an experiment. Signals from these components and those from the neutron source are converted into a digital format by the data acquisition (DAQ) electronics and recorded as time-tagged event data in the DAQ computers using "DAQ-Middleware". Operating in event mode, the DAQ system produces extremely large data files (˜GB) under various measurement conditions. Simultaneously, the measurement meta-data indicating each measurement condition is recorded in XML format by the MLF control software framework "IROHA". These measurement event data and meta-data are collected in the MLF common storage and cataloged by the MLF Experimental Database (MLF EXP-DB) based on a commercial XML database. The system provides a web interface for users to manage and remotely analyze experimental data.

  2. Application of a Database System for Korean Military Personnel Management.

    DTIC Science & Technology

    1987-03-01

    PUNOINtGiSPONSORING 6b OFFICE SYMBOIL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (taoab 8c AOORE SS (city. Stare. MWd BP Code) 10...concepts ......... 33 C. R SHIIONSPS WITH THE A .TL. ................. ............................... 35 1. Tree or hierarchical relationships...between relation and data-processing concepts6 ............... 35 3.6 Example of Tree Relationship ......... .......................... 36 3.7

  3. Custom ultrasonic instrumentation for flow measurement and real-time binary gas analysis in the CERN ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Alhroob, M.; Battistin, M.; Berry, S.; Bitadze, A.; Bonneau, P.; Boyd, G.; Crespo-Lopez, O.; Degeorge, C.; Deterre, C.; Di Girolamo, B.; Doubek, M.; Favre, G.; Hallewell, G.; Katunin, S.; Lombard, D.; Madsen, A.; McMahon, S.; Nagai, K.; O'Rourke, A.; Pearson, B.; Robinson, D.; Rossi, C.; Rozanov, A.; Stanecka, E.; Strauss, M.; Vacek, V.; Vaglio, R.; Young, J.; Zwalinski, L.

    2017-01-01

    The development of custom ultrasonic instrumentation was motivated by the need for continuous real-time monitoring of possible leaks and mass flow measurement in the evaporative cooling systems of the ATLAS silicon trackers. The instruments use pairs of ultrasonic transducers transmitting sound bursts and measuring transit times in opposite directions. The gas flow rate is calculated from the difference in transit times, while the sound velocity is deduced from their average. The gas composition is then evaluated by comparison with a molar composition vs. sound velocity database, based on the direct dependence between sound velocity and component molar concentration in a gas mixture at a known temperature and pressure. The instrumentation has been developed in several geometries, with five instruments now integrated and in continuous operation within the ATLAS Detector Control System (DCS) and its finite state machine. One instrument monitors C3F8 coolant leaks into the Pixel detector N2 envelope with a molar resolution better than 2ṡ 10-5, and has indicated a level of 0.14 % when all the cooling loops of the recently re-installed Pixel detector are operational. Another instrument monitors air ingress into the C3F8 condenser of the new C3F8 thermosiphon coolant recirculator, with sub-percent precision. The recent effect of the introduction of a small quantity of N2 volume into the 9.5 m3 total volume of the thermosiphon system was clearly seen with this instrument. Custom microcontroller-based readout has been developed for the instruments, allowing readout into the ATLAS DCS via Modbus TCP/IP on Ethernet. The instrumentation has many potential applications where continuous binary gas composition is required, including in hydrocarbon and anaesthetic gas mixtures.

  4. NOVAC - Network for Observation of Volcanic and Atmospheric Change: Data archiving and management

    NASA Astrophysics Data System (ADS)

    Lehmann, T.; Kern, C.; Vogel, L.; Platt, U.; Johansson, M.; Galle, B.

    2009-12-01

    The potential for volcanic risk assessment using real-time gas emissions data and the recognized power of sharing data from multiple eruptive centers were the motivation for a European Union FP6 Research Program project entitled NOVAC: Network for Observation of Volcanic and Atmospheric Change. Starting in 2005, a worldwide network of permanent scanning Differential Optical Absorption Spectroscopy (DOAS) instruments was installed at 26 volcanoes around the world. These ground-based remote sensing instruments record the characteristic absorption of volcanic gas emissions (e.g. SO2, BrO) in the ultra-violet wavelength region. A real-time DOAS retrieval was implemented to evaluate the measured spectra, thus providing the respective observatories with gas emission data which can be used for volcanic risk assessment and hazard prediction. Observatory personnel at each partner institution were trained on technical and scientific aspects of the DOAS technique, and a central database was created to allow the exchange of data and ideas between all partners. A bilateral benefit for volcano observatories as well as scientific institutions (e.g. universities and research centers) resulted. Volcano observatories were provided with leading edge technology for measuring volcanic SO2 emission fluxes, and now use this technology for monitoring and risk assessment, while the involved universities and research centers are working on global studies and characterizing the atmospheric impact of the observed gas emissions. The NOVAC database takes into account that project members use the database in a variety of different ways. Therefore, the data is structured in layers, the top of which contains basic information about each instrument. The second layer contains evaluated emission data such as SO2 column densities, SO2 emission fluxes, and BrO/SO2 ratios. The lowest layer contains all spectra measured by the individual instruments. Online since the middle of 2006, the NOVAC database currently contains 26 volcanoes, 56 instruments and more than 50 million spectra. It is scalable for up to 200 or more volcanoes, as the NOVAC project is open to outside participation. The data is archived in a MySQL Database system, storing and querying is done with PHP functions. The web interface is dynamically created based on the existing dataset and offers approx. 150 different search, display, and sorting options. Each user has a separate account and can save his personal search configuration from session to session. Search results are displayed in table form and can also be downloaded. Both evaluated data files and measured spectra can be downloaded as single files or in packages. The spectra can be plotted directly from the database, as well as several measurement values and evaluated parameters over selectable timescales. Because of the large extent of the dataset, major emphasis was placed on performance optimization.

  5. Bio-optical data integration based on a 4 D database system approach

    NASA Astrophysics Data System (ADS)

    Imai, N. N.; Shimabukuro, M. H.; Carmo, A. F. C.; Alcantara, E. H.; Rodrigues, T. W. P.; Watanabe, F. S. Y.

    2015-04-01

    Bio-optical characterization of water bodies requires spatio-temporal data about Inherent Optical Properties and Apparent Optical Properties which allow the comprehension of underwater light field aiming at the development of models for monitoring water quality. Measurements are taken to represent optical properties along a column of water, and then the spectral data must be related to depth. However, the spatial positions of measurement may differ since collecting instruments vary. In addition, the records should not refer to the same wavelengths. Additional difficulty is that distinct instruments store data in different formats. A data integration approach is needed to make these large and multi source data sets suitable for analysis. Thus, it becomes possible, even automatically, semi-empirical models evaluation, preceded by preliminary tasks of quality control. In this work it is presented a solution, in the stated scenario, based on spatial - geographic - database approach with the adoption of an object relational Database Management System - DBMS - due to the possibilities to represent all data collected in the field, in conjunction with data obtained by laboratory analysis and Remote Sensing images that have been taken at the time of field data collection. This data integration approach leads to a 4D representation since that its coordinate system includes 3D spatial coordinates - planimetric and depth - and the time when each data was taken. It was adopted PostgreSQL DBMS extended by PostGIS module to provide abilities to manage spatial/geospatial data. It was developed a prototype which has the mainly tools an analyst needs to prepare the data sets for analysis.

  6. A development and integration of database code-system with a compilation of comparator, k0 and absolute methods for INAA using microsoft access

    NASA Astrophysics Data System (ADS)

    Hoh, Siew Sin; Rapie, Nurul Nadiah; Lim, Edwin Suh Wen; Tan, Chun Yuan; Yavar, Alireza; Sarmani, Sukiman; Majid, Amran Ab.; Khoo, Kok Siong

    2013-05-01

    Instrumental Neutron Activation Analysis (INAA) is often used to determine and calculate the elemental concentrations of a sample at The National University of Malaysia (UKM) typically in Nuclear Science Programme, Faculty of Science and Technology. The objective of this study was to develop a database code-system based on Microsoft Access 2010 which could help the INAA users to choose either comparator method, k0-method or absolute method for calculating the elemental concentrations of a sample. This study also integrated k0data, Com-INAA, k0Concent, k0-Westcott and Abs-INAA to execute and complete the ECC-UKM database code-system. After the integration, a study was conducted to test the effectiveness of the ECC-UKM database code-system by comparing the concentrations between the experiments and the code-systems. 'Triple Bare Monitor' Zr-Au and Cr-Mo-Au were used in k0Concent, k0-Westcott and Abs-INAA code-systems as monitors to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration were net peak area (Np), measurement time (tm), irradiation time (tirr), k-factor (k), thermal to epithermal neutron flux ratio (f), parameters of the neutron flux distribution epithermal (α) and detection efficiency (ɛp). For Com-INAA code-system, certified reference material IAEA-375 Soil was used to calculate the concentrations of elements in a sample. Other CRM and SRM were also used in this database codesystem. Later, a verification process to examine the effectiveness of the Abs-INAA code-system was carried out by comparing the sample concentrations between the code-system and the experiment. The results of the experimental concentration values of ECC-UKM database code-system were performed with good accuracy.

  7. The HARPS-N archive through a Cassandra, NoSQL database suite?

    NASA Astrophysics Data System (ADS)

    Molinari, Emilio; Guerra, Jose; Harutyunyan, Avet; Lodi, Marcello; Martin, Adrian

    2016-07-01

    The TNG-INAF is developing the science archive for the WEAVE instrument. The underlying architecture of the archive is based on a non relational database, more precisely, on Apache Cassandra cluster, which uses a NoSQL technology. In order to test and validate the use of this architecture, we created a local archive which we populated with all the HARPSN spectra collected at the TNG since the instrument's start of operations in mid-2012, as well as developed tools for the analysis of this data set. The HARPS-N data set is two orders of magnitude smaller than WEAVE, but we want to demonstrate the ability to walk through a complete data set and produce scientific output, as valuable as that produced by an ordinary pipeline, though without accessing directly the FITS files. The analytics is done by Apache Solr and Spark and on a relational PostgreSQL database. As an example, we produce observables like metallicity indexes for the targets in the archive and compare the results with the ones coming from the HARPS-N regular data reduction software. The aim of this experiment is to explore the viability of a high availability cluster and distributed NoSQL database as a platform for complex scientific analytics on a large data set, which will then be ported to the WEAVE Archive System (WAS) which we are developing for the WEAVE multi object, fiber spectrograph.

  8. ClassLess: A Comprehensive Database of Young Stellar Objects

    NASA Astrophysics Data System (ADS)

    Hillenbrand, Lynne; Baliber, Nairn

    2015-01-01

    We have designed and constructed a database housing published measurements of Young Stellar Objects (YSOs) within ~1 kpc of the Sun. ClassLess, so called because it includes YSOs in all stages of evolution, is a relational database in which user interaction is conducted via HTML web browsers, queries are performed in scientific language, and all data are linked to the sources of publication. Each star is associated with a cluster (or clusters), and both spatially resolved and unresolved measurements are stored, allowing proper use of data from multiple star systems. With this fully searchable tool, myriad ground- and space-based instruments and surveys across wavelength regimes can be exploited. In addition to primary measurements, the database self consistently calculates and serves higher level data products such as extinction, luminosity, and mass. As a result, searches for young stars with specific physical characteristics can be completed with just a few mouse clicks.

  9. Linking Publications to Instruments, Field Campaigns, Sites and Working Groups: The ARM Experience

    NASA Astrophysics Data System (ADS)

    Lehnert, K.; Parsons, M. A.; Ramachandran, R.; Fils, D.; Narock, T.; Fox, P. A.; Troyan, D.; Cialella, A. T.; Gregory, L.; Lazar, K.; Liang, M.; Ma, L.; Tilp, A.; Wagener, R.

    2017-12-01

    For the past 25 years, the ARM Climate Research Facility - a US Department of Energy scientific user facility - has been collecting atmospheric data in different climatic regimes using both in situ and remote instrumentation. Configuration of the facility's components has been designed to improve the understanding and representation, in climate and earth system models, of clouds and aerosols. Placing a premium on long-term continuous data collection resulted in terabytes of data having been collected, stored, and made accessible to any interested person. All data is accessible via the ARM.gov website and the ARM Data Discovery Tool. A team of metadata professionals assign appropriate tags to help facilitate searching the databases for desired data. The knowledge organization tools and concepts are used to create connections between data, instruments, field campaigns, sites, and measurements are familiar to informatics professionals. Ontology, taxonomy, classification, and thesauri are among the customized concepts put into practice for ARM's purposes. In addition to the multitude of data available, there have been approximately 3,000 journal articles that utilize ARM data. These have been linked to specific ARM web pages. Searches of the complete ARM publication database can be done using a separate interface. This presentation describes how ARM data is linked to instruments, sites, field campaigns, and publications through the application of standard knowledge organization tools and concepts.

  10. An approach to regional wetland digital elevation model development using a differential global positioning system and a custom-built helicopter-based surveying system

    USGS Publications Warehouse

    Jones, J.W.; Desmond, G.B.; Henkle, C.; Glover, R.

    2012-01-01

    Accurate topographic data are critical to restoration science and planning for the Everglades region of South Florida, USA. They are needed to monitor and simulate water level, water depth and hydroperiod and are used in scientific research on hydrologic and biologic processes. Because large wetland environments and data acquisition challenge conventional ground-based and remotely sensed data collection methods, the United States Geological Survey (USGS) adapted a classical data collection instrument to global positioning system (GPS) and geographic information system (GIS) technologies. Data acquired with this instrument were processed using geostatistics to yield sub-water level elevation values with centimetre accuracy (??15 cm). The developed database framework, modelling philosophy and metadata protocol allow for continued, collaborative model revision and expansion, given additional elevation or other ancillary data. ?? 2012 Taylor & Francis.

  11. Operational Support for Instrument Stability through ODI-PPA Metadata Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Young, M. D.; Hayashi, S.; Gopu, A.; Kotulla, R.; Harbeck, D.; Liu, W.

    2015-09-01

    Over long time scales, quality assurance metrics taken from calibration and calibrated data products can aid observatory operations in quantifying the performance and stability of the instrument, and identify potential areas of concern or guide troubleshooting and engineering efforts. Such methods traditionally require manual SQL entries, assuming the requisite metadata has even been ingested into a database. With the ODI-PPA system, QA metadata has been harvested and indexed for all data products produced over the life of the instrument. In this paper we will describe how, utilizing the industry standard Highcharts Javascript charting package with a customized AngularJS-driven user interface, we have made the process of visualizing the long-term behavior of these QA metadata simple and easily replicated. Operators can easily craft a custom query using the powerful and flexible ODI-PPA search interface and visualize the associated metadata in a variety of ways. These customized visualizations can be bookmarked, shared, or embedded externally, and will be dynamically updated as new data products enter the system, enabling operators to monitor the long-term health of their instrument with ease.

  12. Ultra-Structure database design methodology for managing systems biology data and analyses

    PubMed Central

    Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C

    2009-01-01

    Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849

  13. LOINC, a universal standard for identifying laboratory observations: a 5-year update.

    PubMed

    McDonald, Clement J; Huff, Stanley M; Suico, Jeffrey G; Hill, Gilbert; Leavelle, Dennis; Aller, Raymond; Forrey, Arden; Mercer, Kathy; DeMoor, Georges; Hook, John; Williams, Warren; Case, James; Maloney, Pat

    2003-04-01

    The Logical Observation Identifier Names and Codes (LOINC) database provides a universal code system for reporting laboratory and other clinical observations. Its purpose is to identify observations in electronic messages such as Health Level Seven (HL7) observation messages, so that when hospitals, health maintenance organizations, pharmaceutical manufacturers, researchers, and public health departments receive such messages from multiple sources, they can automatically file the results in the right slots of their medical records, research, and/or public health systems. For each observation, the database includes a code (of which 25 000 are laboratory test observations), a long formal name, a "short" 30-character name, and synonyms. The database comes with a mapping program called Regenstrief LOINC Mapping Assistant (RELMA(TM)) to assist the mapping of local test codes to LOINC codes and to facilitate browsing of the LOINC results. Both LOINC and RELMA are available at no cost from http://www.regenstrief.org/loinc/. The LOINC medical database carries records for >30 000 different observations. LOINC codes are being used by large reference laboratories and federal agencies, e.g., the CDC and the Department of Veterans Affairs, and are part of the Health Insurance Portability and Accountability Act (HIPAA) attachment proposal. Internationally, they have been adopted in Switzerland, Hong Kong, Australia, and Canada, and by the German national standards organization, the Deutsches Instituts für Normung. Laboratories should include LOINC codes in their outbound HL7 messages so that clinical and research clients can easily integrate these results into their clinical and research repositories. Laboratories should also encourage instrument vendors to deliver LOINC codes in their instrument outputs and demand LOINC codes in HL7 messages they get from reference laboratories to avoid the need to lump so many referral tests under the "send out lab" code.

  14. Enhancements to the NASA Astrophysics Science Information and Abstract Service

    NASA Astrophysics Data System (ADS)

    Kurtz, M. J.; Eichhorn, G.; Accomazzi, A.; Grant, C. S.; Murray, S. S.

    1995-05-01

    The NASA Astrophysics Data System Astrophysics Science Information and Abstract Service, the extension of the ADS Abstract Service continues rapidly to expand in both use and capabilities. Each month the service is used by about 4,000 different people, and returns about 1,000,000 pieces of bibliographic information. Among the recent additions to the system are: 1. Whole Text Access. In addition to the ApJ Letters we now have whole text for the ApJ on-line, soon we will have AJ and Rev. Mexicana. Discussions with other publishers are in progress. 2. Space Instrumentation Database. We now provide a second abstract service, covering papers related to space instruments. This is larger than the astronomy and astrophysics database in terms of total abstracts. 3. Reference Books and Historical Journals. We have begun putting the SAO Annals and the HCO Annals on-line. We have put the Handbook of Space Astronomy and Astrophysics by M.V. Zombeck (Cambridge U.P.) on-line. 4. Author Abstracts. We can now include original abstracts in addition to those we get from the NASA STI Abstracts Database. We have included abstracts for A&A in collaboration with the CDS in Strasbourg, and are collaborating with the AAS and the ASP on others. We invite publishers and editors of journals and conference proceedings to include their original abstracts in our service; send inquiries via e-mail to ads@cfa.harvard.edu. 5. Author Notes. We now accept notes and comments from authors of articles in our database. These are arbitrary html files and may contain pointers to other WWW documents, they are listed along with the abstracts, whole text, and data available in the index listing for every reference. The ASIAS is available at: http://adswww.harvard.edu/

  15. The search for relevant outcome measures for cost-utility analysis of systemic family interventions in adolescents with substance use disorder and delinquent behavior: a systematic literature review.

    PubMed

    Schawo, S; Bouwmans, C; van der Schee, E; Hendriks, V; Brouwer, W; Hakkaart, L

    2017-09-19

    Systemic family interventions have shown to be effective in adolescents with substance use disorder and delinquent behavior. The interventions target interactions between the adolescent and involved systems (i.e. youth, family, peers, neighbors, school, work, and society). Next to effectiveness considerations, economic aspects have gained attention. However, conventional generic quality of life measures used in health economic evaluations may not be able to capture the broad effects of systemic interventions. This study aims to identify existing outcome measures, which capture the broad effects of systemic family interventions, and allow use in a health economic framework. We based our systematic review on clinical studies in the field. Our goal was to identify effectiveness studies of psychosocial interventions for adolescents with substance use disorder and delinquent behavior and to distill the instruments used in these studies to measure effects. Searched databases were PubMed, Education Resource Information Center (ERIC), Cochrane and Psychnet (PsycBOOKSc, PsycCRITIQUES, print). Identified instruments were ranked according to the number of systems covered (comprehensiveness). In addition, their use for health economic analyses was evaluated according to suitability characteristics such as brevity, accessibility, psychometric properties, etc. One thousand three hundred seventy-eight articles were found and screened for eligibility. Eighty articles were selected, 8 instruments were identified covering 5 or more systems. The systematic review identified instruments from the clinical field suitable to evaluate systemic family interventions in a health economic framework. None of them had preference-weights available. Hence, a next step could be to attach preference-weights to one of the identified instruments to allow health economic evaluations of systemic family interventions.

  16. The Computerized Laboratory Notebook concept for genetic toxicology experimentation and testing.

    PubMed

    Strauss, G H; Stanford, W L; Berkowitz, S J

    1989-03-01

    We describe a microcomputer system utilizing the Computerized Laboratory Notebook (CLN) concept developed in our laboratory for the purpose of automating the Battery of Leukocyte Tests (BLT). The BLT was designed to evaluate blood specimens for toxic, immunotoxic, and genotoxic effects after in vivo exposure to putative mutagens. A system was developed with the advantages of low cost, limited spatial requirements, ease of use for personnel inexperienced with computers, and applicability to specific testing yet flexibility for experimentation. This system eliminates cumbersome record keeping and repetitive analysis inherent in genetic toxicology bioassays. Statistical analysis of the vast quantity of data produced by the BLT would not be feasible without a central database. Our central database is maintained by an integrated package which we have adapted to develop the CLN. The clonal assay of lymphocyte mutagenesis (CALM) section of the CLN is demonstrated. PC-Slaves expand the microcomputer to multiple workstations so that our computerized notebook can be used next to a hood while other work is done in an office and instrument room simultaneously. Communication with peripheral instruments is an indispensable part of many laboratory operations, and we present a representative program, written to acquire and analyze CALM data, for communicating with both a liquid scintillation counter and an ELISA plate reader. In conclusion we discuss how our computer system could easily be adapted to the needs of other laboratories.

  17. Evaluation Instruments for Quality of Life Related to Melasma: An Integrative Review.

    PubMed

    Pollo, Camila Fernandes; Meneguin, Silmara; Miot, Helio Amante

    2018-05-21

    The aim of this study was to analyze scientific production concerning the validation and cultural adaptation of quality of life evaluation instruments for patients with melasma and to offer a critical reflection on these methods. A literature review was performed based on a search of the Web of Science, Bireme, PubMed, Elsevier Scopus, and Google Scholar databases. All published articles from indexed periodicals in these electronic databases up to December 2015 were included. Eight articles were identified, of which only one (12.5%) referred to the development and validation of a specific instrument for evaluation of the quality of life of melasma patients. An additional six articles (75%) referred to transcultural adjustment and validation of the same instrument in other languages, and another (12.5%) article reported the development of a generic instrument for evaluation of quality of life in patients with pigment disorders. This review revealed only one specific instrument developed and validated in different cultures. Despite being widely used, this instrument did not follow the classic construction steps for psychometric instruments, which paves the way for future studies to develop novel instruments.

  18. Evaluation Instruments for Quality of Life Related to Melasma: An Integrative Review

    PubMed Central

    Pollo, Camila Fernandes; Meneguin, Silmara; Miot, Helio Amante

    2018-01-01

    The aim of this study was to analyze scientific production concerning the validation and cultural adaptation of quality of life evaluation instruments for patients with melasma and to offer a critical reflection on these methods. A literature review was performed based on a search of the Web of Science, Bireme, PubMed, Elsevier Scopus, and Google Scholar databases. All published articles from indexed periodicals in these electronic databases up to December 2015 were included. Eight articles were identified, of which only one (12.5%) referred to the development and validation of a specific instrument for evaluation of the quality of life of melasma patients. An additional six articles (75%) referred to transcultural adjustment and validation of the same instrument in other languages, and another (12.5%) article reported the development of a generic instrument for evaluation of quality of life in patients with pigment disorders. This review revealed only one specific instrument developed and validated in different cultures. Despite being widely used, this instrument did not follow the classic construction steps for psychometric instruments, which paves the way for future studies to develop novel instruments. PMID:29791603

  19. Representing nursing assessments in clinical information systems using the logical observation identifiers, names, and codes database.

    PubMed

    Matney, Susan; Bakken, Suzanne; Huff, Stanley M

    2003-01-01

    In recent years, the Logical Observation Identifiers, Names, and Codes (LOINC) Database has been expanded to include assessment items of relevance to nursing and in 2002 met the criteria for "recognition" by the American Nurses Association. Assessment measures in LOINC include those related to vital signs, obstetric measurements, clinical assessment scales, assessments from standardized nursing terminologies, and research instruments. In order for LOINC to be of greater use in implementing information systems that support nursing practice, additional content is needed. Moreover, those implementing systems for nursing practice must be aware of the manner in which LOINC codes for assessments can be appropriately linked with other aspects of the nursing process such as diagnoses and interventions. Such linkages are necessary to document nursing contributions to healthcare outcomes within the context of a multidisciplinary care environment and to facilitate building of nursing knowledge from clinical practice. The purposes of this paper are to provide an overview of the LOINC database, to describe examples of assessments of relevance to nursing contained in LOINC, and to illustrate linkages of LOINC assessments with other nursing concepts.

  20. Scales for evaluating self-perceived anxiety levels in patients admitted to intensive care units: a review.

    PubMed

    Perpiñá-Galvañ, Juana; Richart-Martínez, Miguel

    2009-11-01

    To review studies of anxiety in critically ill patients admitted to an intensive care unit to describe the level of anxiety and synthesize the psychometric properties of the instruments used to measure anxiety. The CUIDEN, IME, ISOC, CINAHL, MEDLINE, and PSYCINFO databases for 1995 to 2005 were searched. The search focused on 3 concepts: anxiety, intensive care, and mechanical ventilation for the English-language databases and ansiedad, cuidados intensivos, and ventilación mecánica for the Spanish-language databases. Information was extracted from 18 selected articles on the level of anxiety experienced by patients and the psychometric properties of the instruments used to measure anxiety. Moderate levels of anxiety were reported. Levels were higher in women than in men, and higher in patients undergoing positive pressure ventilation regardless of sex. Most multi-item instruments had high coefficients of internal consistency. The reliability of instruments with only a single item was not demonstrated, even though the instruments had moderate-to-high correlations with other measurements. Midlength scales, such the anxiety subscale of the Brief Symptom Inventory or the shortened state version of the State-Trait Anxiety Inventory are best for measuring anxiety in critical care patients.

  1. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    PubMed

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-11-01

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Geographic Information Systems and Martian Data: Compatibility and Analysis

    NASA Technical Reports Server (NTRS)

    Jones, Jennifer L.

    2005-01-01

    Planning future landed Mars missions depends on accurate, informed data. This research has created and used spatially referenced instrument data from NASA missions such as the Thermal Emission Imaging System (THEMIS) on the Mars Odyssey Orbiter and the Mars Orbital Camera (MOC) on the Mars Global Surveyor (MGS) Orbiter. Creating spatially referenced data enables its use in Geographic Information Systems (GIS) such as ArcGIS. It has then been possible to integrate this spatially referenced data with global base maps and build and populate location based databases that are easy to access.

  3. Intelligent signal analysis and recognition

    NASA Technical Reports Server (NTRS)

    Levinson, Robert; Helman, Daniel; Oswalt, Edward

    1987-01-01

    Progress in the research and development of self-organizing database system that can support the identification and characterization of signals in an RF environment is described. As the radio frequency spectrum becomes more crowded, there are a number of situations that require a characterization of the RF environment. This database system is designed to be practical in applications where communications and other instruments encounter a time varying and complex RF environment. The primary application of this system is the guidance and control of NASA's SETI Microwave Observing Project. Other possible applications include selection of telemety bands for communication with spacecraft, and the scheduling of antenna for radio astronomy are two examples where characterization of the RF environment is required. In these applications, the RF environment is constantly changing, and even experienced operators cannot quickly identify the multitude of signals that can be encountered. Some of these signals are repetitive, others appear to occur sporadically.

  4. MISSE in the Materials and Processes Technical Information System (MAPTIS )

    NASA Technical Reports Server (NTRS)

    Burns, DeWitt; Finckenor, Miria; Henrie, Ben

    2013-01-01

    Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded

  5. The NIFFTE Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Qu, Hai; Niffte Collaboration

    2011-10-01

    The Neutron Induced Fission Fragment Tracking Experiment (NIFFTE) will employ a novel, high granularity, pressurized Time Projection Chamber to measure fission cross-sections of the major actinides to high precision over a wide incident neutron energy range. These results will improve nuclear data accuracy and benefit the fuel cycle in the future. The NIFFTE data acquisition system (DAQ) has been designed and implemented on the prototype TPC. Lessons learned from engineering runs have been incorporated into some design changes that are being implemented before the next run cycle. A fully instrumented sextant of EtherDAQ cards (16 sectors, 496 channels) will be used for the next run cycle. The Maximum Integrated Data Acquisition System (MIDAS) has been chosen and customized to configure and run the experiment. It also meets the requirement for remote control and monitoring of the system. The integration of the MIDAS online database with the persistent PostgreSQL database has been implemented for experiment usage. The detailed design and current status of the DAQ system will be presented.

  6. Inter-University Upper Atmosphere Global Observation Network (IUGONET) Metadata Database and Its Interoperability

    NASA Astrophysics Data System (ADS)

    Yatagai, A. I.; Iyemori, T.; Ritschel, B.; Koyama, Y.; Hori, T.; Abe, S.; Tanaka, Y.; Shinbori, A.; Umemura, N.; Sato, Y.; Yagi, M.; Ueno, S.; Hashiguchi, N. O.; Kaneda, N.; Belehaki, A.; Hapgood, M. A.

    2013-12-01

    The IUGONET is a Japanese program to build a metadata database for ground-based observations of the upper atmosphere [1]. The project began in 2009 with five Japanese institutions which archive data observed by radars, magnetometers, photometers, radio telescopes and helioscopes, and so on, at various altitudes from the Earth's surface to the Sun. Systems have been developed to allow searching of the above described metadata. We have been updating the system and adding new and updated metadata. The IUGONET development team adopted the SPASE metadata model [2] to describe the upper atmosphere data. This model is used as the common metadata format by the virtual observatories for solar-terrestrial physics. It includes metadata referring to each data file (called a 'Granule'), which enable a search for data files as well as data sets. Further details are described in [2] and [3]. Currently, three additional Japanese institutions are being incorporated in IUGONET. Furthermore, metadata of observations of the troposphere, taken at the observatories of the middle and upper atmosphere radar at Shigaraki and the Meteor radar in Indonesia, have been incorporated. These additions will contribute to efficient interdisciplinary scientific research. In the beginning of 2013, the registration of the 'Observatory' and 'Instrument' metadata was completed, which makes it easy to overview of the metadata database. The number of registered metadata as of the end of July, totalled 8.8 million, including 793 observatories and 878 instruments. It is important to promote interoperability and/or metadata exchange between the database development groups. A memorandum of agreement has been signed with the European Near-Earth Space Data Infrastructure for e-Science (ESPAS) project, which has similar objectives to IUGONET with regard to a framework for formal collaboration. Furthermore, observations by satellites and the International Space Station are being incorporated with a view for making/linking metadata databases. The development of effective data systems will contribute to the progress of scientific research on solar terrestrial physics, climate and the geophysical environment. Any kind of cooperation, metadata input and feedback, especially for linkage of the databases, is welcomed. References 1. Hayashi, H. et al., Inter-university Upper Atmosphere Global Observation Network (IUGONET), Data Sci. J., 12, WDS179-184, 2013. 2. King, T. et al., SPASE 2.0: A standard data model for space physics. Earth Sci. Inform. 3, 67-73, 2010, doi:10.1007/s12145-010-0053-4. 3. Hori, T., et al., Development of IUGONET metadata format and metadata management system. J. Space Sci. Info. Jpn., 105-111, 2012. (in Japanese)

  7. Head Up Displays. (Latest Citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The bibliography contains citations concerning the design, fabrication, and applications of head up displays (HUDs). Applications include military aircraft, helicopters, space shuttle, and commercial aircraft. Functions of the display include instrument approach, target tracking, and navigation. The head up display provides for an integrated avionics system with the pilot in the loop. (Contains 50-250 citations and includes a subject term index and title list.)

  8. Head Up Displays. (Latest citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The bibliography contains citations concerning the design, fabrication, and applications of head up displays (HUDs). Applications include military aircraft, helicopters, space shuttle, and commercial aircraft. Functions of the display include instrument approach, target tracking, and navigation. The head up display provides for an integrated avionics system with the pilot in the loop. (Contains 50-250 citations and includes a subject term index and title list.)

  9. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including comparisons against ICESat altimetry for selected regions with tall vegetation and high relief. The extensive verification effort by the Receiver Algorithm team at GSFC is aimed at assuring that the onboard databases are sufficiently accurate. We will present the results of those assessments and verification tests, along with measures taken to implement modifications to the databases to optimize their use by the receiver algorithms. Companion presentations by McGarry et al. and Leigh et al. describe the details on the ATLAS Onboard Receiver Algorithms and databases development, respectively.

  10. Airborne Shaped Sonic Boom Demonstration Pressure Measurements with Computational Fluid Dynamics Comparisons

    NASA Technical Reports Server (NTRS)

    Haering, Edward A., Jr.; Murray, James E.; Purifoy, Dana D.; Graham, David H.; Meredith, Keith B.; Ashburn, Christopher E.; Stucky, Mark

    2005-01-01

    The Shaped Sonic Boom Demonstration project showed for the first time that by careful design of aircraft contour the resultant sonic boom can maintain a tailored shape, propagating through a real atmosphere down to ground level. In order to assess the propagation characteristics of the shaped sonic boom and to validate computational fluid dynamics codes, airborne measurements were taken of the pressure signatures in the near field by probing with an instrumented F-15B aircraft, and in the far field by overflying an instrumented L-23 sailplane. This paper describes each aircraft and their instrumentation systems, the airdata calibration, analysis of the near- and far-field airborne data, and shows the good to excellent agreement between computational fluid dynamics solutions and flight data. The flights of the Shaped Sonic Boom Demonstration aircraft occurred in two phases. Instrumentation problems were encountered during the first phase, and corrections and improvements were made to the instrumentation system for the second phase, which are documented in the paper. Piloting technique and observations are also given. These airborne measurements of the Shaped Sonic Boom Demonstration aircraft are a unique and important database that will be used to validate design tools for a new generation of quiet supersonic aircraft.

  11. Shipboard Analytical Capabilities on the Renovated JOIDES Resolution, IODP Riserless Drilling Vessel

    NASA Astrophysics Data System (ADS)

    Blum, P.; Foster, P.; Houpt, D.; Bennight, C.; Brandt, L.; Cobine, T.; Crawford, W.; Fackler, D.; Fujine, K.; Hastedt, M.; Hornbacher, D.; Mateo, Z.; Moortgat, E.; Vasilyev, M.; Vasilyeva, Y.; Zeliadt, S.; Zhao, J.

    2008-12-01

    The JOIDES Resolution (JR) has conducted 121 scientific drilling expeditions during the Ocean Drilling Program (ODP) and the first phase of the Integrated Ocean Drilling Program (IODP) (1983-2006). The vessel and scientific systems have just completed an NSF-sponsored renovation (2005-2008). Shipboard analytical systems have been upgraded, within funding constraints imposed by market driven vessel conversion cost increases, to include: (1) enhanced shipboard analytical services including instruments and software for sampling and the capture of chemistry, physical properties, and geological data; (2) new data management capabilities built around a laboratory information management system (LIMS), digital asset management system, and web services; (3) operations data services with enhanced access to navigation and rig instrumentation data; and (4) a combination of commercial and home-made user applications for workflow- specific data extractions, generic and customized data reporting, and data visualization within a shipboard production environment. The instrumented data capture systems include a new set of core loggers for rapid and non-destructive acquisition of images and other physical properties data from drill cores. Line-scan imaging and natural gamma ray loggers capture data at unprecedented quality due to new and innovative designs. Many instruments used to characterize chemical compounds of rocks, sediments, and interstitial fluids were upgraded with the latest technology. The shipboard analytical environment features a new and innovative framework (DESCinfo) and application (DESClogik) for capturing descriptive and interpretive data from geological sub-domains such as sedimentology, petrology, paleontology, structural geology, stratigraphy, etc. This system fills a long-standing gap by providing a global database, controlled vocabularies and taxa name lists with version control, a highly configurable spreadsheet environment for data capture, and visualization of context data collected with the shipboard core loggers and other instruments.

  12. Identification of phlebotomine sand flies using one MALDI-TOF MS reference database and two mass spectrometer systems.

    PubMed

    Mathis, Alexander; Depaquit, Jérôme; Dvořák, Vit; Tuten, Holly; Bañuls, Anne-Laure; Halada, Petr; Zapata, Sonia; Lehrter, Véronique; Hlavačková, Kristýna; Prudhomme, Jorian; Volf, Petr; Sereno, Denis; Kaufmann, Christian; Pflüger, Valentin; Schaffner, Francis

    2015-05-10

    Rapid, accurate and high-throughput identification of vector arthropods is of paramount importance in surveillance programmes that are becoming more common due to the changing geographic occurrence and extent of many arthropod-borne diseases. Protein profiling by MALDI-TOF mass spectrometry fulfils these requirements for identification, and reference databases have recently been established for several vector taxa, mostly with specimens from laboratory colonies. We established and validated a reference database containing 20 phlebotomine sand fly (Diptera: Psychodidae, Phlebotominae) species by using specimens from colonies or field-collections that had been stored for various periods of time. Identical biomarker mass patterns ('superspectra') were obtained with colony- or field-derived specimens of the same species. In the validation study, high quality spectra (i.e. more than 30 evaluable masses) were obtained with all fresh insects from colonies, and with 55/59 insects deep-frozen (liquid nitrogen/-80 °C) for up to 25 years. In contrast, only 36/52 specimens stored in ethanol could be identified. This resulted in an overall sensitivity of 87 % (140/161); specificity was 100 %. Duration of storage impaired data counts in the high mass range, and thus cluster analyses of closely related specimens might reflect their storage conditions rather than phenotypic distinctness. A major drawback of MALDI-TOF MS is the restricted availability of in-house databases and the fact that mass spectrometers from 2 companies (Bruker, Shimadzu) are widely being used. We have analysed fingerprints of phlebotomine sand flies obtained by automatic routine procedure on a Bruker instrument by using our database and the software established on a Shimadzu system. The sensitivity with 312 specimens from 8 sand fly species from laboratory colonies when evaluating only high quality spectra was 98.3 %; the specificity was 100 %. The corresponding diagnostic values with 55 field-collected specimens from 4 species were 94.7 % and 97.4 %, respectively. A centralized high-quality database (created by expert taxonomists and experienced users of mass spectrometers) that is easily amenable to customer-oriented identification services is a highly desirable resource. As shown in the present work, spectra obtained from different specimens with different instruments can be analysed using a centralized database, which should be available in the near future via an online platform in a cost-efficient manner.

  13. Portable open-path optical remote sensing (ORS) Fourier transform infrared (FTIR) instrumentation miniaturization and software for point and click real-time analysis

    NASA Astrophysics Data System (ADS)

    Zemek, Peter G.; Plowman, Steven V.

    2010-04-01

    Advances in hardware have miniaturized the emissions spectrometer and associated optics, rendering them easily deployed in the field. Such systems are also suitable for vehicle mounting, and can provide high quality data and concentration information in minutes. Advances in software have accompanied this hardware evolution, enabling the development of portable point-and-click OP-FTIR systems that weigh less than 16 lbs. These systems are ideal for first-responders, military, law enforcement, forensics, and screening applications using optical remote sensing (ORS) methodologies. With canned methods and interchangeable detectors, the new generation of OP-FTIR technology is coupled to the latest forward reference-type model software to provide point-and-click technology. These software models have been established for some time. However, refined user-friendly models that use active, passive, and solar occultation methodologies now allow the user to quickly field-screen and quantify plumes, fence-lines, and combustion incident scenarios in high-temporal-resolution. Synthetic background generation is now redundant as the models use highly accurate instrument line shape (ILS) convolutions and several other parameters, in conjunction with radiative transfer model databases to model a single calibration spectrum to collected sample spectra. Data retrievals are performed directly on single beam spectra using non-linear classical least squares (NLCLS). Typically, the Hitran line database is used to generate the initial calibration spectrum contained within the software.

  14. Montreal Archive of Sleep Studies: an open-access resource for instrument benchmarking and exploratory research.

    PubMed

    O'Reilly, Christian; Gosselin, Nadia; Carrier, Julie; Nielsen, Tore

    2014-12-01

    Manual processing of sleep recordings is extremely time-consuming. Efforts to automate this process have shown promising results, but automatic systems are generally evaluated on private databases, not allowing accurate cross-validation with other systems. In lacking a common benchmark, the relative performances of different systems are not compared easily and advances are compromised. To address this fundamental methodological impediment to sleep study, we propose an open-access database of polysomnographic biosignals. To build this database, whole-night recordings from 200 participants [97 males (aged 42.9 ± 19.8 years) and 103 females (aged 38.3 ± 18.9 years); age range: 18-76 years] were pooled from eight different research protocols performed in three different hospital-based sleep laboratories. All recordings feature a sampling frequency of 256 Hz and an electroencephalography (EEG) montage of 4-20 channels plus standard electro-oculography (EOG), electromyography (EMG), electrocardiography (ECG) and respiratory signals. Access to the database can be obtained through the Montreal Archive of Sleep Studies (MASS) website (http://www.ceams-carsm.ca/en/MASS), and requires only affiliation with a research institution and prior approval by the applicant's local ethical review board. Providing the research community with access to this free and open sleep database is expected to facilitate the development and cross-validation of sleep analysis automation systems. It is also expected that such a shared resource will be a catalyst for cross-centre collaborations on difficult topics such as improving inter-rater agreement on sleep stage scoring. © 2014 European Sleep Research Society.

  15. Citation searches are more sensitive than keyword searches to identify studies using specific measurement instruments.

    PubMed

    Linder, Suzanne K; Kamath, Geetanjali R; Pratt, Gregory F; Saraykar, Smita S; Volk, Robert J

    2015-04-01

    To compare the effectiveness of two search methods in identifying studies that used the Control Preferences Scale (CPS), a health care decision-making instrument commonly used in clinical settings. We searched the literature using two methods: (1) keyword searching using variations of "Control Preferences Scale" and (2) cited reference searching using two seminal CPS publications. We searched three bibliographic databases [PubMed, Scopus, and Web of Science (WOS)] and one full-text database (Google Scholar). We report precision and sensitivity as measures of effectiveness. Keyword searches in bibliographic databases yielded high average precision (90%) but low average sensitivity (16%). PubMed was the most precise, followed closely by Scopus and WOS. The Google Scholar keyword search had low precision (54%) but provided the highest sensitivity (70%). Cited reference searches in all databases yielded moderate sensitivity (45-54%), but precision ranged from 35% to 75% with Scopus being the most precise. Cited reference searches were more sensitive than keyword searches, making it a more comprehensive strategy to identify all studies that use a particular instrument. Keyword searches provide a quick way of finding some but not all relevant articles. Goals, time, and resources should dictate the combination of which methods and databases are used. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Citation searches are more sensitive than keyword searches to identify studies using specific measurement instruments

    PubMed Central

    Linder, Suzanne K.; Kamath, Geetanjali R.; Pratt, Gregory F.; Saraykar, Smita S.; Volk, Robert J.

    2015-01-01

    Objective To compare the effectiveness of two search methods in identifying studies that used the Control Preferences Scale (CPS), a healthcare decision-making instrument commonly used in clinical settings. Study Design & Setting We searched the literature using two methods: 1) keyword searching using variations of “control preferences scale” and 2) cited reference searching using two seminal CPS publications. We searched three bibliographic databases [PubMed, Scopus, Web of Science (WOS)] and one full-text database (Google Scholar). We report precision and sensitivity as measures of effectiveness. Results Keyword searches in bibliographic databases yielded high average precision (90%), but low average sensitivity (16%). PubMed was the most precise, followed closely by Scopus and WOS. The Google Scholar keyword search had low precision (54%) but provided the highest sensitivity (70%). Cited reference searches in all databases yielded moderate sensitivity (45–54%), but precision ranged from 35–75% with Scopus being the most precise. Conclusion Cited reference searches were more sensitive than keyword searches, making it a more comprehensive strategy to identify all studies that use a particular instrument. Keyword searches provide a quick way of finding some but not all relevant articles. Goals, time and resources should dictate the combination of which methods and databases are used. PMID:25554521

  17. ClassLess: A Comprehensive Database of Young Stellar Objects

    NASA Astrophysics Data System (ADS)

    Hillenbrand, Lynne A.; baliber, nairn

    2015-08-01

    We have designed and constructed a database intended to house catalog and literature-published measurements of Young Stellar Objects (YSOs) within ~1 kpc of the Sun. ClassLess, so called because it includes YSOs in all stages of evolution, is a relational database in which user interaction is conducted via HTML web browsers, queries are performed in scientific language, and all data are linked to the sources of publication. Each star is associated with a cluster (or clusters), and both spatially resolved and unresolved measurements are stored, allowing proper use of data from multiple star systems. With this fully searchable tool, myriad ground- and space-based instruments and surveys across wavelength regimes can be exploited. In addition to primary measurements, the database self consistently calculates and serves higher level data products such as extinction, luminosity, and mass. As a result, searches for young stars with specific physical characteristics can be completed with just a few mouse clicks. We are in the database population phase now, and are eager to engage with interested experts worldwide on local galactic star formation and young stellar populations.

  18. A geospatial database model for the management of remote sensing datasets at multiple spectral, spatial, and temporal scales

    NASA Astrophysics Data System (ADS)

    Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George

    2017-10-01

    In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.

  19. The Northern California Earthquake Management System: A Unified System From Realtime Monitoring to Data Distribution

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Dietz, L.; Lombard, P.; Klein, F.; Zuzlewski, S.; Kohler, W.; Hellweg, M.; Luetgert, J.; Oppenheimer, D.; Romanowicz, B.

    2006-12-01

    The longstanding cooperation between the USGS Menlo Park and UC Berkeley's Seismological Laboratory for monitoring earthquakes and providing data to the research community is achieving a new level of integration. While station support and data collection for each network (NC, BK, BP) remain the responsibilities of the host institution, picks, codas and amplitudes will be produced and shared between the data centers continuously. Thus, realtime earthquake processing from triggering and locating through magnitude and moment tensor calculation and Shakemap production will take place independently at both locations, improving the robustness of event reporting in the Northern California Earthquake Management Center. Parametric data will also be exchanged with the Southern California Earthquake Management System to allow statewide earthquake detection and processing for further redundancy within the California Integrated Seismic Network (CISN). The database plays an integral part in this system, providing the coordination for event processing as well as the repository for event, instrument (metadata) and waveform information. The same master database serves both realtime processing, data quality control and archival, and the data center which provides waveforms and earthquake data to users in the research community. Continuous waveforms from all BK, BP, and NC stations, event waveform gathers, and event information automatically become available at the Northern California Earthquake Data Center (NCEDC). Currently, the NCEDC is collecting and makes available over 4 TByes of data per year from the NCEMC stations and other seismic networks, as well as from GPS and and other geophysical instrumentation.

  20. A BRDF-BPDF database for the analysis of Earth target reflectances

    NASA Astrophysics Data System (ADS)

    Breon, Francois-Marie; Maignan, Fabienne

    2017-01-01

    Land surface reflectance is not isotropic. It varies with the observation geometry that is defined by the sun, view zenith angles, and the relative azimuth. In addition, the reflectance is linearly polarized. The reflectance anisotropy is quantified by the bidirectional reflectance distribution function (BRDF), while its polarization properties are defined by the bidirectional polarization distribution function (BPDF). The POLDER radiometer that flew onboard the PARASOL microsatellite remains the only space instrument that measured numerous samples of the BRDF and BPDF of Earth targets. Here, we describe a database of representative BRDFs and BPDFs derived from the POLDER measurements. From the huge number of data acquired by the spaceborne instrument over a period of 7 years, we selected a set of targets with high-quality observations. The selection aimed for a large number of observations, free of significant cloud or aerosol contamination, acquired in diverse observation geometries with a focus on the backscatter direction that shows the specific hot spot signature. The targets are sorted according to the 16-class International Geosphere-Biosphere Programme (IGBP) land cover classification system, and the target selection aims at a spatial representativeness within the class. The database thus provides a set of high-quality BRDF and BPDF samples that can be used to assess the typical variability of natural surface reflectances or to evaluate models. It is available freely from the PANGAEA website (doi:10.1594/PANGAEA.864090). In addition to the database, we provide a visualization and analysis tool based on the Interactive Data Language (IDL). It allows an interactive analysis of the measurements and a comparison against various BRDF and BPDF analytical models. The present paper describes the input data, the selection principles, the database format, and the analysis tool

  1. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.

  2. The database on transgenic luminescent microorganisms as an instrument of studying a microbial component of closed ecosystems

    NASA Astrophysics Data System (ADS)

    Boyandin, A. N.; Lankin, Y. P.; Kargatova, T. V.; Popova, L. Y.; Pechurkin, N. S.

    Luminescent transgenic microorganisms are widely used for study of microbial communities' functioning including closed ones. Bioluminescence is of high sensitive to effects of different environmental factors. Integration of lux-genes into different metabolic ways allows studying many aspects of microorganisms' life permitting to carry out measurements in situ. There is much information about applications of bioluminescent bacteria in different researches. But for effective using these data their summarizing and accumulation in common source is required. Therefore an information system on characteristics of transgenic microorganisms with cloned lux-genes was created. The database and client software related were developed. A database structure includes information on common characteristics of cloned lux-genes, their sources and properties, on regulation of gene expression in bacterial cells, on dependence of bioluminescence manifestation on biotic, abiotic and anthropogenic environmental factors. The database also can store description of changes in bacterial populations depending on environmental changes. The database created allows storing and using bibliographic information and also links to web sites of world collections of microorganisms. Internet publishing software permitting to open access to the database through the Internet is developed.

  3. High-Performance Secure Database Access Technologies for HEP Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysismore » capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure authorization is pushed into the database engine will eliminate inefficient data transfer bottlenecks. Furthermore, traditionally separated database and security layers provide an extra vulnerability, leaving a weak clear-text password authorization as the only protection on the database core systems. Due to the legacy limitations of the systems’ security models, the allowed passwords often can not even comply with the DOE password guideline requirements. We see an opportunity for the tight integration of the secure authorization layer with the database server engine resulting in both improved performance and improved security. Phase I has focused on the development of a proof-of-concept prototype using Argonne National Laboratory’s (ANL) Argonne Tandem-Linac Accelerator System (ATLAS) project as a test scenario. By developing a grid-security enabled version of the ATLAS project’s current relation database solution, MySQL, PIOCON Technologies aims to offer a more efficient solution to secure database access.« less

  4. Aviation Safety Issues Database

    NASA Technical Reports Server (NTRS)

    Morello, Samuel A.; Ricks, Wendell R.

    2009-01-01

    The aviation safety issues database was instrumental in the refinement and substantiation of the National Aviation Safety Strategic Plan (NASSP). The issues database is a comprehensive set of issues from an extremely broad base of aviation functions, personnel, and vehicle categories, both nationally and internationally. Several aviation safety stakeholders such as the Commercial Aviation Safety Team (CAST) have already used the database. This broader interest was the genesis to making the database publically accessible and writing this report.

  5. Systematic review of reusable versus disposable laparoscopic instruments: costs and safety.

    PubMed

    Siu, Joey; Hill, Andrew G; MacCormick, Andrew D

    2017-01-01

    The quality of instruments and surgical expertise in minimally invasive surgery has developed markedly in the last two decades. Attention is now being turned to ways to allow surgeons to adopt more cost-effective and environmental-friendly approaches. This review explores current evidence on the cost and environmental impact of reusable versus single-use instruments. In addition, we aim to compare their quality, functionality and associated clinical outcomes. The Medline and EMBASE databases were searched for relevant literature from January 2000 to May 2015. Subject headings were Equipment Reuse/, Disposable Equipment/, Cholecystectomy/, Laparoscopic/, Laparoscopy/, Surgical Instruments/, Medical Waste Disposal/, Waste Management/, Medical Waste/, Environmental Sustainability/ and Sterilization/. There are few objective comparative analyses between single-use versus reusable instruments. Current evidence suggests that limiting use of disposal instruments to necessity may hold both economical and environmental advantages. Theoretical advantages of single-use instruments in quality, safety, sterility, ease of use and importantly patient outcomes have rarely been examined. Cost-saving methods, environmental-friendly methods, global operative costs, hidden costs, sterilization methods and quality assurance systems vary greatly between studies making it difficult to gain an overview of the comparison between single-use and reusable instruments. Further examination of cost comparisons between disposable and reusable instruments is necessary while externalized environmental costs, instrument function and safety are also important to consider in future studies. © 2016 Royal Australasian College of Surgeons.

  6. Measuring Patients' Experience of Rehabilitation Services Across the Care Continuum. Part I: A Systematic Review of the Literature.

    PubMed

    McMurray, Josephine; McNeil, Heather; Lafortune, Claire; Black, Samantha; Prorok, Jeanette; Stolee, Paul

    2016-01-01

    To identify empirically tested survey instruments designed to measure patient experience across a rehabilitative care system. A comprehensive search was conducted of the MEDLINE (PubMed), CINAHL (EBSCO), and PsycINFO (APA PsycNET) databases from 2004 to 2014. Further searches were conducted in relevant journals and the reference lists of the final accepted articles. Of 2472 articles identified, 33 were selected for inclusion and analysis. Articles were excluded if they were unrelated to rehabilitative care, were anecdotal or descriptive reports, or had a veterinary, mental health, palliative care, dental, or pediatric focus. Four reviewers performed the screening process. Interrater reliability was confirmed through 2 rounds of title review (30 articles each) and 1 round of abstract review (10 articles), with an average κ score of .69. Data were extracted related to the instrument, study setting, and patient characteristics, including treated disease, type of rehabilitation (eg, occupational or physical therapy), methodology, sample size, and level of evidence. There were 25 discrete measurement instruments identified in the 33 articles evaluated. Seven of the instruments originated outside of the rehabilitative care sector, and only 1 measured service experience across the care continuum. As providers move to integrate rehabilitative care across the continuum from hospital to home, patients experience a system of care. Research is required to develop psychometrically tested instruments that measure patients' experience across a rehabilitative system. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  7. A System for Web-based Access to the HSOS Database

    NASA Astrophysics Data System (ADS)

    Lin, G.

    Huairou Solar Observing Station's (HSOS) magnetogram and dopplergram are world-class instruments. Access to their data has opened to the world. Web-based access to the data will provide a powerful, convenient tool for data searching and solar physics. It is necessary that our data be provided to users via the Web when it is opened to the world. In this presentation, the author describes general design and programming construction of the system. The system will be generated by PHP and MySQL. The author also introduces basic feature of PHP and MySQL.

  8. Oceans 2.0: Interactive tools for the Visualization of Multi-dimensional Ocean Sensor Data

    NASA Astrophysics Data System (ADS)

    Biffard, B.; Valenzuela, M.; Conley, P.; MacArthur, M.; Tredger, S.; Guillemot, E.; Pirenne, B.

    2016-12-01

    Ocean Networks Canada (ONC) operates ocean observatories on all three of Canada's coasts. The instruments produce 280 gigabytes of data per day with 1/2 petabyte archived so far. In 2015, 13 terabytes were downloaded by over 500 users from across the world. ONC's data management system is referred to as "Oceans 2.0" owing to its interactive, participative features. A key element of Oceans 2.0 is real time data acquisition and processing: custom device drivers implement the input-output protocol of each instrument. Automatic parsing and calibration takes place on the fly, followed by event detection and quality control. All raw data are stored in a file archive, while the processed data are copied to fast databases. Interactive access to processed data is provided through data download and visualization/quick look features that are adapted to diverse data types (scalar, acoustic, video, multi-dimensional, etc). Data may be post or re-processed to add features, analysis or correct errors, update calibrations, etc. A robust storage structure has been developed consisting of an extensive file system and a no-SQL database (Cassandra). Cassandra is a node-based open source distributed database management system. It is scalable and offers improved performance for big data. A key feature is data summarization. The system has also been integrated with web services and an ERDDAP OPeNDAP server, capable of serving scalar and multidimensional data from Cassandra for fixed or mobile devices.A complex data viewer has been developed making use of the big data capability to interactively display live or historic echo sounder and acoustic Doppler current profiler data, where users can scroll, apply processing filters and zoom through gigabytes of data with simple interactions. This new technology brings scientists one step closer to a comprehensive, web-based data analysis environment in which visual assessment, filtering, event detection and annotation can be integrated.

  9. A publication database for optical long baseline interferometry

    NASA Astrophysics Data System (ADS)

    Malbet, Fabien; Mella, Guillaume; Lawson, Peter; Taillifet, Esther; Lafrasse, Sylvain

    2010-07-01

    Optical long baseline interferometry is a technique that has generated almost 850 refereed papers to date. The targets span a large variety of objects from planetary systems to extragalactic studies and all branches of stellar physics. We have created a database hosted by the JMMC and connected to the Optical Long Baseline Interferometry Newsletter (OLBIN) web site using MySQL and a collection of XML or PHP scripts in order to store and classify these publications. Each entry is defined by its ADS bibcode, includes basic ADS informations and metadata. The metadata are specified by tags sorted in categories: interferometric facilities, instrumentation, wavelength of operation, spectral resolution, type of measurement, target type, and paper category, for example. The whole OLBIN publication list has been processed and we present how the database is organized and can be accessed. We use this tool to generate statistical plots of interest for the community in optical long baseline interferometry.

  10. Development of an autofluorescence spectral database for the identification and classification of microbial extremophiles

    NASA Astrophysics Data System (ADS)

    Davis, Justin; Howard, Hillari; Hoover, Richard B.; Sabanayagam, Chandran R.

    2010-09-01

    Extremophiles are microorganisms that have adapted to severe conditions that were once considered devoid of life. The extreme settings in which these organisms flourish on Earth resemble many extraterrestrial environments. Identification and classification of extremophiles in situ (without the requirement for excessive handling and processing) can provide a basis for designing remotely operated instruments for extraterrestrial life exploration. An important consideration when designing such experiments is to prevent contamination of the environments. We are developing a reference spectral database of autofluorescence from microbial extremophiles using long-UV excitation (408 nm). Aromatic compounds are essential components of living systems, and biological molecules such as aromatic amino acids, nucleotides, porphyrins and vitamins can also exhibit fluorescence under long-UV excitation conditions. Autofluorescence spectra were obtained from a light microscope that additionally allowed observations of microbial geometry and motility. It was observed that all extremophiles studied displayed an autofluorescence peak at around 470 nm, followed by a long decay that was species specific. The autofluorescence database can potentially be used as a reference to identify and classify past or present microbial life in our solar system.

  11. Development of an Autofluorescence Spectral Database for the Identification and Classification of Microbial Extremophiles

    NASA Technical Reports Server (NTRS)

    Sabanayagam, Chandran; Howard, Hillari; Hoover, Richard B.

    2010-01-01

    Extremophiles are microorganisms that have adapted to severe conditions that were once considered devoid of life. The extreme settings in which these organisms flourish on earth resemble many extraterrestrial environments. Identification and classification of extremophiles in situ (without the requirement for excessive handling and processing) can provide a basis for designing remotely operated instruments for extraterrestrial life exploration. An important consideration when designing such experiments is to prevent contamination of the environments. We are developing a reference spectral database of autofluorescence from microbial extremophiles using long-UV excitation (405 nm). Aromatic compounds are essential components of living systems, and biological molecules such as aromatic amino acids, nucleotides, porphyrins and vitamins can also exhibit fluorescence under long-UV excitation conditions. Autofluorescence spectra were obtained from a confocal microscope that additionally allowed observations of microbial geometry and motility. It was observed that all extremophiles studied displayed an autofluorescence peak at around 470 nm, followed by a long decay that was species specific. The autofluorescence database can potentially be used as a reference to identify and classify past or present microbial life in our solar system.

  12. [A basic research to share Fourier transform near-infrared spectrum information resource].

    PubMed

    Zhang, Lu-Da; Li, Jun-Hui; Zhao, Long-Lian; Zhao, Li-Li; Qin, Fang-Li; Yan, Yan-Lu

    2004-08-01

    A method to share the information resource in the database of Fourier transform near-infrared(FTNIR) spectrum information of agricultural products and utilize the spectrum information sufficiently is explored in this paper. Mapping spectrum information from one instrument to another is studied to express the spectrum information accurately between the instruments. Then mapping spectrum information is used to establish a mathematical model of quantitative analysis without including standard samples. The analysis result is that the relative coefficient r is 0.941 and the relative error is 3.28% between the model estimate values and the Kjeldahl's value for the protein content of twenty-two wheat samples, while the relative coefficient r is 0.963 and the relative error is 2.4% for the other model, which is established by using standard samples. It is shown that the spectrum information can be shared by using the mapping spectrum information. So it can be concluded that the spectrum information in one FTNIR spectrum information database can be transformed to another instrument's mapping spectrum information, which makes full use of the information resource in the database of FTNIR spectrum information to realize the resource sharing between different instruments.

  13. A case study in adaptable and reusable infrastructure at the Keck Observatory Archive: VO interfaces, moving targets, and more

    NASA Astrophysics Data System (ADS)

    Berriman, G. Bruce; Cohen, Richard W.; Colson, Andrew; Gelino, Christopher R.; Good, John C.; Kong, Mihseh; Laity, Anastasia C.; Mader, Jeffrey A.; Swain, Melanie A.; Tran, Hien D.; Wang, Shin-Ywan

    2016-08-01

    The Keck Observatory Archive (KOA) (https://koa.ipac.caltech.edu) curates all observations acquired at the W. M. Keck Observatory (WMKO) since it began operations in 1994, including data from eight active instruments and two decommissioned instruments. The archive is a collaboration between WMKO and the NASA Exoplanet Science Institute (NExScI). Since its inception in 2004, the science information system used at KOA has adopted an architectural approach that emphasizes software re-use and adaptability. This paper describes how KOA is currently leveraging and extending open source software components to develop new services and to support delivery of a complete set of instrument metadata, which will enable more sophisticated and extensive queries than currently possible. In August 2015, KOA deployed a program interface to discover public data from all instruments equipped with an imaging mode. The interface complies with version 2 of the Simple Imaging Access Protocol (SIAP), under development by the International Virtual Observatory Alliance (IVOA), which defines a standard mechanism for discovering images through spatial queries. The heart of the KOA service is an R-tree-based, database-indexing mechanism prototyped by the Virtual Astronomical Observatory (VAO) and further developed by the Montage Image Mosaic project, designed to provide fast access to large imaging data sets as a first step in creating wide-area image mosaics (such as mosaics of subsets of the 4.7 million images of the SDSS DR9 release). The KOA service uses the results of the spatial R-tree search to create an SQLite data database for further relational filtering. The service uses a JSON configuration file to describe the association between instrument parameters and the service query parameters, and to make it applicable beyond the Keck instruments. The images generated at the Keck telescope usually do not encode the image footprints as WCS fields in the FITS file headers. Because SIAP searches are spatial, much of the effort in developing the program interface involved processing the instrument and telescope parameters to understand how accurately we can derive the WCS information for each instrument. This knowledge is now being fed back into the KOA databases as part of a program to include complete metadata information for all imaging observations. The R-tree program was itself extended to support temporal (in addition to spatial) indexing, in response to requests from the planetary science community for a search engine to discover observations of Solar System objects. With this 3D-indexing scheme, the service performs very fast time and spatial matches between the target ephemerides, obtained from the JPL SPICE service. Our experiments indicate these matches can be more than 100 times faster than when separating temporal and spatial searches. Images of the tracks of the moving targets, overlaid with the image footprints, are computed with a new command-line visualization tool, mViewer, released with the Montage distribution. The service is currently in test and will be released in late summer 2016.

  14. The IAGOS information system

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Schultz, Martin; Brötz, Björn; Rauthe-Schöch, Armin; Thouret, Valérie

    2015-04-01

    IAGOS (In-service Aircraft for a Global Observing System) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. The IAGOS database (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data centre Ether (CNES and CNRS). In the framework of the IGAS project (IAGOS for Copernicus Atmospheric Service) interoperability with international portals or other databases is implemented in order to improve IAGOS data discovery. The IGAS data network is composed of three data centres: the IAGOS database in Toulouse including IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data since January 2015; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the MACC data centre in Jülich (http://join.iek.fz-juelich.de). The MACC (Monitoring Atmospheric Composition and Climate) project is a prominent user of the IGAS data network. In June 2015 a new version of the IAGOS database will be released providing improved services such as download in NetCDF or NASA Ames formats; graphical tools (maps, scatter plots, etc.); standardized metadata (ISO 19115) and a better users management. The link with the MACC data centre, through JOIN (Jülich OWS Interface), will allow to combine model outputs with IAGOS data for intercomparison. The interoperability within the IGAS data network, implemented thanks to many web services, will improve the functionalities of the web interfaces of each data centre.

  15. Methane and water spectroscopic database for TROPOMI/Sentinel-5 Precursor in the 2.3 μm region

    NASA Astrophysics Data System (ADS)

    Birk, Manfred; Wagner, Georg; Loos, Joep; Wilzewski, Jonas; Mondelain, Didier; Campargue, Alain; Hase, Frank; Orphal, Johannes; Perrin, Agnes; Tran, Ha; Daumont, Ludovic; Rotger-Languereau, Maud; Bigazzi, Alberto; Zehner, Claus

    2017-04-01

    The ESA project „SEOM-Improved Atmospheric Spectroscopy Databases (IAS)" will improve the spectroscopic database for retrieval of the data products CO, CH4, O3 and SO2 column amounts measured by the TROPOMI instrument (TROPOspheric Monitoring Instrument) aboard the Sentinel-5 Precursor. The project was launched in February 2014 with 3 years duration extended to 4 years recently. The spectroscopy of CO, CH4 and O3 in the 2.3 μm region is covered first while UV measurements of SO2 and UV/FIR/IR measurements of ozone will be carried out later. Measurements were mainly taken with a high resolution Fourier Transform spectrometer combined with a coolable multi reflection cell. Cavity ring down measurements served for validation. The analysis has been completed. A clear improvement can be seen when using the new data for CH4, H2O and CO retrieval from ground-based high resolution solar occultation measurements obtained with instrumentation in the TCCON and NDACC network.

  16. Physical Activity Measurement Instruments for Children with Cerebral Palsy: A Systematic Review

    ERIC Educational Resources Information Center

    Capio, Catherine M.; Sit, Cindy H. P.; Abernethy, Bruce; Rotor, Esmerita R.

    2010-01-01

    Aim: This paper is a systematic review of physical activity measurement instruments for field-based studies involving children with cerebral palsy (CP). Method: Database searches using PubMed Central, MEDLINE, CINAHL Plus, PsycINFO, EMBASE, Cochrane Library, and PEDro located 12 research papers, identifying seven instruments that met the inclusion…

  17. NASA Instrument Cost/Schedule Model

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  18. Lessons Learned From Developing Three Generations of Remote Sensing Science Data Processing Systems

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt; Fleig, Albert J.

    2005-01-01

    The Biospheric Information Systems Branch at NASA s Goddard Space Flight Center has developed three generations of Science Investigator-led Processing Systems for use with various remote sensing instruments. The first system is used for data from the MODIS instruments flown on NASA s Earth Observing Systems @OS) Terra and Aqua Spacecraft launched in 1999 and 2002 respectively. The second generation is for the Ozone Measuring Instrument flying on the EOS Aura spacecraft launched in 2004. We are now developing a third generation of the system for evaluation science data processing for the Ozone Mapping and Profiler Suite (OMPS) to be flown by the NPOESS Preparatory Project (NPP) in 2006. The initial system was based on large scale proprietary hardware, operating and database systems. The current OMI system and the OMPS system being developed are based on commodity hardware, the LINUX Operating System and on PostgreSQL, an Open Source RDBMS. The new system distributes its data archive across multiple server hosts and processes jobs on multiple processor boxes. We have created several instances of this system, including one for operational processing, one for testing and reprocessing and one for applications development and scientific analysis. Prior to receiving the first data from OMI we applied the system to reprocessing information from the Solar Backscatter Ultraviolet (SBUV) and Total Ozone Mapping Spectrometer (TOMS) instruments flown from 1978 until now. The system was able to process 25 years (108,000 orbits) of data and produce 800,000 files (400 GiB) of level 2 and level 3 products in less than a week. We will describe the lessons we have learned and tradeoffs between system design, hardware, operating systems, operational staffing, user support and operational procedures. During each generational phase, the system has become more generic and reusable. While the system is not currently shrink wrapped we believe it is to the point where it could be readily adopted, with substantial cost savings, for other similar tasks.

  19. A Review of Calibration Transfer Practices and Instrument Differences in Spectroscopy.

    PubMed

    Workman, Jerome J

    2018-03-01

    Calibration transfer for use with spectroscopic instruments, particularly for near-infrared, infrared, and Raman analysis, has been the subject of multiple articles, research papers, book chapters, and technical reviews. There has been a myriad of approaches published and claims made for resolving the problems associated with transferring calibrations; however, the capability of attaining identical results over time from two or more instruments using an identical calibration still eludes technologists. Calibration transfer, in a precise definition, refers to a series of analytical approaches or chemometric techniques used to attempt to apply a single spectral database, and the calibration model developed using that database, for two or more instruments, with statistically retained accuracy and precision. Ideally, one would develop a single calibration for any particular application, and move it indiscriminately across instruments and achieve identical analysis or prediction results. There are many technical aspects involved in such precision calibration transfer, related to the measuring instrument reproducibility and repeatability, the reference chemical values used for the calibration, the multivariate mathematics used for calibration, and sample presentation repeatability and reproducibility. Ideally, a multivariate model developed on a single instrument would provide a statistically identical analysis when used on other instruments following transfer. This paper reviews common calibration transfer techniques, mostly related to instrument differences, and the mathematics of the uncertainty between instruments when making spectroscopic measurements of identical samples. It does not specifically address calibration maintenance or reference laboratory differences.

  20. Monitoring of services with non-relational databases and map-reduce framework

    NASA Astrophysics Data System (ADS)

    Babik, M.; Souto, F.

    2012-12-01

    Service Availability Monitoring (SAM) is a well-established monitoring framework that performs regular measurements of the core site services and reports the corresponding availability and reliability of the Worldwide LHC Computing Grid (WLCG) infrastructure. One of the existing extensions of SAM is Site Wide Area Testing (SWAT), which gathers monitoring information from the worker nodes via instrumented jobs. This generates quite a lot of monitoring data to process, as there are several data points for every job and several million jobs are executed every day. The recent uptake of non-relational databases opens a new paradigm in the large-scale storage and distributed processing of systems with heavy read-write workloads. For SAM this brings new possibilities to improve its model, from performing aggregation of measurements to storing raw data and subsequent re-processing. Both SAM and SWAT are currently tuned to run at top performance, reaching some of the limits in storage and processing power of their existing Oracle relational database. We investigated the usability and performance of non-relational storage together with its distributed data processing capabilities. For this, several popular systems have been compared. In this contribution we describe our investigation of the existing non-relational databases suited for monitoring systems covering Cassandra, HBase and MongoDB. Further, we present our experiences in data modeling and prototyping map-reduce algorithms focusing on the extension of the already existing availability and reliability computations. Finally, possible future directions in this area are discussed, analyzing the current deficiencies of the existing Grid monitoring systems and proposing solutions to leverage the benefits of the non-relational databases to get more scalable and flexible frameworks.

  1. Columnar water vapor retrievals from multifilter rotating shadowband radiometer data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexandrov, Mikhail; Schmid, Beat; Turner, David D.

    2009-01-26

    The Multi-Filter Rotating Shadowband Radiometer (MFRSR) measures direct and diffuse irradiances in the visible and near IR spectral range. In addition to characteristics of atmospheric aerosols, MFRSR data also allow retrieval of precipitable water vapor (PWV) column amounts, which are determined from the direct normal irradiances in the 940 nm spectral channel. The HITRAN 2004 spectral database was used in our retrievals to model the water vapor absorption. We present a detailed error analysis describing the influence of uncertainties in instrument calibration and spectral response, as well as those in available spectral databases, on the retrieval results. The results ofmore » our PWV retrievals from the Southern Great Plains (SGP) site operated by the DOE Atmospheric Radiation Measurement (ARM) Program were compared with correlative standard measurements by Microwave Radiometers (MWRs) and a Global Positioning System (GPS) water vapor sensor, as well as with retrievals from other solar radiometers (AERONET’s CIMEL, AATS-6). Some of these data are routinely available at the SGP’s Central Facility, however, we also used measurements from a wider array of instrumentation deployed at this site during the Water Vapor Intensive Observation Period (WVIOP2000) in September – October 2000. The WVIOP data show better agreement between different solar radiometers or between different microwave radiometers (both groups showing relative biases within 4%) than between these two groups of instruments, with MWRs values being consistently higher (up to 14%) than those from solar instruments. We also demonstrate the feasibility of using MFRSR network data for creation of 2D datasets comparable with the MODIS satellite water vapor product.« less

  2. Database of episode-integrated solar energetic proton fluences

    NASA Astrophysics Data System (ADS)

    Robinson, Zachary D.; Adams, James H.; Xapsos, Michael A.; Stauffer, Craig A.

    2018-04-01

    A new database of proton episode-integrated fluences is described. This database contains data from two different instruments on multiple satellites. The data are from instruments on the Interplanetary Monitoring Platform-8 (IMP8) and the Geostationary Operational Environmental Satellites (GOES) series. A method to normalize one set of data to one another is presented to create a seamless database spanning 1973 to 2016. A discussion of some of the characteristics that episodes exhibit is presented, including episode duration and number of peaks. As an example of what can be understood about episodes, the July 4, 2012 episode is examined in detail. The coronal mass ejections and solar flares that caused many of the fluctuations of the proton flux seen at Earth are associated with peaks in the proton flux during this episode. The reasoning for each choice is laid out to provide a reference for how CME and solar flares associations are made.

  3. A simulated observation database to assess the impact of the IASI-NG hyperspectral infrared sounder

    NASA Astrophysics Data System (ADS)

    Andrey-Andrés, Javier; Fourrié, Nadia; Guidard, Vincent; Armante, Raymond; Brunel, Pascal; Crevoisier, Cyril; Tournier, Bernard

    2018-02-01

    The highly accurate measurements of the hyperspectral Infrared Atmospheric Sounding Interferometer (IASI) are used in numerical weather prediction (NWP), atmospheric chemistry and climate monitoring. As the second generation of the European Polar System (EPS-SG) is being developed, a new generation of IASI instruments has been designed to fly on board the MetOp-SG constellation: IASI New Generation (IASI-NG). In order to prepare the arrival of this new instrument, and to evaluate its impact on NWP and atmospheric chemistry applications, a set of IASI and IASI-NG simulated data was built and made available to the public to set a common framework for future impact studies. This paper describes the information available in this database and the procedure followed to run the IASI and IASI-NG simulations. These simulated data were evaluated by comparing IASI-NG to IASI observations. The result is also presented here. Additionally, preliminary impact studies of the benefit of IASI-NG compared to IASI on the retrieval of temperature and humidity in a NWP framework are also shown in the present work. With a channel dataset located in the same wave numbers for both instruments, we showed an improvement of the temperature retrievals throughout the atmosphere, with a maximum in the troposphere with IASI-NG and a lower benefit for the tropospheric humidity.

  4. Digital database of channel cross-section surveys, Mount St. Helens, Washington

    USGS Publications Warehouse

    Mosbrucker, Adam R.; Spicer, Kurt R.; Major, Jon J.; Saunders, Dennis R.; Christianson, Tami S.; Kingsbury, Cole G.

    2015-08-06

    Stream-channel cross-section survey data are a fundamental component to studies of fluvial geomorphology. Such data provide important parameters required by many open-channel flow models, sediment-transport equations, sediment-budget computations, and flood-hazard assessments. At Mount St. Helens, Washington, the long-term response of channels to the May 18, 1980, eruption, which dramatically altered the hydrogeomorphic regime of several drainages, is documented by an exceptional time series of repeat stream-channel cross-section surveys. More than 300 cross sections, most established shortly following the eruption, represent more than 100 kilometers of surveyed topography. Although selected cross sections have been published previously in print form, we present a comprehensive digital database that includes geospatial and tabular data. Furthermore, survey data are referenced to a common geographic projection and to common datums. Database design, maintenance, and data dissemination are accomplished through a geographic information system (GIS) platform, which integrates survey data acquired with theodolite, total station, and global navigation satellite system (GNSS) instrumentation. Users can interactively perform advanced queries and geospatial time-series analysis. An accuracy assessment provides users the ability to quantify uncertainty within these data. At the time of publication, this project is ongoing. Regular database updates are expected; users are advised to confirm they are using the latest version.

  5. National Databases with Information on College Students with Disabilities. NCCSD Research Brief. Volume 1, Issue 1

    ERIC Educational Resources Information Center

    Avellone, Lauren; Scott, Sally

    2017-01-01

    The purpose of this research brief was to identify and provide an overview of national databases containing information about college students with disabilities. Eleven instruments from federal and university-based sources were described. Databases reflect a variety of survey methods, respondents, definitions of disability, and research questions.…

  6. [A Terahertz Spectral Database Based on Browser/Server Technique].

    PubMed

    Zhang, Zhuo-yong; Song, Yue

    2015-09-01

    With the solution of key scientific and technical problems and development of instrumentation, the application of terahertz technology in various fields has been paid more and more attention. Owing to the unique characteristic advantages, terahertz technology has been showing a broad future in the fields of fast, non-damaging detections, as well as many other fields. Terahertz technology combined with other complementary methods can be used to cope with many difficult practical problems which could not be solved before. One of the critical points for further development of practical terahertz detection methods depends on a good and reliable terahertz spectral database. We developed a BS (browser/server) -based terahertz spectral database recently. We designed the main structure and main functions to fulfill practical requirements. The terahertz spectral database now includes more than 240 items, and the spectral information was collected based on three sources: (1) collection and citation from some other abroad terahertz spectral databases; (2) collected from published literatures; and (3) spectral data measured in our laboratory. The present paper introduced the basic structure and fundament functions of the terahertz spectral database developed in our laboratory. One of the key functions of this THz database is calculation of optical parameters. Some optical parameters including absorption coefficient, refractive index, etc. can be calculated based on the input THz time domain spectra. The other main functions and searching methods of the browser/server-based terahertz spectral database have been discussed. The database search system can provide users convenient functions including user registration, inquiry, displaying spectral figures and molecular structures, spectral matching, etc. The THz database system provides an on-line searching function for registered users. Registered users can compare the input THz spectrum with the spectra of database, according to the obtained correlation coefficient one can perform the searching task very fast and conveniently. Our terahertz spectral database can be accessed at http://www.teralibrary.com. The proposed terahertz spectral database is based on spectral information so far, and will be improved in the future. We hope this terahertz spectral database can provide users powerful, convenient, and high efficient functions, and could promote the broader applications of terahertz technology.

  7. DAS: A Data Management System for Instrument Tests and Operations

    NASA Astrophysics Data System (ADS)

    Frailis, M.; Sartor, S.; Zacchei, A.; Lodi, M.; Cirami, R.; Pasian, F.; Trifoglio, M.; Bulgarelli, A.; Gianotti, F.; Franceschi, E.; Nicastro, L.; Conforti, V.; Zoli, A.; Smart, R.; Morbidelli, R.; Dadina, M.

    2014-05-01

    The Data Access System (DAS) is a and data management software system, providing a reusable solution for the storage of data acquired both from telescopes and auxiliary data sources during the instrument development phases and operations. It is part of the Customizable Instrument WorkStation system (CIWS-FW), a framework for the storage, processing and quick-look at the data acquired from scientific instruments. The DAS provides a data access layer mainly targeted to software applications: quick-look displays, pre-processing pipelines and scientific workflows. It is logically organized in three main components: an intuitive and compact Data Definition Language (DAS DDL) in XML format, aimed for user-defined data types; an Application Programming Interface (DAS API), automatically adding classes and methods supporting the DDL data types, and providing an object-oriented query language; a data management component, which maps the metadata of the DDL data types in a relational Data Base Management System (DBMS), and stores the data in a shared (network) file system. With the DAS DDL, developers define the data model for a particular project, specifying for each data type the metadata attributes, the data format and layout (if applicable), and named references to related or aggregated data types. Together with the DDL user-defined data types, the DAS API acts as the only interface to store, query and retrieve the metadata and data in the DAS system, providing both an abstract interface and a data model specific one in C, C++ and Python. The mapping of metadata in the back-end database is automatic and supports several relational DBMSs, including MySQL, Oracle and PostgreSQL.

  8. Core data elements tracking elder sexual abuse.

    PubMed

    Hanrahan, Nancy P; Burgess, Ann W; Gerolamo, Angela M

    2005-05-01

    Sexual abuse in the older adult population is an understudied vector of violent crimes with significant physical and psychological consequences for victims and families. Research requires a theoretical framework that delineates core elements using a standardized instrument. To develop a conceptual framework and identify core data elements specific to the older adult population, clinical, administrative, and criminal experts were consulted using a nominal group method to revise an existing sexual assault instrument. The revised instrument could be used to establish a national database of elder sexual abuse. The database could become a standard reference to guide the detection, assessment, and prosecution of elder sexual abuse crimes as well as build a base from which policy makers could plan and evaluate interventions that targeted risk factors.

  9. a 3d GIS Method Applied to Cataloging and Restoring: the Case of Aurelian Walls at Rome

    NASA Astrophysics Data System (ADS)

    Canciani, M.; Ceniccola, V.; Messi, M.; Saccone, M.; Zampilli, M.

    2013-07-01

    The project involves architecture, archaeology, restoration, graphic documentation and computer imaging. The objective is development of a method for documentation of an architectural feature, based on a three-dimensional model obtained through laser scanning technologies, linked to a database developed in GIS environment. The case study concerns a short section of Rome's Aurelian walls, including the Porta Latina. The city walls are Rome's largest single architectural monument, subject to continuous deterioration, modification and maintenance since their original construction beginning in 271 AD. The documentation system provides a flexible, precise and easily-applied instrument for recording the full appearance, materials, stratification palimpsest and conservation status, in order to identify restoration criteria and intervention priorities, and to monitor and control the use and conservation of the walls over time. The project began with an analysis and documentation campaign integrating direct, traditional recording methods with indirect, topographic instrument and 3D laser scanning recording. These recording systems permitted development of a geographic information system based on three-dimensional modelling of separate, individual elements, linked to a database and related to the various stratigraphic horizons, the construction techniques, the component materials and their state of degradation. The investigations of the extant wall fabric were further compared to historic documentation, from both graphic and descriptive sources. The resulting model constitutes the core of the GIS system for this specific monument. The methodology is notable for its low cost, precision, practicality and thoroughness, and can be applied to the entire Aurelian wall and to other monuments.

  10. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-01-01

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified tomore » include process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  11. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-12-31

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I&C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified to includemore » process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  12. Mechanistic design data from ODOT instrumented pavement sites : phase II report.

    DOT National Transportation Integrated Search

    2017-03-01

    This investigation examined data obtained from three previously-instrumented pavement test sites in Oregon. Data processing algorithms and templates were developed for each test site that facilitated full processing of all the data to build databases...

  13. Mechanistic design data from ODOT instrumented pavement sites : phase 1 report.

    DOT National Transportation Integrated Search

    2017-03-01

    This investigation examined data obtained from three previously-instrumented pavement test sites in Oregon. Data processing algorithms and templates were developed for each test site that facilitated full processing of all the data to build databases...

  14. Influence of mechanical instruments on the biocompatibility of titanium dental implants surfaces: a systematic review.

    PubMed

    Louropoulou, Anna; Slot, Dagmar E; Van der Weijden, Fridus

    2015-07-01

    The objective of this systematic review was to evaluate the effect of mechanical instruments on the biocompatibility of titanium dental implant surfaces. MEDLINE, Cochrane-CENTRAL and EMBASE databases were searched up to December 2013, to identify controlled studies on the ability of cells to adhere and colonize non-contaminated and contaminated, smooth and rough, titanium surfaces after instrumentation with different mechanical instruments. A comprehensive search identified 1893 unique potential papers. Eleven studies met the inclusion criteria and were selected for this review. All studies were in vitro studies. Most studies used titanium discs, strips and cylinders. The air abrasive was the treatment mostly evaluated. The available studies had a high heterogeneity which precluded any statistical analysis of the data. Therefore, the conclusions are not based on quantitative data. Instrumentation seems to have a selective influence on the attachment of different cells. In the presence of contamination, plastic curettes, metal curettes, rotating titanium brushes and an ultrasonic scaling system with a carbon tip and polishing fluid seem to fail to restore the biocompatibility of rough titanium surfaces. The air-powder abrasive system with sodium bicarbonate powder does not seem to affect the fibroblast-titanium surface interaction after treatment of smooth or rough surfaces, even in the presence of contamination. The available data suggest that treatment with an air-powder abrasive system with sodium bicarbonate powder does not seem to adversely affect the biocompatibility of titanium dental implant surfaces. However, the clinical impact of these findings requires further clarification. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. The new IAGOS Database Portal

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Fontaine, Alain

    2016-04-01

    IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data. The IAGOS Database Portal (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). The new IAGOS Database Portal has been released in December 2015. The main improvement is the interoperability implementation with international portals or other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the CAMS data center in Jülich (http://join.iek.fz-juelich.de). The CAMS (Copernicus Atmospheric Monitoring Service) project is a prominent user of the IGAS data network. The new portal provides improved and new services such as the download in NetCDF or NASA Ames formats, plotting tools (maps, time series, vertical profiles, etc.) and user management. Added value products are available on the portal: back trajectories, origin of air masses, co-location with satellite data, etc. The link with the CAMS data center, through JOIN (Jülich OWS Interface), allows to combine model outputs with IAGOS data for inter-comparison. Finally IAGOS metadata has been standardized (ISO 19115) and now provides complete information about data traceability and quality.

  16. The Master Lens Database and The Orphan Lenses Project

    NASA Astrophysics Data System (ADS)

    Moustakas, Leonidas

    2012-10-01

    Strong gravitational lenses are uniquely suited for the study of dark matter structure and substructure within massive halos of many scales, act as gravitational telescopes for distant faint objects, and can give powerful and competitive cosmological constraints. While hundreds of strong lenses are known to date, spanning five orders of magnitude in mass scale, thousands will be identified this decade. To fully exploit the power of these objects presently, and in the near future, we are creating the Master Lens Database. This is a clearinghouse of all known strong lens systems, with a sophisticated and modern database of uniformly measured and derived observational and lens-model derived quantities, using archival Hubble data across several instruments. This Database enables new science that can be done with a comprehensive sample of strong lenses. The operational goal of this proposal is to develop the process and the code to semi-automatically stage Hubble data of each system, create appropriate masks of the lensing objects and lensing features, and derive gravitational lens models, to provide a uniform and fairly comprehensive information set that is ingested into the Database. The scientific goal for this team is to use the properties of the ensemble of lenses to make a new study of the internal structure of lensing galaxies, and to identify new objects that show evidence of strong substructure lensing, for follow-up study. All data, scripts, masks, model setup files, and derived parameters, will be public, and free. The Database will be accessible online and through a sophisticated smartphone application, which will also be free.

  17. Human Factors Considerations for Area Navigation Departure and Arrival Procedures

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Adams, Catherine A.

    2006-01-01

    Area navigation (RNAV) procedures are being implemented in the United States and around the world as part of a transition to a performance-based navigation system. These procedures are providing significant benefits and have also caused some human factors issues to emerge. Under sponsorship from the Federal Aviation Administration (FAA), the National Aeronautics and Space Administration (NASA) has undertaken a project to document RNAV-related human factors issues and propose areas for further consideration. The component focusing on RNAV Departure and Arrival Procedures involved discussions with expert users, a literature review, and a focused review of the NASA Aviation Safety Reporting System (ASRS) database. Issues were found to include aspects of air traffic control and airline procedures, aircraft systems, and procedure design. Major findings suggest the need for specific instrument procedure design guidelines that consider the effects of human performance. Ongoing industry and government activities to address air-ground communication terminology, design improvements, and chart-database commonality are strongly encouraged. A review of factors contributing to RNAV in-service errors would likely lead to improved system design and operational performance.

  18. The Young Visual Binary Database

    NASA Astrophysics Data System (ADS)

    Prato, Lisa A.; Avilez, Ian; Allen, Thomas; Zoonematkermani, Saeid; Biddle, Lauren; Muzzio, Ryan; Wittal, Matthew; Schaefer, Gail; Simon, Michal

    2017-01-01

    We have obtained adaptive optics imaging and high-resolution H-band and in some cases K-band spectra of each component in close to 100 young multiple systems in the nearby star forming regions of Taurus, Ophiuchus, TW Hya, and Orion. The binary separations for the pairs in our sample range from 30 mas to 3 arcseconds. The imaging and most of our spectra were obtained with instruments behind adaptive optics systems in order to resolve even the closest companions. We are in the process of determining fundamental stellar and circumstellar properties, such as effective temperature, Vsin(i), veiling, and radial velocity, for each component in the entire sample. The beta version of our database includes systems in the Taurus region and provides plots, downloadable ascii spectra, and values of the stellar and circumstellar properties for both stars in each system. This resource is openly available to the community at http://jumar.lowell.edu/BinaryStars/. In this poster we describe initial results from our analysis of the survey data. Support for this research was provided in part by NSF award AST-1313399 and by NASA Keck KPDA funding.

  19. The utilization of neural nets in populating an object-oriented database

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms (i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  20. The quantitative measurement of organizational culture in health care: a review of the available instruments.

    PubMed

    Scott, Tim; Mannion, Russell; Davies, Huw; Marshall, Martin

    2003-06-01

    To review the quantitative instruments available to health service researchers who want to measure culture and cultural change. A literature search was conducted using Medline, Cinahl, Helmis, Psychlit, Dhdata, and the database of the King's Fund in London for articles published up to June 2001, using the phrase "organizational culture." In addition, all citations and the gray literature were reviewed and advice was sought from experts in the field to identify instruments not found on the electronic databases. The search focused on instruments used to quantify culture with a track record, or potential for use, in health care settings. For each instrument we examined the cultural dimensions addressed, the number of items for each questionnaire, the measurement scale adopted, examples of studies that had used the tool, the scientific properties of the instrument, and its strengths and limitations. Thirteen instruments were found that satisfied our inclusion criteria, of which nine have a track record in studies involving health care organizations. The instruments varied considerably in terms of their grounding in theory, format, length, scope, and scientific properties. A range of instruments with differing characteristics are available to researchers interested in organizational culture, all of which have limitations in terms of their scope, ease of use, or scientific properties. The choice of instrument should be determined by how organizational culture is conceptualized by the research team, the purpose of the investigation, intended use of the results, and availability of resources.

  1. The COMPTEL Processing and Analysis Software system (COMPASS)

    NASA Astrophysics Data System (ADS)

    de Vries, C. P.; COMPTEL Collaboration

    The data analysis system of the gamma-ray Compton Telescope (COMPTEL) onboard the Compton-GRO spacecraft is described. A continous stream of data of the order of 1 kbytes per second is generated by the instrument. The data processing and analysis software is build around a relational database managment system (RDBMS) in order to be able to trace heritage and processing status of all data in the processing pipeline. Four institutes cooperate in this effort requiring procedures to keep local RDBMS contents identical between the sites and swift exchange of data using network facilities. Lately, there has been a gradual move of the system from central processing facilities towards clusters of workstations.

  2. Two centuries of French patents as documentation of musical instrument construction

    NASA Astrophysics Data System (ADS)

    Jean, Haury

    2005-09-01

    The French Patent Office I.N.P.I. has preserved the originals of ca. 12000 French patents filed between 1791 and present days that are concerned with music-related inventions. As an I.N.P.I. pilot project, these were identified, collected, and classified by the present author, and the actual database named ``Musique & Brevets'' is going to be expanded with English, American, and German material, bringing currently a knowledge base up to 1900. It is expected to be made available on an I.N.P.I. website. This is an unequaled initiative that covers all branches of musical instrument manufacture, mechanical musical instruments, early recording and reproducing of music, but also educational material and methods for printing music. There already exists a number of websites presenting inventions on musical instruments, but these are restricted to one particular instrument and its related patents. ``Musique & Brevets'' intends to be exhaustive and make links between patents filed in different countries at the same time. The paper will present the content of the database, the access to texts and drawings of the patents via specific links, and their importance for the study of history and construction of musical instruments.

  3. Portable and programmable clinical EOG diagnostic system.

    PubMed

    Chen, S C; Tsai, T T; Luo, C H

    2000-01-01

    Monitoring eye movements is clinically important in diagnosis of diseases of the central nervous system. Electrooculography (EOG) is one method of obtaining such records which uses skin electrodes, and utilizes the anterior posterior polarization of the eye. A new EOG diagnostic system has been developed that utilizes two off-the-shelf portable notebook computers, one projector and simple electronic hardware. It can be operated under Windows 95, 98, NT, and has significant advantages over any other similar equipment, including programmability, portability, improved safety and low cost. Especially, portability of the instrument is extremely important for acutely ill or handicapped patients. The purpose of this paper is to introduce the techniques of computer animation, data acquisition, real time analysis of measured data, and database management to implement a portable, programmable and inexpensive contacting EOG instrument. It is very convenient to replace the present expensive, inflexible and large-sized commercially available EOG instruments. A lot of interesting stimulation patterns for clinical application can be created easily in different shape, time sequence, and colour by programming in Delphi language. With the help of Winstar (a software package that is used to control I/O and interrupt functions of the computer under Windows 95, 98, NT), the I/O communication between two notebook computers and A/D interface module can be effectively programmed. In addition, the new EOG diagnostic system is battery operated and it has the advantages of low noise as well as isolation from electricity. Two kinds of EOG tests, pursuit and saccade, were performed on 20 normal subjects with this new portable and programmable instrument. Based on the test result, the performance of the new instrument is superior to the other commercially available instruments. In conclusion, we hope that it will be more convenient for doctors and researchers to do the clinical EOG diagnosis and basic medical science research by using this new creation.

  4. A Free Database of Auto-detected Full-sun Coronal Hole Maps

    NASA Astrophysics Data System (ADS)

    Caplan, R. M.; Downs, C.; Linker, J.

    2016-12-01

    We present a 4-yr (06/10/2010 to 08/18/14 at 6-hr cadence) database of full-sun synchronic EUV and coronal hole (CH) maps made available on a dedicated web site (http://www.predsci.com/chd). The maps are generated using STEREO/EUVI A&B 195Å and SDO/AIA 193Å images through an automated pipeline (Caplan et al, (2016) Ap.J. 823, 53).Specifically, the original data is preprocessed with PSF-deconvolution, a nonlinear limb-brightening correction, and a nonlinear inter-instrument intensity normalization. Coronal holes are then detected in the preprocessed images using a GPU-accelerated region growing segmentation algorithm. The final results from all three instruments are then merged and projected to form full-sun sine-latitude maps. All the software used in processing the maps is provided, which can easily be adapted for use with other instruments and channels. We describe the data pipeline and show examples from the database. We also detail recent CH-detection validation experiments using synthetic EUV emission images produced from global thermodynamic MHD simulations.

  5. Interactive Multi-Instrument Database of Solar Flares (IMIDSF)

    NASA Astrophysics Data System (ADS)

    Sadykov, Viacheslav M.; Nita, Gelu M.; Oria, Vincent; Kosovichev, Alexander G.

    2017-08-01

    Solar flares represent a complicated physical phenomenon observed in a broad range of the electromagnetic spectrum, from radiowaves to gamma-rays. For a complete understanding of the flares it is necessary to perform a combined multi-wavelength analysis using observations from many satellites and ground-based observatories. For efficient data search, integration of different flare lists and representation of observational data, we have developed the Interactive Multi-Instrument Database of Solar Flares (https://solarflare.njit.edu/). The web database is fully functional and allows the user to search for uniquely-identified flare events based on their physical descriptors and availability of observations of a particular set of instruments. Currently, data from three primary flare lists (GOES, RHESSI and HEK) and a variety of other event catalogs (Hinode, Fermi GBM, Konus-Wind, OVSA flare catalogs, CACTus CME catalog, Filament eruption catalog) and observing logs (IRIS and Nobeyama coverage), are integrated. An additional set of physical descriptors (temperature and emission measure) along with observing summary, data links and multi-wavelength light curves is provided for each flare event since January 2002. Results of an initial statistical analysis will be presented.

  6. The Development of Two Science Investigator-led Processing Systems (SIPS) for NASA's Earth Observation System (EOS)

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2004-01-01

    In 2001, NASA Goddard Space Flight Center's Laboratory for Terrestrial Physics started the construction of a science Investigator-led Processing System (SIPS) for processing data from the Ozone Monitoring Instrument (OMI) which will launch on the Aura platform in mid 2004. The Ozone Monitoring Instrument (OMI) is a contribution of the Netherlands Agency for Aerospace Programs (NIVR) in collaboration with the Finnish Meteorological Institute (FMI) to the Earth Observing System (EOS) Aura mission. It will continue the Total Ozone Monitoring System (TOMS) record for total ozone and other atmospheric parameters related to ozone chemistry and climate. OMI measurements will be highly synergistic with the other instruments on the EOS Aura platform. The LTP previously developed the Moderate Resolution Imaging Spectrometer (MODIS) Data Processing System (MODAPS), which has been in full operations since the launches of the Terra and Aqua spacecrafts in December, 1999 and May, 2002 respectively. During that time, it has continually evolved to better support the needs of the MODIS team. We now run multiple instances of the system managing faster than real time reprocessings of the data as well as continuing forward processing. The new OMI Data Processing System (OMIDAPS) was adapted from the MODAPS. It will ingest raw data from the satellite ground station and process it to produce calibrated, geolocated higher level data products. These data products will be transmitted to the Goddard Distributed Active Archive Center (GDAAC) instance of the Earth Observing System (EOS) Data and Information System (EOSDIS) for long term archive and distribution to the public. The OMIDAPS will also provide data distribution to the OMI Science Team for quality assessment, algorithm improvement, calibration, etc. We have taken advantage of lessons learned from the MODIS experience and software already developed for MODIS. We made some changes in the hardware system organization, database and software to adapt the system for OMI. We replaced the fundamental database system, Sybase, with an Open Source RDBMS called PostgreSQL, and based the entire OMIDAPS on a cluster of Linux based commodity computers rather than the large SGI servers that MODAPS uses. Rather than relying on a central I/O server host, the new system distributes its data archive among multiple server hosts in the cluster. OMI is also customizing the graphical user interfaces and reporting structure to more closely meet the needs of the OMI Science Team. Prior to 2003, simulated OMI data and the science algorithms were not ready for production testing. We initially constructed a prototype system and tested using a 25 year dataset of Total Ozone Mapping Spectrometer (TOMS) and Solar Backscatter Ultraviolet Instrument (SBUV) data. This prototype system provided a platform to support the adaptation of the algorithms for OMI, and provided reprocessing of the historical data aiding in its analysis. In a recent reanalysis of the TOMS data, the OMIDAPS processed 108,000 full orbits of data through 4 processing steps per orbit, producing about 800,000 files (400 GiB) of level 2 and greater data files. More recently we have installed two instances of the OMIDAPS for integration and testing of OM1 science processes as they get delivered from the Science Team. A Test instance of the OMIDAPS has also supported a series of "Interface Confidence Tests" (ICTs) and End-to-End Ground System tests to ensure the launch readiness of the system. This paper will discuss the high-level hardware, software, and database organization of the OMIDAPS and how it builds on the MODAPS heritage system. It will also provide an overview of the testing and implementation of the production OMIDAPS.

  7. Health-Related Resource-Use Measurement Instruments for Intersectoral Costs and Benefits in the Education and Criminal Justice Sectors.

    PubMed

    Mayer, Susanne; Paulus, Aggie T G; Łaszewska, Agata; Simon, Judit; Drost, Ruben M W A; Ruwaard, Dirk; Evers, Silvia M A A

    2017-09-01

    Intersectoral costs and benefits (ICBs), i.e. costs and benefits of healthcare interventions outside the healthcare sector, can be a crucial component in economic evaluations from the societal perspective. Pivotal to their estimation is the existence of sound resource-use measurement (RUM) instruments; however, RUM instruments for ICBs in the education or criminal justice sectors have not yet been systematically collated or their psychometric quality assessed. This review aims to fill this gap. To identify relevant instruments, the Database of Instruments for Resource Use Measurement (DIRUM) was searched. Additionally, a systematic literature review was conducted in seven electronic databases to detect instruments containing ICB items used in economic evaluations. Finally, studies evaluating the psychometric quality of these instruments were searched. Twenty-six unique instruments were included. Most frequently, ICB items measured school absenteeism, tutoring, classroom assistance or contacts with legal representatives, police custody/prison detainment and court appearances, with the highest number of items listed in the Client Service Receipt Inventory/Client Sociodemographic and Service Receipt Inventory/Client Service Receipt Inventory-Children's Version (CSRI/CSSRI/CSRI-C), Studying the Scope of Parental Expenditures (SCOPE) and Self-Harm Intervention, Family Therapy (SHIFT) instruments. ICBs in the education sector were especially relevant for age-related developmental disorders and chronic diseases, while criminal justice resource use seems more important in mental health, including alcohol-related disorders or substance abuse. Evidence on the validity or reliability of ICB items was published for two instruments only. With a heterogeneous variety of ICBs found to be relevant for several disease areas but many ICB instruments applied in one study only (21/26 instruments), setting-up an international task force to, for example, develop an internationally adaptable instrument is recommended.

  8. A review of instruments to measure interprofessional team-based primary care.

    PubMed

    Shoemaker, Sarah J; Parchman, Michael L; Fuda, Kathleen Kerwin; Schaefer, Judith; Levin, Jessica; Hunt, Meaghan; Ricciardi, Richard

    2016-07-01

    Interprofessional team-based care is increasingly regarded as an important feature of delivery systems redesigned to provide more efficient and higher quality care, including primary care. Measurement of the functioning of such teams might enable improvement of team effectiveness and could facilitate research on team-based primary care. Our aims were to develop a conceptual framework of high-functioning primary care teams to identify and review instruments that measure the constructs identified in the framework, and to create a searchable, web-based atlas of such instruments (available at: http://primarycaremeasures.ahrq.gov/team-based-care/ ). Our conceptual framework was developed from existing frameworks, the teamwork literature, and expert input. The framework is based on an Input-Mediator-Output model and includes 12 constructs to which we mapped both instruments as a whole, and individual instrument items. Instruments were also reviewed for relevance to measuring team-based care, and characterized. Instruments were identified from peer-reviewed and grey literature, measure databases, and expert input. From nearly 200 instruments initially identified, we found 48 to be relevant to measuring team-based primary care. The majority of instruments were surveys (n = 44), and the remainder (n = 4) were observational checklists. Most instruments had been developed/tested in healthcare settings (n = 30) and addressed multiple constructs, most commonly communication (n = 42), heedful interrelating (n = 42), respectful interactions (n = 40), and shared explicit goals (n = 37). The majority of instruments had some reliability testing (n = 39) and over half included validity testing (n = 29). Currently available instruments offer promise to researchers and practitioners to assess teams' performance, but additional work is needed to adapt these instruments for primary care settings.

  9. The importance and complexity of regret in the measurement of ‘good’ decisions: a systematic review and a content analysis of existing assessment instruments

    PubMed Central

    Joseph‐Williams, Natalie; Edwards, Adrian; Elwyn, Glyn

    2011-01-01

    Abstract Background or context  Regret is a common consequence of decisions, including those decisions related to individuals’ health. Several assessment instruments have been developed that attempt to measure decision regret. However, recent research has highlighted the complexity of regret. Given its relevance to shared decision making, it is important to understand its conceptualization and the instruments used to measure it. Objectives  To review current conceptions of regret. To systematically identify instruments used to measure decision regret and assess whether they capture recent conceptualizations of regret. Search strategy  Five electronic databases were searched in 2008. Search strategies used a combination of MeSH terms (or database equivalent) and free text searching under the following key headings: ‘Decision’ and ‘regret’ and ‘measurement’. Follow‐up manual searches were also performed. Inclusion criteria  Articles were included if they reported the development and psychometric testing of an instrument designed to measure decision regret, or the use of a previously developed and tested instrument. Main results  Thirty‐two articles were included: 10 report the development and validation of an instrument that measures decision regret and 22 report the use of a previously developed and tested instrument. Content analysis found that existing instruments for the measurement of regret do not capture current conceptualizations of regret and they do not enable the construct of regret to be measured comprehensively. Conclusions  Existing instrumentation requires further development. There is also a need to clarify the purpose for using regret assessment instruments as this will, and should, focus their future application. PMID:20860776

  10. Analytical characterization of a new mobile X-ray fluorescence and X-ray diffraction instrument combined with a pigment identification case study

    NASA Astrophysics Data System (ADS)

    Van de Voorde, Lien; Vekemans, Bart; Verhaeven, Eddy; Tack, Pieter; De Wolf, Robin; Garrevoet, Jan; Vandenabeele, Peter; Vincze, Laszlo

    2015-08-01

    A new, commercially available, mobile system combining X-ray diffraction and X-ray fluorescence has been evaluated which enables both elemental analysis and phase identification simultaneously. The instrument makes use of a copper or molybdenum based miniature X-ray tube and a silicon-Pin diode energy-dispersive detector to count the photons originating from the samples. The X-ray tube and detector are both mounted on an X-ray diffraction protractor in a Bragg-Brentano θ:θ geometry. The mobile instrument is one of the lightest and most compact instruments of its kind (3.5 kg) and it is thus very useful for in situ purposes such as the direct (non-destructive) analysis of cultural heritage objects which need to be analyzed on site without any displacement. The supplied software allows both the operation of the instrument for data collection and in-depth data analysis using the International Centre for Diffraction Data database. This paper focuses on the characterization of the instrument, combined with a case study on pigment identification and an illustrative example for the analysis of lead alloyed printing letters. The results show that this commercially available light-weight instrument is able to identify the main crystalline phases non-destructively, present in a variety of samples, with a high degree of flexibility regarding sample size and position.

  11. [The Brazilian Hospital Information System and the acute myocardial infarction hospital care].

    PubMed

    Escosteguy, Claudia Caminha; Portela, Margareth Crisóstomo; Medronho, Roberto de Andrade; de Vasconcellos, Maurício Teixeira Leite

    2002-08-01

    To analyze the applicability of the Brazilian Unified Health System's national hospital database to evaluate the quality of acute myocardial infarction hospital care. It was evaluated 1,936 hospital admission forms having acute myocardial infarction (AMI) as primary diagnosis in the municipal district of Rio de Janeiro, Brazil, in 1997. Data was collected from the national hospital database. A stratified random sampling of 391 medical records was also evaluated. AMI diagnosis agreement followed the literature criteria. Variable accuracy analysis was performed using kappa index agreement. The quality of AMI diagnosis registered in hospital admission forms was satisfactory according to the gold standard of the literature. In general, the accuracy of the variables demographics (sex, age group), process (medical procedures and interventions), and outcome (hospital death) was satisfactory. The accuracy of demographics and outcome variables was higher than the one of process variables. Under registration of secondary diagnosis was high in the forms and it was the main limiting factor. Given the study findings and the widespread availability of the national hospital database, it is pertinent its use as an instrument in the evaluation of the quality of AMI medical care.

  12. Interactive Multi-Instrument Database of Solar Flares

    NASA Technical Reports Server (NTRS)

    Ranjan, Shubha S.; Spaulding, Ryan; Deardorff, Donald G.

    2018-01-01

    The fundamental motivation of the project is that the scientific output of solar research can be greatly enhanced by better exploitation of the existing solar/heliosphere space-data products jointly with ground-based observations. Our primary focus is on developing a specific innovative methodology based on recent advances in "big data" intelligent databases applied to the growing amount of high-spatial and multi-wavelength resolution, high-cadence data from NASA's missions and supporting ground-based observatories. Our flare database is not simply a manually searchable time-based catalog of events or list of web links pointing to data. It is a preprocessed metadata repository enabling fast search and automatic identification of all recorded flares sharing a specifiable set of characteristics, features, and parameters. The result is a new and unique database of solar flares and data search and classification tools for the Heliophysics community, enabling multi-instrument/multi-wavelength investigations of flare physics and supporting further development of flare-prediction methodologies.

  13. The quest for the perfect gravity anomaly: Part 2 - Mass effects and anomaly inversion

    USGS Publications Warehouse

    Keller, Gordon R.; Hildenbrand, T.G.; Hinze, W. J.; Li, X.; Ravat, D.; Webring, M.

    2006-01-01

    Gravity anomalies have become an important tool for geologic studies since the widespread use of high-precision gravimeters after the Second World War. More recently the development of instrumentation for airborne gravity observations, procedures for acquiring data from satellite platforms, the readily available Global Positioning System for precise vertical and horizontal control, improved global data bases, and enhancement of computational hardware and software have accelerated the use of the gravity method. As a result, efforts are being made to improve the gravity databases that are made available to the geoscience community by broadening their observational holdings and increasing the accuracy and precision of the included data. Currently the North American Gravity Database as well as the individual databases of Canada, Mexico, and the United States of America are being revised using new formats and standards. The objective of this paper is to describe the use of the revised standards for gravity data processing and modeling and there impact on geological interpretations. ?? 2005 Society of Exploration Geophysicists.

  14. Development of a GIService based on spatial data mining for location choice of convenience stores in Taipei City

    NASA Astrophysics Data System (ADS)

    Jung, Chinte; Sun, Chih-Hong

    2006-10-01

    Motivated by the increasing accessibility of technology, more and more spatial data are being made digitally available. How to extract the valuable knowledge from these large (spatial) databases is becoming increasingly important to businesses, as well. It is essential to be able to analyze and utilize these large datasets, convert them into useful knowledge, and transmit them through GIS-enabled instruments and the Internet, conveying the key information to business decision-makers effectively and benefiting business entities. In this research, we combine the techniques of GIS, spatial decision support system (SDSS), spatial data mining (SDM), and ArcGIS Server to achieve the following goals: (1) integrate databases from spatial and non-spatial datasets about the locations of businesses in Taipei, Taiwan; (2) use the association rules, one of the SDM methods, to extract the knowledge from the integrated databases; and (3) develop a Web-based SDSS GIService as a location-selection tool for business by the product of ArcGIS Server.

  15. Pivotal role of computers and software in mass spectrometry - SEQUEST and 20 years of tandem MS database searching.

    PubMed

    Yates, John R

    2015-11-01

    Advances in computer technology and software have driven developments in mass spectrometry over the last 50 years. Computers and software have been impactful in three areas: the automation of difficult calculations to aid interpretation, the collection of data and control of instruments, and data interpretation. As the power of computers has grown, so too has the utility and impact on mass spectrometers and their capabilities. This has been particularly evident in the use of tandem mass spectrometry data to search protein and nucleotide sequence databases to identify peptide and protein sequences. This capability has driven the development of many new approaches to study biological systems, including the use of "bottom-up shotgun proteomics" to directly analyze protein mixtures. Graphical Abstract ᅟ.

  16. Integration of a synthetic vision system with airborne laser range scanner-based terrain referenced navigation for precision approach guidance

    NASA Astrophysics Data System (ADS)

    Uijt de Haag, Maarten; Campbell, Jacob; van Graas, Frank

    2005-05-01

    Synthetic Vision Systems (SVS) provide pilots with a virtual visual depiction of the external environment. When using SVS for aircraft precision approach guidance systems accurate positioning relative to the runway with a high level of integrity is required. Precision approach guidance systems in use today require ground-based electronic navigation components with at least one installation at each airport, and in many cases multiple installations to service approaches to all qualifying runways. A terrain-referenced approach guidance system is envisioned to provide precision guidance to an aircraft without the use of ground-based electronic navigation components installed at the airport. This autonomy makes it a good candidate for integration with an SVS. At the Ohio University Avionics Engineering Center (AEC), work has been underway in the development of such a terrain referenced navigation system. When used in conjunction with an Inertial Measurement Unit (IMU) and a high accuracy/resolution terrain database, this terrain referenced navigation system can provide navigation and guidance information to the pilot on a SVS or conventional instruments. The terrain referenced navigation system, under development at AEC, operates on similar principles as other terrain navigation systems: a ground sensing sensor (in this case an airborne laser scanner) gathers range measurements to the terrain; this data is then matched in some fashion with an onboard terrain database to find the most likely position solution and used to update an inertial sensor-based navigator. AEC's system design differs from today's common terrain navigators in its use of a high resolution terrain database (~1 meter post spacing) in conjunction with an airborne laser scanner which is capable of providing tens of thousands independent terrain elevation measurements per second with centimeter-level accuracies. When combined with data from an inertial navigator the high resolution terrain database and laser scanner system is capable of providing near meter-level horizontal and vertical position estimates. Furthermore, the system under development capitalizes on 1) The position and integrity benefits provided by the Wide Area Augmentation System (WAAS) to reduce the initial search space size and; 2) The availability of high accuracy/resolution databases. This paper presents results from flight tests where the terrain reference navigator is used to provide guidance cues for a precision approach.

  17. The Quantitative Measurement of Organizational Culture in Health Care: A Review of the Available Instruments

    PubMed Central

    Scott, Tim; Mannion, Russell; Davies, Huw; Marshall, Martin

    2003-01-01

    Objective To review the quantitative instruments available to health service researchers who want to measure culture and cultural change. Data Sources A literature search was conducted using Medline, Cinahl, Helmis, Psychlit, Dhdata, and the database of the King's Fund in London for articles published up to June 2001, using the phrase “organizational culture.” In addition, all citations and the gray literature were reviewed and advice was sought from experts in the field to identify instruments not found on the electronic databases. The search focused on instruments used to quantify culture with a track record, or potential for use, in health care settings. Data Extraction For each instrument we examined the cultural dimensions addressed, the number of items for each questionnaire, the measurement scale adopted, examples of studies that had used the tool, the scientific properties of the instrument, and its strengths and limitations. Principal Findings Thirteen instruments were found that satisfied our inclusion criteria, of which nine have a track record in studies involving health care organizations. The instruments varied considerably in terms of their grounding in theory, format, length, scope, and scientific properties. Conclusions A range of instruments with differing characteristics are available to researchers interested in organizational culture, all of which have limitations in terms of their scope, ease of use, or scientific properties. The choice of instrument should be determined by how organizational culture is conceptualized by the research team, the purpose of the investigation, intended use of the results, and availability of resources. PMID:12822919

  18. Physics through the 1990s: Scientific interfaces and technological applications

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The volume examines the scientific interfaces and technological applications of physics. Twelve areas are dealt with: biological physics-biophysics, the brain, and theoretical biology; the physics-chemistry interface-instrumentation, surfaces, neutron and synchrotron radiation, polymers, organic electronic materials; materials science; geophysics-tectonics, the atmosphere and oceans, planets, drilling and seismic exploration, and remote sensing; computational physics-complex systems and applications in basic research; mathematics-field theory and chaos; microelectronics-integrated circuits, miniaturization, future trends; optical information technologies-fiber optics and photonics; instrumentation; physics applications to energy needs and the environment; national security-devices, weapons, and arms control; medical physics-radiology, ultrasonics, MNR, and photonics. An executive summary and many chapters contain recommendations regarding funding, education, industry participation, small-group university research and large facility programs, government agency programs, and computer database needs.

  19. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    NASA Technical Reports Server (NTRS)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  20. Pentaerythritol Tetranitrate (PETN) Surveillance by HPLC-MS: Instrumental Parameters Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, C A; Meissner, R

    Surveillance of PETN Homologs in the stockpile here at LLNL is currently carried out by high performance liquid chromatography (HPLC) with ultra violet (UV) detection. Identification of unknown chromatographic peaks with this detection scheme is severely limited. The design agency is aware of the limitations of this methodology and ordered this study to develop instrumental parameters for the use of a currently owned mass spectrometer (MS) as the detection system. The resulting procedure would be a ''drop-in'' replacement for the current surveillance method (ERD04-524). The addition of quadrupole mass spectrometry provides qualitative identification of PETN and its homologs (Petrin, DiPEHN,more » TriPEON, and TetraPEDN) using a LLNL generated database, while providing mass clues to the identity of unknown chromatographic peaks.« less

  1. Nuclear Energy Infrastructure Database Description and User’s Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heidrich, Brenden

    In 2014, the Deputy Assistant Secretary for Science and Technology Innovation initiated the Nuclear Energy (NE)–Infrastructure Management Project by tasking the Nuclear Science User Facilities, formerly the Advanced Test Reactor National Scientific User Facility, to create a searchable and interactive database of all pertinent NE-supported and -related infrastructure. This database, known as the Nuclear Energy Infrastructure Database (NEID), is used for analyses to establish needs, redundancies, efficiencies, distributions, etc., to best understand the utility of NE’s infrastructure and inform the content of infrastructure calls. The Nuclear Science User Facilities developed the database by utilizing data and policy direction from amore » variety of reports from the U.S. Department of Energy, the National Research Council, the International Atomic Energy Agency, and various other federal and civilian resources. The NEID currently contains data on 802 research and development instruments housed in 377 facilities at 84 institutions in the United States and abroad. The effort to maintain and expand the database is ongoing. Detailed information on many facilities must be gathered from associated institutions and added to complete the database. The data must be validated and kept current to capture facility and instrumentation status as well as to cover new acquisitions and retirements. This document provides a short tutorial on the navigation of the NEID web portal at NSUF-Infrastructure.INL.gov.« less

  2. Database for earthquake strong motion studies in Italy

    USGS Publications Warehouse

    Scasserra, G.; Stewart, J.P.; Kayen, R.E.; Lanzo, G.

    2009-01-01

    We describe an Italian database of strong ground motion recordings and databanks delineating conditions at the instrument sites and characteristics of the seismic sources. The strong motion database consists of 247 corrected recordings from 89 earthquakes and 101 recording stations. Uncorrected recordings were drawn from public web sites and processed on a record-by-record basis using a procedure utilized in the Next-Generation Attenuation (NGA) project to remove instrument resonances, minimize noise effects through low- and high-pass filtering, and baseline correction. The number of available uncorrected recordings was reduced by 52% (mostly because of s-triggers) to arrive at the 247 recordings in the database. The site databank includes for every recording site the surface geology, a measurement or estimate of average shear wave velocity in the upper 30 m (Vs30), and information on instrument housing. Of the 89 sites, 39 have on-site velocity measurements (17 of which were performed as part of this study using SASW techniques). For remaining sites, we estimate Vs30 based on measurements on similar geologic conditions where available. Where no local velocity measurements are available, correlations with surface geology are used. Source parameters are drawn from databanks maintained (and recently updated) by Istituto Nazionale di Geofisica e Vulcanologia and include hypocenter location and magnitude for small events (M< ??? 5.5) and finite source parameters for larger events. ?? 2009 A.S. Elnashai & N.N. Ambraseys.

  3. WFIRST: Update on the Coronagraph Science Requirements

    NASA Astrophysics Data System (ADS)

    Douglas, Ewan S.; Cahoy, Kerri; Carlton, Ashley; Macintosh, Bruce; Turnbull, Margaret; Kasdin, Jeremy; WFIRST Coronagraph Science Investigation Teams

    2018-01-01

    The WFIRST Coronagraph instrument (CGI) will enable direct imaging and low resolution spectroscopy of exoplanets in reflected light and imaging polarimetry of circumstellar disks. The CGI science investigation teams were tasked with developing a set of science requirements which advance our knowledge of exoplanet occurrence and atmospheric composition, as well as the composition and morphology of exozodiacal debris disks, cold Kuiper Belt analogs, and protoplanetary systems. We present the initial content, rationales, validation, and verification plans for the WFIRST CGI, informed by detailed and still-evolving instrument and observatory performance models. We also discuss our approach to the requirements development and management process, including the collection and organization of science inputs, open source approach to managing the requirements database, and the range of models used for requirements validation. These tools can be applied to requirements development processes for other astrophysical space missions, and may ease their management and maintenance. These WFIRST CGI science requirements allow the community to learn about and provide insights and feedback on the expected instrument performance and science return.

  4. Monitoring industrial wastewater by online GC-MS with direct aqueous injection.

    PubMed

    Wortberg, M; Ziemer, W; Kugel, M; Müller, H; Neu, H-J

    2006-03-01

    An online GC-MS-system for automated monitoring of crude wastewater at a complex chemical production site is presented. The modular system is, in principal, based on commercial equipment, but utilizes a special, two-stage injector, which consists of a splitless vaporization chamber on top of a PTV injector filled with Tenax. This set-up enables direct injection of wastewater. Almost 140 volatile and semi-volatile compounds are calibrated down to 1 mg L(-1), which is sufficient for analysis of the influent of the wastewater-treatment plant. Two instruments analyze alternately, every 20 min, and the instrument cycle time is 40 min. The quantitative results are transferred to a database which is connected to a process-control system. Depending on the nature and concentration of a compound, an alarm can be generated and the wastewater stream can be diverted into an "off spec tank" if necessary. The GC-MS-system operates quasi-continuously with a system availability >98%. Data quality is automatically controlled in each run and by daily analysis of a quality-control sample. The development of a novel stacked PTV-PTV injector design to expand the range of analytes to selected basic compounds is described.

  5. Final Report: MaRSPlus Sensor System Electrical Cable Management and Distributed Motor Control Computer Interface

    NASA Technical Reports Server (NTRS)

    Reil, Robin

    2011-01-01

    The success of JPL's Next Generation Imaging Spectrometer (NGIS) in Earth remote sensing has inspired a follow-on instrument project, the MaRSPlus Sensor System (MSS). One of JPL's responsibilities in the MSS project involves updating the documentation from the previous JPL airborne imagers to provide all the information necessary for an outside customer to operate the instrument independently. As part of this documentation update, I created detailed electrical cabling diagrams to provide JPL technicians with clear and concise build instructions and a database to track the status of cables from order to build to delivery. Simultaneously, a distributed motor control system is being developed for potential use on the proposed 2018 Mars rover mission. This system would significantly reduce the mass necessary for rover motor control, making more mass space available to other important spacecraft systems. The current stage of the project consists of a desktop computer talking to a single "cold box" unit containing the electronics to drive a motor. In order to test the electronics, I developed a graphical user interface (GUI) using MATLAB to allow a user to send simple commands to the cold box and display the responses received in a user-friendly format.

  6. Producing a Climate-Quality Database of Global Upper Ocean Profile Temperatures - The IQuOD (International Quality-controlled Ocean Database) Project.

    NASA Astrophysics Data System (ADS)

    Sprintall, J.; Cowley, R.; Palmer, M. D.; Domingues, C. M.; Suzuki, T.; Ishii, M.; Boyer, T.; Goni, G. J.; Gouretski, V. V.; Macdonald, A. M.; Thresher, A.; Good, S. A.; Diggs, S. C.

    2016-02-01

    Historical ocean temperature profile observations provide a critical element for a host of ocean and climate research activities. These include providing initial conditions for seasonal-to-decadal prediction systems, evaluating past variations in sea level and Earth's energy imbalance, ocean state estimation for studying variability and change, and climate model evaluation and development. The International Quality controlled Ocean Database (IQuOD) initiative represents a community effort to create the most globally complete temperature profile dataset, with (intelligent) metadata and assigned uncertainties. With an internationally coordinated effort organized by oceanographers, with data and ocean instrumentation expertise, and in close consultation with end users (e.g., climate modelers), the IQuOD initiative will assess and maximize the potential of an irreplaceable collection of ocean temperature observations (tens of millions of profiles collected at a cost of tens of billions of dollars, since 1772) to fulfil the demand for a climate-quality global database that can be used with greater confidence in a vast range of climate change related research and services of societal benefit. Progress towards version 1 of the IQuOD database, ongoing and future work will be presented. More information on IQuOD is available at www.iquod.org.

  7. Developing and Refining the Taiwan Birth Cohort Study (TBCS): Five Years of Experience

    ERIC Educational Resources Information Center

    Lung, For-Wey; Chiang, Tung-Liang; Lin, Shio-Jean; Shu, Bih-Ching; Lee, Meng-Chih

    2011-01-01

    The Taiwan Birth Cohort Study (TBCS) is the first nationwide birth cohort database in Asia designed to establish national norms of children's development. Several challenges during database development and data analysis were identified. Challenges include sampling methods, instrument development and statistical approach to missing data. The…

  8. The IAGOS Information System

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Thouret, Valérie; Brissebrat, Guillaume

    2017-04-01

    IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft and do measurements of aerosols, cloud particles, greenhouse gases, ozone, water vapor and nitrogen oxides from the surface to the lower stratosphere. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data. The IAGOS Data Portal http://www.iagos.org, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). In 2016 the new IAGOS Data Portal has been released. In addition to the data download the portal provides improved and new services such as download in NetCDF or NASA Ames formats and plotting tools (maps, time series, vertical profiles, etc.). New added value products are or will be soon available through the portal: back trajectories, origin of air masses, co-location with satellite data, etc. Web services allow to download IAGOS metadata such as flights and airports information. Administration tools have been implemented for users management and instruments monitoring. A major improvement is the interoperability with international portals or other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse, the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de) and the CAMS (Copernicus Atmosphere Monitoring Service) data center in Jülich (http://join.iek.fz-juelich.de). The link with the CAMS data center, through the JOIN interface, allows to combine model outputs with IAGOS data for inter-comparison. The CAMS project is a prominent user of the IGAS data network. During the year IAGOS will improved metadata standardization and dissemination through different collaborations with the AERIS data center, GAW for which IAGOS is a contributing network and the ENVRI+ European project. Metadata about measurements traceability and quality will be available, DOI will be implemented and interoperability with other European Infrastructures will be set up through standardized web services.

  9. Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor.

    PubMed

    Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung

    2018-03-23

    Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works.

  10. Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor

    PubMed Central

    Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung

    2018-01-01

    Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works. PMID:29570690

  11. Measuring organizational readiness for knowledge translation in chronic care.

    PubMed

    Gagnon, Marie-Pierre; Labarthe, Jenni; Légaré, France; Ouimet, Mathieu; Estabrooks, Carole A; Roch, Geneviève; Ghandour, El Kebir; Grimshaw, Jeremy

    2011-07-13

    Knowledge translation (KT) is an imperative in order to implement research-based and contextualized practices that can answer the numerous challenges of complex health problems. The Chronic Care Model (CCM) provides a conceptual framework to guide the implementation process in chronic care. Yet, organizations aiming to improve chronic care require an adequate level of organizational readiness (OR) for KT. Available instruments on organizational readiness for change (ORC) have shown limited validity, and are not tailored or adapted to specific phases of the knowledge-to-action (KTA) process. We aim to develop an evidence-based, comprehensive, and valid instrument to measure OR for KT in healthcare. The OR for KT instrument will be based on core concepts retrieved from existing literature and validated by a Delphi study. We will specifically test the instrument in chronic care that is of an increasing importance for the health system. Phase one: We will conduct a systematic review of the theories and instruments assessing ORC in healthcare. The retained theoretical information will be synthesized in a conceptual map. A bibliography and database of ORC instruments will be prepared after appraisal of their psychometric properties according to the standards for educational and psychological testing. An online Delphi study will be carried out among decision makers and knowledge users across Canada to assess the importance of these concepts and measures at different steps in the KTA process in chronic care.Phase two: A final OR for KT instrument will be developed and validated both in French and in English and tested in chronic disease management to measure OR for KT regarding the adoption of comprehensive, patient-centered, and system-based CCMs. This study provides a comprehensive synthesis of current knowledge on explanatory models and instruments assessing OR for KT. Moreover, this project aims to create more consensus on the theoretical underpinnings and the instrumentation of OR for KT in chronic care. The final product--a comprehensive and valid OR for KT instrument--will provide the chronic care settings with an instrument to assess their readiness to implement evidence-based chronic care.

  12. Measuring organizational readiness for knowledge translation in chronic care

    PubMed Central

    2011-01-01

    Background Knowledge translation (KT) is an imperative in order to implement research-based and contextualized practices that can answer the numerous challenges of complex health problems. The Chronic Care Model (CCM) provides a conceptual framework to guide the implementation process in chronic care. Yet, organizations aiming to improve chronic care require an adequate level of organizational readiness (OR) for KT. Available instruments on organizational readiness for change (ORC) have shown limited validity, and are not tailored or adapted to specific phases of the knowledge-to-action (KTA) process. We aim to develop an evidence-based, comprehensive, and valid instrument to measure OR for KT in healthcare. The OR for KT instrument will be based on core concepts retrieved from existing literature and validated by a Delphi study. We will specifically test the instrument in chronic care that is of an increasing importance for the health system. Methods Phase one: We will conduct a systematic review of the theories and instruments assessing ORC in healthcare. The retained theoretical information will be synthesized in a conceptual map. A bibliography and database of ORC instruments will be prepared after appraisal of their psychometric properties according to the standards for educational and psychological testing. An online Delphi study will be carried out among decision makers and knowledge users across Canada to assess the importance of these concepts and measures at different steps in the KTA process in chronic care. Phase two: A final OR for KT instrument will be developed and validated both in French and in English and tested in chronic disease management to measure OR for KT regarding the adoption of comprehensive, patient-centered, and system-based CCMs. Discussion This study provides a comprehensive synthesis of current knowledge on explanatory models and instruments assessing OR for KT. Moreover, this project aims to create more consensus on the theoretical underpinnings and the instrumentation of OR for KT in chronic care. The final product--a comprehensive and valid OR for KT instrument--will provide the chronic care settings with an instrument to assess their readiness to implement evidence-based chronic care. PMID:21752264

  13. Patient-reported Outcomes for Assessment of Quality of Life in Refractive Error: A Systematic Review.

    PubMed

    Kandel, Himal; Khadka, Jyoti; Goggin, Michael; Pesudovs, Konrad

    2017-12-01

    This review has identified the best existing patient-reported outcome (PRO) instruments in refractive error. The article highlights the limitations of the existing instruments and discusses the way forward. A systematic review was conducted to identify the types of PROs used in refractive error, to determine the quality of the existing PRO instruments in terms of their psychometric properties, and to determine the limitations in the content of the existing PRO instruments. Articles describing a PRO instrument measuring 1 or more domains of quality of life in people with refractive error were identified by electronic searches on the MEDLINE, PubMed, Scopus, Web of Science, and Cochrane databases. The information on content development, psychometric properties, validity, reliability, and responsiveness of those PRO instruments was extracted from the selected articles. The analysis was done based on a comprehensive set of assessment criteria. One hundred forty-eight articles describing 47 PRO instruments in refractive error were included in the review. Most of the articles (99 [66.9%]) used refractive error-specific PRO instruments. The PRO instruments comprised 19 refractive, 12 vision but nonrefractive, and 16 generic PRO instruments. Only 17 PRO instruments were validated in refractive error populations; six of them were developed using Rasch analysis. None of the PRO instruments has items across all domains of quality of life. The Quality of Life Impact of Refractive Correction, the Quality of Vision, and the Contact Lens Impact on Quality of Life have comparatively better quality with some limitations, compared with the other PRO instruments. This review describes the PRO instruments and informs the choice of an appropriate measure in refractive error. We identified need of a comprehensive and scientifically robust refractive error-specific PRO instrument. Item banking and computer-adaptive testing system can be the way to provide such an instrument.

  14. Mashup of Geo and Space Science Data Provided via Relational Databases in the Semantic Web

    NASA Astrophysics Data System (ADS)

    Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, J. S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.

    2014-12-01

    The use of RDBMS for the storage and management of geo and space science data and/or metadata is very common. Although the information stored in tables is based on a data model and therefore well organized and structured, a direct mashup with RDF based data stored in triple stores is not possible. One solution of the problem consists in the transformation of the whole content into RDF structures and storage in triple stores. Another interesting way is the use of a specific system/service, such as e.g. D2RQ, for the access to relational database content as virtual, read only RDF graphs. The Semantic Web based -proof of concept- GFZ ISDC uses the triple store Virtuoso for the storage of general context information/metadata to geo and space science satellite and ground station data. There is information about projects, platforms, instruments, persons, product types, etc. available but no detailed metadata about the data granuals itself. Such important information, as e.g. start or end time or the detailed spatial coverage of a single measurement is stored in RDBMS tables of the ISDC catalog system only. In order to provide a seamless access to all available information about the granuals/data products a mashup of the different data resources (triple store and RDBMS) is necessary. This paper describes the use of D2RQ for a Semantic Web/SPARQL based mashup of relational databases used for ISDC data server but also for the access to IUGONET and/or ESPAS and further geo and space science data resources. RDBMS Relational Database Management System RDF Resource Description Framework SPARQL SPARQL Protocol And RDF Query Language D2RQ Accessing Relational Databases as Virtual RDF Graphs GFZ ISDC German Research Centre for Geosciences Information System and Data Center IUGONET Inter-university Upper Atmosphere Global Observation Network (Japanese project) ESPAS Near earth space data infrastructure for e-science (European Union funded project)

  15. Development of a Classification Scheme for Examining Adverse Events Associated with Medical Devices, Specifically the DaVinci Surgical System as Reported in the FDA MAUDE Database.

    PubMed

    Gupta, Priyanka; Schomburg, John; Krishna, Suprita; Adejoro, Oluwakayode; Wang, Qi; Marsh, Benjamin; Nguyen, Andrew; Genere, Juan Reyes; Self, Patrick; Lund, Erik; Konety, Badrinath R

    2017-01-01

    To examine the Manufacturer and User Facility Device Experience Database (MAUDE) database to capture adverse events experienced with the Da Vinci Surgical System. In addition, to design a standardized classification system to categorize the complications and machine failures associated with the device. Overall, 1,057,000 DaVinci procedures were performed in the United States between 2009 and 2012. Currently, no system exists for classifying and comparing device-related errors and complications with which to evaluate adverse events associated with the Da Vinci Surgical System. The MAUDE database was queried for events reports related to the DaVinci Surgical System between the years 2009 and 2012. A classification system was developed and tested among 14 robotic surgeons to associate a level of severity with each event and its relationship to the DaVinci Surgical System. Events were then classified according to this system and examined by using Chi-square analysis. Two thousand eight hundred thirty-seven events were identified, of which 34% were obstetrics and gynecology (Ob/Gyn); 19%, urology; 11%, other; and 36%, not specified. Our classification system had moderate agreement with a Kappa score of 0.52. Using our classification system, we identified 75% of the events as mild, 18% as moderate, 4% as severe, and 3% as life threatening or resulting in death. Seventy-seven percent were classified as definitely related to the device, 15% as possibly related, and 8% as not related. Urology procedures compared with Ob/Gyn were associated with more severe events (38% vs 26%, p < 0.0001). Energy instruments were associated with less severe events compared with the surgical system (8% vs 87%, p < 0.0001). Events that were definitely associated with the device tended to be less severe (81% vs 19%, p < 0.0001). Our classification system is a valid tool with moderate inter-rater agreement that can be used to better understand device-related adverse events. The majority of robotic related events were mild but associated with the device.

  16. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  17. Measuring spirituality and religiosity in clinical research: a systematic review of instruments available in the Portuguese language.

    PubMed

    Lucchetti, Giancarlo; Lucchetti, Alessandra Lamas Granero; Vallada, Homero

    2013-01-01

    Despite numerous spirituality and/or religiosity (S/R) measurement tools for use in research worldwide, there is little information on S/R instruments in the Portuguese language. The aim of the present study was to map out the S/R scales available for research in the Portuguese language. Systematic review of studies found in databases. A systematic review was conducted in three phases. Phases 1 and 2: articles in Portuguese, Spanish and English, published up to November 2011, dealing with the Portuguese translation and/or validation of S/R measurement tools for clinical research, were selected from six databases. Phase 3: the instruments were grouped according to authorship, cross-cultural adaptation, internal consistency, concurrent and discriminative validity and test-retest procedures. Twenty instruments were found. Forty-five percent of these evaluated religiosity, 40% spirituality, 10% religious/spiritual coping and 5% S/R. Among these, 90% had been produced in (n = 3) or translated to (n = 15) Brazilian Portuguese and two (10%) solely to European Portuguese. Nevertheless, the majority of the instruments had not undergone in-depth psychometric analysis. Only 40% of the instruments presented concurrent validity, 45% discriminative validity and 15% a test-retest procedure. The characteristics of each instrument were analyzed separately, yielding advantages, disadvantages and psychometric properties. Currently, 20 instruments for measuring S/R are available in the Portuguese language. Most have been translated (n = 15) or developed (n = 3) in Brazil and present good internal consistency. Nevertheless, few instruments have been assessed regarding all their psychometric qualities.

  18. Systematic review of systemic sclerosis-specific instruments for the EULAR Outcome Measures Library: An evolutional database model of validated patient-reported outcomes.

    PubMed

    Ingegnoli, Francesca; Carmona, Loreto; Castrejon, Isabel

    2017-04-01

    The EULAR Outcome Measures Library (OML) is a freely available database of validated patient-reported outcomes (PROs). The aim of this study was to provide a comprehensive review of validated PROs specifically developed for systemic sclerosis (SSc) to feed the EULAR OML. A sensitive search was developed in Medline and Embase to identify all validation studies, cohort studies, reviews, or meta-analyses in which the objective were the development or validation of specific PROs evaluating organ involvement, disease activity or damage in SSc. A reviewer screened title and abstracts, selected the studies, and collected data concerning validation using ad hoc forms based on the COSMIN checklist. From 13,140 articles captured, 74 met the predefined criteria. After excluding two instruments as they were unavailable in English the selected 23 studies provided information on seven SSc-specific PROs on different SSc domains: burden of illness (symptom burden index), functional status (Scleroderma Assessment Questionnaire), functional ability (scleroderma Functional Score), Raynaud's phenomenon (Raynaud's condition score), mouth involvement (Mouth Handicap in SSc), gastro-intestinal involvement (University of California Los Angeles-Scleroderma Clinical Trial Consortium Gastro-Intestinal tract 2.0), and skin involvement (skin self-assessment). Each of them is partially validated and has different psychometric requirements. Seven SSc-specific PROs have a minimum validation and were included in the EULAR OML. Further development in the area of disease-specific PROs in SSc is warranted. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. NEIS (NASA Environmental Information System)

    NASA Technical Reports Server (NTRS)

    Cook, Beth

    1995-01-01

    The NASA Environmental Information System (NEIS) is a tool to support the functions of the NASA Operational Environment Team (NOET). The NEIS is designed to provide a central environmental technology resource drawing on all NASA centers' capabilities, and to support program managers who must ultimately deliver hardware compliant with performance specifications and environmental requirements. The NEIS also tracks environmental regulations, usages of materials and processes, and new technology developments. It has proven to be a useful instrument for channeling information throughout the aerospace community, NASA, other federal agencies, educational institutions, and contractors. The associated paper will discuss the dynamic databases within the NEIS, and the usefulness it provides for environmental compliance efforts.

  20. Impact of sound production by wind instruments on the temporomandibular system of male instrumentalists.

    PubMed

    Pampel, Michael; Jakstat, Holger A; Ahlers, Oliver M

    2014-01-01

    Playing a wind instrument can be either a reason for overuse or a protecting factor against certain diseases. Some individuals have many findings but low morbidity while others have few findings but high morbidity. This contradictory phenomenon should be researched. The temporomandibular system (TMS) is a functional unit which comprises the mandible, associated muscles and bilateral joints with the temporal bone. The TMS is responsible for the generation of sound when wind instruments are played. Over the long-term and with intensive usage, this causes changes in the musculature and in the temporomandibular joint (TMJ) of wind musicians, often resulting in temporomandibular disorders (TMD). The aim of this study is to examine evidence that TMD constitute an occupational disease in wind musicians. TMD patients and wind musicians were examined by dental clinical functional analysis. 102 male subjects were divided into three groups: "healthy" individuals, wind musicians, and patients with TMD. Dental Examination was carried out based on focused inclusion of the research diagnostic criteria - TMD [1,7]. Findings were evaluated for statistical significance by first transferring data into a digital database [2,15], then generating T-Test und Wilcoxon-Test when non-Gaussian distribution appears and applying the Mann-Whitney rank sum test using Sigmaplot Version 1.1 software (Systat Software Inc, Washington, USA). The evaluation revealed that wind instrument musicians show a high incidence of developing TMD as the researchers found almost 100% morbidity regarding parafunctional habits and preauricular muscle pain of each adult and highly active musician. The result is highly significant (p< 0.001) for protrusion distance of the mandible. A higher prevalence of functional disorders of the musculoskeletal system has previously been demonstrated in wind musicians. New research results and the typical functions of various wind instruments provide evidence that playing a wind instrument generates occupational risks to the TMS.

  1. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    PubMed

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  2. SIMPATIQCO: A Server-Based Software Suite Which Facilitates Monitoring the Time Course of LC–MS Performance Metrics on Orbitrap Instruments

    PubMed Central

    2012-01-01

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386

  3. SeaWiFS technical report series. Volume 20: The SeaWiFS bio-optical archive and storage system (SeaBASS), part 1

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Mcclain, Charles R.; Firestone, James K.; Westphal, Todd L.; Yeh, Eueng-Nan; Ge, Yuntao; Firestone, Elaine R.

    1994-01-01

    This document provides an overview of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Bio-Optical Archive and Storage System (SeaBASS), which will serve as a repository for numerous data sets of interest to the SeaWiFS Science Team and other approved investigators in the oceanographic community. The data collected will be those data sets suitable for the development and evaluation of bio-optical algorithms which include results from SeaWiFS Intercalibration Round-Robin Experiments (SIRREXs), prelaunch characterization of the SeaWiFS instrument by its manufacturer -- Hughes/Santa Barbara Research Center (SBRC), Marine Optical Characterization Experiment (MOCE) cruises, Marine Optical Buoy (MOBY) deployments and refurbishments, and field studies of other scientists outside of NASA. The primary goal of the data system is to provide a simple mechanism for querying the available archive and requesting specific items, while assuring that the data is made available only to authorized users. The design, construction, and maintenance of SeaBASS is the responsibility of the SeaWiFS Calibration and Validation Team (CVT). This report is concerned with documenting the execution of this task by the CVT and consists of a series of chapters detailing the various data sets involved. The topics presented are as follows: 1) overview of the SeaBASS file architecture, 2) the bio-optical data system, 3) the historical pigment database, 4) the SIRREX database, and 5) the SBRC database.

  4. New Life for Astronomical Instruments of the Past at the Astronomical Observatory of Taras Shevchenko

    NASA Astrophysics Data System (ADS)

    Kazantseva, Liliya

    2012-09-01

    Astronomical instruments of the past are certainly valuable artifacts of the history of science and education. Like other collections of scientific equipment, they also demonstrate i) development of scientific and technical ideas, ii) technological features of the historical period, iii) professional features of artists or companies -- manufacturers, and iv) national and local specificity of production. However, astronomical instruments are also devices made for observations of rare phenomena -- solar eclipses, transits of planets of the solar disk, etc. Instruments used to study these rare events were very different for each event, since the science changed quickly between events. The Astronomical Observatory of Kyiv National Taras Shevchenko University has a collection of tools made by leading European and local shops from the early nineteenth century. These include tools for optically observing the first artificial Earth satellites, photography, chronometry, and meteorology. In addition, it has assembled a library of descriptions of astronomical instruments and makers'price-lists. Of particular interest are the large stationary tools that are still active in their pavilions. Almost every instrument has a long interesting history. Museification of astronomical instruments gives them a second life, expanding educational programs and tracing the development of astronomy in general and scientific institution and region in particular. It would be advisable to first create a regional database of these rare astronomical instruments (which is already being done in Ukraine), then a common global database. By combining all the historical information about astronomical instruments with the advantages of the Internet, you can show the full evolution of an astronomical instrument with all its features. Time is relentless, and much is destroyed, badly kept and thrown in the garbage. We need time to protect, capture, and tell about it.

  5. Ridge 2000 Data Management System

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.; Carbotte, S. M.; Arko, R. A.; Haxby, W. F.; Ryan, W. B.; Chayes, D. N.; Lehnert, K. A.; Shank, T. M.

    2005-12-01

    Hosted at Lamont by the marine geoscience Data Management group, mgDMS, the NSF-funded Ridge 2000 electronic database, http://www.marine-geo.org/ridge2000/, is a key component of the Ridge 2000 multi-disciplinary program. The database covers each of the three Ridge 2000 Integrated Study Sites: Endeavour Segment, Lau Basin, and 8-11N Segment. It promotes the sharing of information to the broader community, facilitates integration of the suite of information collected at each study site, and enables comparisons between sites. The Ridge 2000 data system provides easy web access to a relational database that is built around a catalogue of cruise metadata. Any web browser can be used to perform a versatile text-based search which returns basic cruise and submersible dive information, sample and data inventories, navigation, and other relevant metadata such as shipboard personnel and links to NSF program awards. In addition, non-proprietary data files, images, and derived products which are hosted locally or in national repositories, as well as science and technical reports, can be freely downloaded. On the Ridge 2000 database page, our Data Link allows users to search the database using a broad range of parameters including data type, cruise ID, chief scientist, geographical location. The first Ridge 2000 field programs sailed in 2004 and, in addition to numerous data sets collected prior to the Ridge 2000 program, the database currently contains information on fifteen Ridge 2000-funded cruises and almost sixty Alvin dives. Track lines can be viewed using a recently- implemented Web Map Service button labelled Map View. The Ridge 2000 database is fully integrated with databases hosted by the mgDMS group for MARGINS and the Antarctic multibeam and seismic reflection data initiatives. Links are provided to partner databases including PetDB, SIOExplorer, and the ODP Janus system. Improved inter-operability with existing and new partner repositories continues to be strengthened. One major effort involves the gradual unification of the metadata across these partner databases. Standardised electronic metadata forms that can be filled in at sea are available from our web site. Interactive map-based exploration and visualisation of the Ridge 2000 database is provided by GeoMapApp, a freely-available Java(tm) application being developed within the mgDMS group. GeoMapApp includes high-resolution bathymetric grids for the 8-11N EPR segment and allows customised maps and grids for any of the Ridge 2000 ISS to be created. Vent and instrument locations can be plotted and saved as images, and Alvin dive photos are also available.

  6. Using SIR (Scientific Information Retrieval System) for data management during a field program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tichler, J.L.

    As part of the US Department of Energy's program, PRocessing of Emissions by Clouds and Precipitation (PRECP), a team of scientists from four laboratories conducted a study in north central New York State, to characterize the chemical and physical processes occurring in winter storms. Sampling took place from three aircraft, two instrumented motor homes and a network of 26 surface precipitation sampling sites. Data management personnel were part of the field program, using a portable IBM PC-AT computer to enter information as it became available during the field study. Having the same database software on the field computer and onmore » the cluster of VAX 11/785 computers in use aided database development and the transfer of data between machines. 2 refs., 3 figs., 5 tabs.« less

  7. DFACS - DATABASE, FORMS AND APPLICATIONS FOR CABLING AND SYSTEMS, VERSION 3.30

    NASA Technical Reports Server (NTRS)

    Billitti, J. W.

    1994-01-01

    DFACS is an interactive multi-user computer-aided engineering tool for system level electrical integration and cabling engineering. The purpose of the program is to provide the engineering community with a centralized database for entering and accessing system functional definitions, subsystem and instrument-end circuit pinout details, and harnessing data. The primary objective is to provide an instantaneous single point of information interchange, thus avoiding error-prone, time-consuming, and costly multiple-path data shuttling. The DFACS program, which is centered around a single database, has built-in menus that provide easy data input and access for all involved system, subsystem, and cabling personnel. The DFACS program allows parallel design of circuit data sheets and harness drawings. It also recombines raw information to automatically generate various project documents and drawings including the Circuit Data Sheet Index, the Electrical Interface Circuits List, Assembly and Equipment Lists, Electrical Ground Tree, Connector List, Cable Tree, Cabling Electrical Interface and Harness Drawings, Circuit Data Sheets, and ECR List of Affected Interfaces/Assemblies. Real time automatic production of harness drawings and circuit data sheets from the same data reservoir ensures instant system and cabling engineering design harmony. DFACS also contains automatic wire routing procedures and extensive error checking routines designed to minimize the possibility of engineering error. DFACS is designed to run on DEC VAX series computers under VMS using Version 6.3/01 of INGRES QUEL/OSL, a relational database system which is available through Relational Technology, Inc. The program is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard media) or a TK50 tape cartridge. DFACS was developed in 1987 and last updated in 1990. DFACS is a copyrighted work with all copyright vested in NASA. DEC, VAX and VMS are trademarks of Digital Equipment Corporation. INGRES QUEL/OSL is a trademark of Relational Technology, Inc.

  8. Bridging international law and rights-based litigation: mapping health-related rights through the development of the Global Health and Human Rights Database.

    PubMed

    Meier, Benjamin Mason; Cabrera, Oscar A; Ayala, Ana; Gostin, Lawrence O

    2012-06-15

    The O'Neill Institute for National and Global Health Law at Georgetown University, the World Health Organization, and the Lawyers Collective have come together to develop a searchable Global Health and Human Rights Database that maps the intersection of health and human rights in judgments, international and regional instruments, and national constitutions. Where states long remained unaccountable for violations of health-related human rights, litigation has arisen as a central mechanism in an expanding movement to create rights-based accountability. Facilitated by the incorporation of international human rights standards in national law, this judicial enforcement has supported the implementation of rights-based claims, giving meaning to states' longstanding obligations to realize the highest attainable standard of health. Yet despite these advancements, there has been insufficient awareness of the international and domestic legal instruments enshrining health-related rights and little understanding of the scope and content of litigation upholding these rights. As this accountability movement evolves, the Global Health and Human Rights Database seeks to chart this burgeoning landscape of international instruments, national constitutions, and judgments for health-related rights. Employing international legal research to document and catalogue these three interconnected aspects of human rights for the public's health, the Database's categorization by human rights, health topics, and regional scope provides a comprehensive means of understanding health and human rights law. Through these categorizations, the Global Health and Human Rights Database serves as a basis for analogous legal reasoning across states to serve as precedents for future cases, for comparative legal analysis of similar health claims in different country contexts, and for empirical research to clarify the impact of human rights judgments on public health outcomes. Copyright © 2012 Meier, Nygren-Krug, Cabrera, Ayala, and Gostin.

  9. Building a QC Database of Meteorological Data from NASA KSC and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, J. C.; Barbre, R. E.; Decker, R. K.; Orcutt, J. M.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) provides atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large datasets consists of ensuring erroneous data are removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, development methodologies, and periods of record. The goal of this activity is to use the previous efforts to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, It is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.

  10. The Quality Control Algorithms Used in the Creation of NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological database.

  11. Infections and exposures: reported incidents associated with unsuccessful decontamination of reusable surgical instruments.

    PubMed

    Southworth, P M

    2014-11-01

    Reusable surgical instruments provide a potential route for the transmission of pathogenic agents between patients in healthcare facilities. As such, the decontamination process between uses is a vital component in the prevention of healthcare-associated infections. This article reviews reported outbreaks and incidents associated with inappropriate, inadequate, or unsuccessful decontamination of surgical instruments, indicating potential pitfalls of decontamination practices worldwide. To the author's knowledge, this is the first review of surgical instrument decontamination failures. Databases of medical literature, Medline and Embase, were searched systematically. Articles detailing incidents associated with unsuccessful decontamination of surgical instruments were identified. Twenty-one articles were identified reporting incidents associated with failures in decontamination. A large proportion of incidents involved the attempted disinfection, rather than sterilization, of surgical instruments (43% of articles), counter to a number of national guidelines. Instruments used in eye surgery were most frequently reported to be associated with decontamination failures (29% of articles). Of the few articles detailing potential or confirmed pathogenic transmission, Pseudomonas aeruginosa and Mycobacterium spp. were most represented. One incident of possible variant Creutzfeldt-Jakob disease transmission was also identified. Limitations of analysing only published incidents mean that the likelihood of under-reporting (including reluctance to publish failure) must be considered. Despite these limitations, the small number of articles identified suggests a relatively low risk of cross-infection through reusable surgical instruments when cleaning/sterilization procedures are adhered to. The diverse nature of reported incidents also suggests that failures are not systemic. Copyright © 2014 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  12. Challenges in converting an interviewer-administered food probe database to self-administration in the National Cancer Institute Automated Self-administered 24-Hour Recall (ASA24).

    PubMed

    Zimmerman, Thea Palmer; Hull, Stephen G; McNutt, Suzanne; Mittl, Beth; Islam, Noemi; Guenther, Patricia M; Thompson, Frances E; Potischman, Nancy A; Subar, Amy F

    2009-12-01

    The National Cancer Institute (NCI) is developing an automated, self-administered 24-hour dietary recall (ASA24) application to collect and code dietary intake data. The goal of the ASA24 development is to create a web-based dietary interview based on the US Department of Agriculture (USDA) Automated Multiple Pass Method (AMPM) instrument currently used in the National Health and Nutrition Examination Survey (NHANES). The ASA24 food list, detail probes, and portion probes were drawn from the AMPM instrument; portion-size pictures from Baylor College of Medicine's Food Intake Recording Software System (FIRSSt) were added; and the food code/portion code assignments were linked to the USDA Food and Nutrient Database for Dietary Studies (FNDDS). The requirements that the interview be self-administered and fully auto-coded presented several challenges as the AMPM probes and responses were linked with the FNDDS food codes and portion pictures. This linking was accomplished through a "food pathway," or the sequence of steps that leads from a respondent's initial food selection, through the AMPM probes and portion pictures, to the point at which a food code and gram weight portion size are assigned. The ASA24 interview database that accomplishes this contains more than 1,100 food probes and more than 2 million food pathways and will include about 10,000 pictures of individual foods depicting up to 8 portion sizes per food. The ASA24 will make the administration of multiple days of recalls in large-scale studies economical and feasible.

  13. “Retention Projection” Enables Reliable Use of Shared Gas Chromatographic Retention Data Across Labs, Instruments, and Methods

    PubMed Central

    Barnes, Brian B.; Wilson, Michael B.; Carr, Peter W.; Vitha, Mark F.; Broeckling, Corey D.; Heuberger, Adam L.; Prenni, Jessica; Janis, Gregory C.; Corcoran, Henry; Snow, Nicholas H.; Chopra, Shilpi; Dhandapani, Ramkumar; Tawfall, Amanda; Sumner, Lloyd W.; Boswell, Paul G.

    2014-01-01

    Gas chromatography-mass spectrometry (GC-MS) is a primary tool used to identify compounds in complex samples. Both mass spectra and GC retention times are matched to those of standards, but it is often impractical to have standards on hand for every compound of interest, so we must rely on shared databases of MS data and GC retention information. Unfortunately, retention databases (e.g. linear retention index libraries) are experimentally restrictive, notoriously unreliable, and strongly instrument dependent, relegating GC retention information to a minor, often negligible role in compound identification despite its potential power. A new methodology called “retention projection” has great potential to overcome the limitations of shared chromatographic databases. In this work, we tested the reliability of the methodology in five independent laboratories. We found that even when each lab ran nominally the same method, the methodology was 3-fold more accurate than retention indexing because it properly accounted for unintentional differences between the GC-MS systems. When the labs used different methods of their own choosing, retention projections were 4- to 165-fold more accurate. More importantly, the distribution of error in the retention projections was predictable across different methods and labs, thus enabling automatic calculation of retention time tolerance windows. Tolerance windows at 99% confidence were generally narrower than those widely used even when physical standards are on hand to measure their retention. With its high accuracy and reliability, the new retention projection methodology makes GC retention a reliable, precise tool for compound identification, even when standards are not available to the user. PMID:24205931

  14. Nuclear Energy Infrastructure Database Fitness and Suitability Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heidrich, Brenden

    In 2014, the Deputy Assistant Secretary for Science and Technology Innovation (NE-4) initiated the Nuclear Energy-Infrastructure Management Project by tasking the Nuclear Science User Facilities (NSUF) to create a searchable and interactive database of all pertinent NE supported or related infrastructure. This database will be used for analyses to establish needs, redundancies, efficiencies, distributions, etc. in order to best understand the utility of NE’s infrastructure and inform the content of the infrastructure calls. The NSUF developed the database by utilizing data and policy direction from a wide variety of reports from the Department of Energy, the National Research Council, themore » International Atomic Energy Agency and various other federal and civilian resources. The NEID contains data on 802 R&D instruments housed in 377 facilities at 84 institutions in the US and abroad. A Database Review Panel (DRP) was formed to review and provide advice on the development, implementation and utilization of the NEID. The panel is comprised of five members with expertise in nuclear energy-associated research. It was intended that they represent the major constituencies associated with nuclear energy research: academia, industry, research reactor, national laboratory, and Department of Energy program management. The Nuclear Energy Infrastructure Database Review Panel concludes that the NSUF has succeeded in creating a capability and infrastructure database that identifies and documents the major nuclear energy research and development capabilities across the DOE complex. The effort to maintain and expand the database will be ongoing. Detailed information on many facilities must be gathered from associated institutions added to complete the database. The data must be validated and kept current to capture facility and instrumentation status as well as to cover new acquisitions and retirements.« less

  15. WinTICS-24 --- A Telescope Control Interface for MS Windows

    NASA Astrophysics Data System (ADS)

    Hawkins, R. Lee

    1995-12-01

    WinTICS-24 is a telescope control system interface and observing assistant written in Visual Basic for MS Windows. It provides the ability to control a telescope and up to 3 other instruments via the serial ports on an IBM-PC compatible computer, all from one consistent user interface. In addition to telescope control, WinTICS contains an observing logbook, trouble log (which can automatically email its entries to a responsible person), lunar phase display, object database (which allows the observer to type in the name of an object and automatically slew to it), a time of minimum calculator for eclipsing binary stars, and an interface to the Guide CD-ROM for bringing up finder charts of the current telescope coordinates. Currently WinTICS supports control of DFM telescopes, but is easily adaptable to other telescopes and instrumentation.

  16. The development of a web-based assessment system to identify students’ misconception automatically on linear kinematics with a four-tier instrument test

    NASA Astrophysics Data System (ADS)

    Pujayanto, Pujayanto; Budiharti, Rini; Adhitama, Egy; Nuraini, Niken Rizky Amalia; Vernanda Putri, Hanung

    2018-07-01

    This research proposes the development of a web-based assessment system to identify students’ misconception. The system, named WAS (web-based assessment system), can identify students’ misconception profile on linear kinematics automatically after the student has finished the test. The test instrument was developed and validated. Items were constructed and arranged from the result of a focus group discussion (FGD), related to previous research. Fifty eight students (female  =  37, male  =  21) were used as samples. They were from different classes with 18 students from the gifted class and another 40 students from the normal class. WAS was designed specifically to support the teacher as an efficient replacement for a paper-based test system. In addition, WAS offers flexible timing functionally, stand-alone subject module, robustness and scalability. The entire WAS program and interface was developed with open source-based technologies such as the XAMP server, MySQL database, Javascript and PHP. It provides results immediately and provides diagrammatic questions as well as scientific symbols. It is feasible to apply this system to many students at once. Thus, it could be integrated in many schools as part of physics courses.

  17. Foodborne Norovirus State of Affairs in the EU Rapid Alert System for Food and Feed

    PubMed Central

    Papapanagiotou, Elias P.

    2017-01-01

    The European Union Rapid Alert System for Food and Feed (EU RASFF) database is an invaluable instrument for analyzing notifications involving norovirus in food. The aim of this work was to carry out a thorough research of the alert and border rejection notifications submitted in the RASFF database from its onset until 31 August 2017. Some conclusions of interest were: (i) Denmark, France, Italy, the Netherlands and Norway have contributed the majority of alert notifications as notifying countries, (ii) France and Serbia have been cited more often in alert notifications as countries of origin, (iii) Italy and Spain have submitted the majority of border rejection notifications, (iv) Third Countries implicated more frequently in border rejection notifications for norovirus in bivalve molluscs were Vietnam and Tunisia, whereas in fruits and vegetables were China and Serbia, (v) “risk dispersion” from norovirus-contaminated food was narrow since, in just over half of all alert notifications and all of the border rejection notifications, only up to three countries were involved, and (vi) both raw (oysters and berries) and cooked (mussels) food products can present a health risk to consumers. The information retrieved from the RASFF database on norovirus-contaminated food could prove helpful in the planning of future norovirus risk analysis endeavors. PMID:29186840

  18. Spiders and Camels and Sybase! Oh, My!

    NASA Astrophysics Data System (ADS)

    Barg, Irene; Ferro, Anthony J.; Stobie, Elizabeth

    The Hubble Space Telescope NICMOS Guaranteed Time Observers (GTOs) requested a means of sharing point spread function (PSF) observations. Because of the specifics of the instrument, these PSFs are very useful in the analysis of observations and can vary with the conditions on the telescope. The GTOs are geographically diverse, so a centralized processing solution would not work. The individual PSF observations were reduced by different people, at different institutions, using different reduction software. These varied observations had to be combined into a single database and linked to other information as well. The NICMOS software group at the University of Arizona developed a solution based on a World Wide Web (WWW) interface, using Perl/CGI forms to query the submitter about the PSF data to be entered. After some semi-automated sanity checks, using the FTOOLS package, the metadata are then entered into a Sybase relational database system. A user of the system can then query the database, again through a WWW interface, to locate and retrieve PSFs which may match their observations, as well as determine other information regarding the telescope conditions at the time of the observations (e.g., the breathing parameter). This presentation discusses some of the driving forces in the design, problems encountered, and the choices made. The tools used, including Sybase, Perl, FTOOLS, and WWW elements are also discussed.

  19. Application of SQL database to the control system of MOIRCS

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Tomohiro; Omata, Koji; Konishi, Masahiro; Ichikawa, Takashi; Suzuki, Ryuji; Tokoku, Chihiro; Uchimoto, Yuka Katsuno; Nishimura, Tetsuo

    2006-06-01

    MOIRCS (Multi-Object Infrared Camera and Spectrograph) is a new instrument for the Subaru telescope. In order to perform observations of near-infrared imaging and spectroscopy with cold slit mask, MOIRCS contains many device components, which are distributed on an Ethernet LAN. Two PCs wired to the focal plane array electronics operate two HAWAII2 detectors, respectively, and other two PCs are used for integrated control and quick data reduction, respectively. Though most of the devices (e.g., filter and grism turrets, slit exchange mechanism for spectroscopy) are controlled via RS232C interface, they are accessible from TCP/IP connection using TCP/IP to RS232C converters. Moreover, other devices are also connected to the Ethernet LAN. This network distributed structure provides flexibility of hardware configuration. We have constructed an integrated control system for such network distributed hardwares, named T-LECS (Tohoku University - Layered Electronic Control System). T-LECS has also network distributed software design, applying TCP/IP socket communication to interprocess communication. In order to help the communication between the device interfaces and the user interfaces, we defined three layers in T-LECS; an external layer for user interface applications, an internal layer for device interface applications, and a communication layer, which connects two layers above. In the communication layer, we store the data of the system to an SQL database server; they are status data, FITS header data, and also meta data such as device configuration data and FITS configuration data. We present our software system design and the database schema to manage observations of MOIRCS with Subaru.

  20. Electrosurgical injuries during robot assisted surgery: insights from the FDA MAUDE database

    NASA Astrophysics Data System (ADS)

    Fuller, Andrew; Vilos, George A.; Pautler, Stephen E.

    2012-02-01

    Introduction: The da Vinci surgical system requires the use of electrosurgical instruments. The re-use of such instruments creates the potential for stray electrical currents from capacitive coupling and/or insulation failure with subsequent injury. The morbidity of such injuries may negate many of the benefits of minimally invasive surgery. We sought to evaluate the rate and nature of electrosurgical injury (ESI) associated with this device. Methods: The Manufacturer and User Facility Device Experience (MAUDE) database is administered by the US Food and Drug Administration (FDA) and reports adverse events related to medical devices in the United States. We analyzed all incidents in the context of robotic surgery between January 2001 and June 2011 to identify those related to the use of electrosurgery. Results: In the past decade, a total of 605 reports have been submitted to the FDA with regard to adverse events related to the da Vinci robotic surgical platform. Of these, 24 (3.9%) were related to potential or actual ESI. Nine out of the 24 cases (37.5%) resulted in additional surgical intervention for repair. There were 6 bowel injuries of which only one was recognized and managed intra-operatively. The remainder required laparotomy between 5 and 8 days after the initial robotic procedure. Additionally, there were 3 skin burns. The remaining cases required conservative management or resulted in no harm. Conclusion: ESI in the context of robotic surgery is uncommon but remains under-recognized and under-reported. Surgeons performing robot assisted surgery should be aware that ESI can occur with robotic instruments and vigilance for intra- and post-operative complications is paramount.

  1. Building strategies for tsunami scenarios databases to be used in a tsunami early warning decision support system: an application to western Iberia

    NASA Astrophysics Data System (ADS)

    Tinti, S.; Armigliato, A.; Pagnoni, G.; Zaniboni, F.

    2012-04-01

    One of the most challenging goals that the geo-scientific community is facing after the catastrophic tsunami occurred on December 2004 in the Indian Ocean is to develop the so-called "next generation" Tsunami Early Warning Systems (TEWS). Indeed, the meaning of "next generation" does not refer to the aim of a TEWS, which obviously remains to detect whether a tsunami has been generated or not by a given source and, in the first case, to send proper warnings and/or alerts in a suitable time to all the countries and communities that can be affected by the tsunami. Instead, "next generation" identifies with the development of a Decision Support System (DSS) that, in general terms, relies on 1) an integrated set of seismic, geodetic and marine sensors whose objective is to detect and characterise the possible tsunamigenic sources and to monitor instrumentally the time and space evolution of the generated tsunami, 2) databases of pre-computed numerical tsunami scenarios to be suitably combined based on the information coming from the sensor environment and to be used to forecast the degree of exposition of different coastal places both in the near- and in the far-field, 3) a proper overall (software) system architecture. The EU-FP7 TRIDEC Project aims at developing such a DSS and has selected two test areas in the Euro-Mediterranean region, namely the western Iberian margin and the eastern Mediterranean (Turkish coasts). In this study, we discuss the strategies that are being adopted in TRIDEC to build the databases of pre-computed tsunami scenarios and we show some applications to the western Iberian margin. In particular, two different databases are being populated, called "Virtual Scenario Database" (VSDB) and "Matching Scenario Database" (MSDB). The VSDB contains detailed simulations of few selected earthquake-generated tsunamis. The cases provided by the members of the VSDB are computed "real events"; in other words, they represent the unknowns that the TRIDEC platform must be able to recognise and match during the early crisis management phase. The MSDB contains a very large number (order of thousands) of tsunami simulations performed starting from many different simple earthquake sources of different magnitudes and located in the "vicinity" of the virtual scenario earthquake. Examples from both databases will be presented.

  2. Immersive training and mentoring for laparoscopic surgery

    NASA Astrophysics Data System (ADS)

    Nistor, Vasile; Allen, Brian; Dutson, E.; Faloutsos, P.; Carman, G. P.

    2007-04-01

    We describe in this paper a training system for minimally invasive surgery (MIS) that creates an immersive training simulation by recording the pathways of the instruments from an expert surgeon while performing an actual training task. Instrument spatial pathway data is stored and later accessed at the training station in order to visualize the ergonomic experience of the expert surgeon and trainees. Our system is based on tracking the spatial position and orientation of the instruments on the console for both the expert surgeon and the trainee. The technology is the result of recent developments in miniaturized position sensors that can be integrated seamlessly into the MIS instruments without compromising functionality. In order to continuously monitor the positions of laparoscopic tool tips, DC magnetic tracking sensors are used. A hardware-software interface transforms the coordinate data points into instrument pathways, while an intuitive graphic user interface displays the instruments spatial position and orientation for the mentor/trainee, and endoscopic video information. These data are recorded and saved in a database for subsequent immersive training and training performance analysis. We use two 6 DOF DC magnetic trackers with a sensor diameter of just 1.3 mm - small enough for insertion into 4 French catheters, embedded in the shaft of a endoscopic grasper and a needle driver. One sensor is located at the distal end of the shaft while the second sensor is located at the proximal end of the shaft. The placement of these sensors does not impede the functionally of the instrument. Since the sensors are located inside the shaft there are no sealing issues between the valve of the trocar and the instrument. We devised a peg transfer training task in accordance to validated training procedures, and tested our system on its ability to differentiate between the expert surgeon and the novices, based on a set of performance metrics. These performance metrics: motion smoothness, total path length, and time to completion, are derived from the kinematics of the instrument. An affine combination of the above mentioned metrics is provided to give a general score for the training performance. Clear differentiation between the expert surgeons and the novice trainees is visible in the test results. Strictly kinematics based performance metrics can be used to evaluate the training progress of MIS trainees in the context of UCLA - LTS.

  3. Advanced Power and Propulsion: 2000-2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This custom bibliography from the NASA Scientific and Technical Information Program lists a sampling of records found in the NASA Aeronautics and Space Database. The scope of this topic includes primarily nuclear thermal and nuclear electric technologies, to enable spacecraft and instrument operation and communications, particularly in the outer solar system, where sunlight can no longer be exploited by solar panels. This area of focus is one of the enabling technologies as defined by NASA s Report of the President s Commission on Implementation of United States Space Exploration Policy, published in June 2004.

  4. Advanced life support study

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summary reports on each of the eight tasks undertaken by this contract are given. Discussed here is an evaluation of a Closed Ecological Life Support System (CELSS), including modeling and analysis of Physical/Chemical Closed Loop Life Support (P/C CLLS); the Environmental Control and Life Support Systems (ECLSS) evolution - Intermodule Ventilation study; advanced technologies interface requirements relative to ECLSS; an ECLSS resupply analysis; the ECLSS module addition relocation systems engineering analysis; an ECLSS cost/benefit analysis to identify rack-level interface requirements of the alternate technologies evaluated in the ventilation study, with a comparison of these with the rack level interface requirements for the baseline technologies; advanced instrumentation - technology database enhancement; and a clean room survey and assessment of various ECLSS evaluation options for different growth scenarios.

  5. Performance model for grid-connected photovoltaic inverters.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyson, William Earl; Galbraith, Gary M.; King, David L.

    2007-09-01

    This document provides an empirically based performance model for grid-connected photovoltaic inverters used for system performance (energy) modeling and for continuous monitoring of inverter performance during system operation. The versatility and accuracy of the model were validated for a variety of both residential and commercial size inverters. Default parameters for the model can be obtained from manufacturers specification sheets, and the accuracy of the model can be further refined using measurements from either well-instrumented field measurements in operational systems or using detailed measurements from a recognized testing laboratory. An initial database of inverter performance parameters was developed based on measurementsmore » conducted at Sandia National Laboratories and at laboratories supporting the solar programs of the California Energy Commission.« less

  6. ISC-GEM: Global Instrumental Earthquake Catalogue (1900-2009), I. Data collection from early instrumental seismological bulletins

    NASA Astrophysics Data System (ADS)

    Di Giacomo, Domenico; Harris, James; Villaseñor, Antonio; Storchak, Dmitry A.; Engdahl, E. Robert; Lee, William H. K.

    2015-02-01

    In order to produce a new global reference earthquake catalogue based on instrumental data covering the last 100+ years of global earthquakes, we collected, digitized and processed an unprecedented amount of printed early instrumental seismological bulletins with fundamental parametric data for relocating and reassessing the magnitude of earthquakes that occurred in the period between 1904 and 1970. This effort was necessary in order to produce an earthquake catalogue with locations and magnitudes as homogeneous as possible. The parametric data obtained and processed during this work fills a large gap in electronic bulletin data availability. This new dataset complements the data publicly available in the International Seismological Centre (ISC) Bulletin starting in 1964. With respect to the amplitude-period data necessary to re-compute magnitude, we searched through the global collection of printed bulletins stored at the ISC and entered relevant station parametric data into the database. As a result, over 110,000 surface and body-wave amplitude-period pairs for re-computing standard magnitudes MS and mb were added to the ISC database. To facilitate earthquake relocation, different sources have been used to retrieve body-wave arrival times. These were entered into the database using optical character recognition methods (International Seismological Summary, 1918-1959) or manually (e.g., British Association for the Advancement of Science, 1913-1917). In total, ∼1,000,000 phase arrival times were added to the ISC database for large earthquakes that occurred in the time interval 1904-1970. The selection of earthquakes for which data was added depends on time period and magnitude: for the early years of last century (until 1917) only very large earthquakes were selected for processing (M ⩾ 7.5), whereas in the periods 1918-1959 and 1960-2009 the magnitude thresholds are 6.25 and 5.5, respectively. Such a selection was mainly dictated by limitations in time and funding. Although the newly available parametric data is only a subset of the station data available in the printed bulletins, its electronic availability will be important for any future study of earthquakes that occurred during the early instrumental period.

  7. TAPAS, a VO archive at the IRAM 30-m telescope

    NASA Astrophysics Data System (ADS)

    Leon, Stephane; Espigares, Victor; Ruíz, José Enrique; Verdes-Montenegro, Lourdes; Mauersberger, Rainer; Brunswig, Walter; Kramer, Carsten; Santander-Vela, Juan de Dios; Wiesemeyer, Helmut

    2012-07-01

    Astronomical observatories are today generating increasingly large volumes of data. For an efficient use of them, databases have been built following the standards proposed by the International Virtual Observatory Alliance (IVOA), providing a common protocol to query them and make them interoperable. The IRAM 30-m radio telescope, located in Sierra Nevada (Granada, Spain) is a millimeter wavelength telescope with a constantly renewed, extensive choice of instruments, and capable of covering the frequency range between 80 and 370 GHz. It is continuously producing a large amount of data thanks to the more than 200 scientific projects observed each year. The TAPAS archive at the IRAM 30-m telescope is aimed to provide public access to the headers describing the observations performed with the telescope, according to a defined data policy, making as well the technical data available to the IRAM staff members. A special emphasis has been made to make it Virtual Observatory (VO) compliant, and to offer a VO compliant web interface allowing to make the information available to the scientific community. TAPAS is built using the Django Python framework on top of a relational MySQL database, and is fully integrated with the telescope control system. The TAPAS data model (DM) is based on the Radio Astronomical DAta Model for Single dish radio telescopes (RADAMS), to allow for easy integration into the VO infrastructure. A metadata modeling layer is used by the data-filler to allow an implementation free from assumptions about the control system and the underlying database. TAPAS and its public web interface ( http://tapas.iram.es ) provides a scalable system that can evolve with new instruments and observing modes. A meta description of the DM has been introduced in TAPAS in order to both avoid undesired coupling between the code and the DM and to provide a better management of the archive. A subset of the header data stored in TAPAS will be made available at the CDS.

  8. Recent advances in the Lesser Antilles observatories Part 2 : WebObs - an integrated web-based system for monitoring and networks management

    NASA Astrophysics Data System (ADS)

    Beauducel, François; Bosson, Alexis; Randriamora, Frédéric; Anténor-Habazac, Christian; Lemarchand, Arnaud; Saurel, Jean-Marie; Nercessian, Alexandre; Bouin, Marie-Paule; de Chabalier, Jean-Bernard; Clouard, Valérie

    2010-05-01

    Seismological and Volcanological observatories have common needs and often common practical problems for multi disciplinary data monitoring applications. In fact, access to integrated data in real-time and estimation of measurements uncertainties are keys for an efficient interpretation, but instruments variety, heterogeneity of data sampling and acquisition systems lead to difficulties that may hinder crisis management. In Guadeloupe observatory, we have developed in the last years an operational system that attempts to answer the questions in the context of a pluri-instrumental observatory. Based on a single computer server, open source scripts (Matlab, Perl, Bash, Nagios) and a Web interface, the system proposes: an extended database for networks management, stations and sensors (maps, station file with log history, technical characteristics, meta-data, photos and associated documents); a web-form interfaces for manual data input/editing and export (like geochemical analysis, some of the deformation measurements, ...); routine data processing with dedicated automatic scripts for each technique, production of validated data outputs, static graphs on preset moving time intervals, and possible e-mail alarms; computers, acquisition processes, stations and individual sensors status automatic check with simple criteria (files update and signal quality), displayed as synthetic pages for technical control. In the special case of seismology, WebObs includes a digital stripchart multichannel continuous seismogram associated with EarthWorm acquisition chain (see companion paper Part 1), event classification database, location scripts, automatic shakemaps and regional catalog with associated hypocenter maps accessed through a user request form. This system leads to a real-time Internet access for integrated monitoring and becomes a strong support for scientists and technicians exchange, and is widely open to interdisciplinary real-time modeling. It has been set up at Martinique observatory and installation is planned this year at Montserrat Volcanological Observatory. It also in production at the geomagnetic observatory of Addis Abeba in Ethiopia.

  9. System on chip (SOC) wi-fi microcontroller for multistation measurement of water surface level using ultrasonic sensor

    NASA Astrophysics Data System (ADS)

    Suryono, Suryono; Purnomo Putro, Sapto; Widowati; Adhy, Satriyo

    2018-05-01

    Experimental results of data acquisition and transmission of water surface level from the field using System on Chip (SOC) Wi-Fi microcontroller are described here. System on Chip (SOC) Wi-Fi microcontroller is useful in dealing with limitations of in situ measurement by people. It is expected to address the problem of field instrumentation such as complexities in electronic circuit, power supply, efficiency, and automation of digital data acquisition. The system developed here employs five (5) nodes consisting of ultrasonic water surface level sensor using (SOC) Wi-Fi microcontroller. The five nodes are connected to a Wi-Fi router as the gateway to send multi-station data to a computer host. Measurement of water surface level using SOC Wi-Fi microcontroller manages conduct multi-station communication via database service programming that is capable of inputting every data sent to the database record according to the identity of data sent. The system here has a measurement error of 0.65 cm, while in terms of range, communication between data node to gateway varies in distance from 25 m to 45 m. Communication has been successfully conducted from one Wi-Fi gateway to the other that further improvement for its multi-station range is a certain possibility.

  10. Automating X-ray Fluorescence Analysis for Rapid Astrobiology Surveys.

    PubMed

    Thompson, David R; Flannery, David T; Lanka, Ravi; Allwood, Abigail C; Bue, Brian D; Clark, Benton C; Elam, W Timothy; Estlin, Tara A; Hodyss, Robert P; Hurowitz, Joel A; Liu, Yang; Wade, Lawrence A

    2015-11-01

    A new generation of planetary rover instruments, such as PIXL (Planetary Instrument for X-ray Lithochemistry) and SHERLOC (Scanning Habitable Environments with Raman Luminescence for Organics and Chemicals) selected for the Mars 2020 mission rover payload, aim to map mineralogical and elemental composition in situ at microscopic scales. These instruments will produce large spectral cubes with thousands of channels acquired over thousands of spatial locations, a large potential science yield limited mainly by the time required to acquire a measurement after placement. A secondary bottleneck also faces mission planners after downlink; analysts must interpret the complex data products quickly to inform tactical planning for the next command cycle. This study demonstrates operational approaches to overcome these bottlenecks by specialized early-stage science data processing. Onboard, simple real-time systems can perform a basic compositional assessment, recognizing specific features of interest and optimizing sensor integration time to characterize anomalies. On the ground, statistically motivated visualization can make raw uncalibrated data products more interpretable for tactical decision making. Techniques such as manifold dimensionality reduction can help operators comprehend large databases at a glance, identifying trends and anomalies in data. These onboard and ground-side analyses can complement a quantitative interpretation. We evaluate system performance for the case study of PIXL, an X-ray fluorescence spectrometer. Experiments on three representative samples demonstrate improved methods for onboard and ground-side automation and illustrate new astrobiological science capabilities unavailable in previous planetary instruments. Dimensionality reduction-Planetary science-Visualization.

  11. NESDIS OSPO Data Access Policy and CRM

    NASA Astrophysics Data System (ADS)

    Seybold, M. G.; Donoho, N. A.; McNamara, D.; Paquette, J.; Renkevens, T.

    2012-12-01

    The Office of Satellite and Product Operations (OSPO) is the NESDIS office responsible for satellite operations, product generation, and product distribution. Access to and distribution of OSPO data was formally established in a Data Access Policy dated February, 2011. An extension of the data access policy is the OSPO Customer Relationship Management (CRM) Database, which has been in development since 2008 and is reaching a critical level of maturity. This presentation will provide a summary of the data access policy and standard operating procedure (SOP) for handling data access requests. The tangential CRM database will be highlighted including the incident tracking system, reporting and notification capabilities, and the first comprehensive portfolio of NESDIS satellites, instruments, servers, applications, products, user organizations, and user contacts. Select examples of CRM data exploitation will show how OSPO is utilizing the CRM database to more closely satisfy the user community's satellite data needs with new product promotions, as well as new data and imagery distribution methods in OSPO's Environmental Satellite Processing Center (ESPC). In addition, user services and outreach initiatives from the Satellite Products and Services Division will be highlighted.

  12. The Dynamic Aviation Data System (DADS).

    PubMed

    Soman, S; Strome, T; Francescutti, L H

    1997-08-01

    This paper proposes The Dynamic Aviation Data System (DADS), which integrates a variety of existing information sources regarding flight to serve as a tool to pilots in dealing with the challenges of flight. The system is composed of three main parts: a pilot's history on disk; a system that can read proposed flight plans and make suggestions based upon Geographical Information Systems, weather, aircraft, and case report databases that exist throughout North America; and a small hand-held computer that interfaces with the aircraft's instruments and that can be brought into the cockpit to aid the pilot before and during flight. The system is based upon technology that currently exists and information that is already regularly collected. While many issues regarding implementation and cost efficiency of the system need to be addressed, the system shows promise in its ability to make useful flight safety information available to all pilots in order to save lives.

  13. EMERALD: Coping with the Explosion of Seismic Data

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2009-12-01

    The geosciences are currently generating an unparalleled quantity of new public broadband seismic data with the establishment of large-scale seismic arrays such as the EarthScope USArray, which are enabling new and transformative scientific discoveries of the structure and dynamics of the Earth’s interior. Much of this explosion of data is a direct result of the formation of the IRIS consortium, which has enabled an unparalleled level of open exchange of seismic instrumentation, data, and methods. The production of these massive volumes of data has generated new and serious data management challenges for the seismological community. A significant challenge is the maintenance and updating of seismic metadata, which includes information such as station location, sensor orientation, instrument response, and clock timing data. This key information changes at unknown intervals, and the changes are not generally communicated to data users who have already downloaded and processed data. Another basic challenge is the ability to handle massive seismic datasets when waveform file volumes exceed the fundamental limitations of a computer’s operating system. A third, long-standing challenge is the difficulty of exchanging seismic processing codes between researchers; each scientist typically develops his or her own unique directory structure and file naming convention, requiring that codes developed by another researcher be rewritten before they can be used. To address these challenges, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The overarching goal of the EMERALD project is to enable more efficient and effective use of seismic datasets ranging from just a few hundred to millions of waveforms with a complete database-driven system, leading to higher quality seismic datasets for scientific analysis and enabling faster, more efficient scientific research. We will present a preliminary (beta) version of EMERALD, an integrated, extensible, standalone database server system based on the open-source PostgreSQL database engine. The system is designed for fast and easy processing of seismic datasets, and provides the necessary tools to manage very large datasets and all associated metadata. EMERALD provides methods for efficient preprocessing of seismic records; large record sets can be easily and quickly searched, reviewed, revised, reprocessed, and exported. EMERALD can retrieve and store station metadata and alert the user to metadata changes. The system provides many methods for visualizing data, analyzing dataset statistics, and tracking the processing history of individual datasets. EMERALD allows development and sharing of visualization and processing methods using any of 12 programming languages. EMERALD is designed to integrate existing software tools; the system provides wrapper functionality for existing widely-used programs such as GMT, SOD, and TauP. Users can interact with EMERALD via a web browser interface, or they can directly access their data from a variety of database-enabled external tools. Data can be imported and exported from the system in a variety of file formats, or can be directly requested and downloaded from the IRIS DMC from within EMERALD.

  14. An Interactive Multi-instrument Database of Solar Flares

    NASA Astrophysics Data System (ADS)

    Sadykov, Viacheslav M.; Kosovichev, Alexander G.; Oria, Vincent; Nita, Gelu M.

    2017-07-01

    Solar flares are complicated physical phenomena that are observable in a broad range of the electromagnetic spectrum, from radio waves to γ-rays. For a more comprehensive understanding of flares, it is necessary to perform a combined multi-wavelength analysis using observations from many satellites and ground-based observatories. For an efficient data search, integration of different flare lists, and representation of observational data, we have developed the Interactive Multi-Instrument Database of Solar Flares (IMIDSF, https://solarflare.njit.edu/). The web-accessible database is fully functional and allows the user to search for uniquely identified flare events based on their physical descriptors and the availability of observations by a particular set of instruments. Currently, the data from three primary flare lists (Geostationary Operational Environmental Satellites, RHESSI, and HEK) and a variety of other event catalogs (Hinode, Fermi GBM, Konus-WIND, the OVSA flare catalogs, the CACTus CME catalog, the Filament eruption catalog) and observing logs (IRIS and Nobeyama coverage) are integrated, and an additional set of physical descriptors (temperature and emission measure) is provided along with an observing summary, data links, and multi-wavelength light curves for each flare event since 2002 January. We envision that this new tool will allow researchers to significantly speed up the search of events of interest for statistical and case studies.

  15. The 2003 edition of geisa: a spectroscopic database system for the second generation vertical sounders radiance simulation

    NASA Astrophysics Data System (ADS)

    Jacquinet-Husson, N.; Lmd Team

    The GEISA (Gestion et Etude des Informations Spectroscopiques Atmosphériques: Management and Study of Atmospheric Spectroscopic Information) computer accessible database system, in its former 1997 and 2001 versions, has been updated in 2003 (GEISA-03). It is developed by the ARA (Atmospheric Radiation Analysis) group at LMD (Laboratoire de Météorologie Dynamique, France) since 1974. This early effort implemented the so-called `` line-by-line and layer-by-layer '' approach for forward radiative transfer modelling action. The GEISA 2003 system comprises three databases with their associated management softwares: a database of spectroscopic parameters required to describe adequately the individual spectral lines belonging to 42 molecules (96 isotopic species) and located in a spectral range from the microwave to the limit of the visible. The featured molecules are of interest in studies of the terrestrial as well as the other planetary atmospheres, especially those of the Giant Planets. a database of absorption cross-sections of molecules such as chlorofluorocarbons which exhibit unresolvable spectra. a database of refractive indices of basic atmospheric aerosol components. Illustrations will be given of GEISA-03, data archiving method, contents, management softwares and Web access facilities at: http://ara.lmd.polytechnique.fr The performance of instruments like AIRS (Atmospheric Infrared Sounder; http://www-airs.jpl.nasa.gov) in the USA, and IASI (Infrared Atmospheric Sounding Interferometer; http://smsc.cnes.fr/IASI/index.htm) in Europe, which have a better vertical resolution and accuracy, compared to the presently existing satellite infrared vertical sounders, is directly related to the quality of the spectroscopic parameters of the optically active gases, since these are essential input in the forward models used to simulate recorded radiance spectra. For these upcoming atmospheric sounders, the so-called GEISA/IASI sub-database system has been elaborated, from GEISA. Its content, will be described, as well. This work is ongoing, with the purpose of assessing the IASI measurements capabilities and the spectroscopic information quality, within the ISSWG (IASI Sounding Science Working Group), in the frame of the CNES (Centre National d'Etudes Spatiales, France)/EUMETSAT (EUropean organization for the exploitation of METeorological SATellites) Polar System (EPS) project, by simulating high resolution radiances and/or using experimental data. EUMETSAT will implement GEISA/IASI into the EPS ground segment. The IASI soundings spectroscopic data archive requirements will be discussed in the context of comparisons between recorded and calculated experimental spectra, using the ARA/4A forward line-by-line radiative transfer modelling code in its latest version.

  16. TAMDAR Sensor Validation in 2003 AIRS II

    NASA Technical Reports Server (NTRS)

    Daniels, Taumi S.; Murray, John J.; Anderson, Mark V.; Mulally, Daniel J.; Jensen, Kristopher R.; Grainger, Cedric A.; Delene, David J.

    2005-01-01

    This study entails an assessment of TAMDAR in situ temperature, relative humidity and winds sensor data from seven flights of the UND Citation II. These data are undergoing rigorous assessment to determine their viability to significantly augment domestic Meteorological Data Communications Reporting System (MDCRS) and the international Aircraft Meteorological Data Reporting (AMDAR) system observational databases to improve the performance of regional and global numerical weather prediction models. NASA Langley Research Center participated in the Second Alliance Icing Research Study from November 17 to December 17, 2003. TAMDAR data taken during this period is compared with validation data from the UND Citation. The data indicate acceptable performance of the TAMDAR sensor when compared to measurements from the UND Citation research instruments.

  17. Monitoring service for the Gran Telescopio Canarias control system

    NASA Astrophysics Data System (ADS)

    Huertas, Manuel; Molgo, Jordi; Macías, Rosa; Ramos, Francisco

    2016-07-01

    The Monitoring Service collects, persists and propagates the Telescope and Instrument telemetry, for the Gran Telescopio CANARIAS (GTC), an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). A new version of the Monitoring Service has been developed in order to improve performance, provide high availability, guarantee fault tolerance and scalability to cope with high volume of data. The architecture is based on a distributed in-memory data store with a Product/Consumer pattern design. The producer generates the data samples. The consumers either persists the samples to a database for further analysis or propagates them to the consoles in the control room to monitorize the state of the whole system.

  18. Building a Quality Controlled Database of Meteorological Data from NASA Kennedy Space Center and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.

  19. Monitoring and Hardware Management for Critical Fusion Plasma Instrumentation

    NASA Astrophysics Data System (ADS)

    Carvalho, Paulo F.; Santos, Bruno; Correia, Miguel; Combo, Álvaro M.; Rodrigues, AntÓnio P.; Pereira, Rita C.; Fernandes, Ana; Cruz, Nuno; Sousa, Jorge; Carvalho, Bernardo B.; Batista, AntÓnio J. N.; Correia, Carlos M. B. A.; Gonçalves, Bruno

    2018-01-01

    Controlled nuclear fusion aims to obtain energy by particles collision confined inside a nuclear reactor (Tokamak). These ionized particles, heavier isotopes of hydrogen, are the main elements inside of plasma that is kept at high temperatures (millions of Celsius degrees). Due to high temperatures and magnetic confinement, plasma is exposed to several sources of instabilities which require a set of procedures by the control and data acquisition systems throughout fusion experiments processes. Control and data acquisition systems often used in nuclear fusion experiments are based on the Advanced Telecommunication Computer Architecture (AdvancedTCA®) standard introduced by the Peripheral Component Interconnect Industrial Manufacturers Group (PICMG®), to meet the demands of telecommunications that require large amount of data (TB) transportation at high transfer rates (Gb/s), to ensure high availability including features such as reliability, serviceability and redundancy. For efficient plasma control, systems are required to collect large amounts of data, process it, store for later analysis, make critical decisions in real time and provide status reports either from the experience itself or the electronic instrumentation involved. Moreover, systems should also ensure the correct handling of detected anomalies and identified faults, notify the system operator of occurred events, decisions taken to acknowledge and implemented changes. Therefore, for everything to work in compliance with specifications it is required that the instrumentation includes hardware management and monitoring mechanisms for both hardware and software. These mechanisms should check the system status by reading sensors, manage events, update inventory databases with hardware system components in use and maintenance, store collected information, update firmware and installed software modules, configure and handle alarms to detect possible system failures and prevent emergency scenarios occurrences. The goal is to ensure high availability of the system and provide safety operation, experiment security and data validation for the fusion experiment. This work aims to contribute to the joint effort of the IPFN control and data acquisition group to develop a hardware management and monitoring application for control and data acquisition instrumentation especially designed for large scale tokamaks like ITER.

  20. Text Mining to Support Gene Ontology Curation and Vice Versa.

    PubMed

    Ruch, Patrick

    2017-01-01

    In this chapter, we explain how text mining can support the curation of molecular biology databases dealing with protein functions. We also show how curated data can play a disruptive role in the developments of text mining methods. We review a decade of efforts to improve the automatic assignment of Gene Ontology (GO) descriptors, the reference ontology for the characterization of genes and gene products. To illustrate the high potential of this approach, we compare the performances of an automatic text categorizer and show a large improvement of +225 % in both precision and recall on benchmarked data. We argue that automatic text categorization functions can ultimately be embedded into a Question-Answering (QA) system to answer questions related to protein functions. Because GO descriptors can be relatively long and specific, traditional QA systems cannot answer such questions. A new type of QA system, so-called Deep QA which uses machine learning methods trained with curated contents, is thus emerging. Finally, future advances of text mining instruments are directly dependent on the availability of high-quality annotated contents at every curation step. Databases workflows must start recording explicitly all the data they curate and ideally also some of the data they do not curate.

  1. Spacecraft instrument technology and cosmochemistry

    PubMed Central

    McSween, Harry Y.; McNutt, Ralph L.; Prettyman, Thomas H.

    2011-01-01

    Measurements by instruments on spacecraft have significantly advanced cosmochemistry. Spacecraft missions impose serious limitations on instrument volume, mass, and power, so adaptation of laboratory instruments drives technology. We describe three examples of flight instruments that collected cosmochemical data. Element analyses by Alpha Particle X-ray Spectrometers on the Mars Exploration Rovers have revealed the nature of volcanic rocks and sedimentary deposits on Mars. The Gamma Ray Spectrometer on the Lunar Prospector orbiter provided a global database of element abundances that resulted in a new understanding of the Moon’s crust. The Ion and Neutral Mass Spectrometer on Cassini has analyzed the chemical compositions of the atmosphere of Titan and active plumes on Enceladus. PMID:21402932

  2. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  3. Five years of NO2 Mobile-DOAS measurements in Europe: an overview

    NASA Astrophysics Data System (ADS)

    Merlaud, Alexis; Fayt, Caroline; Pinardi, Gaia; Tack, Frederik; Hendrick, François; Le Roux, Anabel-Lise; Constantin, Daniel-Eduard; Voiculescu, Mirela; Shaiganfar, Reza; Wagner, Thomas; Van Roozendael, Michel

    2014-05-01

    Since the CINDI campaign held in the Netherlands in July 2009, BIRA-IASB has been operating a car-based mobile-DOAS system, primarily dedicated to tropospheric NO2 measurements. The instrument is based on two similar compact spectrometers and records scattered light spectra simultaneously in the zenith direction and 30° above the horizon, following the MAX-DOAS approach. After CINDI, Mobile-DOAS measurements were performed on a routine basis between March 2010 and August 2011, mostly across Belgium, but also in Luxembourg, France, and Germany. From 2011, another BIRA-IASB mobile-DOAS instrument, using a single zenith channel, was operated in Romania through a collaboration with the University of Galati. In June 2013, these two mobile-DOAS instruments took part in the MADCAT campaign in Mainz, Germany, together with the MPIC mobile-DOAS system, based on a mini MAX-DOAS. We describe the BIRA-IASB instruments, our strategy to retrieve the NO2 tropospheric column, and the large database that was constituted. The latter is particularly interesting for its size: it covers some 500 hours of measurements and 20 000 km, including rural, periurban and urban areas with different air quality conditions. A 2011 cloud-free subset of the measurements is compared with OMI data. We also present preliminary results of an intercomparison between the three mobile-DOAS instruments operated during MADCAT. The high spatial frequency of the measurements (around 100 m) makes them valuable to study the NO2 horizontal gradients in polluted areas. This has implications in the context of air quality satellite validation studies, in which the variability of NO2 inside a satellite pixel must be taken into account.

  4. Using All-Sky Imaging to Improve Telescope Scheduling (Abstract)

    NASA Astrophysics Data System (ADS)

    Cole, G. M.

    2017-12-01

    (Abstract only) Automated scheduling makes it possible for a small telescope to observe a large number of targets in a single night. But when used in areas which have less-than-perfect sky conditions such automation can lead to large numbers of observations of clouds and haze. This paper describes the development of a "sky-aware" telescope automation system that integrates the data flow from an SBIG AllSky340c camera with an enhanced dispatch scheduler to make optimum use of the available observing conditions for two highly instrumented backyard telescopes. Using the minute-by-minute time series image stream and a self-maintained reference database, the software maintains a file of sky brightness, transparency, stability, and forecasted visibility at several hundred grid positions. The scheduling software uses this information in real time to exclude targets obscured by clouds and select the best observing task, taking into account the requirements and limits of each instrument.

  5. Raman LIDAR for UHECR experiments: an overview of the L'Aquila (Italy) lidar station experience for the retrieval of quality-assured data

    NASA Astrophysics Data System (ADS)

    Iarlori, Marco; Rizi, Vincenzo; D'Amico, Giuseppe; Freudenthaler, Volker; Wandinger, Ulla; Grillo, Aurelio

    L'Aquila (Italy) lidar station is part of the EARLINET (European Aerosol Research Lidar Network) since its beginning in the 2000. In the EARLINET community great efforts are devoted to the quality-assurance of the aerosol optical properties inserted in the database. To this end, each lidar station performed intercomparisons with reference instruments, a series of internal hardware checks in order to assess the quality of their instruments and exercises to test the algorithms used to retrieve the aerosol optical parameters. In this paper we give an overview of our experience within EARLINET qualityassurance (QA) program, which was adopted for the Raman lidar (RL) operated in the AUGER Observatory. This program could be systematically adopted for the lidar systems needed for the current and upcoming UHECR experiments, like CTA (Cherenkov Telescope Array).

  6. Multimodal optical imaging database from tumour brain human tissue: endogenous fluorescence from glioma, metastasis and control tissues

    NASA Astrophysics Data System (ADS)

    Poulon, Fanny; Ibrahim, Ali; Zanello, Marc; Pallud, Johan; Varlet, Pascale; Malouki, Fatima; Abi Lahoud, Georges; Devaux, Bertrand; Abi Haidar, Darine

    2017-02-01

    Eliminating time-consuming process of conventional biopsy is a practical improvement, as well as increasing the accuracy of tissue diagnoses and patient comfort. We addressed these needs by developing a multimodal nonlinear endomicroscope that allows real-time optical biopsies during surgical procedure. It will provide immediate information for diagnostic use without removal of tissue and will assist the choice of the optimal surgical strategy. This instrument will combine several means of contrast: non-linear fluorescence, second harmonic generation signal, reflectance, fluorescence lifetime and spectral analysis. Multimodality is crucial for reliable and comprehensive analysis of tissue. Parallel to the instrumental development, we currently improve our understanding of the endogeneous fluorescence signal with the different modalities that will be implemented in the stated. This endeavor will allow to create a database on the optical signature of the diseased and control brain tissues. This proceeding will present the preliminary results of this database on three types of tissues: cortex, metastasis and glioblastoma.

  7. Carbohydrate Structure Database: tools for statistical analysis of bacterial, plant and fungal glycomes

    PubMed Central

    Egorova, K.S.; Kondakova, A.N.; Toukach, Ph.V.

    2015-01-01

    Carbohydrates are biological blocks participating in diverse and crucial processes both at cellular and organism levels. They protect individual cells, establish intracellular interactions, take part in the immune reaction and participate in many other processes. Glycosylation is considered as one of the most important modifications of proteins and other biologically active molecules. Still, the data on the enzymatic machinery involved in the carbohydrate synthesis and processing are scattered, and the advance on its study is hindered by the vast bulk of accumulated genetic information not supported by any experimental evidences for functions of proteins that are encoded by these genes. In this article, we present novel instruments for statistical analysis of glycomes in taxa. These tools may be helpful for investigating carbohydrate-related enzymatic activities in various groups of organisms and for comparison of their carbohydrate content. The instruments are developed on the Carbohydrate Structure Database (CSDB) platform and are available freely on the CSDB web-site at http://csdb.glycoscience.ru. Database URL: http://csdb.glycoscience.ru PMID:26337239

  8. Psychometric Properties of Patient-Facing eHealth Evaluation Measures: Systematic Review and Analysis

    PubMed Central

    Turvey, Carolyn L; Nazi, Kim M; Holman, John E; Hogan, Timothy P; Shimada, Stephanie L; Kennedy, Diana R

    2017-01-01

    Background Significant resources are being invested into eHealth technology to improve health care. Few resources have focused on evaluating the impact of use on patient outcomes A standardized set of metrics used across health systems and research will enable aggregation of data to inform improved implementation, clinical practice, and ultimately health outcomes associated with use of patient-facing eHealth technologies. Objective The objective of this project was to conduct a systematic review to (1) identify existing instruments for eHealth research and implementation evaluation from the patient’s point of view, (2) characterize measurement components, and (3) assess psychometrics. Methods Concepts from existing models and published studies of technology use and adoption were identified and used to inform a search strategy. Search terms were broadly categorized as platforms (eg, email), measurement (eg, survey), function/information use (eg, self-management), health care occupations (eg, nurse), and eHealth/telemedicine (eg, mHealth). A computerized database search was conducted through June 2014. Included articles (1) described development of an instrument, or (2) used an instrument that could be traced back to its original publication, or (3) modified an instrument, and (4) with full text in English language, and (5) focused on the patient perspective on technology, including patient preferences and satisfaction, engagement with technology, usability, competency and fluency with technology, computer literacy, and trust in and acceptance of technology. The review was limited to instruments that reported at least one psychometric property. Excluded were investigator-developed measures, disease-specific assessments delivered via technology or telephone (eg, a cancer-coping measure delivered via computer survey), and measures focused primarily on clinician use (eg, the electronic health record). Results The search strategy yielded 47,320 articles. Following elimination of duplicates and non-English language publications (n=14,550) and books (n=27), another 31,647 articles were excluded through review of titles. Following a review of the abstracts of the remaining 1096 articles, 68 were retained for full-text review. Of these, 16 described an instrument and six used an instrument; one instrument was drawn from the GEM database, resulting in 23 articles for inclusion. None included a complete psychometric evaluation. The most frequently assessed property was internal consistency (21/23, 91%). Testing for aspects of validity ranged from 48% (11/23) to 78% (18/23). Approximately half (13/23, 57%) reported how to score the instrument. Only six (26%) assessed the readability of the instrument for end users, although all the measures rely on self-report. Conclusions Although most measures identified in this review were published after the year 2000, rapidly changing technology makes instrument development challenging. Platform-agnostic measures need to be developed that focus on concepts important for use of any type of eHealth innovation. At present, there are important gaps in the availability of psychometrically sound measures to evaluate eHealth technologies. PMID:29021128

  9. Validation of the ASTER instrument level 1A scene geometry

    USGS Publications Warehouse

    Kieffer, H.H.; Mullins, K.F.; MacKinnon, D.J.

    2008-01-01

    An independent assessment of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument geometry was undertaken by the U.S. ASTER Team, to confirm the geometric correction parameters developed and applied to Level 1A (radiometrically and geometrically raw with correction parameters appended) ASTER data. The goal was to evaluate the geometric quality of the ASTER system and the stability of the Terra spacecraft. ASTER is a 15-band system containing optical instruments with resolutions from 15- to 90-meters; all geometrically registered products are ultimately tied to the 15-meter Visible and Near Infrared (VNIR) sub-system. Our evaluation process first involved establishing a large database of Ground Control Points (GCP) in the mid-western United States; an area with features of an appropriate size for spacecraft instrument resolutions. We used standard U.S. Geological Survey (USGS) Digital Orthophoto Quads (DOQS) of areas in the mid-west to locate accurate GCPs by systematically identifying road intersections and recording their coordinates. Elevations for these points were derived from USGS Digital Elevation Models (DEMS). Road intersections in a swath of nine contiguous ASTER scenes were then matched to the GCPs, including terrain correction. We found no significant distortion in the images; after a simple image offset to absolute position, the RMS residual of about 200 points per scene was less than one-half a VNIR pixel. Absolute locations were within 80 meters, with a slow drift of about 10 meters over the entire 530-kilometer swath. Using strictly simultaneous observations of scenes 370 kilometers apart, we determined a stereo angle correction of 0.00134 degree with an accuracy of one microradian. The mid-west GCP field and the techniques used here should be widely applicable in assessing other spacecraft instruments having resolutions from 5 to 50-meters. ?? 2008 American Society for Photogrammetry and Remote Sensing.

  10. Telemetry and Science Data Software System

    NASA Technical Reports Server (NTRS)

    Bates, Lakesha; Hong, Liang

    2011-01-01

    The Telemetry and Science Data Software System (TSDSS) was designed to validate the operational health of a spacecraft, ease test verification, assist in debugging system anomalies, and provide trending data and advanced science analysis. In doing so, the system parses, processes, and organizes raw data from the Aquarius instrument both on the ground and while in space. In addition, it provides a user-friendly telemetry viewer, and an instant pushbutton test report generator. Existing ground data systems can parse and provide simple data processing, but have limitations in advanced science analysis and instant report generation. The TSDSS functions as an offline data analysis system during I&T (integration and test) and mission operations phases. After raw data are downloaded from an instrument, TSDSS ingests the data files, parses, converts telemetry to engineering units, and applies advanced algorithms to produce science level 0, 1, and 2 data products. Meanwhile, it automatically schedules upload of the raw data to a remote server and archives all intermediate and final values in a MySQL database in time order. All data saved in the system can be straightforwardly retrieved, exported, and migrated. Using TSDSS s interactive data visualization tool, a user can conveniently choose any combination and mathematical computation of interesting telemetry points from a large range of time periods (life cycle of mission ground data and mission operations testing), and display a graphical and statistical view of the data. With this graphical user interface (GUI), the data queried graphs can be exported and saved in multiple formats. This GUI is especially useful in trending data analysis, debugging anomalies, and advanced data analysis. At the request of the user, mission-specific instrument performance assessment reports can be generated with a simple click of a button on the GUI. From instrument level to observatory level, the TSDSS has been operating supporting functional and performance tests and refining system calibration algorithms and coefficients, in sync with the Aquarius/SAC-D spacecraft. At the time of this reporting, it was prepared and set up to perform anomaly investigation for mission operations preceding the Aquarius/SAC-D spacecraft launch on June 10, 2011.

  11. The on-site quality-assurance system for Hyper Suprime-Cam: OSQAH

    NASA Astrophysics Data System (ADS)

    Furusawa, Hisanori; Koike, Michitaro; Takata, Tadafumi; Okura, Yuki; Miyatake, Hironao; Lupton, Robert H.; Bickerton, Steven; Price, Paul A.; Bosch, James; Yasuda, Naoki; Mineo, Sogo; Yamada, Yoshihiko; Miyazaki, Satoshi; Nakata, Fumiaki; Koshida, Shintaro; Komiyama, Yutaka; Utsumi, Yousuke; Kawanomoto, Satoshi; Jeschke, Eric; Noumaru, Junichi; Schubert, Kiaina; Iwata, Ikuru; Finet, Francois; Fujiyoshi, Takuya; Tajitsu, Akito; Terai, Tsuyoshi; Lee, Chien-Hsiu

    2018-01-01

    We have developed an automated quick data analysis system for data quality assurance (QA) for Hyper Suprime-Cam (HSC). The system was commissioned in 2012-2014, and has been offered for general observations, including the HSC Subaru Strategic Program, since 2014 March. The system provides observers with data quality information, such as seeing, sky background level, and sky transparency, based on quick analysis as data are acquired. Quick-look images and validation of image focus are also provided through an interactive web application. The system is responsible for the automatic extraction of QA information from acquired raw data into a database, to assist with observation planning, assess progress of all observing programs, and monitor long-term efficiency variations of the instrument and telescope. Enhancements of the system are being planned to facilitate final data analysis, to improve the HSC archive, and to provide legacy products for astronomical communities.

  12. Viking lander camera geometry calibration report. Volume 1: Test methods and data reduction techniques

    NASA Technical Reports Server (NTRS)

    Wolf, M. B.

    1981-01-01

    The determination and removal of instrument signature from Viking Lander camera geometric data are described. All tests conducted as well as a listing of the final database (calibration constants) used to remove instrument signature from Viking Lander flight images are included. The theory of the geometric aberrations inherent in the Viking Lander camera is explored.

  13. Movement kinematics and cyclic fatigue of NiTi rotary instruments: a systematic review.

    PubMed

    Ferreira, F; Adeodato, C; Barbosa, I; Aboud, L; Scelza, P; Zaccaro Scelza, M

    2017-02-01

    The aim of this review was to provide a detailed analysis of the literature concerning the correlation between different movement kinematics and the cyclic fatigue resistance of NiTi rotary endodontic instruments. From June 2014 to August 2015, four independent reviewers comprehensively and systematically searched the Medline (PubMed), EMBASE, Web of Science, Scopus and Google Scholar databases for works published since January 2005, using the following search terms: endodontics; nickel-titanium rotary files; continuous rotation; reciprocating motion; cyclic fatigue. In addition to the electronic searches, manual searches were performed to include articles listed in the reference sections of high-impact published articles that were not indexed in the databases. Laboratory studies in English language were considered for this review. The electronic and manual searches resulted in identification of 75 articles. Based on the inclusion criteria, 32 articles were selected for analysis of full-text copies. Specific analysis was then made of 20 articles that described the effects of reciprocating and continuous movements on cyclic fatigue of the instruments. A wide range of testing conditions and methodologies have been used to compare the cyclic fatigue resistance of rotary endodontic instruments. Most studies report that reciprocating motion improves the fatigue resistance of endodontic instruments, compared to continuous rotation, independent of other variables such as the speed of rotation, the angle or radius of curvature of simulated canals, geometry and taper, or the surface characteristics of the NiTi instruments. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  14. netherland hydrological modeling instrument

    NASA Astrophysics Data System (ADS)

    Hoogewoud, J. C.; de Lange, W. J.; Veldhuizen, A.; Prinsen, G.

    2012-04-01

    Netherlands Hydrological Modeling Instrument A decision support system for water basin management. J.C. Hoogewoud , W.J. de Lange ,A. Veldhuizen , G. Prinsen , The Netherlands Hydrological modeling Instrument (NHI) is the center point of a framework of models, to coherently model the hydrological system and the multitude of functions it supports. Dutch hydrological institutes Deltares, Alterra, Netherlands Environmental Assessment Agency, RWS Waterdienst, STOWA and Vewin are cooperating in enhancing the NHI for adequate decision support. The instrument is used by three different ministries involved in national water policy matters, for instance the WFD, drought management, manure policy and climate change issues. The basis of the modeling instrument is a state-of-the-art on-line coupling of the groundwater system (MODFLOW), the unsaturated zone (metaSWAP) and the surface water system (MOZART-DM). It brings together hydro(geo)logical processes from the column to the basin scale, ranging from 250x250m plots to the river Rhine and includes salt water flow. The NHI is validated with an eight year run (1998-2006) with dry and wet periods. For this run different parts of the hydrology have been compared with measurements. For instance, water demands in dry periods (e.g. for irrigation), discharges at outlets, groundwater levels and evaporation. A validation alone is not enough to get support from stakeholders. Involvement from stakeholders in the modeling process is needed. There fore to gain sufficient support and trust in the instrument on different (policy) levels a couple of actions have been taken: 1. a transparent evaluation of modeling-results has been set up 2. an extensive program is running to cooperate with regional waterboards and suppliers of drinking water in improving the NHI 3. sharing (hydrological) data via newly setup Modeling Database for local and national models 4. Enhancing the NHI with "local" information. The NHI is and has been used for many decision supports and evaluations. The main focus of the instrument is operational drought management and evaluating adaptive measures for different climate scenario's. It has also been used though as a basis to evaluate water quality of WFD-water bodies and measures, nutrient-leaching and describing WFD groundwater bodies. There is a toolkit to translate the hydrological NHI results to values for different water users. For instance with the NHI results agricultural yields can be calculated, effects on ground water dependant ecosystems, subsidence, shipping, drinking water supply. This makes NHI a valuable decision support system in Dutch water management.

  15. Particle Number Concentrations for HI-SCALE Field Campaign Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hering, Susanne V

    In support of the Holistic Interactions of Shallow Clouds, Aerosols, and Ecosystems (HI-SCALE) project to study new particle formation in the atmosphere, a pair of custom water condensation particle counters were provided to the second intensive field campaign, from mid-August through mid-September 2017, at the U.S. Department of Energy Southern Great Plains Atmospheric Radiation Measurement (ARM) Climate Research Facility observatory. These custom instruments were developed by Aerosol Dynamics, Inc. (Hering et al. 2017) to detect particles into the nanometer size range. Referred to as “versatile water condensation particle counter (vWCPC)”, they are water-based, laminar-flow condensational growth instruments whose lower particlemore » size threshold can be set based on user-selected operating temperatures. For HI-SCALE, the vWCPCs were configured to measure airborne particle number concentrations in the size range from approximately 2nm to 2μm. Both were installed in the particle sizing system operated by Chongai Kuang of Brookhaven National Laboratory (BNL). One of these was operated in parallel to a TSI Model 3776, upstream of the mobility particle sizing system, to measure total ambient particle concentrations. The airborne particle concentration data from this “total particle number vWCPC” (Ntot-vWCPC) system has been reported to the ARM database. The data are reported with one-second resolution. The second vWCPC was operated in parallel with the BNL diethylene glycol instrument to count particles downstream of a separate differential mobility size analyzer. Data from this “DMA-vWCPC” system was logged by BNL, and will eventually be provided by that laboratory.« less

  16. The Open AUC Project.

    PubMed

    Cölfen, Helmut; Laue, Thomas M; Wohlleben, Wendel; Schilling, Kristian; Karabudak, Engin; Langhorst, Bradley W; Brookes, Emre; Dubbs, Bruce; Zollars, Dan; Rocco, Mattia; Demeler, Borries

    2010-02-01

    Progress in analytical ultracentrifugation (AUC) has been hindered by obstructions to hardware innovation and by software incompatibility. In this paper, we announce and outline the Open AUC Project. The goals of the Open AUC Project are to stimulate AUC innovation by improving instrumentation, detectors, acquisition and analysis software, and collaborative tools. These improvements are needed for the next generation of AUC-based research. The Open AUC Project combines on-going work from several different groups. A new base instrument is described, one that is designed from the ground up to be an analytical ultracentrifuge. This machine offers an open architecture, hardware standards, and application programming interfaces for detector developers. All software will use the GNU Public License to assure that intellectual property is available in open source format. The Open AUC strategy facilitates collaborations, encourages sharing, and eliminates the chronic impediments that have plagued AUC innovation for the last 20 years. This ultracentrifuge will be equipped with multiple and interchangeable optical tracks so that state-of-the-art electronics and improved detectors will be available for a variety of optical systems. The instrument will be complemented by a new rotor, enhanced data acquisition and analysis software, as well as collaboration software. Described here are the instrument, the modular software components, and a standardized database that will encourage and ease integration of data analysis and interpretation software.

  17. Managing Data, Provenance and Chaos through Standardization and Automation at the Georgia Coastal Ecosystems LTER Site

    NASA Astrophysics Data System (ADS)

    Sheldon, W.

    2013-12-01

    Managing data for a large, multidisciplinary research program such as a Long Term Ecological Research (LTER) site is a significant challenge, but also presents unique opportunities for data stewardship. LTER research is conducted within multiple organizational frameworks (i.e. a specific LTER site as well as the broader LTER network), and addresses both specific goals defined in an NSF proposal as well as broader goals of the network; therefore, every LTER data can be linked to rich contextual information to guide interpretation and comparison. The challenge is how to link the data to this wealth of contextual metadata. At the Georgia Coastal Ecosystems LTER we developed an integrated information management system (GCE-IMS) to manage, archive and distribute data, metadata and other research products as well as manage project logistics, administration and governance (figure 1). This system allows us to store all project information in one place, and provide dynamic links through web applications and services to ensure content is always up to date on the web as well as in data set metadata. The database model supports tracking changes over time in personnel roles, projects and governance decisions, allowing these databases to serve as canonical sources of project history. Storing project information in a central database has also allowed us to standardize both the formatting and content of critical project information, including personnel names, roles, keywords, place names, attribute names, units, and instrumentation, providing consistency and improving data and metadata comparability. Lookup services for these standard terms also simplify data entry in web and database interfaces. We have also coupled the GCE-IMS to our MATLAB- and Python-based data processing tools (i.e. through database connections) to automate metadata generation and packaging of tabular and GIS data products for distribution. Data processing history is automatically tracked throughout the data lifecycle, from initial import through quality control, revision and integration by our data processing system (GCE Data Toolbox for MATLAB), and included in metadata for versioned data products. This high level of automation and system integration has proven very effective in managing the chaos and scalability of our information management program.

  18. Analysis of 2015 Winter In-Flight Icing Case Studies with Ground-Based Remote Sensing Systems Compared to In-Situ SLW Sondes

    NASA Technical Reports Server (NTRS)

    Serke, David J.; King, Michael Christopher; Hansen, Reid; Reehorst, Andrew L.

    2016-01-01

    National Aeronautics and Space Administration (NASA) and the National Center for Atmospheric Research (NCAR) have developed an icing remote sensing technology that has demonstrated skill at detecting and classifying icing hazards in a vertical column above an instrumented ground station. This technology has recently been extended to provide volumetric coverage surrounding an airport. Building on the existing vertical pointing system, the new method for providing volumetric coverage utilizes a vertical pointing cloud radar, a multi-frequency microwave radiometer with azimuth and elevation pointing, and a NEXRAD radar. The new terminal area icing remote sensing system processes the data streams from these instruments to derive temperature, liquid water content, and cloud droplet size for each examined point in space. These data are then combined to ultimately provide icing hazard classification along defined approach paths into an airport. To date, statistical comparisons of the vertical profiling technology have been made to Pilot Reports and Icing Forecast Products. With the extension into relatively large area coverage and the output of microphysical properties in addition to icing severity, the use of these comparators is not appropriate and a more rigorous assessment is required. NASA conducted a field campaign during the early months of 2015 to develop a database to enable the assessment of the new terminal area icing remote sensing system and further refinement of terminal area icing weather information technologies in general. In addition to the ground-based remote sensors listed earlier, in-situ icing environment measurements by weather balloons were performed to produce a comprehensive comparison database. Balloon data gathered consisted of temperature, humidity, pressure, super-cooled liquid water content, and 3-D position with time. Comparison data plots of weather balloon and remote measurements, weather balloon flight paths, bulk comparisons of integrated liquid water content and icing cloud extent agreement, and terminal-area hazard displays are presented. Discussions of agreement quality and paths for future development are also included.

  19. MaGa, a web-based collaborative database for gas emissions: a tool to improve the knowledge on Earth degassing

    NASA Astrophysics Data System (ADS)

    Frigeri, A.; Cardellini, C.; Chiodini, G.; Frondini, F.; Bagnato, E.; Aiuppa, A.; Fischer, T. P.; Lehnert, K. A.

    2014-12-01

    The study of the main pathways of carbon flux from the deep Earth requires the analysis of a large quantity and variety of data on volcanic and non-volcanic gas emissions. Hence, there is need for common frameworks to aggregate available data and insert new observations. Since 2010 we have been developing the Mapping Gas emissions (MaGa) web-based database to collect data on carbon degassing form volcanic and non-volcanic environments. MaGa uses an Object-relational model, translating the experience of field surveyors into the database schema. The current web interface of MaGa allows users to browse the data in tabular format or by browsing an interactive web-map. Enabled users can insert information as measurement methods, instrument details as well as the actual values collected in the field. Measurements found in the literature can be inserted as well as direct field observations made by human-operated instruments. Currently the database includes fluxes and gas compositions from active craters degassing, diffuse soil degassing and fumaroles both from dormant volcanoes and open-vent volcanoes from literature survey and data about non-volcanic emission of the Italian territory. Currently, MaGa holds more than 1000 volcanic plume degassing fluxes, data from 30 sites of diffuse soil degassing from italian volcanoes, and about 60 measurements from fumarolic and non volcanic emission sites. For each gas emission site, the MaGa holds data, pictures, descriptions on gas sampling, analysis and measurement methods, together with bibliographic references and contacts to researchers having experience on each site. From 2012, MaGa developments started to be focused towards the framework of the Deep Earth Carbon Degassing research initiative of the Deep Carbon Observatory. Whithin the DECADE initiative, there are others data systems, as EarthChem and the Smithsonian Institution's Global Volcanism Program. An interoperable interaction between the DECADE data systems is being planned. MaGa is showing good potentials to improve the knowledge on Earth degassing firstly by making data more accessible and encouraging participation among researchers, and secondly by allowing to observe and explore, for the first time, a gas emission dataset with spatial and temporal extents never analyzed before.

  20. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE PAGES

    Van Berkel, Gary J.; Kertesz, Vilmos

    2016-11-15

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  1. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Berkel, Gary J.; Kertesz, Vilmos

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  2. Earlinet database: new design and new products for a wider use of aerosol lidar data

    NASA Astrophysics Data System (ADS)

    Mona, Lucia; D'Amico, Giuseppe; Amato, Francesco; Linné, Holger; Baars, Holger; Wandinger, Ulla; Pappalardo, Gelsomina

    2018-04-01

    The EARLINET database is facing a complete reshaping to meet the wide request for more intuitive products and to face the even wider request related to the new initiatives such as Copernicus, the European Earth observation programme. The new design has been carried out in continuity with the past, to take advantage from long-term database. In particular, the new structure will provide information suitable for synergy with other instruments, near real time (NRT) applications, validation and process studies and climate applications.

  3. Relationship between physical activity level and psychosocial and socioeconomic factors and issues in children and adolescents with asthma: a scoping review.

    PubMed

    Westergren, Thomas; Berntsen, Sveinung; Ludvigsen, Mette Spliid; Aagaard, Hanne; Hall, Elisabeth O C; Ommundsen, Yngvar; Uhrenfeldt, Lisbeth; Fegran, Liv

    2017-08-01

    Asthma is a heterogeneous chronic airway disease which may reduce capability for physical activity. In healthy peers, physical activity is influenced by psychosocial and socioeconomic factors. Knowledge about the role of these factors has not been mapped in children and adolescents with asthma. The main objective of this scoping review was to identify psychosocial and socioeconomic factors associated with physical activity level in children and adolescents with asthma in the literature. The specific objectives were to map the instruments used to measure these factors, report on the construction and validation of these instruments, map psychosocial and socioeconomic issues related to physical activity level reported in qualitative studies, and identify gaps in knowledge about the relationship between psychosocial and socioeconomic factors and physical activity level in children and adolescents with asthma. Children and adolescents with asthma aged six to 18 years. Psychosocial and socioeconomic factors related to physical activity level and participation. All physical activity contexts. Quantitative and qualitative primary studies in English, with no date limit. The databases searched included nine major databases for health and sports science, and five databases for unpublished studies. After screening and identification of studies, the reference lists of all identified reports were searched, and forward citation searches were conducted using four databases. The following data were extracted: (a) relevant study characteristics and assessment of physical activity level, (b) instruments used to assess psychosocial and socioeconomic factors, (c) association between physical activity level and these factors, (d) construction and validation of instruments, and (e) psychosocial and socioeconomic issues related to physical activity participation. Twenty-one quantitative and 13 qualitative studies were included. In cross-sectional studies, enjoyment, physical self-concept, self-efficacy, attitudes and beliefs about physical activity and health, psychological distress, health-related quality of life, and social support were more often reported as being correlated with physical activity level. In three studies, the construct validity was assessed by factor analysis and construct reliability tests for the study population. Qualitative studies reported 10 issues related to physical activity participation, and capability and being like peers were most commonly reported. There was no direct evidence that qualitative research informed the development or adjustment of instruments in quantitative studies. Seven psychosocial factors correlated with physical activity level; capability and being like peers were the most commonly reported issues. Reports of the construction and validation of instruments were sparse.

  4. Uveka: a UV exposure monitoring system using autonomous instruments network for Reunion Island citizens

    NASA Astrophysics Data System (ADS)

    Sébastien, Nicolas; Cros, Sylvain; Lallemand, Caroline; Kurzrock, Frederik; Schmutz, Nicolas

    2016-04-01

    Reunion Island is a French oversea territory located in the Indian Ocean. This tropical Island has about 840,000 inhabitants and is visited every year by more than 400,000 tourists. On average, 340 sunny days occurs on this island in a whole year. Beyond these advantageous conditions, exposure of the population to ultraviolet radiation constitutes a public health issue. The number of hospitalisations for skin cancer increased by 50% between 2005 and 2010. Health insurance reimbursements due to ophthalmic anomalies caused by the sun is about two million Euros. Among the prevention measures recommended by public health policies, access to information on UV radiation is one of the basic needs. Reuniwatt, supported by the Regional Council of La Reunion, is currently developing the project Uveka. Uveka is a solution permitting to provide in real-time and in short-term forecast (several hours), the UV radiation maps of the Reunion Island. Accessible via web interface and smartphone application, Uveka informs the citizens about the UV exposure rate and its risk according to its individual characteristics (skin phototype, past exposure to sun etc.). The present work describes this initiative through the presentation of the UV radiation monitoring system and the data processing chain toward the end-users. The UV radiation monitoring system of Uveka is a network of low cost UV sensors. Each instrument is equipped with a solar panel and a battery. Moreover, the sensor is able to communicate using the 3G telecommunication network. Then, the instrument can be installed without AC power or access to a wired communication network. This feature eliminates a site selection constraint. Indeed, with more than 200 microclimates and a strong cloud cover spatial variability, building a representative measurement site network in this island with a limited number of instruments is a real challenge. In addition to these UV radiation measurements, the mapping of the surface solar radiation using the meteorological satellite Meteosat-7 data permits to complete the gaps. Kriging the punctual measurements using satellite data as spatial weights enables to obtain a continuous map with a spatially constant quality all over the Reunion Island. A significant challenge of this monitoring system is to ensure the temporal continuity of the real-time mapping. Indeed, autonomous sensors are programmed with our proprietary protocol leading to a smart management of the battery load and telecommunication costs. Measurements are sent to a server with a protocol minimizing the data amount in order to ensure low telecommunication prices. The server receives the measurements data and integrates them into a NoSql database. The server is able to handle long times series and quality control is routinely made to ensure data consistence as well as instruments float state monitoring. The database can be requested by our geographical information system server through an application programming interface. This configuration permits an easy development of a web-based or smart phone application using any external information provided by the user (personal phenotype and exposure experience) or its device (e.g. computing refinements according to its location).

  5. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    PubMed

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap that exists between the CAS system and implant manufacturers, hospitals, and surgeons.

  6. Data Collection Instruments and Database for Assessment of Multiple Launch Rocket System (MLRS) Training Strategy: Appendixes

    DTIC Science & Technology

    1992-04-01

    problem. No calibration, -- no fire. (90) 6.A Are you able to take advant ge of the training time with the Batery ? Only about 25% of the time is quality...A Are you able to take advantage of the training time with the Batery ? Only about 25% of the time is quality training which is like six hours out of...advantage of the training time with the Batery ? Only about 25% of the time is quality training which is like six hours out of 24. The communications and

  7. Laboratory light scattering measurements on "natural" particles with the PROGRA2 experiment: an overview

    NASA Astrophysics Data System (ADS)

    Hadamcik, E.; Rrenard, J.; Levasseur-Regourd, A. C.; Worms, J. C.

    Polarimetric phase curves were obtained with the PROGRA2 instrument for different particles: glass beads, polyhedral solids, rough particles, dense aggregates and aggregates with porosity higher than 90 %. The main purpose of these measurements is to build a large database, which allows interpreting remote sensing observations of solar system bodies. For some samples numerical or experimental models (i.e. DDA, stochastically built particles, microwave analogue) and laboratory experiments are compared to better disentangle the involved physical properties. This paper gives some main results of the experiment, and their applications to Earth atmosphere, comets and asteroids.

  8. Building a QC Database of Meteorological Data From NASA KSC and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.

  9. Fear of cancer recurrence: a systematic literature review of self-report measures.

    PubMed

    Thewes, Belinda; Butow, Phillis; Zachariae, Robert; Christensen, Soren; Simard, Sébastien; Gotay, Carolyn

    2012-06-01

    Prior research has shown that many cancer survivors experience ongoing fears of cancer recurrence (FCR) and that this chronic uncertainty of health status during and after cancer treatment can be a significant psychological burden. The field of research on FCR is an emerging area of investigation in the cancer survivorship literature, and several standardised instruments for its assessment have been developed. This review aims to identify all available FCR-specific questionnaires and subscales and critically appraise their properties. A systematic review was undertaken to identify instruments measuring FCR. Relevant studies were identified via Medline (1950-2010), CINAHL (1982-2010), PsycINFO (1967-2010) and AMED (1985-2010) databases, reference lists of articles and reviews, grey literature databases and consultation with experts in the field. The Medical Outcomes Trust criteria were used to examine the psychometric properties of the questionnaires. A total of 20 relevant multi-item measures were identified. The majority of instruments have demonstrated reliability and preliminary evidence of validity. Relatively few brief measures (2-10 items) were found to have comprehensive validation and reliability data available. Several valid and reliable longer measures (>10 items) are available. Three have developed short forms that may prove useful as screening tools. This analysis indicated that further refinement and validation of existing instruments is required. Valid and reliable instruments are needed for both research and clinical care. Copyright © 2011 John Wiley & Sons, Ltd.

  10. Measures of Wellness in Young Adult College Students: An Integrative Review.

    PubMed

    Nair, Julie McCulloh

    2018-04-01

    Wellness behaviors typically form during the college years, making wellness evaluation crucial during this time frame. Instruments often assess health rather than wellness. Thus, the purpose of this integrative review is to identify and evaluate instruments measuring wellness among young adult college students. Google Scholar, CINAHL, PubMed, MEDLINE, PsycINFO, ERIC, and other databases were searched yielding 350 studies initially. Seven studies met inclusion criteria and were retained for this review. Reliability and validity is reported in each study with ongoing analysis. Homogeneous samples were reported in each study, and administering concurrent instruments created feasibility issues. A summary of instruments measuring wellness in young adult college students is provided. However, few wellness instruments exist in this population. Thus, further development is needed.

  11. Migration of the CERN IT Data Centre Support System to ServiceNow

    NASA Astrophysics Data System (ADS)

    Alvarez Alonso, R.; Arneodo, G.; Barring, O.; Bonfillou, E.; Coelho dos Santos, M.; Dore, V.; Lefebure, V.; Fedorko, I.; Grossir, A.; Hefferman, J.; Mendez Lorenzo, P.; Moller, M.; Pera Mira, O.; Salter, W.; Trevisani, F.; Toteva, Z.

    2014-06-01

    The large potential and flexibility of the ServiceNow infrastructure based on "best practises" methods is allowing the migration of some of the ticketing systems traditionally used for the monitoring of the servers and services available at the CERN IT Computer Centre. This migration enables the standardization and globalization of the ticketing and control systems implementing a generic system extensible to other departments and users. One of the activities of the Service Management project together with the Computing Facilities group has been the migration of the ITCM structure based on Remedy to ServiceNow within the context of one of the ITIL processes called Event Management. The experience gained during the first months of operation has been instrumental towards the migration to ServiceNow of other service monitoring systems and databases. The usage of this structure is also extended to the service tracking at the Wigner Centre in Budapest.

  12. Remote monitoring system for the cryogenic system of superconducting magnets in the SuperKEKB interaction region

    NASA Astrophysics Data System (ADS)

    Aoki, K.; Ohuchi, N.; Zong, Z.; Arimoto, Y.; Wang, X.; Yamaoka, H.; Kawai, M.; Kondou, Y.; Makida, Y.; Hirose, M.; Endou, T.; Iwasaki, M.; Nakamura, T.

    2017-12-01

    A remote monitoring system was developed based on the software infrastructure of the Experimental Physics and Industrial Control System (EPICS) for the cryogenic system of superconducting magnets in the interaction region of the SuperKEKB accelerator. The SuperKEKB has been constructed to conduct high-energy physics experiments at KEK. These superconducting magnets consist of three apparatuses, the Belle II detector solenoid, and QCSL and QCSR accelerator magnets. They are each contained in three cryostats cooled by dedicated helium cryogenic systems. The monitoring system was developed to read data of the EX-8000, which is an integrated instrumentation system to control all cryogenic components. The monitoring system uses the I/O control tools of EPICS software for TCP/IP, archiving techniques using a relational database, and easy human-computer interface. Using this monitoring system, it is possible to remotely monitor all real-time data of the superconducting magnets and cryogenic systems. It is also convenient to share data among multiple groups.

  13. Latest developments for the IAGOS database: Interoperability and metadata

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.

  14. Current applications of robotics in spine surgery: a systematic review of the literature.

    PubMed

    Joseph, Jacob R; Smith, Brandon W; Liu, Xilin; Park, Paul

    2017-05-01

    OBJECTIVE Surgical robotics has demonstrated utility across the spectrum of surgery. Robotics in spine surgery, however, remains in its infancy. Here, the authors systematically review the evidence behind robotic applications in spinal instrumentation. METHODS This systematic review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines. Relevant studies (through October 2016) that reported the use of robotics in spinal instrumentation were identified from a search of the PubMed database. Data regarding the accuracy of screw placement, surgeon learning curve, radiation exposure, and reasons for robotic failure were extracted. RESULTS Twenty-five studies describing 2 unique robots met inclusion criteria. Of these, 22 studies evaluated accuracy of spinal instrumentation. Although grading of pedicle screw accuracy was variable, the most commonly used method was the Gertzbein and Robbins system of classification. In the studies using the Gertzbein and Robbins system, accuracy (Grades A and B) ranged from 85% to 100%. Ten studies evaluated radiation exposure during the procedure. In studies that detailed fluoroscopy usage, overall fluoroscopy times ranged from 1.3 to 34 seconds per screw. Nine studies examined the learning curve for the surgeon, and 12 studies described causes of robotic failure, which included registration failure, soft-tissue hindrance, and lateral skiving of the drill guide. CONCLUSIONS Robotics in spine surgery is an emerging technology that holds promise for future applications. Surgical accuracy in instrumentation implanted using robotics appears to be high. However, the impact of robotics on radiation exposure is not clear and seems to be dependent on technique and robot type.

  15. Increasing Usability in Ocean Observing Systems

    NASA Astrophysics Data System (ADS)

    Chase, A. C.; Gomes, K.; O'Reilly, T.

    2005-12-01

    As observatory systems move to more advanced techniques for instrument configuration and data management, standardized frameworks are being developed to benefit from commodities of scale. ACE (A Configuror and Editor) is a tool that was developed for SIAM (Software Infrastructure and Application for MOOS), a framework for the seamless integration of self-describing plug-and-work instruments into the Monterey Ocean Observing System. As a comprehensive solution, the SIAM infrastructure requires a number of processes to be run to configure an instrument for use within its framework. As solutions move from the lab to the field, the steps needed to implement the solution must be made bulletproof so that they may be used in the field with confidence. Loosely defined command line interfaces don't always provide enough user feedback and business logic can be difficult to maintain over a series of scripts. ACE is a tool developed for guiding the user through a number of complicated steps, removing the reliance on command-line utilities and reducing the difficulty of completing the necessary steps, while also preventing operator error and enforcing system constraints. Utilizing the cross-platform nature of the Java programming language, ACE provides a complete solution for deploying an instrument within the SIAM infrastructure without depending on special software being installed on the users computer. Requirements such as the installation of a Unix emulator for users running Windows machines, and the installation of, and ability to use, a CVS client, have all been removed by providing the equivalent functionality from within ACE. In order to achieve a "one stop shop" for configuring instruments, ACE had to be written to handle a wide variety of functionality including: compiling java code, interacting with a CVS server and maintaining client-side CVS information, editing XML, interacting with a server side database, and negotiating serial port communications through Java. This paper will address the relative tradeoffs of including all the afore-mentioned functionality in a single tool, its affects on user adoption of the framework (SIAM) it provides access to, as well as further discussion of some of the functionality generally pertinent to data management (XML editing, source code management and compilation, etc).

  16. An Interactive Multi-instrument Database of Solar Flares

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadykov, Viacheslav M; Kosovichev, Alexander G; Oria, Vincent

    Solar flares are complicated physical phenomena that are observable in a broad range of the electromagnetic spectrum, from radio waves to γ -rays. For a more comprehensive understanding of flares, it is necessary to perform a combined multi-wavelength analysis using observations from many satellites and ground-based observatories. For an efficient data search, integration of different flare lists, and representation of observational data, we have developed the Interactive Multi-Instrument Database of Solar Flares (IMIDSF, https://solarflare.njit.edu/). The web-accessible database is fully functional and allows the user to search for uniquely identified flare events based on their physical descriptors and the availability ofmore » observations by a particular set of instruments. Currently, the data from three primary flare lists ( Geostationary Operational Environmental Satellites , RHESSI , and HEK) and a variety of other event catalogs ( Hinode , Fermi GBM, Konus- W IND, the OVSA flare catalogs, the CACTus CME catalog, the Filament eruption catalog) and observing logs ( IRIS and Nobeyama coverage) are integrated, and an additional set of physical descriptors (temperature and emission measure) is provided along with an observing summary, data links, and multi-wavelength light curves for each flare event since 2002 January. We envision that this new tool will allow researchers to significantly speed up the search of events of interest for statistical and case studies.« less

  17. Toward Phase IV, Populating the WOVOdat Database

    NASA Astrophysics Data System (ADS)

    Ratdomopurbo, A.; Newhall, C. G.; Schwandner, F. M.; Selva, J.; Ueda, H.

    2009-12-01

    One of challenges for volcanologists is the fact that more and more people are likely to live on volcanic slopes. Information about volcanic activity during unrest should be accurate and rapidly distributed. As unrest may lead to eruption, evacuation may be necessary to minimize damage and casualties. The decision to evacuate people is usually based on the interpretation of monitoring data. Over the past several decades, monitoring volcanoes has used more and more sophisticated instruments. A huge volume of data is collected in order to understand the state of activity and behaviour of a volcano. WOVOdat, The World Organization of Volcano Observatories (WOVO) Database of Volcanic Unrest, will provide context within which scientists can interpret the state of their own volcano, during and between crises. After a decision during the 2000 IAVCEI General Assembly to create WOVOdat, development has passed through several phases, from Concept Development (Phase-I in 2000-2002), Database Design (Phase-II, 2003-2006) and Pilot Testing (Phase-III in 2007-2008). For WOVOdat to be operational, there are still two (2) steps to complete, which are: Database Population (Phase-IV) and Enhancement and Maintenance (Phase-V). Since January 2009, the WOVOdat project is hosted by Earth Observatory of Singapore for at least a 5-year period. According to the original planning in 2002, this 5-year period will be used for completing the Phase-IV. As the WOVOdat design is not yet tested for all types of data, 2009 is still reserved for building the back-end relational database management system (RDBMS) of WOVOdat and testing it with more complex data. Fine-tuning of the WOVOdat’s RDBMS design is being done with each new upload of observatory data. The next and main phase of WOVOdat development will be data population, managing data transfer from multiple observatory formats to WOVOdat format. Data population will depend on two important things, the availability of SQL database in volcano observatories and their data sharing policy. Hence, a strong collaboration with every WOVO observatory is important. For some volcanoes where the data are not in an SQL system, the WOVOdat project will help scientists working on the volcano to start building an SQL database.

  18. Mini-DIAL system measurements coupled with multivariate data analysis to identify TIC and TIM simulants: preliminary absorption database analysis.

    NASA Astrophysics Data System (ADS)

    Gaudio, P.; Malizia, A.; Gelfusa, M.; Martinelli, E.; Di Natale, C.; Poggi, L. A.; Bellecci, C.

    2017-01-01

    Nowadays Toxic Industrial Components (TICs) and Toxic Industrial Materials (TIMs) are one of the most dangerous and diffuse vehicle of contamination in urban and industrial areas. The academic world together with the industrial and military one are working on innovative solutions to monitor the diffusion in atmosphere of such pollutants. In this phase the most common commercial sensors are based on “point detection” technology but it is clear that such instruments cannot satisfy the needs of the smart cities. The new challenge is developing stand-off systems to continuously monitor the atmosphere. Quantum Electronics and Plasma Physics (QEP) research group has a long experience in laser system development and has built two demonstrators based on DIAL (Differential Absorption of Light) technology could be able to identify chemical agents in atmosphere. In this work the authors will present one of those DIAL system, the miniaturized one, together with the preliminary results of an experimental campaign conducted on TICs and TIMs simulants in cell with aim of use the absorption database for the further atmospheric an analysis using the same DIAL system. The experimental results are analysed with standard multivariate data analysis technique as Principal Component Analysis (PCA) to develop a classification model aimed at identifying organic chemical compound in atmosphere. The preliminary results of absorption coefficients of some chemical compound are shown together pre PCA analysis.

  19. Information technologies for astrophysics circa 2001

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include mineaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large datasets. Three limiting paradigms are saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage and retrieval off the shelf; and the linear mode of innovation. We must extend these paradigms to meet our goals for information technology at the end of the decade.

  20. Astrophysics

    Science.gov Websites

    , microquasars, neutron stars, pulsars, black holes astro-ph.IM - Instrumentation and Methods for Astrophysics Astrophysics. Methods for data analysis, statistical methods. Software, database design astro-ph.SR - Solar and

  1. Evaluation of a Myopic Normative Database for Analysis of Retinal Nerve Fiber Layer Thickness.

    PubMed

    Biswas, Sayantan; Lin, Chen; Leung, Christopher K S

    2016-09-01

    Analysis of retinal nerve fiber layer (RNFL) abnormalities with optical coherence tomography in eyes with high myopia has been complicated by high rates of false-positive errors. An understanding of whether the application of a myopic normative database can improve the specificity for detection of RNFL abnormalities in eyes with high myopia is relevant. To evaluate the diagnostic performance of a myopic normative database for detection of RNFL abnormalities in eyes with high myopia (spherical equivalent, -6.0 diopters [D] or less). In this cross-sectional study, 180 eyes with high myopia (mean [SD] spherical equivalent, -8.0 [1.8] D) from 180 healthy individuals were included in the myopic normative database. Another 46 eyes with high myopia from healthy individuals (mean [SD] spherical equivalent, -8.1 [1.8] D) and 74 eyes from patients with high myopia and glaucoma (mean [SD] spherical equivalent, -8.3 [1.9] D) were included for evaluation of specificity and sensitivity. The 95th and 99th percentiles of the mean and clock-hour circumpapillary RNFL thicknesses and the individual superpixel thicknesses of the RNFL thickness map measured by spectral-domain optical coherence tomography were calculated from the 180 eyes with high myopia. Participants were recruited from January 2, 2013, to December 30, 2015. The following 6 criteria of RNFL abnormalities were examined: (1) mean circumpapillary RNFL thickness below the lower 95th or (2) the lower 99th percentile; (3) one clock-hour or more for RNFL thickness below the lower 95th or (4) the lower 99th percentile; and (5) twenty contiguous superpixels or more of RNFL thickness in the RNFL thickness map below the lower 95th or (6) the lower 99th percentile. Specificities and sensitivities for detection of RNFL abnormalities. Of the 46 healthy eyes and 74 eyes with glaucoma studied (from 39 men and 38 women), the myopic normative database showed a higher specificity (63.0%-100%) than did the built-in normative database of the optical coherence tomography instrument (8.7%-87.0%) for detection of RNFL abnormalities across all the criteria examined (differences in specificities between 13.0% [95% CI, 1.1%-24.9%; P = .01] and 54.3% [95% CI, 37.8%-70.9%; P < .001]) except for the criterion of mean RNFL thickness below the lower 99th percentile, in which both normative databases had the same specificities (100%) but the myopic normative database exhibited a higher sensitivity (71.6% vs 86.5%; difference in sensitivities, 14.9% [95% CI, 4.6%-25.1%; P = .002]). The application of a myopic normative database improved the specificity without compromising the sensitivity compared with the optical coherence tomography instrument's built-in normative database for detection of RNFL abnormalities in eyes with high myopia. Inclusion of myopic normative databases should be considered in optical coherence tomography instruments.

  2. The MIPAS2D: 2-D analysis of MIPAS observations of ESA target molecules and minor species

    NASA Astrophysics Data System (ADS)

    Arnone, E.; Brizzi, G.; Carlotti, M.; Dinelli, B. M.; Magnani, L.; Papandrea, E.; Ridolfi, M.

    2008-12-01

    Measurements from the MIPAS instrument onboard the ENVISAT satellite were analyzed with the Geofit Multi- Target Retrieval (GMTR) system to obtain 2-dimensional fields of pressure, temperature and volume mixing ratios of H2O, O3, HNO3, CH4, N2O, and NO2. Secondary target species relevant to stratospheric chemistry were also analysed and robust mixing ratios of N2O5, ClONO2, F11, F12, F14 and F22 were obtained. Other minor species with high uncertainties were not included in the database and will be the object of further studies. The analysis covers the original nominal observation mode from July 2002 to March 2004 and it is currently being extended to the ongoing reduced resolution mission. The GMTR algorithm was operated on a fixed 5 degrees latitudinal grid in order to ease the comparison with model calculations and climatological datasets. The generated database of atmospheric fields can be directly used for analyses based on averaging processes with no need of further interpolation. Samples of the obtained products are presented and discussed. The database of the retrieved quantities is made available to the scientific community.

  3. Near-infrared incoherent broadband cavity enhanced absorption spectroscopy (NIR-IBBCEAS) for detection and quantification of natural gas components.

    PubMed

    Prakash, Neeraj; Ramachandran, Arun; Varma, Ravi; Chen, Jun; Mazzoleni, Claudio; Du, Ke

    2018-06-28

    The principle of near-infrared incoherent broadband cavity enhanced absorption spectroscopy was employed to develop a novel instrument for detecting natural gas leaks as well as for testing the quality of natural gas mixtures. The instrument utilizes the absorption features of methane, butane, ethane, and propane in the wavelength region of 1100 nm to 1250 nm. The absorption cross-section spectrum in this region for methane was adopted from the HITRAN database, and those for the other three gases were measured in the laboratory. A singular-value decomposition (SVD) based analysis scheme was employed for quantifying methane, butane, ethane, and propane by performing a linear least-square fit. The developed instrument achieved a detection limit of 460 ppm, 141 ppm, 175 ppm and 173 ppm for methane, butane, ethane, and propane, respectively, with a measurement time of 1 second and a cavity length of 0.59 m. These detection limits are less than 1% of the Lower Explosive Limit (LEL) for each gas. The sensitivity can be further enhanced by changing the experimental parameters (such as cavity length, lamp power etc.) and using longer averaging intervals. The detection system is a low-cost and portable instrument suitable for performing field monitorings. The results obtained on the gas mixture emphasize the instrument's potential for deployment at industrial facilities dealing with natural gas, where potential leaks pose a threat to public safety.

  4. Field trial of a dual-wavelength fluorescent emission (L.I.F.E.) instrument and the Magma White rover during the MARS2013 Mars analog mission.

    PubMed

    Groemer, Gernot; Sattler, Birgit; Weisleitner, Klemens; Hunger, Lars; Kohstall, Christoph; Frisch, Albert; Józefowicz, Mateusz; Meszyński, Sebastian; Storrie-Lombardi, Michael; Bothe, Claudia; Boyd, Andrea; Dinkelaker, Aline; Dissertori, Markus; Fasching, David; Fischer, Monika; Föger, Daniel; Foresta, Luca; Frischauf, Norbert; Fritsch, Lukas; Fuchs, Harald; Gautsch, Christoph; Gerard, Stephan; Goetzloff, Linda; Gołebiowska, Izabella; Gorur, Paavan; Groemer, Gerhard; Groll, Petra; Haider, Christian; Haider, Olivia; Hauth, Eva; Hauth, Stefan; Hettrich, Sebastian; Jais, Wolfgang; Jones, Natalie; Taj-Eddine, Kamal; Karl, Alexander; Kauerhoff, Tilo; Khan, Muhammad Shadab; Kjeldsen, Andreas; Klauck, Jan; Losiak, Anna; Luger, Markus; Luger, Thomas; Luger, Ulrich; McArthur, Jane; Moser, Linda; Neuner, Julia; Orgel, Csilla; Ori, Gian Gabriele; Paternesi, Roberta; Peschier, Jarno; Pfeil, Isabella; Prock, Silvia; Radinger, Josef; Ragonig, Christoph; Ramirez, Barbara; Ramo, Wissam; Rampey, Mike; Sams, Arnold; Sams, Elisabeth; Sams, Sebastian; Sandu, Oana; Sans, Alejandra; Sansone, Petra; Scheer, Daniela; Schildhammer, Daniel; Scornet, Quentin; Sejkora, Nina; Soucek, Alexander; Stadler, Andrea; Stummer, Florian; Stumptner, Willibald; Taraba, Michael; Tlustos, Reinhard; Toferer, Ernst; Turetschek, Thomas; Winter, Egon; Zanella-Kux, Katja

    2014-05-01

    Abstract We have developed a portable dual-wavelength laser fluorescence spectrometer as part of a multi-instrument optical probe to characterize mineral, organic, and microbial species in extreme environments. Operating at 405 and 532 nm, the instrument was originally designed for use by human explorers to produce a laser-induced fluorescence emission (L.I.F.E.) spectral database of the mineral and organic molecules found in the microbial communities of Earth's cryosphere. Recently, our team had the opportunity to explore the strengths and limitations of the instrument when it was deployed on a remote-controlled Mars analog rover. In February 2013, the instrument was deployed on board the Magma White rover platform during the MARS2013 Mars analog field mission in the Kess Kess formation near Erfoud, Morocco. During these tests, we followed tele-science work flows pertinent to Mars surface missions in a simulated spaceflight environment. We report on the L.I.F.E. instrument setup, data processing, and performance during field trials. A pilot postmission laboratory analysis determined that rock samples acquired during the field mission exhibited a fluorescence signal from the Sun-exposed side characteristic of chlorophyll a following excitation at 405 nm. A weak fluorescence response to excitation at 532 nm may have originated from another microbial photosynthetic pigment, phycoerythrin, but final assignment awaits development of a comprehensive database of mineral and organic fluorescence spectra. No chlorophyll fluorescence signal was detected from the shaded underside of the samples.

  5. Online molecular image repository and analysis system: A multicenter collaborative open-source infrastructure for molecular imaging research and application.

    PubMed

    Rahman, Mahabubur; Watabe, Hiroshi

    2018-05-01

    Molecular imaging serves as an important tool for researchers and clinicians to visualize and investigate complex biochemical phenomena using specialized instruments; these instruments are either used individually or in combination with targeted imaging agents to obtain images related to specific diseases with high sensitivity, specificity, and signal-to-noise ratios. However, molecular imaging, which is a multidisciplinary research field, faces several challenges, including the integration of imaging informatics with bioinformatics and medical informatics, requirement of reliable and robust image analysis algorithms, effective quality control of imaging facilities, and those related to individualized disease mapping, data sharing, software architecture, and knowledge management. As a cost-effective and open-source approach to address these challenges related to molecular imaging, we develop a flexible, transparent, and secure infrastructure, named MIRA, which stands for Molecular Imaging Repository and Analysis, primarily using the Python programming language, and a MySQL relational database system deployed on a Linux server. MIRA is designed with a centralized image archiving infrastructure and information database so that a multicenter collaborative informatics platform can be built. The capability of dealing with metadata, image file format normalization, and storing and viewing different types of documents and multimedia files make MIRA considerably flexible. With features like logging, auditing, commenting, sharing, and searching, MIRA is useful as an Electronic Laboratory Notebook for effective knowledge management. In addition, the centralized approach for MIRA facilitates on-the-fly access to all its features remotely through any web browser. Furthermore, the open-source approach provides the opportunity for sustainable continued development. MIRA offers an infrastructure that can be used as cross-boundary collaborative MI research platform for the rapid achievement in cancer diagnosis and therapeutics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Documentation of the U.S. Geological Survey Oceanographic Time-Series Measurement Database

    USGS Publications Warehouse

    Montgomery, Ellyn T.; Martini, Marinna A.; Lightsom, Frances L.; Butman, Bradford

    2008-01-02

    This report describes the instrumentation and platforms used to make the measurements; the methods used to process, apply quality-control criteria, and archive the data; the data storage format, and how the data are released and distributed. The report also includes instructions on how to access the data from the online database at http://stellwagen.er.usgs.gov/. As of 2016, the database contains about 5,000 files, which may include observations of current velocity, wave statistics, ocean temperature, conductivity, pressure, and light transmission at one or more depths over some duration of time.

  7. European Geophysical Society (23rd) General Assembly, Annales Geophysicae, Part 4, Nonlinear Geophysics & Natural Hazards, Supplement 4 to Volume 16, Held in Nice, France on 20-24 April 1998

    DTIC Science & Technology

    1998-01-01

    the power spectra of instrumental temperature data from the Global Summary of day database from time scales of 1 day to 100 years. Maritime sta- tions...continental-type spectrum to a maritime-type spectrum is investigated by averaging spectra from all stations in the database in 2°x2° grid squares...We present global and regional maps of the seismic intensity factor based on data from the NEIC Global Hypocenter Database from 1963-1994. The

  8. Pharmacovigilance of drug allergy and hypersensitivity using the ENDA-DAHD database and the GALEN platform. The Galenda project.

    PubMed

    Bousquet, P-J; Demoly, P; Romano, A; Aberer, W; Bircher, A; Blanca, M; Brockow, K; Pichler, W; Torres, M J; Terreehorst, I; Arnoux, B; Atanaskovic-Markovic, M; Barbaud, A; Bijl, A; Bonadonna, P; Burney, P G; Caimmi, S; Canonica, G W; Cernadas, J; Dahlen, B; Daures, J-P; Fernandez, J; Gomes, E; Gueant, J-L; Kowalski, M L; Kvedariene, V; Mertes, P-M; Martins, P; Nizankowska-Mogilnicka, E; Papadopoulos, N; Ponvert, C; Pirmohamed, M; Ring, J; Salapatas, M; Sanz, M L; Szczeklik, A; Van Ganse, E; De Weck, A L; Zuberbier, T; Merk, H F; Sachs, B; Sidoroff, A

    2009-02-01

    Nonallergic hypersensitivity and allergic reactions are part of the many different types of adverse drug reactions (ADRs). Databases exist for the collection of ADRs. Spontaneous reporting makes up the core data-generating system of pharmacovigilance, but there is a large under-estimation of allergy/hypersensitivity drug reactions. A specific database is therefore required for drug allergy and hypersensitivity using standard operating procedures (SOPs), as the diagnosis of drug allergy/hypersensitivity is difficult and current pharmacovigilance algorithms are insufficient. Although difficult, the diagnosis of drug allergy/hypersensitivity has been standardized by the European Network for Drug Allergy (ENDA) under the aegis of the European Academy of Allergology and Clinical Immunology and SOPs have been published. Based on ENDA and Global Allergy and Asthma European Network (GA(2)LEN, EU Framework Programme 6) SOPs, a Drug Allergy and Hypersensitivity Database (DAHD((R))) has been established under FileMaker((R)) Pro 9. It is already available online in many different languages and can be accessed using a personal login. GA(2)LEN is a European network of 27 partners (16 countries) and 59 collaborating centres (26 countries), which can coordinate and implement the DAHD across Europe. The GA(2)LEN-ENDA-DAHD platform interacting with a pharmacovigilance network appears to be of great interest for the reporting of allergy/hypersensitivity ADRs in conjunction with other pharmacovigilance instruments.

  9. SelTarbase, a database of human mononucleotide-microsatellite mutations and their potential impact to tumorigenesis and immunology

    PubMed Central

    Woerner, Stefan M.; Yuan, Yan P.; Benner, Axel; Korff, Sebastian; von Knebel Doeberitz, Magnus; Bork, Peer

    2010-01-01

    About 15% of human colorectal cancers and, at varying degrees, other tumor entities as well as nearly all tumors related to Lynch syndrome are hallmarked by microsatellite instability (MSI) as a result of a defective mismatch repair system. The functional impact of resulting mutations depends on their genomic localization. Alterations within coding mononucleotide repeat tracts (MNRs) can lead to protein truncation and formation of neopeptides, whereas alterations within untranslated MNRs can alter transcription level or transcript stability. These mutations may provide selective advantage or disadvantage to affected cells. They may further concern the biology of microsatellite unstable cells, e.g. by generating immunogenic peptides induced by frameshifts mutations. The Selective Targets database (http://www.seltarbase.org) is a curated database of a growing number of public MNR mutation data in microsatellite unstable human tumors. Regression calculations for various MSI–H tumor entities indicating statistically deviant mutation frequencies predict TGFBR2, BAX, ACVR2A and others that are shown or highly suspected to be involved in MSI tumorigenesis. Many useful tools for further analyzing genomic DNA, derived wild-type and mutated cDNAs and peptides are integrated. A comprehensive database of all human coding, untranslated, non-coding RNA- and intronic MNRs (MNR_ensembl) is also included. Herewith, SelTarbase presents as a plenty instrument for MSI-carcinogenesis-related research, diagnostics and therapy. PMID:19820113

  10. The photo-colorimetric space as a medium for the representation of spatial data

    NASA Technical Reports Server (NTRS)

    Kraiss, K. Friedrich; Widdel, Heino

    1989-01-01

    Spatial displays and instruments are usually used in the context of vehicle guidance, but it is hard to find applicable spatial formats in information retrieval and interaction systems. Human interaction with spatial data structures and the applicability of the CIE color space to improve dialogue transparency is discussed. A proposal is made to use the color space to code spatially represented data. The semantic distances of the categories of dialogue structures or, more general, of database structures, are determined empirically. Subsequently the distances are transformed and depicted into the color space. The concept is demonstrated for a car diagnosis system, where the category cooling system could, e.g., be coded in blue, the category ignition system in red. Hereby a correspondence between color and semantic distances is achieved. Subcategories can be coded as luminance differences within the color space.

  11. [Assessment of municipal management of oral health in primary care: data collection instrument accuracy].

    PubMed

    Pires, Diego Anselmi; Colussi, Claudia Flemming; Calvo, Maria Cristina Marino

    2014-11-01

    This validation study seeks to check the accuracy of an evaluation model. In an evaluation, it is necessary to validate the precision and reliability of the data collection instrument. In this study, the Management Assessment of Oral Health in Primary Care in Santa Catarina was used as a benchmark to calculate the indicators. Its model analyzes primary data, collected via an electronic form, and secondary data, available in the Unified Health System (SUS) database. For this study, the form was applied in the cities of Santa Catarina's Coal Region at two different moments to check its reproducibility, followed by a discussion over the answers with the researcher. The results obtained were analyzed and debated in a consensus workshop with specialists in the field, detecting inaccuracies relating to the concept, the source used and the profile of the respondents themselves. The gross agreement rate in the two data collections was 87%, and the inaccuracies amounted to 36% of the answers. Preferential source suggestions, question modifications and guidelines for the correct filling out of the form were some of the proposed changes, improving the original matrix and the data collection instrument.

  12. Health economics research into supporting carers of people with dementia: A systematic review of outcome measures

    PubMed Central

    2012-01-01

    Advisory bodies, such as the National Institute for Health and Clinical Excellence (NICE) in the UK, advocate using preference based instruments to measure the quality of life (QoL) component of the quality-adjusted life year (QALY). Cost per QALY is used to determine cost-effectiveness, and hence funding, of interventions. QALYs allow policy makers to compare the effects of different interventions across different patient groups. Generic measures may not be sensitive enough to fully capture the QoL effects for certain populations, such as carers, so there is a need to consider additional outcome measures, which are preference based where possible to enable cost-effectiveness analysis to be undertaken. This paper reviews outcome measures commonly used in health services research and health economics research involving carers of people with dementia. An electronic database search was conducted in PubMed, Medline, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, the National Health Service Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment database. Studies were eligible for inclusion if they included an outcome measure for carers of people with dementia. 2262 articles were identified. 455 articles describing 361 studies remained after exclusion criteria were applied. 228 outcome measures were extracted from the studies. Measures were categorised into 44 burden measures, 43 mastery measures, 61 mood measures, 32 QoL measures, 27 social support and relationships measures and 21 staff competency and morale measures. The choice of instrument has implications on funding decisions; therefore, researchers need to choose appropriate instruments for the population being measured and the type of intervention undertaken. If an instrument is not sensitive enough to detect changes in certain populations, the effect of an intervention may be underestimated, and hence interventions which may appear to be beneficial to participants are not deemed cost-effective and are not funded. If this is the case, it is essential that additional outcome measures which detect changes in broader QoL are included, whilst still retaining preference based utility measures such as EQ-5D to allow QALY calculation for comparability with other interventions. PMID:23181515

  13. Health economics research into supporting carers of people with dementia: a systematic review of outcome measures.

    PubMed

    Jones, Carys; Edwards, Rhiannon Tudor; Hounsome, Barry

    2012-11-26

    Advisory bodies, such as the National Institute for Health and Clinical Excellence (NICE) in the UK, advocate using preference based instruments to measure the quality of life (QoL) component of the quality-adjusted life year (QALY). Cost per QALY is used to determine cost-effectiveness, and hence funding, of interventions. QALYs allow policy makers to compare the effects of different interventions across different patient groups. Generic measures may not be sensitive enough to fully capture the QoL effects for certain populations, such as carers, so there is a need to consider additional outcome measures, which are preference based where possible to enable cost-effectiveness analysis to be undertaken. This paper reviews outcome measures commonly used in health services research and health economics research involving carers of people with dementia. An electronic database search was conducted in PubMed, Medline, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, the National Health Service Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment database. Studies were eligible for inclusion if they included an outcome measure for carers of people with dementia. 2262 articles were identified. 455 articles describing 361 studies remained after exclusion criteria were applied. 228 outcome measures were extracted from the studies. Measures were categorised into 44 burden measures, 43 mastery measures, 61 mood measures, 32 QoL measures, 27 social support and relationships measures and 21 staff competency and morale measures. The choice of instrument has implications on funding decisions; therefore, researchers need to choose appropriate instruments for the population being measured and the type of intervention undertaken. If an instrument is not sensitive enough to detect changes in certain populations, the effect of an intervention may be underestimated, and hence interventions which may appear to be beneficial to participants are not deemed cost-effective and are not funded. If this is the case, it is essential that additional outcome measures which detect changes in broader QoL are included, whilst still retaining preference based utility measures such as EQ-5D to allow QALY calculation for comparability with other interventions.

  14. Psychometric Properties of Patient-Facing eHealth Evaluation Measures: Systematic Review and Analysis.

    PubMed

    Wakefield, Bonnie J; Turvey, Carolyn L; Nazi, Kim M; Holman, John E; Hogan, Timothy P; Shimada, Stephanie L; Kennedy, Diana R

    2017-10-11

    Significant resources are being invested into eHealth technology to improve health care. Few resources have focused on evaluating the impact of use on patient outcomes A standardized set of metrics used across health systems and research will enable aggregation of data to inform improved implementation, clinical practice, and ultimately health outcomes associated with use of patient-facing eHealth technologies. The objective of this project was to conduct a systematic review to (1) identify existing instruments for eHealth research and implementation evaluation from the patient's point of view, (2) characterize measurement components, and (3) assess psychometrics. Concepts from existing models and published studies of technology use and adoption were identified and used to inform a search strategy. Search terms were broadly categorized as platforms (eg, email), measurement (eg, survey), function/information use (eg, self-management), health care occupations (eg, nurse), and eHealth/telemedicine (eg, mHealth). A computerized database search was conducted through June 2014. Included articles (1) described development of an instrument, or (2) used an instrument that could be traced back to its original publication, or (3) modified an instrument, and (4) with full text in English language, and (5) focused on the patient perspective on technology, including patient preferences and satisfaction, engagement with technology, usability, competency and fluency with technology, computer literacy, and trust in and acceptance of technology. The review was limited to instruments that reported at least one psychometric property. Excluded were investigator-developed measures, disease-specific assessments delivered via technology or telephone (eg, a cancer-coping measure delivered via computer survey), and measures focused primarily on clinician use (eg, the electronic health record). The search strategy yielded 47,320 articles. Following elimination of duplicates and non-English language publications (n=14,550) and books (n=27), another 31,647 articles were excluded through review of titles. Following a review of the abstracts of the remaining 1096 articles, 68 were retained for full-text review. Of these, 16 described an instrument and six used an instrument; one instrument was drawn from the GEM database, resulting in 23 articles for inclusion. None included a complete psychometric evaluation. The most frequently assessed property was internal consistency (21/23, 91%). Testing for aspects of validity ranged from 48% (11/23) to 78% (18/23). Approximately half (13/23, 57%) reported how to score the instrument. Only six (26%) assessed the readability of the instrument for end users, although all the measures rely on self-report. Although most measures identified in this review were published after the year 2000, rapidly changing technology makes instrument development challenging. Platform-agnostic measures need to be developed that focus on concepts important for use of any type of eHealth innovation. At present, there are important gaps in the availability of psychometrically sound measures to evaluate eHealth technologies. ©Bonnie J Wakefield, Carolyn L Turvey, Kim M Nazi, John E Holman, Timothy P Hogan, Stephanie L Shimada, Diana R Kennedy. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 11.10.2017.

  15. The volatile compound BinBase mass spectral database.

    PubMed

    Skogerson, Kirsten; Wohlgemuth, Gert; Barupal, Dinesh K; Fiehn, Oliver

    2011-08-04

    Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). The BinBase database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples) and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement.

  16. The volatile compound BinBase mass spectral database

    PubMed Central

    2011-01-01

    Background Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. Description The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). Conclusions The BinBase database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples) and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement. PMID:21816034

  17. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos

    2017-02-15

    An "Open Access"-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendor-provided software libraries. Sample classification based on spectral comparison utilized the spectral contrast angle method. Using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. This work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  18. Quality control and assurance for validation of DOS/I measurements

    NASA Astrophysics Data System (ADS)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  19. Development of TGS2611 methane sensor and SHT11 humidity and temperature sensor for measuring greenhouse gas on peatlands in south kalimantan, indonesia

    NASA Astrophysics Data System (ADS)

    Sugriwan, I.; Soesanto, O.

    2017-05-01

    The research was focused on development of data acquisition system to monitor the content of methane, relative humidity and temperature on peatlands in South Kalimantan, Indonesia. Methane is one of greenhouse gases that emitted from peatlands; while humidity and temperature are important parameters of microclimate on peatlands. The content of methane, humidity and temperature are three parameters were monitored digitally, real time, continuously and automatically record by data acquisition systems that interfaced to the personal computer. The hardware of data acquisition system consists of power supply unit, TGS2611 methane gas sensor, SHT11 humidity and temperature sensors, voltage follower, ATMega8535 microcontroller, 16 × 2 LCD character and personal computer. ATMega8535 module is a device to manage all part in measuring instrument. The software which is responsible to take sensor data, calculate characteristic equation and send data to 16 × 2 LCD character are Basic Compiler. To interface between measuring instrument and personal computer is maintained by Delphi 7. The result of data acquisition showed on 16 × 2 LCD characters, PC monitor and database with developed by XAMPP. Methane, humidity, and temperature which release from peatlands are trapped by Closed-Chamber Measurement with dimension 60 × 50 × 40 cm3. TGS2611 methane gas sensor and SHT11 humidity and temperature sensor are calibrated to determine transfer function used to data communication between sensors and microcontroller and integrated into ATMega8535 Microcontroller. Calculation of RS and RL of TGS2611 methane gas sensor refer to data sheet and obtained respectively 1360 ohm and 905 ohm. The characteristic equation of TGS2611 satisfies equation VRL = 0.561 ln n - 2.2641 volt, with n is a various concentrations and VRL in volt. The microcontroller maintained the voltage signal than interfaced it to liquid crystal displays and personal computer (laptop) to display result of the measurement. The result of data acquisition saved on excels and database format.

  20. Disentangling the risk assessment and intimate partner violence relation: Estimating mediating and moderating effects.

    PubMed

    Williams, Kirk R; Stansfield, Richard

    2017-08-01

    To manage intimate partner violence (IPV), the criminal justice system has turned to risk assessment instruments to predict if a perpetrator will reoffend. Empirically determining whether offenders assessed as high risk are those who recidivate is critical for establishing the predictive validity of IPV risk assessment instruments and for guiding the supervision of perpetrators. But by focusing solely on the relation between calculated risk scores and subsequent IPV recidivism, previous studies of the predictive validity of risk assessment instruments omitted mediating factors intended to mitigate the risk of this behavioral recidivism. The purpose of this study was to examine the mediating effects of such factors and the moderating effects of risk assessment on the relation between assessed risk (using the Domestic Violence Screening Instrument-Revised [DVSI-R]) and recidivistic IPV. Using a sample of 2,520 perpetrators of IPV, results revealed that time sentenced to jail and time sentenced to probation each significantly mediated the relation between DVSI-R risk level and frequency of reoffending. The results also revealed that assessed risk moderated the relation between these mediating factors and IPV recidivism, with reduced recidivism (negative estimated effects) for high-risk perpetrators but increased recidivism (positive estimate effects) for low-risk perpetrators. The implication is to assign interventions to the level of risk so that no harm is done. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. A Unified Satellite-Observation Polar Stratospheric Cloud (PSC) Database for Long-Term Climate-Change Studies

    NASA Technical Reports Server (NTRS)

    Fromm, Michael; Pitts, Michael; Alfred, Jerome

    2000-01-01

    This report summarizes the project team's activity and accomplishments during the period 12 February, 1999 - 12 February, 2000. The primary objective of this project was to create and test a generic algorithm for detecting polar stratospheric clouds (PSC), an algorithm that would permit creation of a unified, long term PSC database from a variety of solar occultation instruments that measure aerosol extinction near 1000 nm The second objective was to make a database of PSC observations and certain relevant related datasets. In this report we describe the algorithm, the data we are making available, and user access options. The remainder of this document provides the details of the algorithm and the database offering.

  2. High Resolution Spectroscopy to Support Atmospheric Measurements

    NASA Technical Reports Server (NTRS)

    Benner, D. Chris; Venkataraman, Malathy Devi

    2000-01-01

    The major research activities performed during the cooperative agreement enhanced our spectroscopic knowledge of molecules of atmospheric interest such as carbon dioxide, water vapor, ozone, methane, and carbon monoxide, to name a few. Measurements were made using the NASA Langley Tunable Diode Laser Spectrometer System (TDL) and several Fourier Transform Spectrometer Systems (FTS) around the globe. The results from these studies made remarkable improvements in the line positions and intensities for several molecules, particularly ozone and carbon dioxide in the 2 to 17-micrometer spectral region. Measurements of pressure broadening and pressure induced line shift coefficients and the temperature dependence of pressure broadening and pressure induced line shift coefficients for infrared transitions of ozone, methane, and water vapor were also performed. Results from these studies have been used for retrievals of stratospheric gas concentration profiles from data collected by several Upper Atmospheric Research satellite (UARS) infrared instruments as well as in the analysis of high resolution atmospheric spectra such as those acquired by space-based, ground-based, and various balloon- and aircraft-borne experiments. Our results made significant contributions in several updates of the HITRAN (HIgh resolution TRANsmission) spectral line parameters database. This database enjoys worldwide recognition in research involving diversified scientific fields.

  3. High Resolution Spectroscopy to Support Atmospheric Measurements

    NASA Technical Reports Server (NTRS)

    Benner, D. Chris; Venkataraman, Malathy Devi

    2000-01-01

    The major research activities performed during the cooperative agreement enhanced our spectroscopic knowledge of molecules of atmospheric interest such as carbon dioxide, water vapor, ozone, methane, and carbon monoxide, to name a few. Measurements were made using the NASA Langley Tunable Diode Laser Spectrometer System (TDL) and several Fourier Transform Spectrometer Systems (FTS) around the globe. The results from these studies made remarkable improvements in the line positions and intensities for several molecules, particularly ozone and carbon dioxide in the 2 to 17-micrometer spectral region. Measurements of pressure broadening and pressure induced line shift coefficients and the temperature dependence of pressure broadening and pressure induced line shift coefficients for infrared transitions of ozone, methane, and water vapor were also performed. Results from these studies have been used for retrievals of stratospheric gas concentration profiles from data collected by several Upper Atmospheric Research satellite (UARS) infrared instruments as well as in the analysis of high resolution atmospheric spectra such as those acquired by space-based, ground-based, and various balloon-and aircraft-borne experiments. Our results made significant contributions in several updates of the HITRAN (HIgh resolution TRANsmission) spectral line parameters database. This database enjoys worldwide recognition in research involving diversified scientific fields.

  4. Digitized Database of Old Seismograms Recorder in Romania

    NASA Astrophysics Data System (ADS)

    Paulescu, Daniel; Rogozea, Maria; Popa, Mihaela; Radulian, Mircea

    2016-08-01

    The aim of this paper is to describe a managing system for a unique Romanian database of historical seismograms and complementary documentation (metadata) and its dissemination and analysis procedure. For this study, 5188 historical seismograms recorded between 1903 and 1957 by the Romanian seismological observatories (Bucharest-Filaret, Focşani, Bacău, Vrincioaia, Câmpulung-Muscel, Iaşi) were used. In order to reconsider the historical instrumental data, the analog seismograms are converted to digital images and digital waveforms (digitization/ vectorialisation). First, we applied a careful scanning procedure of the seismograms and related material (seismic bulletins, station books, etc.). In a next step, the high resolution scanned seismograms will be processed to obtain the digital/numeric waveforms. We used a Colortrac Smartlf Cx40 scanner which provides images in TIFF or JPG format. For digitization the algorithm Teseo2 developed by the National Institute of Geophysics and Volcanology in Rome (Italy), within the framework of the SISMOS Project, will be used.

  5. Classification of ion mobility spectra by functional groups using neural networks

    NASA Technical Reports Server (NTRS)

    Bell, S.; Nazarov, E.; Wang, Y. F.; Eiceman, G. A.

    1999-01-01

    Neural networks were trained using whole ion mobility spectra from a standardized database of 3137 spectra for 204 chemicals at various concentrations. Performance of the network was measured by the success of classification into ten chemical classes. Eleven stages for evaluation of spectra and of spectral pre-processing were employed and minimums established for response thresholds and spectral purity. After optimization of the database, network, and pre-processing routines, the fraction of successful classifications by functional group was 0.91 throughout a range of concentrations. Network classification relied on a combination of features, including drift times, number of peaks, relative intensities, and other factors apparently including peak shape. The network was opportunistic, exploiting different features within different chemical classes. Application of neural networks in a two-tier design where chemicals were first identified by class and then individually eliminated all but one false positive out of 161 test spectra. These findings establish that ion mobility spectra, even with low resolution instrumentation, contain sufficient detail to permit the development of automated identification systems.

  6. Flight-determined engine exhaust characteristics of an F404 engine in an F-18 airplane

    NASA Technical Reports Server (NTRS)

    Ennix, Kimberly A.; Burcham, Frank W., Jr.; Webb, Lannie D.

    1993-01-01

    Personnel at the NASA Langley Research Center (NASA-Langley) and the NASA Dryden Flight Research Facility (NASA-Dryden) recently completed a joint acoustic flight test program. Several types of aircraft with high nozzle pressure ratio engines were flown to satisfy a twofold objective. First, assessments were made of subsonic climb-to-cruise noise from flights conducted at varying altitudes in a Mach 0.30 to 0.90 range. Second, using data from flights conducted at constant altitude in a Mach 0.30 to 0.95 range, engineers obtained a high quality noise database. This database was desired to validate the Aircraft Noise Prediction Program and other system noise prediction codes. NASA-Dryden personnel analyzed the engine data from several aircraft that were flown in the test program to determine the exhaust characteristics. The analysis of the exhaust characteristics from the F-18 aircraft are reported. An overview of the flight test planning, instrumentation, test procedures, data analysis, engine modeling codes, and results are presented.

  7. Comprehensive analysis of a multidimensional liquid chromatography mass spectrometry dataset acquired on a quadrupole selecting, quadrupole collision cell, time-of-flight mass spectrometer: I. How much of the data is theoretically interpretable by search engines?

    PubMed

    Chalkley, Robert J; Baker, Peter R; Hansen, Kirk C; Medzihradszky, Katalin F; Allen, Nadia P; Rexach, Michael; Burlingame, Alma L

    2005-08-01

    An in-depth analysis of a multidimensional chromatography-mass spectrometry dataset acquired on a quadrupole selecting, quadrupole collision cell, time-of-flight (QqTOF) geometry instrument was carried out. A total of 3269 CID spectra were acquired. Through manual verification of database search results and de novo interpretation of spectra 2368 spectra could be confidently determined as predicted tryptic peptides. A detailed analysis of the non-matching spectra was also carried out, highlighting what the non-matching spectra in a database search typically are composed of. The results of this comprehensive dataset study demonstrate that QqTOF instruments produce information-rich data of which a high percentage of the data is readily interpretable.

  8. Instrumental variables analysis using multiple databases: an example of antidepressant use and risk of hip fracture.

    PubMed

    Uddin, Md Jamal; Groenwold, Rolf H H; de Boer, Anthonius; Gardarsdottir, Helga; Martin, Elisa; Candore, Gianmario; Belitser, Svetlana V; Hoes, Arno W; Roes, Kit C B; Klungel, Olaf H

    2016-03-01

    Instrumental variable (IV) analysis can control for unmeasured confounding, yet it has not been widely used in pharmacoepidemiology. We aimed to assess the performance of IV analysis using different IVs in multiple databases in a study of antidepressant use and hip fracture. Information on adults with at least one prescription of a selective serotonin reuptake inhibitor (SSRI) or tricyclic antidepressant (TCA) during 2001-2009 was extracted from the THIN (UK), BIFAP (Spain), and Mondriaan (Netherlands) databases. IVs were created using the proportion of SSRI prescriptions per practice or using the one, five, or ten previous prescriptions by a physician. Data were analysed using conventional Cox regression and two-stage IV models. In the conventional analysis, SSRI (vs. TCA) was associated with an increased risk of hip fracture, which was consistently found across databases: the adjusted hazard ratio (HR) was approximately 1.35 for time-fixed and 1.50 to 2.49 for time-varying SSRI use, while the IV analysis based on the IVs that appeared to satisfy the IV assumptions showed conflicting results, e.g. the adjusted HRs ranged from 0.55 to 2.75 for time-fixed exposure. IVs for time-varying exposure violated at least one IV assumption and were therefore invalid. This multiple database study shows that the performance of IV analysis varied across the databases for time-fixed and time-varying exposures and strongly depends on the definition of IVs. It remains challenging to obtain valid IVs in pharmacoepidemiological studies, particularly for time-varying exposure, and IV analysis should therefore be interpreted cautiously. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Flight Data Reduction of Wake Velocity Measurements Using an Instrumented OV-10 Airplane

    NASA Technical Reports Server (NTRS)

    Vicroy, Dan D.; Stuever, Robert A.; Stewart, Eric C.; Rivers, Robert A.

    1999-01-01

    A series of flight tests to measure the wake of a Lockheed C- 130 airplane and the accompanying atmospheric state have been conducted. A specially instrumented North American Rockwell OV-10 airplane was used to measure the wake and atmospheric conditions. An integrated database has been compiled for wake characterization and validation of wake vortex computational models. This paper describes the wake- measurement flight-data reduction process.

  10. CFIT Prevention Using Synthetic Vision

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.; Parrish, Russell V.

    2003-01-01

    In commercial aviation, over 30-percent of all fatal accidents worldwide are categorized as Controlled Flight Into Terrain (CFIT) accidents where a fully functioning airplane is inadvertently flown into the ground, water, or an obstacle. An experiment was conducted at NASA Langley Research Center investigating the presentation of a synthetic terrain database scene to the pilot on a Primary Flight Display (PFD). The major hypothesis for the experiment is that a synthetic vision system (SVS) will improve the pilot s ability to detect and avoid a potential CFIT compared to conventional flight instrumentation. All display conditions, including the baseline, contained a Terrain Awareness and Warning System (TAWS) and Vertical Situation Display (VSD) enhanced Navigation Display (ND). Sixteen pilots each flew 22 approach - departure maneuvers in Instrument Meteorological Conditions (IMC) to the terrain challenged Eagle County Regional Airport (EGE) in Colorado. For the final run, the flight guidance cues were altered such that the departure path went into the terrain. All pilots with a SVS enhanced PFD (12 of 16 pilots) noticed and avoided the potential CFIT situation. All of the pilots who flew the anomaly with the baseline display configuration (which included a TAWS and VSD enhanced ND) had a CFIT event.

  11. Clinical microbiology informatics.

    PubMed

    Rhoads, Daniel D; Sintchenko, Vitali; Rauch, Carol A; Pantanowitz, Liron

    2014-10-01

    The clinical microbiology laboratory has responsibilities ranging from characterizing the causative agent in a patient's infection to helping detect global disease outbreaks. All of these processes are increasingly becoming partnered more intimately with informatics. Effective application of informatics tools can increase the accuracy, timeliness, and completeness of microbiology testing while decreasing the laboratory workload, which can lead to optimized laboratory workflow and decreased costs. Informatics is poised to be increasingly relevant in clinical microbiology, with the advent of total laboratory automation, complex instrument interfaces, electronic health records, clinical decision support tools, and the clinical implementation of microbial genome sequencing. This review discusses the diverse informatics aspects that are relevant to the clinical microbiology laboratory, including the following: the microbiology laboratory information system, decision support tools, expert systems, instrument interfaces, total laboratory automation, telemicrobiology, automated image analysis, nucleic acid sequence databases, electronic reporting of infectious agents to public health agencies, and disease outbreak surveillance. The breadth and utility of informatics tools used in clinical microbiology have made them indispensable to contemporary clinical and laboratory practice. Continued advances in technology and development of these informatics tools will further improve patient and public health care in the future. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  12. Clinical Microbiology Informatics

    PubMed Central

    Sintchenko, Vitali; Rauch, Carol A.; Pantanowitz, Liron

    2014-01-01

    SUMMARY The clinical microbiology laboratory has responsibilities ranging from characterizing the causative agent in a patient's infection to helping detect global disease outbreaks. All of these processes are increasingly becoming partnered more intimately with informatics. Effective application of informatics tools can increase the accuracy, timeliness, and completeness of microbiology testing while decreasing the laboratory workload, which can lead to optimized laboratory workflow and decreased costs. Informatics is poised to be increasingly relevant in clinical microbiology, with the advent of total laboratory automation, complex instrument interfaces, electronic health records, clinical decision support tools, and the clinical implementation of microbial genome sequencing. This review discusses the diverse informatics aspects that are relevant to the clinical microbiology laboratory, including the following: the microbiology laboratory information system, decision support tools, expert systems, instrument interfaces, total laboratory automation, telemicrobiology, automated image analysis, nucleic acid sequence databases, electronic reporting of infectious agents to public health agencies, and disease outbreak surveillance. The breadth and utility of informatics tools used in clinical microbiology have made them indispensable to contemporary clinical and laboratory practice. Continued advances in technology and development of these informatics tools will further improve patient and public health care in the future. PMID:25278581

  13. ESCOLEX: a grade-level lexical database from European Portuguese elementary to middle school textbooks.

    PubMed

    Soares, Ana Paula; Medeiros, José Carlos; Simões, Alberto; Machado, João; Costa, Ana; Iriarte, Álvaro; de Almeida, José João; Pinheiro, Ana P; Comesaña, Montserrat

    2014-03-01

    In this article, we introduce ESCOLEX, the first European Portuguese children's lexical database with grade-level-adjusted word frequency statistics. Computed from a 3.2-million-word corpus, ESCOLEX provides 48,381 word forms extracted from 171 elementary and middle school textbooks for 6- to 11-year-old children attending the first six grades in the Portuguese educational system. Like other children's grade-level databases (e.g., Carroll, Davies, & Richman, 1971; Corral, Ferrero, & Goikoetxea, Behavior Research Methods, 41, 1009-1017, 2009; Lété, Sprenger-Charolles, & Colé, Behavior Research Methods, Instruments, & Computers, 36, 156-166, 2004; Zeno, Ivens, Millard, Duvvuri, 1995), ESCOLEX provides four frequency indices for each grade: overall word frequency (F), index of dispersion across the selected textbooks (D), estimated frequency per million words (U), and standard frequency index (SFI). It also provides a new measure, contextual diversity (CD). In addition, the number of letters in the word and its part(s) of speech, number of syllables, syllable structure, and adult frequencies taken from P-PAL (a European Portuguese corpus-based lexical database; Soares, Comesaña, Iriarte, Almeida, Simões, Costa, …, Machado, 2010; Soares, Iriarte, Almeida, Simões, Costa, França, …, Comesaña, in press) are provided. ESCOLEX will be a useful tool both for researchers interested in language processing and development and for professionals in need of verbal materials adjusted to children's developmental stages. ESCOLEX can be downloaded along with this article or from http://p-pal.di.uminho.pt/about/databases .

  14. Global Precipitation Estimates from Cross-Track Passive Microwave Observations Using a Physically-Based Retrieval Scheme

    NASA Technical Reports Server (NTRS)

    Kidd, Chris; Matsui, Toshi; Chern, Jiundar; Mohr, Karen; Kummerow, Christian; Randel, Dave

    2015-01-01

    The estimation of precipitation across the globe from satellite sensors provides a key resource in the observation and understanding of our climate system. Estimates from all pertinent satellite observations are critical in providing the necessary temporal sampling. However, consistency in these estimates from instruments with different frequencies and resolutions is critical. This paper details the physically based retrieval scheme to estimate precipitation from cross-track (XT) passive microwave (PM) sensors on board the constellation satellites of the Global Precipitation Measurement (GPM) mission. Here the Goddard profiling algorithm (GPROF), a physically based Bayesian scheme developed for conically scanning (CS) sensors, is adapted for use with XT PM sensors. The present XT GPROF scheme utilizes a model-generated database to overcome issues encountered with an observational database as used by the CS scheme. The model database ensures greater consistency across meteorological regimes and surface types by providing a more comprehensive set of precipitation profiles. The database is corrected for bias against the CS database to ensure consistency in the final product. Statistical comparisons over western Europe and the United States show that the XT GPROF estimates are comparable with those from the CS scheme. Indeed, the XT estimates have higher correlations against surface radar data, while maintaining similar root-mean-square errors. Latitudinal profiles of precipitation show the XT estimates are generally comparable with the CS estimates, although in the southern midlatitudes the peak precipitation is shifted equatorward while over the Arctic large differences are seen between the XT and the CS retrievals.

  15. Impact of cancer anorexia-cachexia syndrome on health-related quality of life and resource utilisation: A systematic review.

    PubMed

    Tarricone, Rosanna; Ricca, Giada; Nyanzi-Wakholi, Barbara; Medina-Lara, Antonieta

    2016-03-01

    Cancer anorexia-cachexia syndrome (CACS) negatively impacts patients' quality of life (QoL) and increases the burden on healthcare resources. To review published CACS data regarding health-related QOL (HRQoL) and its economic impact on the healthcare system. Searches were conducted in MEDLINE, EMBASE, DARE, and NHS EED databases. A total of 458 HRQoL and 189 healthcare resources utilisation abstracts were screened, and 42 and 2 full-text articles were included, respectively. The EORTC QLQ-C30 and FAACT instruments were most favoured for assessing HRQOL but none of the current tools cover all domains affected by CACS. Economic estimates for managing CACS are scarce, with studies lacking a breakdown of healthcare resource utilisation items. HRQoL instruments that can better assess and incorporate all the domains affected by CACS are required. Rigorous assessment of costs and benefits of treatment are needed to understand the magnitude of the impact of CACS. Copyright © 2016. Published by Elsevier Ireland Ltd.

  16. Polar Magnetic Field Experiment

    NASA Technical Reports Server (NTRS)

    Russell, C. T.

    1999-01-01

    This grant covers the initial data reduction and analysis of the magnetic field measurements of the Polar spacecraft. At this writing data for the first three years of the mission have been processed and deposited in the key parameter database. These data are also available in a variety of time resolutions and coordinate systems via a webserver at UCLA that provides both plots and digital data. The flight software has twice been reprogrammed: once to remove a glitch in the data where there were rare collisions between commands in the central processing unit and once to provide burst mode data at 100 samples per second on a regular basis. The instrument continues to function as described in the instrument paper (1.1 in the bibliography attached below). The early observations were compared with observations on the same field lines at lower altitude. The polar magnetic measurements also proved to be most useful for testing the accuracy of MHD models. WE also made important contributions to study of waves and turbulence.

  17. A new MAP for Mars

    NASA Technical Reports Server (NTRS)

    Zubrin, Robert; Price, Steve; Clark, Ben; Cantrell, Jim; Bourke, Roger

    1993-01-01

    A Mars Aerial Platform (MAP) mission capable of generating thousands of very-high-resolution (20 cm/pixel) pictures of the Martian surface is considered. The MAP entry vehicle will map the global circulation of the planet's atmosphere and examine the surface and subsurface. Data acquisition will use instruments carried aboard balloons flying at nominal altitude of about 7 km over the Martian surface. The MAP balloons will take high- and medium-resolution photographs of Mars, sound its surface with radar, and provide tracking data to chart its winds. Mars vehicle design is based on the fourth-generation NTP, NEP, SEP vehicle set that provides a solid database for determining transportation system costs. Interference analysis and 3D image generation are performed using manual system sizing and sketching in conjunction with precise CAD modeling.

  18. Information technologies for astrophysics circa 2001

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1991-01-01

    It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include miniaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is less easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large data sets. Three limiting paradigms are as follows: saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage, and retrieval off the shelf; and the linear model of innovation. We must extend these paradigms to meet our goals for information technology at the end of the decade.

  19. A cross-cultural comparison of children's imitative flexibility.

    PubMed

    Clegg, Jennifer M; Legare, Cristine H

    2016-09-01

    Recent research with Western populations has demonstrated that children use imitation flexibly to engage in both instrumental and conventional learning. Evidence for children's imitative flexibility in non-Western populations is limited, however, and has only assessed imitation of instrumental tasks. This study (N = 142, 6- to 8-year-olds) demonstrates both cultural continuity and cultural variation in imitative flexibility. Children engage in higher imitative fidelity for conventional tasks than for instrumental tasks in both an industrialized, Western culture (United States), and a subsistence-based, non-Western culture (Vanuatu). Children in Vanuatu engage in higher imitative fidelity of instrumental tasks than in the United States, a potential consequence of cultural variation in child socialization for conformity. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Considerations in comparing the U.S. Geological Survey one‐year induced‐seismicity hazard models with “Did You Feel It?” and instrumental data

    USGS Publications Warehouse

    White, Isabel; Liu, Taojun; Luco, Nicolas; Liel, Abbie

    2017-01-01

    The recent steep increase in seismicity rates in Oklahoma, southern Kansas, and other parts of the central United States led the U.S. Geological Survey (USGS) to develop, for the first time, a probabilistic seismic hazard forecast for one year (2016) that incorporates induced seismicity. In this study, we explore a process to ground‐truth the hazard model by comparing it with two databases of observations: modified Mercalli intensity (MMI) data from the “Did You Feel It?” (DYFI) system and peak ground acceleration (PGA) values from instrumental data. Because the 2016 hazard model was heavily based on earthquake catalogs from 2014 to 2015, this initial comparison utilized observations from these years. Annualized exceedance rates were calculated with the DYFI and instrumental data for direct comparison with the model. These comparisons required assessment of the options for converting hazard model results and instrumental data from PGA to MMI for comparison with the DYFI data. In addition, to account for known differences that affect the comparisons, the instrumental PGA and DYFI data were declustered, and the hazard model was adjusted for local site conditions. With these adjustments, examples at sites with the most data show reasonable agreement in the exceedance rates. However, the comparisons were complicated by the spatial and temporal completeness of the instrumental and DYFI observations. Furthermore, most of the DYFI responses are in the MMI II–IV range, whereas the hazard model is oriented toward forecasts at higher ground‐motion intensities, usually above about MMI IV. Nevertheless, the study demonstrates some of the issues that arise in making these comparisons, thereby informing future efforts to ground‐truth and improve hazard modeling for induced‐seismicity applications.

  1. The VO-Dance web application at the IA2 data center

    NASA Astrophysics Data System (ADS)

    Molinaro, Marco; Knapic, Cristina; Smareglia, Riccardo

    2012-09-01

    Italian center for Astronomical Archives (IA2, http://ia2.oats.inaf.it) is a national infrastructure project of the Italian National Institute for Astrophysics (Istituto Nazionale di AstroFisica, INAF) that provides services for the astronomical community. Besides data hosting for the Large Binocular Telescope (LBT) Corporation, the Galileo National Telescope (Telescopio Nazionale Galileo, TNG) Consortium and other telescopes and instruments, IA2 offers proprietary and public data access through user portals (both developed and mirrored) and deploys resources complying the Virtual Observatory (VO) standards. Archiving systems and web interfaces are developed to be extremely flexible about adding new instruments from other telescopes. VO resources publishing, along with data access portals, implements the International Virtual Observatory Alliance (IVOA) protocols providing astronomers with new ways of analyzing data. Given the large variety of data flavours and IVOA standards, the need for tools to easily accomplish data ingestion and data publishing arises. This paper describes the VO-Dance tool, that IA2 started developing to address VO resources publishing in a dynamical way from already existent database tables or views. The tool consists in a Java web application, potentially DBMS and platform independent, that stores internally the services' metadata and information, exposes restful endpoints to accept VO queries for these services and dynamically translates calls to these endpoints to SQL queries coherent with the published table or view. In response to the call VO-Dance translates back the database answer in a VO compliant way.

  2. The HYDICE instrument design and its application to planetary instruments

    NASA Technical Reports Server (NTRS)

    Basedow, R.; Silverglate, P.; Rappoport, W.; Rockwell, R.; Rosenberg, D.; Shu, K.; Whittlesey, R.; Zalewski, E.

    1993-01-01

    The Hyperspectral Digital Imagery Collection Experiment (HYDICE) instrument represents a significant advance in the state of the art in hyperspectral sensors. It combines a higher signal-to-noise ratio (SNR) and significantly better spatial and spectral resolution and radio metric accuracy than systems flying on aircraft today. The need for 'clean' data, i.e., data free of sampling artifacts and excessive spatial or spectral noise, is a key driver behind the difficult combination of performance requirements laid out for HYDICE. Most of these involve the sensor optics and detector. This paper presents an optimized approach to those requirements, one that comprises push broom scanning, a single, mechanically cooled focal plane, a double-pass prism spectrometer, and an easily fabricated yet wide-field telescope. Central to the approach is a detector array that covers the entire spectrum from 0.4 to 2.5 microns. Among the major benefits conferred by such a design are optical and mechanical simplicity, low polarization sensitivity, and coverage of the entire spectrum without suffering the spectral gaps caused by beam splitters. The overall system minimizes interfaces to the C-141 aircraft on which it will be flown, can be calibrated on the ground and in flight to accuracies better than those required, and is designed for simple, push-button operation. Only unprocessed data are recorded during flight. A ground data processing station provides quick-look, calibration correction, and archiving capabilities, with a throughput better than the requirements. Overall performance of the system is expected to provide the solid database required to evaluate the potential of hyperspectral imagery in a wide variety of applications. HYDICE can be regarded as a test bed for future planetary instruments. The ability to spectrally image a wide field of view over multiple spectral octaves offers obvious advantages and is expected to maximize science return for the required cost and weight.

  3. The GEISA Spectroscopic Database System in its latest Edition

    NASA Astrophysics Data System (ADS)

    Jacquinet-Husson, N.; Crépeau, L.; Capelle, V.; Scott, N. A.; Armante, R.; Chédin, A.

    2009-04-01

    GEISA (Gestion et Etude des Informations Spectroscopiques Atmosphériques: Management and Study of Spectroscopic Information)[1] is a computer-accessible spectroscopic database system, designed to facilitate accurate forward planetary radiative transfer calculations using a line-by-line and layer-by-layer approach. It was initiated in 1976. Currently, GEISA is involved in activities related to the assessment of the capabilities of IASI (Infrared Atmospheric Sounding Interferometer on board the METOP European satellite -http://earth-sciences.cnes.fr/IASI/)) through the GEISA/IASI database[2] derived from GEISA. Since the Metop (http://www.eumetsat.int) launch (October 19th 2006), GEISA/IASI is the reference spectroscopic database for the validation of the level-1 IASI data, using the 4A radiative transfer model[3] (4A/LMD http://ara.lmd.polytechnique.fr; 4A/OP co-developed by LMD and Noveltis with the support of CNES). Also, GEISA is involved in planetary research, i.e.: modelling of Titan's atmosphere, in the comparison with observations performed by Voyager: http://voyager.jpl.nasa.gov/, or by ground-based telescopes, and by the instruments on board the Cassini-Huygens mission: http://www.esa.int/SPECIALS/Cassini-Huygens/index.html. The updated 2008 edition of GEISA (GEISA-08), a system comprising three independent sub-databases devoted, respectively, to line transition parameters, infrared and ultraviolet/visible absorption cross-sections, microphysical and optical properties of atmospheric aerosols, will be described. Spectroscopic parameters quality requirement will be discussed in the context of comparisons between observed or simulated Earth's and other planetary atmosphere spectra. GEISA is implemented on the CNES/CNRS Ether Products and Services Centre WEB site (http://ether.ipsl.jussieu.fr), where all archived spectroscopic data can be handled through general and user friendly associated management software facilities. More than 350 researchers are registered for on line use of GEISA. Refs: 1. Jacquinet-Husson N., N.A. Scott, A. Chédin,L. Crépeau, R. Armante, V. Capelle, J. Orphal, A. Coustenis, C. Boonne, N. Poulet-Crovisier, et al. THE GEISA SPECTROSCOPIC DATABASE: Current and future archive for Earth and planetary atmosphere studies. JQSRT, 109, 1043-1059, 2008 2. Jacquinet-Husson N., N.A. Scott, A. Chédin, K. Garceran, R. Armante, et al. The 2003 edition of the GEISA/IASI spectroscopic database. JQSRT, 95, 429-67, 2005. 3. Scott, N.A. and A. Chedin, 1981: A fast line-by-line method for atmospheric absorption computations: The Automatized Atmospheric Absorption Atlas. J. Appl. Meteor., 20,556-564.

  4. An Update of the Bodeker Scientific Vertically Resolved, Global, Gap-Free Ozone Database

    NASA Astrophysics Data System (ADS)

    Kremser, S.; Bodeker, G. E.; Lewis, J.; Hassler, B.

    2016-12-01

    High vertical resolution ozone measurements from multiple satellite-based instruments have been merged with measurements from the global ozonesonde network to calculate monthly mean ozone values in 5º latitude zones. Ozone number densities and ozone mixing ratios are provided on 70 altitude levels (1 to 70 km) and on 70 pressure levels spaced approximately 1 km apart (878.4 hPa to 0.046 hPa). These data are sparse and do not cover the entire globe or altitude range. To provide a gap-free database, a least squares regression model is fitted to these data and then evaluated globally. By applying a single fit at each level, and using the approach of allowing the regression fits to change only slightly from one level to the next, the regression is less sensitive to measurement anomalies at individual stations or to individual satellite-based instruments. Particular attention is paid to ensuring that the low ozone abundances in the polar regions are captured. This presentation reports on updates to an earlier version of the vertically resolved ozone database, including the incorporation of new ozone measurements and new techniques for combining the data. Compared to previous versions of the database, particular attention is paid to avoiding spatial and temporal sampling biases and tracing uncertainties through to the final product. This updated database, developed within the New Zealand Deep South National Science Challenge, is suitable for assessing ozone fields from chemistry-climate model simulations or for providing the ozone boundary conditions for global climate model simulations that do not treat stratospheric chemistry interactively.

  5. Long-term data archiving

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, David Steven

    2009-01-01

    Long term data archiving has much value for chemists, not only to retain access to research and product development records, but also to enable new developments and new discoveries. There are some recent regulatory requirements (e.g., FDA 21 CFR Part 11), but good science and good business both benefit regardless. A particular example of the benefits of and need for long term data archiving is the management of data from spectroscopic laboratory instruments. The sheer amount of spectroscopic data is increasing at a scary rate, and the pressures to archive come from the expense to create the data (or recreatemore » it if it is lost) as well as its high information content. The goal of long-term data archiving is to save and organize instrument data files as well as any needed meta data (such as sample ID, LIMS information, operator, date, time, instrument conditions, sample type, excitation details, environmental parameters, etc.). This editorial explores the issues involved in long-term data archiving using the example of Raman spectral databases. There are at present several such databases, including common data format libraries and proprietary libraries. However, such databases and libraries should ultimately satisfy stringent criteria for long term data archiving, including readability for long times into the future, robustness to changes in computer hardware and operating systems, and use of public domain data formats. The latter criterion implies the data format should be platform independent and the tools to create the data format should be easily and publicly obtainable or developable. Several examples of attempts at spectral libraries exist, such as the ASTM ANDI format, and the JCAMP-DX format. On the other hand, proprietary library spectra can be exchanged and manipulated using proprietary tools. As the above examples have deficiencies according to the three long term data archiving criteria, Extensible Markup Language (XML; a product of the World Wide Web Consortium, an independent standards body) as a new data interchange tool is being investigated and implemented. In order to facilitate data archiving, Raman data needs calibration as well as some other kinds of data treatment. Figure 1 illustrates schematically the present situation for Raman data calibration in the world-wide Raman spectroscopy community, and presents some of the terminology used.« less

  6. Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems

    NASA Astrophysics Data System (ADS)

    Dogan, Firat; Atilgan, Yasemin

    The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.

  7. SIRTF Tools for DIRT

    NASA Astrophysics Data System (ADS)

    Pound, M. W.; Wolfire, M. G.; Amarnath, N. S.

    2004-07-01

    The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS {http://dustem.astro.umd.edu}) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysics community for about 5 years. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. We are adding new functionality to DIRT to support new missions like SIRTF and SOFIA. A new Instrument module allows for plotting of the model points convolved with the spatial and spectral responses of the selected instrument. This lets users better fit data from specific instruments. Currently, we have implemented modules for the Infrared Array Camera (IRAC) and Multiband Imaging Photometer (MIPS) on SIRTF. The models are based on the dust radiation transfer code of Wolfire & Cassinelli (1986) which accounts for multiple grain sizes and compositions. The model outputs are averaged over the instrument bands using the same weighting (νFν = constant) as the SIRTF data pipeline which allows the SIRTF data products to be compared directly with the model database. This work was supported in part by a NASA AISRP grant NAG 5-10751 and the SIRTF Legacy Science Program provided by NASA through an award issued by JPL under NASA contract 1407.

  8. Valve Health Monitoring System Utilizing Smart Instrumentation

    NASA Technical Reports Server (NTRS)

    Jensen, Scott L.; Drouant, George J.

    2006-01-01

    The valve monitoring system is a stand alone unit with network capabilities for integration into a higher level health management system. The system is designed for aiding in failure predictions of high-geared ball valves and linearly actuated valves. It performs data tracking and archiving for identifying degraded performance. The data collection types are cryogenic cycles, total cycles, inlet temperature, body temperature torsional strain, linear bonnet strain, preload position, total travel and total directional changes. Events are recorded and time stamped in accordance with the IRIG B True Time. The monitoring system is designed for use in a Class 1 Division II explosive environment. The basic configuration consists of several instrumentation sensor units and a base station. The sensor units are self contained microprocessor controlled and remotely mountable in three by three by two inches. Each unit is potted in a fire retardant substance without any cavities and limited to low operating power for maintaining safe operation in a hydrogen environment. The units are temperature monitored to safeguard against operation outside temperature limitations. Each contains 902-928 MHz band digital transmitters which meet Federal Communication Commission's requirements and are limited to a 35 foot transmission radius for preserving data security. The base-station controller correlates data from the sensor units and generates data event logs on a compact flash memory module for database uploading. The entries are also broadcast over an Ethernet network. Nitrogen purged National Electrical Manufactures Association (NEMA) Class 4 enclosures are used to house the base-station

  9. Valve health monitoring system utilizing smart instrumentation

    NASA Astrophysics Data System (ADS)

    Jensen, Scott L.; Drouant, George J.

    2006-05-01

    The valve monitoring system is a stand alone unit with network capabilities for integration into a higher level health management system. The system is designed for aiding in failure predictions of high-geared ball valves and linearly actuated valves. It performs data tracking and archiving for identifying degraded performance. The data collection types are: cryogenic cycles, total cycles, inlet temperature, outlet temperature, body temperature, torsional strain, linear bonnet strain, preload position, total travel, and total directional changes. Events are recorded and time stamped in accordance with the IRIG B True Time. The monitoring system is designed for use in a Class 1 Division II explosive environment. The basic configuration consists of several instrumentation sensor units and a base station. The sensor units are self contained microprocessor controlled and remotely mountable in three by three by two inches. Each unit is potted in a fire retardant substance without any cavities and limited to low operating power for maintaining safe operation in a hydrogen environment. The units are temperature monitored to safeguard against operation outside temperature limitations. Each contains 902-928 MHz band digital transmitters which meet Federal Communication Commissions requirements and are limited to a 35 foot transmission radius for preserving data security. The base-station controller correlates related data from the sensor units and generates data event logs on a compact flash memory module for database uploading. The entries are also broadcast over an Ethernet network. Nitrogen purged National Electrical Manufactures Association (NEMA) Class 4 Enclosures are used to house the base-station.

  10. Glacier Land Ice Measurements from Space (GLIMS) and the GLIMS Information Management System at NSIDC

    NASA Astrophysics Data System (ADS)

    Machado, A. E.; Scharfen, G. R.; Barry, R. G.; Khalsa, S. S.; Raup, B.; Swick, R.; Troisi, V. J.; Wang, I.

    2001-12-01

    GLIMS (Global Land Ice Measurements from Space) is an international project to survey a majority of the world's glaciers with the accuracy and precision needed to assess recent changes and determine trends in glacial environments. This will be accomplished by: comprehensive periodic satellite measurements, coordinated distribution of screened image data, analysis of images at worldwide Regional Centers, validation of analyses, and a publicly accessible database. The primary data source will be from the ASTER (Advanced Spaceborne Thermal Emission and reflection Radiometer) instrument aboard the EOS Terra spacecraft, and Landsat ETM+ (Enhanced Thematic Mapper Plus), currently in operation. Approximately 700 ASTER images have been acquired with GLIMS gain settings as of mid-2001. GLIMS is a collaborative effort with the United States Geological Survey (USGS), the National Aeronautics Space Adminstration (NASA), other U.S. Federal Agencies and a group of internationally distributed glaciologists at Regional Centers of expertise. The National Snow and Ice Data Center (NSIDC) is developing the information management system for GLIMS. We will ingest and maintain GLIMS-analyzed glacier data from Regional Centers and provide access to the data via the World Wide Web. The GLIMS database will include measurements (over time) of glacier length, area, boundaries, topography, surface velocity vectors, and snowline elevation, derived primarily from remote sensing data. The GLIMS information management system at NSIDC will provide an easy to use and widely accessible service for the glaciological community and other users needing information about the world's glaciers. The structure of the international GLIMS consortium, status of database development, sample imagery and derived analyses and user search and order interfaces will be demonstrated. More information on GLIMS is available at: http://www.glims.org/.

  11. Implementation of the CUAHSI information system for regional hydrological research and workflow

    NASA Astrophysics Data System (ADS)

    Bugaets, Andrey; Gartsman, Boris; Bugaets, Nadezhda; Krasnopeyev, Sergey; Krasnopeyeva, Tatyana; Sokolov, Oleg; Gonchukov, Leonid

    2013-04-01

    Environmental research and education have become increasingly data-intensive as a result of the proliferation of digital technologies, instrumentation, and pervasive networks through which data are collected, generated, shared, and analyzed. Over the next decade, it is likely that science and engineering research will produce more scientific data than has been created over the whole of human history (Cox et al., 2006). Successful using these data to achieve new scientific breakthroughs depends on the ability to access, organize, integrate, and analyze these large datasets. The new project of PGI FEB RAS (http://tig.dvo.ru), FERHRI (www.ferhri.org) and Primgidromet (www.primgidromet.ru) is focused on creation of an open unified hydrological information system according to the international standards to support hydrological investigation, water management and forecasts systems. Within the hydrologic science community, the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (http://his.cuahsi.org) has been developing a distributed network of data sources and functions that are integrated using web services and that provide access to data, tools, and models that enable synthesis, visualization, and evaluation of hydrologic system behavior. Based on the top of CUAHSI technologies two first template databases were developed for primary datasets of special observations on experimental basins in the Far East Region of Russia. The first database contains data of special observation performed on the former (1957-1994) Primorskaya Water-Balance Station (1500 km2). Measurements were carried out on 20 hydrological and 40 rain gauging station and were published as special series but only as hardcopy books. Database provides raw data from loggers with hourly and daily time support. The second database called «FarEastHydro» provides published standard daily measurement performed at Roshydromet observation network (200 hydrological and meteorological stations) for the period beginning 1930 through 1990. Both of the data resources are maintained in a test mode at the project site http://gis.dvo.ru:81/, which is permanently updated. After first success, the decision was made to use the CUAHSI technology as a basis for development of hydrological information system to support data publishing and workflow of Primgidromet, the regional office of Federal State Hydrometeorological Agency. At the moment, Primgidromet observation network is equipped with 34 automatic SEBA hydrological pressure sensor pneumatic gauges PS-Light-2 and 36 automatic SEBA weather stations. Large datasets generated by sensor networks are organized and stored within a central ODM database which allows to unambiguously interpret the data with sufficient metadata and provides traceable heritage from raw measurements to useable information. Organization of the data within a central CUAHSI ODM database was the most critical step, with several important implications. This technology is widespread and well documented, and it ensures that all datasets are publicly available and readily used by other investigators and developers to support additional analyses and hydrological modeling. Implementation of ODM within a Relational Database Management System eliminates the potential data manipulation errors and intermediate the data processing steps. Wrapping CUAHSI WaterOneFlow web-service into OpenMI 2.0 linkable component (www.openmi.org) allows a seamless integration with well-known hydrological modeling systems.

  12. Addition of a breeding database in the Genome Database for Rosaceae

    PubMed Central

    Evans, Kate; Jung, Sook; Lee, Taein; Brutcher, Lisa; Cho, Ilhyung; Peace, Cameron; Main, Dorrie

    2013-01-01

    Breeding programs produce large datasets that require efficient management systems to keep track of performance, pedigree, geographical and image-based data. With the development of DNA-based screening technologies, more breeding programs perform genotyping in addition to phenotyping for performance evaluation. The integration of breeding data with other genomic and genetic data is instrumental for the refinement of marker-assisted breeding tools, enhances genetic understanding of important crop traits and maximizes access and utility by crop breeders and allied scientists. Development of new infrastructure in the Genome Database for Rosaceae (GDR) was designed and implemented to enable secure and efficient storage, management and analysis of large datasets from the Washington State University apple breeding program and subsequently expanded to fit datasets from other Rosaceae breeders. The infrastructure was built using the software Chado and Drupal, making use of the Natural Diversity module to accommodate large-scale phenotypic and genotypic data. Breeders can search accessions within the GDR to identify individuals with specific trait combinations. Results from Search by Parentage lists individuals with parents in common and results from Individual Variety pages link to all data available on each chosen individual including pedigree, phenotypic and genotypic information. Genotypic data are searchable by markers and alleles; results are linked to other pages in the GDR to enable the user to access tools such as GBrowse and CMap. This breeding database provides users with the opportunity to search datasets in a fully targeted manner and retrieve and compare performance data from multiple selections, years and sites, and to output the data needed for variety release publications and patent applications. The breeding database facilitates efficient program management. Storing publicly available breeding data in a database together with genomic and genetic data will further accelerate the cross-utilization of diverse data types by researchers from various disciplines. Database URL: http://www.rosaceae.org/breeders_toolbox PMID:24247530

  13. Addition of a breeding database in the Genome Database for Rosaceae.

    PubMed

    Evans, Kate; Jung, Sook; Lee, Taein; Brutcher, Lisa; Cho, Ilhyung; Peace, Cameron; Main, Dorrie

    2013-01-01

    Breeding programs produce large datasets that require efficient management systems to keep track of performance, pedigree, geographical and image-based data. With the development of DNA-based screening technologies, more breeding programs perform genotyping in addition to phenotyping for performance evaluation. The integration of breeding data with other genomic and genetic data is instrumental for the refinement of marker-assisted breeding tools, enhances genetic understanding of important crop traits and maximizes access and utility by crop breeders and allied scientists. Development of new infrastructure in the Genome Database for Rosaceae (GDR) was designed and implemented to enable secure and efficient storage, management and analysis of large datasets from the Washington State University apple breeding program and subsequently expanded to fit datasets from other Rosaceae breeders. The infrastructure was built using the software Chado and Drupal, making use of the Natural Diversity module to accommodate large-scale phenotypic and genotypic data. Breeders can search accessions within the GDR to identify individuals with specific trait combinations. Results from Search by Parentage lists individuals with parents in common and results from Individual Variety pages link to all data available on each chosen individual including pedigree, phenotypic and genotypic information. Genotypic data are searchable by markers and alleles; results are linked to other pages in the GDR to enable the user to access tools such as GBrowse and CMap. This breeding database provides users with the opportunity to search datasets in a fully targeted manner and retrieve and compare performance data from multiple selections, years and sites, and to output the data needed for variety release publications and patent applications. The breeding database facilitates efficient program management. Storing publicly available breeding data in a database together with genomic and genetic data will further accelerate the cross-utilization of diverse data types by researchers from various disciplines. Database URL: http://www.rosaceae.org/breeders_toolbox.

  14. Design of real-time communication system for image recognition based colony picking instrument

    NASA Astrophysics Data System (ADS)

    Wang, Qun; Zhang, Rongfu; Yan, Hua; Wu, Huamin

    2017-11-01

    In order to aachieve autommated observatiion and pickinng of monocloonal colonies, an overall dessign and realizzation of real-time commmunication system based on High-throoughput monooclonal auto-piicking instrumment is propossed. The real-time commmunication system is commposed of PCC-PLC commuunication systtem and Centrral Control CComputer (CCC)-PLC communicatioon system. Bassed on RS232 synchronous serial communnication methood to develop a set of dedicated shoort-range commmunication prootocol betweenn the PC and PPLC. Furthermmore, the systemm uses SQL SSERVER database to rrealize the dataa interaction between PC andd CCC. Moreoover, the commmunication of CCC and PC, adopted Socket Ethernnet communicaation based on TCP/IP protoccol. TCP full-dduplex data cannnel to ensure real-time data eexchange as well as immprove system reliability andd security. We tested the commmunication syystem using sppecially develooped test software, thee test results show that the sysstem can realizze the communnication in an eefficient, safe aand stable way between PLC, PC andd CCC, keep thhe real-time conntrol to PLC annd colony inforrmation collecttion.

  15. Post endodontic pain following single-visit root canal preparation with rotary vs reciprocating instruments: a meta-analysis of randomized clinical trials.

    PubMed

    Hou, Xiao-Mei; Su, Zheng; Hou, Ben-Xiang

    2017-05-25

    In endodontic therapy, continuous rotary instrumentation reduced debris compared to reciprocal instrumentation, which might affect the incidence of post-endodontic pain (PP). The aim of our study was to assess whether PP incidence and levels were influenced by the choice of rotary or reciprocal instruments. In this meta-analysis the Pubmed and EM databases were searched for prospective clinical randomized trials published before April 20, 2016, using combinations of the keywords: root canal preparation/instrumentation/treatment/therapy; post-operative/endodontic pain; reciprocal and rotary instruments. Three studies were included, involving a total of 1,317 patients, 659 treated with reciprocating instruments and 658 treated with rotary instruments. PP was reported in 139 patients in the reciprocating group and 172 in the rotary group. The PP incidence odds ratio was 1.27 with 95% confidence interval (CI) (0.25, 6.52) favoring rotary instruments. The mild, moderate and severe PP levels odds ratios were 0.31 (0.11, 0.84), 2.24 (0.66, 7.59) and 11.71 (0.63, 218.15), respectively. No evidence of publication bias was found. Rotary instrument choice in endodontic therapy is associated with a lower incidence of PP than reciprocating instruments, while reciprocating instruments are associated with less mild PP incidence.

  16. IRIS++ database: Merging of IRIS + Mark-1 + LOWL

    NASA Astrophysics Data System (ADS)

    Salabert, D.; Fossat, E.; Gelly, B.; Tomczyk, S.; Pallé, P.; Jiménez-Reyes, S. J.; Cacciani, A.; Corbard, T.; Ehgamberdiev, S.; Grec, G.; Hoeksema, J. T.; Kholikov, S.; Lazrek, M.; Schmider, F. X.

    2002-08-01

    The IRIS network has been operated continuously since July 1st 1989. To date, it has acquired more than a complete solar cycle of full-disk helioseismic data which has been used to constrain the structure and rotation of the deep solar interior. However, the duty cycle of the network data has never reached initial expectations. To improve this situation, several cooperations have been developed with teams collecting observations with similar instruments. This paper demonstrates that we are able to merge data from these different instruments in a consistent manner resulting in a very significant improvement in network duty cycle over more than one solar cycle initiating what we call the IRIS++ network. The integrated radial velocities from the IRIS++ database (1989 to 1999) are available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A/390/717

  17. ICESat-2 / ATLAS Flight Science Receiver Algorithms

    NASA Astrophysics Data System (ADS)

    Mcgarry, J.; Carabajal, C. C.; Degnan, J. J.; Mallama, A.; Palm, S. P.; Ricklefs, R.; Saba, J. L.

    2013-12-01

    NASA's Advanced Topographic Laser Altimeter System (ATLAS) will be the single instrument on the ICESat-2 spacecraft which is expected to launch in 2016 with a 3 year mission lifetime. The ICESat-2 orbital altitude will be 500 km with a 92 degree inclination and 91-day repeat tracks. ATLAS is a single photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz and a 6 spot pattern on the Earth's surface. Without some method of eliminating solar background noise in near real-time, the volume of ATLAS telemetry would far exceed the normal X-band downlink capability. To reduce the data volume to an acceptable level a set of onboard Receiver Algorithms has been developed. These Algorithms limit the daily data volume by distinguishing surface echoes from the background noise and allow the instrument to telemeter only a small vertical region about the signal. This is accomplished through the use of an onboard Digital Elevation Model (DEM), signal processing techniques, and an onboard relief map. Similar to what was flown on the ATLAS predecessor GLAS (Geoscience Laser Altimeter System) the DEM provides minimum and maximum heights for each 1 degree x 1 degree tile on the Earth. This information allows the onboard algorithm to limit its signal search to the region between minimum and maximum heights (plus some margin for errors). The understanding that the surface echoes will tend to clump while noise will be randomly distributed led us to histogram the received event times. The selection of the signal locations is based on those histogram bins with statistically significant counts. Once the signal location has been established the onboard Digital Relief Map (DRM) is used to determine the vertical width of the telemetry band about the signal. The ATLAS Receiver Algorithms are nearing completion of the development phase and are currently being tested using a Monte Carlo Software Simulator that models the instrument, the orbit and the environment. This Simulator makes it possible to check all logic paths that could be encountered by the Algorithms on orbit. In addition the NASA airborne instrument MABEL is collecting data with characteristics similar to what ATLAS will see. MABEL data is being used to test the ATLAS Receiver Algorithms. Further verification will be performed during Integration and Testing of the ATLAS instrument and during Environmental Testing on the full ATLAS instrument. Results from testing to date show the Receiver Algorithms have the ability to handle a wide range of signal and noise levels with a very good sensitivity at relatively low signal to noise ratios. In addition, preliminary tests have demonstrated, using the ICESat-2 Science Team's selected land ice and sea ice test cases, the capability of the Algorithms to successfully find and telemeter the surface echoes. In this presentation we will describe the ATLAS Flight Science Receiver Algorithms and the Software Simulator, and will present results of the testing to date. The onboard databases (DEM, DRM and the Surface Reference Mask) are being developed at the University of Texas at Austin as part of the ATLAS Flight Science Receiver Algorithms. Verification of the onboard databases is being performed by ATLAS Receiver Algorithms team members Claudia Carabajal and Jack Saba.

  18. Measuring quality of life of people with predementia and dementia and their caregivers: a systematic review protocol

    PubMed Central

    Landeiro, Filipa; Walsh, Katie; Ghinai, Isaac; Mughal, Seher; Nye, Elsbeth; Wace, Helena; Roberts, Nia; Lecomte, Pascal; Wittenberg, Raphael; Wolstenholme, Jane; Handels, Ron; Roncancio-Diaz, Emilse; Potashman, Michele H; Tockhorn-Heidenreich, Antje

    2018-01-01

    Introduction Dementia is the fastest growing major cause of disability globally and may have a profound impact on the health-related quality of life (HRQoL) of both the patient with dementia and those who care for them. This review aims to systematically identify and synthesise the measurements of HRQoL for people with, and their caregivers across the full spectrum of, dementia from its preceding stage of predementia to end of life. Methods and analysis A systematic literature review was conducted in Medical Literature Analysis and Retrieval System Online, ExcerptaMedicadataBASE, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, Database of Abstracts of Reviews of Effect, National Health Service Economic Evaluation Database and PsycINFO between January 1990 and the end of April 2017. Two reviewers will independently assess each study for inclusion and disagreements will be resolved by a third reviewer. Data will be extracted using a predefined data extraction form following best practice. Study quality will be assessed with the Effective Public Health Practice Project quality assessment tool. HRQoL measurements will be presented separately for people with dementia and caregivers by instrument used and, when possible, HRQoL will be reported by disease type and stage of the disease. Descriptive statistics of the results will be provided. A narrative synthesis of studies will also be provided discussing differences in HRQoL measurements by instrument used to estimate it, type of dementia and disease severity. Ethics and dissemination This systematic literature review is exempt from ethics approval because the work is carried out on published documents. The findings of the review will be disseminated in a related peer-reviewed journal and presented at conferences. They will also contribute to the work developed in the Real World Outcomes across the Alzheimer’s disease spectrum for better care: multimodal data access platform (ROADMAP). Trial registration number CRD42017071416. PMID:29602838

  19. Global Tsunami Database: Adding Geologic Deposits, Proxies, and Tools

    NASA Astrophysics Data System (ADS)

    Brocko, V. R.; Varner, J.

    2007-12-01

    A result of collaboration between NOAA's National Geophysical Data Center (NGDC) and the Cooperative Institute for Research in the Environmental Sciences (CIRES), the Global Tsunami Database includes instrumental records, human observations, and now, information inferred from the geologic record. Deep Ocean Assessment and Reporting of Tsunamis (DART) data, historical reports, and information gleaned from published tsunami deposit research build a multi-faceted view of tsunami hazards and their history around the world. Tsunami history provides clues to what might happen in the future, including frequency of occurrence and maximum wave heights. However, instrumental and written records commonly span too little time to reveal the full range of a region's tsunami hazard. The sedimentary deposits of tsunamis, identified with the aid of modern analogs, increasingly complement instrumental and human observations. By adding the component of tsunamis inferred from the geologic record, the Global Tsunami Database extends the record of tsunamis backward in time. Deposit locations, their estimated age and descriptions of the deposits themselves fill in the tsunami record. Tsunamis inferred from proxies, such as evidence for coseismic subsidence, are included to estimate recurrence intervals, but are flagged to highlight the absence of a physical deposit. Authors may submit their own descriptions and upload digital versions of publications. Users may sort by any populated field, including event, location, region, age of deposit, author, publication type (extract information from peer reviewed publications only, if you wish), grain size, composition, presence/absence of plant material. Users may find tsunami deposit references for a given location, event or author; search for particular properties of tsunami deposits; and even identify potential collaborators. Users may also download public-domain documents. Data and information may be viewed using tools designed to extract and display data from the Oracle database (selection forms, Web Map Services, and Web Feature Services). In addition, the historic tsunami archive (along with related earthquakes and volcanic eruptions) is available in KML (Keyhole Markup Language) format for use with Google Earth and similar geo-viewers.

  20. Ocean Instruments Web Site for Undergraduate, Secondary and Informal Education

    NASA Astrophysics Data System (ADS)

    Farrington, J. W.; Nevala, A.; Dolby, L. A.

    2004-12-01

    An Ocean Instruments web site has been developed that makes available information about ocean sampling and measurement instruments and platforms. The site features text, pictures, diagrams and background information written or edited by experts in ocean science and engineering and contains links to glossaries and multimedia technologies including video streaming, audio packages, and searchable databases. The site was developed after advisory meetings with selected professors teaching undergraduate classes who responded to the question, what could Woods Hole Oceanographic Institution supply to enhance undergraduate education in ocean sciences, life sciences, and geosciences? Prototypes were developed and tested with students, potential users, and potential contributors. The site is hosted by WHOI. The initial five instruments featured were provided by four WHOI scientists and engineers and by one Sea Education Association faculty member. The site is now open to contributions from scientists and engineers worldwide. The site will not advertise or promote the use of individual ocean instruments.

  1. Progress Toward an IRIS++ Database Open to the Helioseismological Community

    NASA Astrophysics Data System (ADS)

    Gelly, B.; Khalikov, S.; Pallé, P. L.; IRIS Team

    The IRIS network is now fourteen years old, and has continuously been taking data since 1989. The data analysis, which produced some noticeable scientifical results, like the measurement of the ell = 1 rotationnal splitting or the measurement of the solar acoustic cut-off frequency, was mainly performed with the summer campaigns data of 1989 to 1992. P-mode frequency and width tables were recently published using the same subset of the IRIS data . We are now finishing the calibration and the timing of the whole set of IRIS data from 89 to 97, which will increase by a factor of 4 the amount of available data. The duty cycle of the IRIS network ranges from about 65% over 3 months of the summer campaigns to some 23% over one year in the worst case. To improve our duty cycle we developed several collaborations with other teams running similar instruments: (1) the Mark I instrument, ran at the IAC for many years, a potassium resonance single pixel device, also part of the BiSON network (Elsworth et al., 1988). (2) Alexandro Cacciani's MOF, ran at the JPL in Pasadena. Although this is a sodium resonance imaging instrument, it has been used in ``one pixel'' format for several summer seasons since 1989 (Cacciani et al., 1984). (3) the LOWL instrument is a Doppler imager also based on a Magneto-Optical Filter (MOF), operated at the Mauna Loa solar observatory since 1994 (Tomczyk et al., 1995). The merging of those 'alien' data has been carefully adressed at the calibration ands timing stages, and we can now present the advantages of such a-posteriori collaborations. We endeavour to set-up the corresponding database of 'one-pixel seismological data from ground-based intruments' in Nice and to open it to the scientific community of this meeting by the end of 1998. This database will soon have the potential to trace the spectral features of the solar signal over one 11-years cycle.

  2. Closeout of CRADA JSA 2012S004: Chapter 5, Integrated Control System, of the document of the ESS Conceptual Design Report, publicly available at https://europeanspallationsource.se/accelerator-documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Satogata, Todd

    2013-04-22

    The integrated control system (ICS) is responsible for the whole ESS machine and facility: accelerator, target, neutron scattering instruments and conventional facilities. This unified approach keeps the costs of development, maintenance and support relatively low. ESS has selected a standardised, field-proven controls framework, the Experimental Physics and Industrial Control System (EPICS), which was originally developed jointly by Argonne and Los Alamos National Laboratories. Complementing this selection are best practices and experience from similar facilities regarding platform standardisation, control system development and device integration and commissioning. The components of ICS include the control system core, the control boxes, the BLED databasemore » management system, and the human machine interface. The control system core is a set of systems and tools that make it possible for the control system to provide required data, information and services to engineers, operators, physicists and the facility itself. The core components are the timing system that makes possible clock synchronisation across the facility, the machine protection system (MPS) and the personnel protection system (PPS) that prevent damage to the machine and personnel, and a set of control system services. Control boxes are servers that control a collection of equipment (for example a radio frequency cavity). The integrated control system will include many control boxes that can be assigned to one supplier, such as an internal team, a collaborating institute or a commercial vendor. This approach facilitates a clear division of responsibilities and makes integration much easier. A control box is composed of a standardised hardware platform, components, development tools and services. On the top level, it interfaces with the core control system components (timing, MPS, PPS) and with the human-machine interface. At the bottom, it interfaces with the equipment and parts of the facility through a set of analog and digital signals, real-time control loops and other communication buses. The ICS central data management system is named BLED (beam line element databases). BLED is a set of databases, tools and services that is used to store, manage and access data. It holds vital control system configuration and physics-related (lattice) information about the accelerator, target and instruments. It facilitates control system configuration by bringing together direct input-output controller (IOC) con guration and real-time data from proton and neutron beam line models. BLED also simplifies development and speeds up the code-test-debug cycle. The set of tools that access BLED will be tailored to the needs of different categories of users, such as ESS staff physicists, engineers, and operators; external partner laboratories; and visiting experimental instrument users. The human-machine interface is vital to providing a high-quality experience to ICS users. It encompasses a wide array of devices and software tools, from control room screens to engineer terminal windows; from beam physics data tools to post-mortem data analysis tools. It serves users with a wide range of skills from widely varied backgrounds. The Controls Group is developing a set of user profiles to accommodate this diverse range of use-cases and users.« less

  3. 2-Micron Coherent Doppler Lidar Instrument Advancements for Tropospheric Wind Measurement

    NASA Technical Reports Server (NTRS)

    Petros, Mulugeta; Singh, U. N.; Yu, J.; Kavaya, M. J.; Koch, G.

    2014-01-01

    Knowledge derived from global tropospheric wind measurement is an important constituent of our overall understanding of climate behavior [1]. Accurate weather prediction saves lives and protects properties from destructions. High-energy 2-micron laser is the transmitter of choice for coherent Doppler wind detection. In addition to the eye-safety, the wavelength of the transmitter suitably matches the aerosol size in the lower troposphere. Although the technology of the 2-micron laser has been maturing steadily, lidar derived wind data is still a void in the global weather database. In the last decade, researchers at NASA Langley Research Center (LaRC) have been engaged in this endeavor, contributing to the scientific database of 2-micron lidar transmitters. As part of this effort, an in depth analysis of the physics involved in the workings of the Ho: Tm laser systems have been published. In the last few years, we have demonstrated lidar transmitter with over1Joule output energy. In addition, a large body of work has been done in characterizing new laser materials and unique crystal configurations to enhance the efficiency and output energy of the 2-micron laser systems. At present 2-micron lidar systems are measuring wind from both ground and airborne platforms. This paper will provide an overview of the advancements made in recent years and the technology maturity levels attained.

  4. A literature review of transmission effectiveness and electromagnetic compatibility in home telemedicine environments to evaluate safety and security.

    PubMed

    Carranza, Noemí; Ramos, Victoria; Lizana, Francisca G; García, Jorge; del Pozo, Alejando; Monteagudo, José Luis

    2010-09-01

    The objective of this study was to determine already reported cases of transmission/reception failure and interferences to evaluate the safety and security of the new mobile home telemedicine systems. The literature published in the last 10 years (1998-2009) has been reviewed, by searching in several databases. Searches on transmission effectiveness and electromagnetic compatibility were made manually through journals, conference proceedings, and also the healthcare technology assessment agencies' Web pages. Search strategies developed through electronic databases and manual search identified a total of 886 references, with 44 finally being included in the results. They have been divided by technology in the transmission/reception effectiveness studies, and according to the type of medical device in the case of electromagnetic interferences studies. The study reveals that there are numerous publications on telemedicine and home-monitoring systems using wireless networks. However, literature on effectiveness in terms of connectivity and transmission problems and electromagnetic interferences is limited. From the collected studies, it can be concluded that there are transmission failures, low-coverage areas, errors in the transmission of packets, and so on. Moreover, cases of serious interferences in medical instruments have also been reported. These facts highlight the lack of studies and specific recommendations to be followed in the implementation of biomonitoring systems in domestic environments using wireless networks.

  5. INNOVATIVE METHODS FOR EMISSION INVENTORY DEVELOPMENT AND EVALUATION: WORKSHOP SYNTHESIS

    EPA Science Inventory

    Emission inventories are key databases for evaluating, managing, and regulating air pollutants. Refinements and innovations in instruments that measure air pollutants, models that calculate emissions, and techniques for data management and uncertainty assessment are critical to ...

  6. Workshop on Innovative Instrumentation for the In Situ Study of Atmosphere-Surface Interactions on Mars

    NASA Technical Reports Server (NTRS)

    Fegley, Bruce, Jr. (Editor); Waenke, Heinrich (Editor)

    1992-01-01

    The speakers in the first session of the workshop addressed some of the continuing enigmas regarding the atmospheric composition, surface composition, and atmosphere-surface interactions on Mars; provided a description of a database of proposed payloads and instruments for SEI missions that is scheduled to be accessible in 1993; discussed potential uses of atmospheric imaging from landed stations on Mars; and advocated the collection and employment of high-spectral-resolution reflectance and emission data.

  7. Accuracy of Prediction Instruments for Diagnosing Large Vessel Occlusion in Individuals With Suspected Stroke: A Systematic Review for the 2018 Guidelines for the Early Management of Patients With Acute Ischemic Stroke.

    PubMed

    Smith, Eric E; Kent, David M; Bulsara, Ketan R; Leung, Lester Y; Lichtman, Judith H; Reeves, Mathew J; Towfighi, Amytis; Whiteley, William N; Zahuranec, Darin B

    2018-03-01

    Endovascular thrombectomy is a highly efficacious treatment for large vessel occlusion (LVO). LVO prediction instruments, based on stroke signs and symptoms, have been proposed to identify stroke patients with LVO for rapid transport to endovascular thrombectomy-capable hospitals. This evidence review committee was commissioned by the American Heart Association/American Stroke Association to systematically review evidence for the accuracy of LVO prediction instruments. Medline, Embase, and Cochrane databases were searched on October 27, 2016. Study quality was assessed with the Quality Assessment of Diagnostic Accuracy-2 tool. Thirty-six relevant studies were identified. Most studies (21 of 36) recruited patients with ischemic stroke, with few studies in the prehospital setting (4 of 36) and in populations that included hemorrhagic stroke or stroke mimics (12 of 36). The most frequently studied prediction instrument was the National Institutes of Health Stroke Scale. Most studies had either some risk of bias or unclear risk of bias. Reported discrimination of LVO mostly ranged from 0.70 to 0.85, as measured by the C statistic. In meta-analysis, sensitivity was as high as 87% and specificity was as high as 90%, but no threshold on any instruments predicted LVO with both high sensitivity and specificity. With a positive LVO prediction test, the probability of LVO could be 50% to 60% (depending on the LVO prevalence in the population), but the probability of LVO with a negative test could still be ≥10%. No scale predicted LVO with both high sensitivity and high specificity. Systems that use LVO prediction instruments for triage will miss some patients with LVO and milder stroke. More prospective studies are needed to assess the accuracy of LVO prediction instruments in the prehospital setting in all patients with suspected stroke, including patients with hemorrhagic stroke and stroke mimics. © 2018 American Heart Association, Inc.

  8. Adverse Events in Robotic Surgery: A Retrospective Study of 14 Years of FDA Data

    PubMed Central

    Alemzadeh, Homa; Raman, Jaishankar; Leveson, Nancy; Kalbarczyk, Zbigniew; Iyer, Ravishankar K.

    2016-01-01

    Background Use of robotic systems for minimally invasive surgery has rapidly increased during the last decade. Understanding the causes of adverse events and their impact on patients in robot-assisted surgery will help improve systems and operational practices to avoid incidents in the future. Methods By developing an automated natural language processing tool, we performed a comprehensive analysis of the adverse events reported to the publicly available MAUDE database (maintained by the U.S. Food and Drug Administration) from 2000 to 2013. We determined the number of events reported per procedure and per surgical specialty, the most common types of device malfunctions and their impact on patients, and the potential causes for catastrophic events such as patient injuries and deaths. Results During the study period, 144 deaths (1.4% of the 10,624 reports), 1,391 patient injuries (13.1%), and 8,061 device malfunctions (75.9%) were reported. The numbers of injury and death events per procedure have stayed relatively constant (mean = 83.4, 95% confidence interval (CI), 74.2–92.7 per 100,000 procedures) over the years. Surgical specialties for which robots are extensively used, such as gynecology and urology, had lower numbers of injuries, deaths, and conversions per procedure than more complex surgeries, such as cardiothoracic and head and neck (106.3 vs. 232.9 per 100,000 procedures, Risk Ratio = 2.2, 95% CI, 1.9–2.6). Device and instrument malfunctions, such as falling of burnt/broken pieces of instruments into the patient (14.7%), electrical arcing of instruments (10.5%), unintended operation of instruments (8.6%), system errors (5%), and video/imaging problems (2.6%), constituted a major part of the reports. Device malfunctions impacted patients in terms of injuries or procedure interruptions. In 1,104 (10.4%) of all the events, the procedure was interrupted to restart the system (3.1%), to convert the procedure to non-robotic techniques (7.3%), or to reschedule it (2.5%). Conclusions Despite widespread adoption of robotic systems for minimally invasive surgery in the U.S., a non-negligible number of technical difficulties and complications are still being experienced during procedures. Adoption of advanced techniques in design and operation of robotic surgical systems and enhanced mechanisms for adverse event reporting may reduce these preventable incidents in the future. PMID:27097160

  9. FastBit: Interactively Searching Massive Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Ahern, Sean; Bethel, E. Wes

    2009-06-23

    As scientific instruments and computer simulations produce more and more data, the task of locating the essential information to gain insight becomes increasingly difficult. FastBit is an efficient software tool to address this challenge. In this article, we present a summary of the key underlying technologies, namely bitmap compression, encoding, and binning. Together these techniques enable FastBit to answer structured (SQL) queries orders of magnitude faster than popular database systems. To illustrate how FastBit is used in applications, we present three examples involving a high-energy physics experiment, a combustion simulation, and an accelerator simulation. In each case, FastBit significantly reducesmore » the response time and enables interactive exploration on terabytes of data.« less

  10. C++, objected-oriented programming, and astronomical data models

    NASA Technical Reports Server (NTRS)

    Farris, A.

    1992-01-01

    Contemporary astronomy is characterized by increasingly complex instruments and observational techniques, higher data collection rates, and large data archives, placing severe stress on software analysis systems. The object-oriented paradigm represents a significant new approach to software design and implementation that holds great promise for dealing with this increased complexity. The basic concepts of this approach will be characterized in contrast to more traditional procedure-oriented approaches. The fundamental features of objected-oriented programming will be discussed from a C++ programming language perspective, using examples familiar to astronomers. This discussion will focus on objects, classes and their relevance to the data type system; the principle of information hiding; and the use of inheritance to implement generalization/specialization relationships. Drawing on the object-oriented approach, features of a new database model to support astronomical data analysis will be presented.

  11. Seismic Search Engine: A distributed database for mining large scale seismic data

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Vaidya, S.; Kuzma, H. A.

    2009-12-01

    The International Monitoring System (IMS) of the CTBTO collects terabytes worth of seismic measurements from many receiver stations situated around the earth with the goal of detecting underground nuclear testing events and distinguishing them from other benign, but more common events such as earthquakes and mine blasts. The International Data Center (IDC) processes and analyzes these measurements, as they are collected by the IMS, to summarize event detections in daily bulletins. Thereafter, the data measurements are archived into a large format database. Our proposed Seismic Search Engine (SSE) will facilitate a framework for data exploration of the seismic database as well as the development of seismic data mining algorithms. Analogous to GenBank, the annotated genetic sequence database maintained by NIH, through SSE, we intend to provide public access to seismic data and a set of processing and analysis tools, along with community-generated annotations and statistical models to help interpret the data. SSE will implement queries as user-defined functions composed from standard tools and models. Each query is compiled and executed over the database internally before reporting results back to the user. Since queries are expressed with standard tools and models, users can easily reproduce published results within this framework for peer-review and making metric comparisons. As an illustration, an example query is “what are the best receiver stations in East Asia for detecting events in the Middle East?” Evaluating this query involves listing all receiver stations in East Asia, characterizing known seismic events in that region, and constructing a profile for each receiver station to determine how effective its measurements are at predicting each event. The results of this query can be used to help prioritize how data is collected, identify defective instruments, and guide future sensor placements.

  12. Bias in benefit-risk appraisal in older products: the case of buflomedil for intermittent claudication.

    PubMed

    De Backer, Tine L M; Vander Stichele, Robert H; Van Bortel, Luc M

    2009-01-01

    Benefit-risk assessment should be ongoing during the life cycle of a pharmaceutical agent. New products are subjected to rigorous registration laws and rules, which attempt to assure the availability and validity of evidence. For older products, bias in benefit-risk assessment is more likely, as a number of safeguards were not in place at the time these products were registered. This issue of bias in benefit-risk assessment of older products is illustrated here with an example: buflomedil in intermittent claudication. Data on efficacy were retrieved from a Cochrane systematic review. Data on safety were obtained by comparing the number of reports of serious adverse events and fatalities published in the literature with those reported in postmarketing surveillance databases. In the case of efficacy, the slim basis of evidence for the benefit of buflomedil is undermined by documented publication bias. In the case of safety, bias in reporting to international safety databases is illustrated by the discrepancy between the number of drug-related deaths published in the literature (20), the potentially drug-related deaths in the WHO database (20) and deaths attributed to buflomedil in the database of the international marketing authorization holder (11). In older products, efficacy cannot be evaluated without a thorough search for publication bias. For safety, case reporting of drug-related serious events and deaths in the literature remains a necessary instrument for risk appraisal of older medicines, despite the existence of postmarketing safety databases. The enforcement of efficient communication between healthcare workers, drug companies, national centres of pharmacovigilance, national poison centers and the WHO is necessary to ensure the validity of postmarketing surveillance reporting systems. Drugs considered obsolete because of unfavourable benefit-risk assessment should not be allowed to stay on the market.

  13. The Scientific Uplink and User Support System for SIRTF

    NASA Astrophysics Data System (ADS)

    Heinrichsen, I.; Chavez, J.; Hartley, B.; Mei, Y.; Potts, S.; Roby, T.; Turek, G.; Valjavec, E.; Wu, X.

    The Space Infrared Telescope Facility (SIRTF) is one of NASA's Great Observatory missions, scheduled for launch in 2001. As such its ground segment design is driven by the requirement to provide strong support for the entire astronomical community starting with the call for Legacy Proposals in early 2000. In this contribution, we present the astronomical user interface and the design of the server software that comprises the Scientific Uplink System for SIRTF. The software architecture is split into three major parts: A front-end Java application deployed to the astronomical community providing the capabilities to visualize and edit proposals and the associated lists of observations. This observer toolkit provides templates to define all parameters necessary to carry out the required observations. A specialized version of this software, based on the same overall architecture, is used internal to the SIRTF Science Center to prepare calibration and engineering observations. A Weblogic (TM) based middleware component brokers the transactions with the servers, astronomical image and catalog sources as well as the SIRTF operational databases. Several server systems perform the necessary computations, to obtain resource estimates, target visibilities and to access the instrument models for signal to noise calculations. The same server software is used internally at a later stage to derive the detailed command sequences needed by the SIRTF instruments and spacecraft to execute a given observation.

  14. SolarSoft Web Services

    NASA Astrophysics Data System (ADS)

    Freeland, S.; Hurlburt, N.

    2005-12-01

    The SolarSoft system (SSW) is a set of integrated software libraries, databases, and system utilities which provide a common programming and data analysis environment for solar physics. The system includes contributions from a large community base, representing the efforts of many NASA PI team MO&DA teams,spanning many years and multiple NASA and international orbital and ground based missions. The SSW general use libraries include Many hundreds of utilities which are instrument and mission independent. A large subset are also SOLAR independent, such as time conversions, digital detector cleanup, time series analysis, mathematics, image display, WWW server communications and the like. PI teams may draw on these general purpose libraries for analysis and application development while concentrating efforts on instrument specific calibration issues rather than reinvention of general use software. By the same token, PI teams are encouraged to contribute new applications or enhancements to existing utilities which may have more general interest. Recent areas of intense evolution include space weather applications, automated distributed data access and analysis, interfaces with the ongoing Virtual Solar Observatory efforts, and externalization of SolarSoft power through Web Services. We will discuss the current status of SSW web services and demonstrate how this facilitates accessing the underlying power of SolarSoft in more abstract terms. In this context, we will describe the use of SSW services within the Collaborative Sun Earth Connector environment.

  15. Evidence-based management of deep wound infection after spinal instrumentation.

    PubMed

    Lall, Rishi R; Wong, Albert P; Lall, Rohan R; Lawton, Cort D; Smith, Zachary A; Dahdaleh, Nader S

    2015-02-01

    In this study, evidence-based medicine is used to assess optimal surgical and medical management of patients with post-operative deep wound infection following spinal instrumentation. A computerized literature search of the PubMed database was performed. Twenty pertinent studies were identified. Studies were separated into publications addressing instrumentation retention versus removal and publications addressing antibiotic therapy regimen. The findings were classified based on level of evidence (I-III) and findings were summarized into evidentiary tables. No level I or II evidence was identified. With regards to surgical management, five studies support instrumentation retention in the setting of early deep infection. In contrast, for delayed infection, the evidence favors removal of instrumentation at the time of initial debridement. Surgeons should be aware that for deformity patients, even if solid fusion is observed, removal of instrumentation may be associated with significant loss of correction. A course of intravenous antibiotics followed by long-term oral suppressive therapy should be pursued if instrumentation is retained. A shorter treatment course may be appropriate if hardware is removed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Safety first: Instrumentality for reaching safety determines attention allocation under threat.

    PubMed

    Vogt, Julia; Koster, Ernst H W; De Houwer, Jan

    2017-04-01

    Theories of attention to emotional information suggest that attentional processes prioritize threatening information. In this article, we suggest that attention will prioritize the events that are most instrumental to a goal in any given context, which in threatening situations is typically reaching safety. To test our hypotheses, we used an attentional cueing paradigm that contained cues signaling imminent threat (i.e., aversive noises) as well as cues that allowed participants to avoid threat (instrumental safety signals). Correct reactions to instrumental safety signals seemingly allowed participants to lower the presentation rate of the threat. Experiment 1 demonstrates that attention prioritizes instrumental safety signals over threat signals. Experiment 2 replicates this finding and additionally compares instrumental safety signals to other action-relevant signals controlling for action relevance as cause of the effects. Experiment 3 demonstrates that when actions toward threat signals permit to avoid threat, attention prioritizes threat signals. Taken together, these results support the view that instrumentality for reaching safety determines the allocation of attention under threat. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. The validation of an educational database for children with profound intellectual disabilities

    PubMed Central

    Corten, Lieselotte; van Rensburg, Winnie; Kilian, Elizma; McKenzie, Judith; Vorster, Hein; Jelsma, Jennifer

    2016-01-01

    Background The Western Cape Forum for Intellectual Disability took the South African Government to court in 2010 on its failure to implement the right to education for Children with Severe and Profound Intellectual Disability. Subsequently, multidisciplinary teams were appointed by the Western Cape Education Department to deliver services to the Special Care Centres (SCCs). Initially, minimal information was available on this population. Objectives The purpose is to document the process of developing and validating a database for the collection of routine data. Method A descriptive analytical study design was used. A sample of convenience was drawn from individuals under the age of 18 years, enrolled in SCCs in the Western Cape. The team who entered and analysed the data reached consensus regarding the utility and feasibility of each item. Results Data were collected on 134 children. The omission of certain items from the database was identified. Some information was not reliable or readily available. Of the instruments identified to assess function, the classification systems were found to be reliable and useful, as were the performance scales. The WeeFIM, on the other hand, was lengthy and expensive, and was therefore discarded. Discussion and conclusions A list of items to be included was identified. Apart from an individual profile, it can be useful for service planning and monitoring, if incorporated into the central information system used to monitor the performance of all children. Without such inclusion, this most vulnerable population, despite court ruling, will not have their right to education adequately addressed. PMID:28730055

  18. Advanced Stirling Duplex Materials Assessment for Potential Venus Mission Heater Head Application

    NASA Technical Reports Server (NTRS)

    Ritzert, Frank; Nathal, Michael V.; Salem, Jonathan; Jacobson, Nathan; Nesbitt, James

    2011-01-01

    This report will address materials selection for components in a proposed Venus lander system. The lander would use active refrigeration to allow Space Science instrumentation to survive the extreme environment that exists on the surface of Venus. The refrigeration system would be powered by a Stirling engine-based system and is termed the Advanced Stirling Duplex (ASD) concept. Stirling engine power conversion in its simplest definition converts heat from radioactive decay into electricity. Detailed design decisions will require iterations between component geometries, materials selection, system output, and tolerable risk. This study reviews potential component requirements against known materials performance. A lower risk, evolutionary advance in heater head materials could be offered by nickel-base superalloy single crystals, with expected capability of approximately 1100C. However, the high temperature requirements of the Venus mission may force the selection of ceramics or refractory metals, which are more developmental in nature and may not have a well-developed database or a mature supporting technology base such as fabrication and joining methods.

  19. Implementation of real-time digital endoscopic image processing system

    NASA Astrophysics Data System (ADS)

    Song, Chul Gyu; Lee, Young Mook; Lee, Sang Min; Kim, Won Ky; Lee, Jae Ho; Lee, Myoung Ho

    1997-10-01

    Endoscopy has become a crucial diagnostic and therapeutic procedure in clinical areas. Over the past four years, we have developed a computerized system to record and store clinical data pertaining to endoscopic surgery of laparascopic cholecystectomy, pelviscopic endometriosis, and surgical arthroscopy. In this study, we developed a computer system, which is composed of a frame grabber, a sound board, a VCR control board, a LAN card and EDMS. Also, computer system controls peripheral instruments such as a color video printer, a video cassette recorder, and endoscopic input/output signals. Digital endoscopic data management system is based on open architecture and a set of widely available industry standards; namely Microsoft Windows as an operating system, TCP/IP as a network protocol and a time sequential database that handles both images and speech. For the purpose of data storage, we used MOD and CD- R. Digital endoscopic system was designed to be able to store, recreate, change, and compress signals and medical images. Computerized endoscopy enables us to generate and manipulate the original visual document, making it accessible to a virtually unlimited number of physicians.

  20. 48 CFR 10.002 - Procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... officer may use market research conducted within 18 months before the award of any task or delivery order... or business publications. (iv) Querying the Governmentwide database of contracts and other procurement instruments intended for use by multiple agencies available at http://www.contractdirectory.gov...

  1. Measuring quality of life in opioid-dependent people: a systematic review of assessment instruments.

    PubMed

    Strada, Lisa; Vanderplasschen, Wouter; Buchholz, Angela; Schulte, Bernd; Muller, Ashley E; Verthein, Uwe; Reimer, Jens

    2017-12-01

    Opioid dependence is a chronic relapsing disorder. Despite increasing research on quality of life (QOL) in people with opioid dependence, little attention has been paid to the instruments used. This systematic review examines the suitability of QOL instruments for use in opioid-dependent populations and the instruments' quality. A systematic search was performed in the databases Medline, PsycInfo, The Cochrane Library, and CINAHL. Articles were eligible if they assessed QOL of opioid-dependent populations using a validated QOL instrument. Item content relevance to opioid-dependent people was evaluated by means of content analysis, and instrument properties were assessed using minimum standards for patient-reported outcome measures. Eighty-nine articles were retrieved, yielding sixteen QOL instruments, of which ten were assessed in this review. Of the ten instruments, six were disease specific, but none for opioid dependence. Two instruments had good item content relevance. The conceptual and measurement model were described in seven instruments. Four instruments were developed with input from the respective target population. Eight instruments had low respondent and administrator burden. Psychometric properties were either not assessed in opioid-dependent populations or were inconclusive or moderate. No instrument scored perfectly on both the content and properties. The limited suitability of instruments for opioid-dependent people hinders accurate and sensitive measurement of QOL in this population. Future research is in need of an opioid dependence-specific QOL instrument to measure the true impact of the disease on people's lives and to evaluate treatment-related services.

  2. The Integrated Sounding System: Description and Preliminary Observations from TOGA COARE.

    NASA Astrophysics Data System (ADS)

    Parsons, David; Dabberdt, Walter; Cole, Harold; Hock, Terrence; Martin, Charles; Barrett, Anne-Leslie; Miller, Erik; Spowart, Michael; Howard, Michael; Ecklund, Warner; Carter, David; Gage, Kenneth; Wilson, John

    1994-04-01

    An Integrated Sounding System (ISS) that combines state-of- the-art remote and in situ sensors into a single transportable facility has been developed jointly by the National Center for Atmospheric Research (NCAR) and the Aeronomy laboratory of the National Oceanic and Atmospheric Administration (NOAA/AL). The instrumentation for each ISS includes a 915-MHz wind profiler, a Radio Acoustic Sounding System (RASS), an Omega-based NAVAID sounding system, and an enhanced surface meteorological station. The general philosophy behind the ISS is that the integration of various measurement systems overcomes each system's respective limitations while taking advantage of its positive attributes. The individual observing systems within the ISS provide high-level data products to a central workstation that manages and integrates these measurements. The ISS software package performs a wide range of functions: real-time data acquisition, database support, and graphical displays; data archival and communications; and operational and post time analysis. The first deployment of the ISS consists of six sites in the western tropical Pacific-four land-based deployments and two ship-based deployments. The sites serve the Coupled Ocean-Atmosphere Response Experiment (COARE) of the Tropical Ocean and Global Atmosphere (TOGA) program and TOGA's enhanced atmospheric monitoring effort. Examples of ISS data taken during this deployment are shown in order to demonstrate the capabilities of this new sounding system and to demonstrate the performance of these in situ and remote sensing instruments in a moist tropical environment. In particular, a strong convective outflow with a pronounced impact of the atmospheric boundary layer and heat fluxes from the ocean surface was examined with a shipboard ISS. If these strong outflows commonly occur, they may prove to be an important component of the surface energy budget of the western tropical Pacific.

  3. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) and how to select an outcome measurement instrument.

    PubMed

    Mokkink, Lidwine B; Prinsen, Cecilia A C; Bouter, Lex M; Vet, Henrica C W de; Terwee, Caroline B

    2016-01-19

    COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) is an initiative of an international multidisciplinary team of researchers who aim to improve the selection of outcome measurement instruments both in research and in clinical practice by developing tools for selecting the most appropriate available instrument. In this paper these tools are described, i.e. the COSMIN taxonomy and definition of measurement properties; the COSMIN checklist to evaluate the methodological quality of studies on measurement properties; a search filter for finding studies on measurement properties; a protocol for systematic reviews of outcome measurement instruments; a database of systematic reviews of outcome measurement instruments; and a guideline for selecting outcome measurement instruments for Core Outcome Sets in clinical trials. Currently, we are updating the COSMIN checklist, particularly the standards for content validity studies. Also new standards for studies using Item Response Theory methods will be developed. Additionally, in the future we want to develop standards for studies on the quality of non-patient reported outcome measures, such as clinician-reported outcomes and performance-based outcomes. In summary, we plea for more standardization in the use of outcome measurement instruments, for conducting high quality systematic reviews on measurement instruments in which the best available outcome measurement instrument is recommended, and for stopping the use of poor outcome measurement instruments.

  4. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) and how to select an outcome measurement instrument

    PubMed Central

    Mokkink, Lidwine B.; Prinsen, Cecilia A. C.; Bouter, Lex M.; de Vet, Henrica C. W.; Terwee, Caroline B.

    2016-01-01

    Background: COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) is an initiative of an international multidisciplinary team of researchers who aim to improve the selection of outcome measurement instruments both in research and in clinical practice by developing tools for selecting the most appropriate available instrument. Method: In this paper these tools are described, i.e. the COSMIN taxonomy and definition of measurement properties; the COSMIN checklist to evaluate the methodological quality of studies on measurement properties; a search filter for finding studies on measurement properties; a protocol for systematic reviews of outcome measurement instruments; a database of systematic reviews of outcome measurement instruments; and a guideline for selecting outcome measurement instruments for Core Outcome Sets in clinical trials. Currently, we are updating the COSMIN checklist, particularly the standards for content validity studies. Also new standards for studies using Item Response Theory methods will be developed. Additionally, in the future we want to develop standards for studies on the quality of non-patient reported outcome measures, such as clinician-reported outcomes and performance-based outcomes. Conclusions: In summary, we plea for more standardization in the use of outcome measurement instruments, for conducting high quality systematic reviews on measurement instruments in which the best available outcome measurement instrument is recommended, and for stopping the use of poor outcome measurement instruments. PMID:26786084

  5. A mobile and self-sufficient lab for high frequency measurements of stable water isotopes and chemistry of multiple water sources

    NASA Astrophysics Data System (ADS)

    Windhorst, David; Kraft, Philipp; Holly, Hartmut; Sahraei, Amir; Breuer, Lutz

    2017-04-01

    Technical advances over the last years have made instruments for stable water isotope and water chemistry measurements smaller, more durable and energy efficient. It is nowadays feasible to deploy such instruments in situ during field campaigns. Coupled to an automated sample delivery system, high temporal resolution online measurements of various sources are within the bounds of economic and technical possibility. However, the day to day operation of such equipment still requires either a lot of man power and infrastructure or the implementation of a quasi-self-sufficient system. The challenge remains on how to facilitate and remotely operate such a system. We present the design and implementation of the Water Analysis Trailer for Environmental Research (WATER), an autonomous platform consisting of instruments for stable water isotope and water chemistry analysis. The system takes and measures samples in high temporal resolution (<15 min) of up to 12 sources. To ensure an unmanned operation of up to one week several issues need to be addressed. The essential topics are: - self-sufficient power supply, - automated sample delivery and preparation, and - autonomous measurements and management interfacing all instruments. In addition to the basic requirements we implemented: - communication of all system states, alarm messages and measurement results to an internal as well as an external database via cellular telemetry, - automated storage of up to 300 frozen reference samples (100 mL, stored at -18°C), - climate control for temperature sensitive equipment (±1°C), - a local and remote (up to 20 km using radio telemetry) sensor network (i.e. to record states of the hydrological system and climate and soil conditions), also suitable to trigger specific measurements - automatic fire suppression and security system. The initial instrumentation includes a UV spectrometer (ProPs, Trios GmBH, Germany) to measure NO3-, COD, TOC and total suspended sediments, multiparameter water quality probe (YSI600R, YSI, USA) to measure electrical conductivity and pH, and a stable water isotope analyzer (L2130-i, Picarro, USA) coupled to a continuous water sampler (A0217, Picarro, USA). Fourty soil moisture, temperature and electrical conductivity sensors (5TE, Decagon, USA) are connect to the remote sensor network (A850, Adcon, Austria) and rain gauges and a climate station (WXT520, Vaisala, Finland) are connected to the local sensor network via SDI-12. In a first field trial starting in March 2017 the mobile laboratory will be used to study the hydrological processes in the developed landscape of the Schwingbach catchment (Germany). We are confident, that the unprecedented degree in detail, the measurements promise, will further accelerate our hydrological understanding and the interaction of various discharge generating sources.

  6. Evaluation of spectroscopic databases through radiative transfer simulations compared to observations. Application to the validation of GEISA 2015 with IASI and TCCON

    NASA Astrophysics Data System (ADS)

    Armante, Raymond; Scott, Noelle; Crevoisier, Cyril; Capelle, Virginie; Crepeau, Laurent; Jacquinet, Nicole; Chédin, Alain

    2016-09-01

    The quality of spectroscopic parameters that serve as input to forward radiative transfer models are essential to fully exploit remote sensing of Earth atmosphere. However, the process of updating spectroscopic databases in order to provide the users with a database that insures an optimal characterization of spectral properties of molecular absorption for radiative transfer modeling is challenging. The evaluation of the databases content and the underlying choices made by the managing team is thus a crucial step. Here, we introduce an original and powerful approach for evaluating spectroscopic parameters: the Spectroscopic Parameters And Radiative Transfer Evaluation (SPARTE) chain. The SPARTE chain relies on the comparison between forward radiative transfer simulations made by the 4A radiative transfer model and observations of spectra made from various observations collocated over several thousands of well-characterized atmospheric situations. Averaging the resulting 'calculated-observed spectral' residuals minimizes the random errors coming from both the radiometric noise of the instruments and the imperfect description of the atmospheric state. The SPARTE chain can be used to evaluate any spectroscopic databases, from the visible to the microwave, using any type of remote sensing observations (ground-based, airborne or space-borne). We show that the comparison of the shape of the residuals enables: (i) identifying incorrect line parameters (line position, intensity, width, pressure shift, etc.), even for molecules for which interferences between the lines have to be taken into account; (ii) proposing revised values, in cooperation with contributing teams; and (iii) validating the final updated parameters. In particular, we show that the simultaneous availability of two databases such as GEISA and HITRAN helps identifying remaining issues in each database. The SPARTE chain has been here applied to the validation of the update of GEISA-2015 in 2 spectral regions of particular interest for several currently exploited or planned Earth space missions: the thermal infrared domain and the short-wave infrared domain, for which observations from the space-borne IASI instrument and from the ground-based FTS instruments at the Parkfalls TCCON site are used respectively. Main results include: (i) the validation of the positions and intensities of line parameters, with overall significantly lower residuals for GEISA-2015 than for GEISA-2011 and (iii) the validation of the choice made on the parameters (such as pressure shift and air-broadened width) which has not been given by the provider but completed by ourselves. For example, comparisons between residuals obtained with GEISA-2015 and HITRAN-2012 have highlighted a specific issue with some HWHM values in the latter that can be clearly identified on the 'calculated-observed' residuals.

  7. The Clinimetric Properties of Instruments Measuring Home Hazards for Older People at Risk of Falling: A Systematic Review.

    PubMed

    Romli, Muhammad Hibatullah; Mackenzie, Lynette; Lovarini, Meryl; Tan, Maw Pin; Clemson, Lindy

    2018-03-01

    Home hazards are associated with falls among older people living in the community. However, evaluating home hazards is a complex process as environmental factors vary according to geography, culture, and architectural design. As a result, many health practitioners commonly use nonstandardized assessment methods that may lead to inaccurate findings. Thus, the aim of this systematic review was to identify standardized instruments for evaluating home hazards related to falls and evaluate the clinimetric properties of these instruments for use by health practitioners. A systematic search was conducted in the Medline, CINAHL, AgeLine, Web of Science databases, and the University of Sydney Library CrossSearch Engine. Study screening, assessment, and quality ratings were conducted independently. Thirty-six studies were identified describing 19 instruments and three assessment techniques. The clinimetric properties varied between instruments. The Home Falls and Accidents Screening Tool, Home Safety Self-Assessment Tool, In-Home Occupational Performance Evaluation, and Westmead Home Safety Assessment were the instruments with high potential for evaluating home hazards associated with falls. Health practitioners can choose the most appropriate instruments for their practice, as a range of standardized instruments with established clinimetric properties are available.

  8. Measuring Self-Care in Persons With Type 2 Diabetes: A Systematic Review

    PubMed Central

    Lu, Yan; Xu, Jiayun; Zhao, Weigang; Han, Hae-Ra

    2015-01-01

    This systematic review examines the characteristics and psychometric properties of the instruments used to assess self-care behaviors among persons with type 2 diabetes. Electronic databases were searched for relevant studies published in English within the past 20 years. Thirty different instruments were identified in 75 articles: 18 original instruments on type 2 diabetes mellitus self-care, 8 translated or revised version, and 4 not specific but relevant to diabetes. Twenty-one instruments were multidimensional and addressed multiple dimensions of self-care behavior. Nine were unidimensional: three focusing exclusively on medication taking, three on diet, one on physical activity, one on self-monitoring of blood glucose, and one on oral care. Most instruments (22 of 30) were developed during the last decade. Only 10 were repeated more than once. Nineteen of the 30 instruments reported both reliability and validity information but with varying degrees of rigor. In conclusion, most instruments used to measure self-care were relatively new and had been applied to only a limited number of studies with incomplete psychometric profiles. Rigorous psychometric testing, operational definition of self-care, and sufficient explanation of scoring need to be considered for further instrument development. PMID:26130465

  9. Hardware and software facilities for the J-PAS and J-PLUS surveys archiving, processing and data publication

    NASA Astrophysics Data System (ADS)

    Cristóbal-Hornillos, D.; Varela, J.; Ederoclite, A.; Vázquez Ramió, H.; López-Sainz, A.; Hernández-Fuertes, J.; Civera, T.; Muniesa, D.; Moles, M.; Cenarro, A. J.; Marín-Franch, A.; Yanes-Díaz, A.

    2015-05-01

    The Observatorio Astrofísico de Javalambre consists of two main telescopes: JST/T250, a 2.5 m telescope with a FoV of 3 deg, and JAST/T80, a 83 cm with a 2 deg FoV. JST/T250 will be devoted to complete the Javalambre-PAU Astronomical Survey (J-PAS). It is a photometric survey with a system of 54 narrow-band plus 3 broad-band filters covering an area of 8500°^2. The JAST/T80 will perform the J-PLUS survey, covering the same area in a system of 12 filters. This contribution presents the software and hardware architecture designed to store and process the data. The processing pipeline runs daily and it is devoted to correct instrumental signature on the science images, to perform astrometric and photometric calibration, and the computation of individual image catalogs. In a second stage, the pipeline performs the combination of the tile mosaics and the computation of final catalogs. The catalogs are ingested in as Scientific database to be provided to the community. The processing software is connected with a management database to store persistent information about the pipeline operations done on each frame. The processing pipeline is executed in a computing cluster under a batch queuing system. Regarding the storage system, it will combine disk and tape technologies. The disk storage system will have capacity to store the data that is accessed by the pipeline. The tape library will store and archive the raw data and earlier data releases with lower access frequency.

  10. The New Politics of US Health Care Prices: Institutional Reconfiguration and the Emergence of All-Payer Claims Databases.

    PubMed

    Rocco, Philip; Kelly, Andrew S; Béland, Daniel; Kinane, Michael

    2017-02-01

    Prices are a significant driver of health care cost in the United States. Existing research on the politics of health system reform has emphasized the limited nature of policy entrepreneurs' efforts at solving the problem of rising prices through direct regulation at the state level. Yet this literature fails to account for how change agents in the states gradually reconfigured the politics of prices, forging new, transparency-based policy instruments called all-payer claims databases (APCDs), which are designed to empower consumers, purchasers, and states to make informed market and policy choices. Drawing on pragmatist institutional theory, this article shows how APCDs emerged as the dominant model for reforming health care prices. While APCD advocates faced significant institutional barriers to policy change, we show how they reconfigured existing ideas, tactical repertoires, and legal-technical infrastructures to develop a politically and technologically robust reform. Our analysis has important implications for theories of how change agents overcome structural barriers to health reform. Copyright © 2017 by Duke University Press.

  11. DABAM: an open-source database of X-ray mirrors metrology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele

    2016-04-20

    An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper,more » with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. Some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.« less

  12. DABAM: an open-source database of X-ray mirrors metrology

    PubMed Central

    Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele; Glass, Mark; Idir, Mourad; Metz, Jim; Raimondi, Lorenzo; Rebuffi, Luca; Reininger, Ruben; Shi, Xianbo; Siewert, Frank; Spielmann-Jaeggi, Sibylle; Takacs, Peter; Tomasset, Muriel; Tonnessen, Tom; Vivo, Amparo; Yashchuk, Valeriy

    2016-01-01

    An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper, with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. Some optics simulations are presented and discussed to illustrate the real use of the profiles from the database. PMID:27140145

  13. DABAM: An open-source database of X-ray mirrors metrology

    DOE PAGES

    Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele; ...

    2016-05-01

    An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper,more » with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. In conclusion, some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.« less

  14. Footprint Database and web services for the Herschel space observatory

    NASA Astrophysics Data System (ADS)

    Verebélyi, Erika; Dobos, László; Kiss, Csaba

    2015-08-01

    Using all telemetry and observational meta-data, we created a searchable database of Herschel observation footprints. Data from the Herschel space observatory is freely available for everyone but no uniformly processed catalog of all observations has been published yet. As a first step, we unified the data model for all three Herschel instruments in all observation modes and compiled a database of sky coverage information. As opposed to methods using a pixellation of the sphere, in our database, sky coverage is stored in exact geometric form allowing for precise area calculations. Indexing of the footprints allows for very fast search among observations based on pointing, time, sky coverage overlap and meta-data. This enables us, for example, to find moving objects easily in Herschel fields. The database is accessible via a web site and also as a set of REST web service functions which makes it usable from program clients like Python or IDL scripts. Data is available in various formats including Virtual Observatory standards.

  15. DABAM: an open-source database of X-ray mirrors metrology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele

    An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper,more » with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. Some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.« less

  16. DABAM: An open-source database of X-ray mirrors metrology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez del Rio, Manuel; Bianchi, Davide; Cocco, Daniele

    An open-source database containing metrology data for X-ray mirrors is presented. It makes available metrology data (mirror heights and slopes profiles) that can be used with simulation tools for calculating the effects of optical surface errors in the performances of an optical instrument, such as a synchrotron beamline. A typical case is the degradation of the intensity profile at the focal position in a beamline due to mirror surface errors. This database for metrology (DABAM) aims to provide to the users of simulation tools the data of real mirrors. The data included in the database are described in this paper,more » with details of how the mirror parameters are stored. An accompanying software is provided to allow simple access and processing of these data, calculate the most usual statistical parameters, and also include the option of creating input files for most used simulation codes. In conclusion, some optics simulations are presented and discussed to illustrate the real use of the profiles from the database.« less

  17. Strategic Purchasing in Practice: Comparing Ten European Countries.

    PubMed

    Klasa, Katarzyna; Greer, Scott L; van Ginneken, Ewout

    2018-02-05

    Strategic purchasing of health care services is widely recommended as a policy instrument. We conducted a review of literature of material drawn from the European Observatory on Health Systems and Policies Health Systems in Transition series, other European Observatory databases, and selected country-specific literature to augment the comparative analysis by providing the most recent healthcare trends in ten selected countries. There is little evidence of purchasing being strategic according to any of the established definitions. There is little or no literature suggesting that existing purchasing mechanisms in Europe deliver improved population health, citizen empowerment, stronger governance and stewardship, or develop purchaser organization and capacity. Strategic purchasing has not generally been implemented. Policymakers considering adopting strategic purchasing policies should be aware of this systemic implementation problem. Policymakers in systems with strategic purchasing built into policy should not assume that a purchasing system is strategic or that it is delivering any expected objectives. However, there are individual components of strategic purchasing that are worth pursuing and can provide benefits to health systems. Copyright © 2018. Published by Elsevier B.V.

  18. United States Army Medical Materiel Development Activity: 1997 Annual Report.

    DTIC Science & Technology

    1997-01-01

    business planning and execution information management system (Project Management Division Database ( PMDD ) and Product Management Database System (PMDS...MANAGEMENT • Project Management Division Database ( PMDD ), Product Management Database System (PMDS), and Special Users Database System:The existing...System (FMS), were investigated. New Product Managers and Project Managers were added into PMDS and PMDD . A separate division, Support, was

  19. The Role of the NASA Global Hawk Link Module as an Information Nexus For Atmospheric Mapping Missions

    NASA Technical Reports Server (NTRS)

    Sullivan, D. V.

    2015-01-01

    The Link Module described in this paper was developed for the NASA Uninhabited Aerial System (UAS) Global Hawk Pacific Mission (GloPAC) Airborne Science Campaign; four flights of 30 hour duration, supporting the Aura Validation Experiment (AVE). It was used again during the Genesis and Rapid Intensification Processes (GRIP) experiment, a NASA Earth Science field experiment to better understand how tropical storms form and develop into major hurricanes. In these missions, the Link Module negotiated all communication over the high bandwidth Ku satellite link, archived all the science data from onboard experiments in a spatially enabled database, routed command and control of the instruments from the Global Hawk Operations Center, and re-transmitted select data sets directly to experimenters control and analysis systems. The availability of aggregated information from collections of sensors, and remote control capabilities, in real-time, is revolutionizing the way Airborne Science is being conducted. The Link Module NG now being flown in support of the NASA Earth Venture missions, the Hurricane and Severe Storm Sentinel (HS3) mission, and Airborne Tropical Tropopause Experiment (A TTREX) mission, has advanced data fusion technologies that are further advancing the Scientific productivity, flexibility and robustness of these systems. On-the-fly traffic shaping has been developed to allow the high definition video, used for critical flight control segments, to dynamically allocate variable bandwidth on demand. Historically, the Link Module evolved from the instrument and communication interface controller used by NASA's Pathfinder and Pathfinder plus solar powered UAS's in the late 1990' s. It later was expanded for use in the AIRDAS four channel scanner flown on the NASA Altus UAS, and then again to a module in the AMS twelve channel multispectral scanner flying on the NASA (Predator-b) Ikhana UAS. The current system is the answer to the challenges imposed by extremely long duration UASs, with on-board multi-instrument (>= 12) Sensor Webs.

  20. An Analysis of Far-Infrared Radiances Obtained By the First Instrument at Table Mountain through the Use of Radiative Transfer Calculations

    NASA Astrophysics Data System (ADS)

    Kratz, D. P.; Mlynczak, M. G.; Cageao, R.; Johnson, D. G.; Mast, J. C.

    2014-12-01

    The Far-Infrared Spectroscopy of the Troposphere (FIRST) instrument is a Fourier Transform Spectrometer with a moderately high spectral resolution (0.643 cm-1 unapodized). Designed to measure the most energetically important range (100 to 2000 cm-1) of Earth's emitted infrared spectrum, the FIRST instrument was specifically engineered to include the often overlooked far-infrared (100 to 650 cm-1). To date, the FIRST instrument has been deployed on several field missions, both balloon-borne and ground-based, with the most recent deployment occurring at NASA's Jet Propulsion Laboratory Table Mountain Facility in California during the months of September and October 2012. This deployment, located 2,285 meters above the Mojave Desert, provided an opportunity to observe down-welling radiances under low water vapor conditions, with some cases having total column water vapor amounts of approximately 2 to 3 millimeters. Such low water vapor conditions allow for stringent testing of both the FIRST instrument and the radiative transfer methods used to analyze these measurements. This study is focused on the analysis of the FIRST measurements in the far-infrared obtained during clear-sky conditions, and requires an accounting of the uncertainties in both the measured and calculated radiances. The former involves the manner in which calibration and full-sky conditions affect the radiances measured by the FIRST instrument. The latter involves not only differences in the model formulations and the uncertainties in the water vapor and temperature data provided by the radio-sonde measurements, but also the critical differences and uncertainties contained within the input line parameter databases. This study specifically explores the significant differences that can arise in the calculated radiances that are associated with the choice of a line parameter database, and how this choice affects the analysis of the FIRST measurements.

  1. VizieR Online Data Catalog: SNLS and SDSS SN surveys photometric calibration (Betoule+, 2013)

    NASA Astrophysics Data System (ADS)

    Betoule, M.; Marriner, J.; Regnault, N.; Cuillandre, J.-C.; Astier, P.; Guy, J.; Balland, C.; El, Hage P.; Hardin, D.; Kessler, R.; Le Guillou, L.; Mosher, J.; Pain, R.; Rocci, P.-F.; Sako, M.; Schahmaneche, K.

    2012-11-01

    We present a joined photometric calibration for the SNLS and the SDSS supernova surveys. Our main delivery are catalogs of natural AB magnitudes for a large set of selected tertiary standard stars covering the science field of both surveys. Those catalogs are calibrated to the AB flux scale through observations of 5 primary spectrophotometric standard stars, for which HST-STIS spectra are available in the CALSPEC database. The estimate of the uncertainties associated to this calibration are delivered as a single covariance matrix. We also provide a model of the transmission efficiency of the SNLS photometric instrument MegaCam. Those transmission functions are required for the interpretation of MegaCam natural magnitudes in term of physical fluxes. Similar curves for the SDSS photometric instrument have been published in Doi et al. (2010AJ....139.1628D). Last, we release the measured magnitudes of the five CALSPEC standard stars in the magnitude system of the tertiary catalogs. This makes it possible to update the calibration of the tertiary catalogs if CALSPEC spectra for the primary standards are revised. (11 data files).

  2. Processing of Digital Plates1.2m of Baldone Observatory Schmidt Telescope

    NASA Astrophysics Data System (ADS)

    Eglitis, Ilgmars; Andruk, Vitaly

    2017-04-01

    The aim of this research is to evaluate accuracy of Plate Processing Method and to perform a detailed study of the Epson Expression 10000XL scanner, which was used to digitize plates from the database collection of the 1.2 m Schmidt Telescope installed in the Baldone Observatory. Special software developed in LINUX/MIDAS/ROMAFOT environment was used for processing the scans. Results of the digitized files with grey gradations of 8- and 16-bits were compared; an estimation of the accuracy of the developed method for rectangular coordinates determination and photometry was made. Errors in the instrumental system are ±0.026 pixels and ±0.024m for coordinates and stellar magnitudes respectively. To evaluate the repeatability of the scanner's astrometric and photometric errors, six consecutive scans of one plate were processed with a spatial separation of 1200 dpi. The following error estimations are obtained for stars brighter than U< 13.5m: σxy = ±0.021 to 0.027 pixels and σm = ±0.014m to 0.016m for rectangular coordinates and instrumental stellar magnitudes respectively.

  3. Mars 2020 Entry, Descent and Landing Instrumentation 2 (MEDLI2)

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Bose, Deepak; White, Todd R.; Wright, Henry S.; Schoenenberger, Mark; Kuhl, Christopher A.; Trombetta, Dominic; Santos, Jose A.; Oishi, Tomomi; Karlgaard, Christopher D.; hide

    2016-01-01

    The Mars Entry Descent and Landing Instrumentation 2 (MEDLI2) sensor suite will measure aerodynamic, aerothermodynamic, and TPS performance during the atmospheric entry, descent, and landing phases of the Mars 2020 mission. The key objectives are to reduce design margin and prediction uncertainties for the aerothermal environments and aerodynamic database. For MEDLI2, the sensors are installed on both the heatshield and backshell, and include 7 pressure transducers, 17 thermal plugs, and 3 heat flux sensors (including a radiometer). These sensors will expand the set of measurements collected by the highly successful MEDLI suite, collecting supersonic pressure measurements on the forebody, a pressure measurement on the aftbody, direct heat flux measurements on the aftbody, a radiative heating measurement on the aftbody, and multiple near-surface thermal measurements on the thermal protection system (TPS) materials on both the forebody and aftbody. To meet the science objectives, supersonic pressure transducers and heat flux sensors are currently being developed and their qualification and calibration plans are presented. Finally, the reconstruction targets for data accuracy are presented, along with the planned methodologies for achieving the targets.

  4. Blade Surface Pressure Distributions in a Rocket Engine Turbine: Experimental Work With On-Blade Pressure Transducers

    NASA Technical Reports Server (NTRS)

    Hudson, Susan T.; Zoladz, Thomas F.; Griffin, Lisa W.; Turner, James E. (Technical Monitor)

    2000-01-01

    Understanding the unsteady aspects of turbine rotor flowfields is critical to successful future turbine designs. A technology program was conducted at NASA's Marshall Space Flight Center to increase the understanding of unsteady environments for rocket engine turbines. The experimental program involved instrumenting turbine rotor blades with surface-mounted high frequency response pressure transducers. The turbine model was then tested to measure the unsteady pressures on the rotor blades. The data obtained from the experimental program is unique in three respects. First, much more unsteady data was obtained (several minutes per set point) than has been possible in the past. Also, two independent unsteady data acquisition systems and fundamental signal processing approaches were used. Finally, an extensive steady performance database existed for the turbine model. This allowed an evaluation of the effect of the on-blade instrumentation on the turbine's performance. This unique data set, the lessons learned for acquiring this type of data, and the improvements made to the data analysis and prediction tools will contribute to future turbine programs such as those for reusable launch vehicles.

  5. A true real-time, on-line security system for waterborne pathogen surveillance

    NASA Astrophysics Data System (ADS)

    Adams, John A.; McCarty, David L.

    2008-04-01

    Over the past several years many advances have been made to monitor potable water systems for toxic threats. However, the need for real-time, on-line systems to detect the malicious introduction of deadly pathogens still exists. Municipal water distribution systems, government facilities and buildings, and high profile public events remain vulnerable to terrorist-related biological contamination. After years of research and development, an instrument using multi-angle light scattering (MALS) technology has been introduced to achieve on-line, real-time detection and classification of a waterborne pathogen event. The MALS system utilizes a continuous slip stream of water passing through a flow cell in the instrument. A laser beam, focused perpendicular to the water flow, strikes particles as they pass through the beam generating unique light scattering patterns that are captured by photodetectors. Microorganisms produce patterns termed 'bio-optical signatures' which are comparable to fingerprints. By comparing these bio-optical signatures to an on-board database of microorganism patterns, detection and classification occurs within minutes. If a pattern is not recognized, it is classified as an 'unknown' and the unidentified contaminant is registered as a potential threat. In either case, if the contaminant exceeds a customer's threshold, the system will immediately alert personnel to the contamination event while extracting a sample for confirmation. The system, BioSentry TM, developed by JMAR Technologies is now field-tested and commercially available. BioSentry is cost effective, uses no reagents, operates remotely, and can be used for continuous microbial surveillance in many water treatment environments. Examples of HLS installations will be presented along with data from the US EPA NHSRC Testing and Evaluation Facility.

  6. Data processing and in-flight calibration systems for OMI-EOS-Aura

    NASA Astrophysics Data System (ADS)

    van den Oord, G. H. J.; Dobber, M.; van de Vegte, J.; van der Neut, I.; Som de Cerff, W.; Rozemeijer, N. C.; Schenkelaars, V.; ter Linden, M.

    2006-08-01

    The OMI instrument that flies on the EOS Aura mission was launched in July 2004. OMI is a UV-VIS imaging spectrometer that measures in the 270 - 500 nm wavelength range. OMI provides daily global coverage with high spatial resolution. Every orbit of 100 minutes OMI generates about 0.5 GB of Level 0 data and 1.2 GB of Level 1 data. About half of the Level 1 data consists of in-flight calibration measurements. These data rates make it necessary to automate the process of in-flight calibration. For that purpose two facilities have been developed at KNMI in the Netherlands: the OMI Dutch Processing System (ODPS) and the Trend Monitoring and In-flight Calibration Facility (TMCF). A description of these systems is provided with emphasis on the use for radiometric, spectral and detector calibration and characterization. With the advance of detector technology and the need for higher spatial resolution, data rates will become even higher for future missions. To make effective use of automated systems like the TMCF, it is of paramount importance to integrate the instrument operations concept, the information contained in the Level 1 (meta-)data products and the inflight calibration software and system databases. In this way a robust but also flexible end-to-end system can be developed that serves the needs of the calibration staff, the scientific data users and the processing staff. The way this has been implemented for OMI may serve as an example of a cost-effective and user friendly solution for future missions. The basic system requirements for in-flight calibration are discussed and examples are given how these requirements have been implemented for OMI. Special attention is paid to the aspect of supporting the Level 0 - 1 processing with timely and accurate calibration constants.

  7. Performance assessment of EMR systems based on post-relational database.

    PubMed

    Yu, Hai-Yan; Li, Jing-Song; Zhang, Xiao-Guang; Tian, Yu; Suzuki, Muneou; Araki, Kenji

    2012-08-01

    Post-relational databases provide high performance and are currently widely used in American hospitals. As few hospital information systems (HIS) in either China or Japan are based on post-relational databases, here we introduce a new-generation electronic medical records (EMR) system called Hygeia, which was developed with the post-relational database Caché and the latest platform Ensemble. Utilizing the benefits of a post-relational database, Hygeia is equipped with an "integration" feature that allows all the system users to access data-with a fast response time-anywhere and at anytime. Performance tests of databases in EMR systems were implemented in both China and Japan. First, a comparison test was conducted between a post-relational database, Caché, and a relational database, Oracle, embedded in the EMR systems of a medium-sized first-class hospital in China. Second, a user terminal test was done on the EMR system Izanami, which is based on the identical database Caché and operates efficiently at the Miyazaki University Hospital in Japan. The results proved that the post-relational database Caché works faster than the relational database Oracle and showed perfect performance in the real-time EMR system.

  8. New approaches to merging multi-sensor satellite measurements of volcanic SO2 emissions

    NASA Astrophysics Data System (ADS)

    Carn, S. A.; Telling, J. W.; Krotkov, N. A.

    2015-12-01

    As part of the NASA MEaSUREs program, we are developing a unique long-term database of volcanic sulfur dioxide (SO2) emissions for use by the scientific community, using observations from multiple satellite instruments collected since 1978. Challenges to creating such a database include assessing data continuity between multiple satellite missions and SO2 retrieval algorithms and estimating measurement uncertainties. Here, we describe the approaches that we are using to merge multi-decadal SO2 measurements from the ultraviolet (UV) Total Ozone Mapping Spectrometer (TOMS), Ozone Monitoring Instrument (OMI) and Ozone Monitoring and Profiler Suite (OMPS) sensors. A particular challenge has involved accounting for the OMI row anomaly (ORA), a data gap in OMI measurements since 2008 that partially or wholly obscures some volcanic eruption clouds, whilst still profiting from the high OMI spatial resolution and data quality, and prior OMI SO2 validation. We present a new method to substitute missing SO2 information in the ORA with near-coincident SO2 data from OMPS, providing improved estimates of eruptive volcanic SO2 emissions. The technique can also be used to assess consistency between different satellite instruments and SO2 retrieval algorithms, investigate the impact of variable sensor spatial resolution, and estimate measurement uncertainties. It is particularly effective for larger eruptions producing extensive SO2 clouds where the ORA obscures the volcanic plume in multiple contiguous orbits. Application of the technique is demonstrated using recent volcanic eruptions including the 2015 eruption of Calbuco, Chile. We also provide an update on the status of the multi-satellite long-term volcanic SO2 database (MSVOLSO2L4).

  9. WATCHDOG: A COMPREHENSIVE ALL-SKY DATABASE OF GALACTIC BLACK HOLE X-RAY BINARIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tetarenko, B. E.; Sivakoff, G. R.; Heinke, C. O.

    With the advent of more sensitive all-sky instruments, the transient universe is being probed in greater depth than ever before. Taking advantage of available resources, we have established a comprehensive database of black hole (and black hole candidate) X-ray binary (BHXB) activity between 1996 and 2015 as revealed by all-sky instruments, scanning surveys, and select narrow-field X-ray instruments on board the INTErnational Gamma-Ray Astrophysics Laboratory, Monitor of All-Sky X-ray Image, Rossi X-ray Timing Explorer, and Swift telescopes; the Whole-sky Alberta Time-resolved Comprehensive black-Hole Database Of the Galaxy or WATCHDOG. Over the past two decades, we have detected 132 transient outbursts, trackedmore » and classified behavior occurring in 47 transient and 10 persistently accreting BHs, and performed a statistical study on a number of outburst properties across the Galactic population. We find that outbursts undergone by BHXBs that do not reach the thermally dominant accretion state make up a substantial fraction (∼40%) of the Galactic transient BHXB outburst sample over the past ∼20 years. Our findings suggest that this “hard-only” behavior, observed in transient and persistently accreting BHXBs, is neither a rare nor recent phenomenon and may be indicative of an underlying physical process, relatively common among binary BHs, involving the mass-transfer rate onto the BH remaining at a low level rather than increasing as the outburst evolves. We discuss how the larger number of these “hard-only” outbursts and detected outbursts in general have significant implications for both the luminosity function and mass-transfer history of the Galactic BHXB population.« less

  10. VizieR Online Data Catalog: WATCHDOG: an all-sky database of Galactic BHXBs (Tetarenko+, 2016)

    NASA Astrophysics Data System (ADS)

    Tetarenko, B. E.; Sivakoff, G. R.; Heinke, C. O.; Gladstone, J. C.

    2016-03-01

    With the advent of more sensitive all-sky instruments, the transient universe is being probed in greater depth than ever before. Taking advantage of available resources, we have established a comprehensive database of black hole (and black hole candidate) X-ray binary (BHXB) activity between 1996 and 2015 as revealed by all-sky instruments, scanning surveys, and select narrow-field X-ray instruments on board the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL), Monitor of All-Sky X-ray Image (MAXI), Rossi X-ray Timing Explorer (RXTE), and Swift telescopes; the Whole-sky Alberta Time-resolved Comprehensive black-Hole Database Of the Galaxy or WATCHDOG. Over the past two decades, we have detected 132 transient outbursts, tracked and classified behavior occurring in 47 transient and 10 persistently accreting BHs, and performed a statistical study on a number of outburst properties across the Galactic population. We find that outbursts undergone by BHXBs that do not reach the thermally dominant accretion state make up a substantial fraction (~40%) of the Galactic transient BHXB outburst sample over the past ~20 years. Our findings suggest that this "hard-only" behavior, observed in transient and persistently accreting BHXBs, is neither a rare nor recent phenomenon and may be indicative of an underlying physical process, relatively common among binary BHs, involving the mass-transfer rate onto the BH remaining at a low level rather than increasing as the outburst evolves. We discuss how the larger number of these "hard-only" outbursts and detected outbursts in general have significant implications for both the luminosity function and mass-transfer history of the Galactic BHXB population. (9 data files).

  11. WATCHDOG: A Comprehensive All-sky Database of Galactic Black Hole X-ray Binaries

    NASA Astrophysics Data System (ADS)

    Tetarenko, B. E.; Sivakoff, G. R.; Heinke, C. O.; Gladstone, J. C.

    2016-02-01

    With the advent of more sensitive all-sky instruments, the transient universe is being probed in greater depth than ever before. Taking advantage of available resources, we have established a comprehensive database of black hole (and black hole candidate) X-ray binary (BHXB) activity between 1996 and 2015 as revealed by all-sky instruments, scanning surveys, and select narrow-field X-ray instruments on board the INTErnational Gamma-Ray Astrophysics Laboratory, Monitor of All-Sky X-ray Image, Rossi X-ray Timing Explorer, and Swift telescopes; the Whole-sky Alberta Time-resolved Comprehensive black-Hole Database Of the Galaxy or WATCHDOG. Over the past two decades, we have detected 132 transient outbursts, tracked and classified behavior occurring in 47 transient and 10 persistently accreting BHs, and performed a statistical study on a number of outburst properties across the Galactic population. We find that outbursts undergone by BHXBs that do not reach the thermally dominant accretion state make up a substantial fraction (∼40%) of the Galactic transient BHXB outburst sample over the past ∼20 years. Our findings suggest that this “hard-only” behavior, observed in transient and persistently accreting BHXBs, is neither a rare nor recent phenomenon and may be indicative of an underlying physical process, relatively common among binary BHs, involving the mass-transfer rate onto the BH remaining at a low level rather than increasing as the outburst evolves. We discuss how the larger number of these “hard-only” outbursts and detected outbursts in general have significant implications for both the luminosity function and mass-transfer history of the Galactic BHXB population.

  12. Transcriptome analysis of the desert locust central nervous system: production and annotation of a Schistocerca gregaria EST database.

    PubMed

    Badisco, Liesbeth; Huybrechts, Jurgen; Simonet, Gert; Verlinden, Heleen; Marchal, Elisabeth; Huybrechts, Roger; Schoofs, Liliane; De Loof, Arnold; Vanden Broeck, Jozef

    2011-03-21

    The desert locust (Schistocerca gregaria) displays a fascinating type of phenotypic plasticity, designated as 'phase polyphenism'. Depending on environmental conditions, one genome can be translated into two highly divergent phenotypes, termed the solitarious and gregarious (swarming) phase. Although many of the underlying molecular events remain elusive, the central nervous system (CNS) is expected to play a crucial role in the phase transition process. Locusts have also proven to be interesting model organisms in a physiological and neurobiological research context. However, molecular studies in locusts are hampered by the fact that genome/transcriptome sequence information available for this branch of insects is still limited. We have generated 34,672 raw expressed sequence tags (EST) from the CNS of desert locusts in both phases. These ESTs were assembled in 12,709 unique transcript sequences and nearly 4,000 sequences were functionally annotated. Moreover, the obtained S. gregaria EST information is highly complementary to the existing orthopteran transcriptomic data. Since many novel transcripts encode neuronal signaling and signal transduction components, this paper includes an overview of these sequences. Furthermore, several transcripts being differentially represented in solitarious and gregarious locusts were retrieved from this EST database. The findings highlight the involvement of the CNS in the phase transition process and indicate that this novel annotated database may also add to the emerging knowledge of concomitant neuronal signaling and neuroplasticity events. In summary, we met the need for novel sequence data from desert locust CNS. To our knowledge, we hereby also present the first insect EST database that is derived from the complete CNS. The obtained S. gregaria EST data constitute an important new source of information that will be instrumental in further unraveling the molecular principles of phase polyphenism, in further establishing locusts as valuable research model organisms and in molecular evolutionary and comparative entomology.

  13. Examining the social determinants of children's developmental health: protocol for building a pan-Canadian population-based monitoring system for early childhood development

    PubMed Central

    Guhn, Martin; Janus, Magdalena; Enns, Jennifer; Brownell, Marni; Forer, Barry; Duku, Eric; Muhajarine, Nazeem; Raos, Rob

    2016-01-01

    Introduction Early childhood is a key period to establish policies and practices that optimise children's health and development, but Canada lacks nationally representative data on social indicators of children's well-being. To address this gap, the Early Development Instrument (EDI), a teacher-administered questionnaire completed for kindergarten-age children, has been implemented across most Canadian provinces over the past 10 years. The purpose of this protocol is to describe the Canadian Neighbourhoods and Early Child Development (CanNECD) Study, the aims of which are to create a pan-Canadian EDI database to monitor trends over time in children's developmental health and to advance research examining the social determinants of health. Methods and analysis Canada-wide EDI records from 2004 to 2014 (representing over 700 000 children) will be linked to Canada Census and Income Taxfiler data. Variables of socioeconomic status derived from these databases will be used to predict neighbourhood-level EDI vulnerability rates by conducting a series of regression analyses and latent variable models at provincial/territorial and national levels. Where data are available, we will measure the neighbourhood-level change in developmental vulnerability rates over time and model the socioeconomic factors associated with those trends. Ethics and dissemination Ethics approval for this study was granted by the Behavioural Research Ethics Board at the University of British Columbia. Study findings will be disseminated to key partners, including provincial and federal ministries, schools and school districts, collaborative community groups and the early childhood development research community. The database created as part of this longitudinal population-level monitoring system will allow researchers to associate practices, programmes and policies at school and community levels with trends in developmental health outcomes. The CanNECD Study will guide future early childhood development action and policies, using the database as a tool for formative programme and policy evaluation. PMID:27130168

  14. The IAGOS Information System

    NASA Astrophysics Data System (ADS)

    Boulanger, D.; Thouret, V.

    2016-12-01

    IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft and do measurements of aerosols, cloud particles, greenhouse gases, ozone, water vapor and nitrogen oxides from the surface to the lower stratosphere. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core and IAGOS-CARIBIC data. The IAGOS Data Portal (http://www.iagos.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). In 2016 the new IAGOS Data Portal has been released. In addition to the data download the portal provides improved and new services such as download in NetCDF or NASA Ames formats and plotting tools (maps, time series, vertical profiles). New added value products are available through the portal: back trajectories, origin of air masses, co-location with satellite data. Web services allow to download IAGOS metadata such as flights and airports information. Administration tools have been implemented for users management and instruments monitoring. A major improvement is the interoperability with international portals and other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse, the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de) and the CAMS (Copernicus Atmosphere Monitoring Service) data center in Jülich (http://join.iek.fz-juelich.de). The link with the CAMS data center, through the JOIN interface, allows to combine model outputs with IAGOS data for inter-comparison. The CAMS project is a prominent user of the IGAS data network. Duting the next year IAGOS will improve metadata standardization and dissemination through different collaborations with the AERIS data center, GAW for which IAGOS is a contributing network and the ENVRI+ European project. Measurements traceability and quality metadata will be available and DOI will be implemented.

  15. The development of large-scale de-identified biomedical databases in the age of genomics-principles and challenges.

    PubMed

    Dankar, Fida K; Ptitsyn, Andrey; Dankar, Samar K

    2018-04-10

    Contemporary biomedical databases include a wide range of information types from various observational and instrumental sources. Among the most important features that unite biomedical databases across the field are high volume of information and high potential to cause damage through data corruption, loss of performance, and loss of patient privacy. Thus, issues of data governance and privacy protection are essential for the construction of data depositories for biomedical research and healthcare. In this paper, we discuss various challenges of data governance in the context of population genome projects. The various challenges along with best practices and current research efforts are discussed through the steps of data collection, storage, sharing, analysis, and knowledge dissemination.

  16. Mid-IR DIAL for high-resolution mapping of explosive precursors

    NASA Astrophysics Data System (ADS)

    Mitev, V.; Babichenko, S.; Bennes, J.; Borelli, R.; Dolfi-Bouteyre, A.; Fiorani, L.; Hespel, L.; Huet, T.; Palucci, A.; Pistilli, M.; Puiu, A.; Rebane, O.; Sobolev, I.

    2013-10-01

    A DIAL instrument on a moving platform is seen as a valuable remote sensing component in a sensor network for area monitoring, targeting sites involved in unauthorised explosive manufacturing. Such instrument will perform the area mapping of the vapour concentration of key substances, known to be used as precursors in explosive fabrication, such as acetone and nitromethane. The IR spectra of acetone and nitromethane vapours have been defined from available spectroscopy databases and from laboratory measurements as showing optimal spectral band for the DIAL operation in the spectral range of 3.0 μm - 3.5 μm. The DIAL operation has been numerically simulated, with inputs based on the HITRAN database, the U.S. Standard Atmosphere and aerosol simulation software package OPAC. A combination of OPO and OPA has been chosen as a transmitter, where the idler wavelength is used for probing, with wavelength tuning in sequence. A scanner mounted on top of the coaxially aligned laser and receiver, is capable of covering almost 360 degrees horizontally and +/-30 degrees vertically. The detection is performed by a photovoltaic photodiode with 4-stage cooling, with a signal digitalisation having 14 bit amplitude resolution and 125 Ms/s sampling rate. Here we present the development and the first test of the DIAL instrument.

  17. [Systematic review of studies on quality of life indexed on the SciELO database].

    PubMed

    Landeiro, Graziela Macedo Bastos; Pedrozo, Celine Cristina Raimundo; Gomes, Maria José; Oliveira, Elizabete Regina de Araújo

    2011-10-01

    Interest in the quality of life construct has increased in the same proportion as the output of instruments to measure it. In order to analyze the scientific literature on the subject to provide a reflection on this construct in Brazil, a systematic review of the SciELO database covering the period from January 2001 to December 2006 was conducted. It was divided into 3 phases: the first involving 180 publications, the second 124, and the third 10. Of the 180 publications, 77.4% consisted of production in the last three years, with growth of 32.4% from 2001 to 2006. Of these, 124 were selected for methodological analysis in accordance with the category of the study: 79 (63.9%) instrument application articles; 25 (20.1%) translation, validation, adaptation and construction of a QOL instrument; 10 (8%) qualitative studies on QOL; 5 (4%) bibliographical review, 5 (4%) on the quality of life concept. The next stage involved the use of questionnaires and/or interview scripts in order to obtain a broader consensus on perceived quality of life from the interviewees. It was seen that there was significant scientific output in the period under scrutiny, with diversification of approaches and methodologies, highlighting the complexity of the quality of life construct.

  18. Doing Your Science While You're in Orbit

    NASA Astrophysics Data System (ADS)

    Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.

    2010-11-01

    Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.

  19. Barrow real-time sea ice mass balance data: ingestion, processing, dissemination and archival of multi-sensor data

    NASA Astrophysics Data System (ADS)

    Grimes, J.; Mahoney, A. R.; Heinrichs, T. A.; Eicken, H.

    2012-12-01

    Sensor data can be highly variable in nature and also varied depending on the physical quantity being observed, sensor hardware and sampling parameters. The sea ice mass balance site (MBS) operated in Barrow by the University of Alaska Fairbanks (http://seaice.alaska.edu/gi/observatories/barrow_sealevel) is a multisensor platform consisting of a thermistor string, air and water temperature sensors, acoustic altimeters above and below the ice and a humidity sensor. Each sensor has a unique specification and configuration. The data from multiple sensors are combined to generate sea ice data products. For example, ice thickness is calculated from the positions of the upper and lower ice surfaces, which are determined using data from downward-looking and upward-looking acoustic altimeters above and below the ice, respectively. As a data clearinghouse, the Geographic Information Network of Alaska (GINA) processes real time data from many sources, including the Barrow MBS. Doing so requires a system that is easy to use, yet also offers the flexibility to handle data from multisensor observing platforms. In the case of the Barrow MBS, the metadata system needs to accommodate the addition of new and retirement of old sensors from year to year as well as instrument configuration changes caused by, for example, spring melt or inquisitive polar bears. We also require ease of use for both administrators and end users. Here we present the data and processing steps of using sensor data system powered by the NoSQL storage engine, MongoDB. The system has been developed to ingest, process, disseminate and archive data from the Barrow MBS. Storing sensor data in a generalized format, from many different sources, is a challenging task, especially for traditional SQL databases with a set schema. MongoDB is a NoSQL (not only SQL) database that does not require a fixed schema. There are several advantages using this model over the traditional relational database management system (RDBMS) model databases. The lack of a required schema allows flexibility in how the data can be ingested into the database. For example, MongoDB imposes no restrictions on field names. For researchers using the system, this means that the name they have chosen for the sensor is carried through the database, any processing, and to the final output helping to preserve data integrity. Also, MongoDB allows the data to be pushed to it dynamically meaning that field attributes can be defined at the point of ingestion. This allows any sensor data to be ingested as a document and for this functionality to be transferred to the user interface, allowing greater adaptability to different use-case scenarios. In presenting the MondoDB data system being developed for the Barrow MBS, we demonstrate the versatility of this approach and its suitability as the foundation of a Barrow node of the Arctic Observing Network. Authors Jason Grimes - Geographic Information Network of Alaska - jason@gina.alaska.edu Andy Mahony - Geophysical Institute - mahoney@gi.alaska.edu Hajo Eiken - Geophysical Institute - Hajo.Eicken@gi.alaska.edu Tom Heinrichs - Geographic Information Network of Alaska - Tom.Heinrichs@alaska.edu

  20. A review of instruments to measure interprofessional collaboration for chronic disease management for community-living older adults.

    PubMed

    Bookey-Bassett, Sue; Markle-Reid, Maureen; McKey, Colleen; Akhtar-Danesh, Noori

    2016-01-01

    It is acknowledged internationally that chronic disease management (CDM) for community-living older adults (CLOA) is an increasingly complex process. CDM for older adults, who are often living with multiple chronic conditions, requires coordination of various health and social services. Coordination is enabled through interprofessional collaboration (IPC) among individual providers, community organizations, and health sectors. Measuring IPC is complicated given there are multiple conceptualisations and measures of IPC. A literature review of several healthcare, psychological, and social science electronic databases was conducted to locate instruments that measure IPC at the team level and have published evidence of their reliability and validity. Five instruments met the criteria and were critically reviewed to determine their strengths and limitations as they relate to CDM for CLOA. A comparison of the characteristics, psychometric properties, and overall concordance of each instrument with salient attributes of IPC found the Collaborative Practice Assessment Tool to be the most appropriate instrument for measuring IPC for CDM in CLOA.

  1. Investigating the key indicators for evaluating post-disaster shelter.

    PubMed

    Nath, Ronita; Shannon, Harry; Kabali, Conrad; Oremus, Mark

    2017-07-01

    This study sought to identify the primary indicators for evaluating shelter assistance following natural disasters and then to develop a shelter evaluation instrument based on these indicators. Electronic databases and the 'grey' literature were scoured for publications with a relation to post-disaster shelter assistance. Indicators for evaluating such assistance were extracted from these publications. In total, 1,525 indicators were extracted from 181 publications. A preliminary evaluation instrument was designed from these 1,525 indicators. Shelter experts checked the instrument for face and content validity, and it was revised subsequently based on their input. The revised instrument comprises a version for use by shelter agencies (48 questions that assess 23 indicators) and a version for use by beneficiaries (52 questions that assess 22 indicators). The instrument can serve as a standardised tool to enable groups to gauge whether or not the shelter assistance that they supply meets the needs of disaster-affected populations. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.

  2. Development of expert systems for analyzing electronic documents

    NASA Astrophysics Data System (ADS)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  3. Assessing Social Networks in Patients with Psychotic Disorders: A Systematic Review of Instruments.

    PubMed

    Siette, Joyce; Gulea, Claudia; Priebe, Stefan

    2015-01-01

    Evidence suggests that social networks of patients with psychotic disorders influence symptoms, quality of life and treatment outcomes. It is therefore important to assess social networks for which appropriate and preferably established instruments should be used. To identify instruments assessing social networks in studies of patients with psychotic disorders and explore their properties. A systematic search of electronic databases was conducted to identify studies that used a measure of social networks in patients with psychotic disorders. Eight instruments were identified, all of which had been developed before 1991. They have been used in 65 studies (total N of patients = 8,522). They assess one or more aspects of social networks such as their size, structure, dimensionality and quality. Most instruments have various shortcomings, including questionable inter-rater and test-retest reliability. The assessment of social networks in patients with psychotic disorders is characterized by a variety of approaches which may reflect the complexity of the construct. Further research on social networks in patients with psychotic disorders would benefit from advanced and more precise instruments using comparable definitions of and timescales for social networks across studies.

  4. An Expanded UV Irradiance Database from TOMS Including the Effects of Ozone, Clouds, and Aerosol Attenuation

    NASA Technical Reports Server (NTRS)

    Herman, J.; Krotkov, N.

    2003-01-01

    The TOMS UV irradiance database (1978 to 2003) has been expanded to include five new products (noon irradiance at 305,310,324, and 380 nm, and noon erythemal-weighted irradiance), in addition to the existing erythemal daily exposure, that permit direct comparisons with ground-based measurements from spectrometers and broadband instruments. The new data are available on http://toms.gsfc.nasa.gov/>http://toms.gsfc.nasa.gov. Comparisons of the TOMS estimated irradiances with ground-based instruments are given along with a review of the sources of known errors, especially the recent improvements in accounting for aerosol attenuation. Trend estimations from the new TOMS irradiances permit the clear separation of changes caused by ozone and those caused by aerosols and clouds. Systematic differences in cloud cover are shown to be the most important factor in determining regional differences in UV radiation reaching the ground for locations at the same latitude (e.g., the summertime differences between Australia and the US southwest).

  5. The measurement of collaboration within healthcare settings: a systematic review of measurement properties of instruments.

    PubMed

    Walters, Stephen John; Stern, Cindy; Robertson-Malt, Suzanne

    2016-04-01

    There is a growing call by consumers and governments for healthcare to adopt systems and approaches to care to improve patient safety. Collaboration within healthcare settings is an important factor for improving systems of care. By using validated measurement instruments a standardized approach to assessing collaboration is possible, otherwise it is only an assumption that collaboration is occurring in any healthcare setting. The objective of this review was to evaluate and compare measurement properties of instruments that measure collaboration within healthcare settings, specifically those which have been psychometrically tested and validated. Participants could be healthcare professionals, the patient or any non-professional who contributes to a patient's care, for example, family members, chaplains or orderlies. The term participant type means the designation of any one participant; for example 'nurse', 'social worker' or 'administrator'. More than two participant types was mandatory. The focus of this review was the validity of tools used to measure collaboration within healthcare settings. The types of studies considered for inclusion were validation studies, but quantitative study designs such as randomized controlled trials, controlled trials and case studies were also eligible for inclusion. Studies that focused on Interprofessional Education, were published as an abstract only, contained patient self-reporting only or were not about care delivery were excluded. The outcome of interest was validation and interpretability of the instrument being assessed and included content validity, construct validity and reliability. Interpretability is characterized by statistics such as mean and standard deviation which can be translated to a qualitative meaning. The search strategy aimed to find both published and unpublished studies. A three-step search strategy was utilized in this review. The databases searched included PubMed, CINAHL, Embase, Cochrane Central Register of Controlled Trials, Emerald Fulltext, MD Consult Australia, PsycARTICLES, Psychology and Behavioural Sciences Collection, PsycINFO, Informit Health Databases, Scopus, UpToDate and Web of Science. The search for unpublished studies included EThOS (Electronic Thesis Online Service), Index to Theses and ProQuest- Dissertations and Theses. The assessment of methodological quality of the included studies was undertaken using the COSMIN checklist which is a validated tool that assesses the process of design and validation of healthcare measurement instruments. An Excel spreadsheet version of COSMIN was developed for data collection which included a worksheet for extracting participant characteristics and interpretability data. Statistical pooling of data was not possible for this review. Therefore, the findings are presented in a narrative form including tables and figures to aid in data presentation. To make a synthesis of the assessments of methodological quality of the different studies, each instrument was rated by accounting for the number of studies performed with an instrument, the appraisal of methodological quality and the consistency of results between studies. Twenty-one studies of 12 instruments were included in the review. The studies were diverse in their theoretical underpinnings, target population/setting and measurement objectives. Measurement objectives included: investigating beliefs, behaviors, attitudes, perceptions and relationships associated with collaboration; measuring collaboration between different levels of care or within a multi-rater/target group; assessing collaboration across teams; or assessing internal participation of both teams and patients.Studies produced validity or interpretability data but none of the studies assessed all validity and reliability properties. However, most of the included studies produced a factor structure or referred to prior factor analysis. A narrative synthesis of the individual study factor structures was generated consisting of nine headings: organizational settings, support structures, purpose and goals; communication; reflection on process; cooperation; coordination; role interdependence and partnership; relationships; newly created professional activities; and professional flexibility. Among the many instruments that measure collaboration within healthcare settings, the quality of each instrument varies; instruments are designed for specific populations and purposes, and are validated in various settings. Selecting an instrument requires careful consideration of the qualities of each. Therefore, referring to systematic reviews of measurement properties of instruments may be helpful to clinicians or researchers in instrument selection. Systematic reviews of measurement properties of instruments are valuable in aiding in instrument selection. This systematic review may be useful in instrument selection for the measurement of collaboration within healthcare settings with a complex mix of participant types. Evaluating collaboration provides important information on the strengths and limitations of different healthcare settings and the opportunities for continuous improvement via any remedial actions initiated. Development of a tool that can be used to measure collaboration within teams of healthcare professionals and non-professionals is important for practice. The use of different statistical modelling techniques, such as Item Response Theory modelling and the translation of models into Computer Adaptive Tests, may prove useful. Measurement equivalence is an important consideration for future instrument development and validation. Further development of the COSMIN tool should include appraisal for measurement equivalence. Researchers developing and validating measurement tools should consider multi-method research designs.

  6. Environment/Health/Safety (EHS): Databases

    Science.gov Websites

    Hazard Documents Database Biosafety Authorization System CATS (Corrective Action Tracking System) (for findings 12/2005 to present) Chemical Management System Electrical Safety Ergonomics Database (for new Learned / Best Practices REMS - Radiation Exposure Monitoring System SJHA Database - Subcontractor Job

  7. Cost of Lightning Strike Related Outages of Visual Navigational Aids at Airports in the United States

    NASA Astrophysics Data System (ADS)

    Rakas, J.; Nikolic, M.; Bauranov, A.

    2017-12-01

    Lightning storms are a serious hazard that can cause damage to vital human infrastructure. In aviation, lightning strikes cause outages to air traffic control equipment and facilities that result in major disruptions in the network, causing delays and financial costs measured in the millions of dollars. Failure of critical systems, such as Visual Navigational Aids (Visual NAVAIDS), are particularly dangerous since NAVAIDS are an essential part of landing procedures. Precision instrument approach, an operation utilized during the poor visibility conditions, utilizes several of these systems, and their failure leads to holding patterns and ultimately diversions to other airports. These disruptions lead to both ground and airborne delay. Accurate prediction of these outages and their costs is a key prerequisite for successful investment planning. The air traffic management and control sector need accurate information to successfully plan maintenance and develop a more robust system under the threat of increasing lightning rates. To analyze the issue, we couple the Remote Monitoring and Logging System (RMLS) database and the Aviation System Performance Metrics (ASPM) databases to identify lightning-induced outages, and connect them with weather conditions, demand and landing runway to calculate the total delays induced by the outages, as well as the number of cancellations and diversions. The costs are then determined by calculating direct costs to aircraft operators and costs of passengers' time for delays, cancellations and diversions. The results indicate that 1) not all NAVAIDS are created equal, and 2) outside conditions matter. The cost of an outage depends on the importance of the failed system and the conditions that prevailed before, during and after the failure. The outage that occurs during high demand and poor weather conditions is more likely to result in more delays and higher costs.

  8. [Artificial Intelligence in Drug Discovery].

    PubMed

    Fujiwara, Takeshi; Kamada, Mayumi; Okuno, Yasushi

    2018-04-01

    According to the increase of data generated from analytical instruments, application of artificial intelligence(AI)technology in medical field is indispensable. In particular, practical application of AI technology is strongly required in "genomic medicine" and "genomic drug discovery" that conduct medical practice and novel drug development based on individual genomic information. In our laboratory, we have been developing a database to integrate genome data and clinical information obtained by clinical genome analysis and a computational support system for clinical interpretation of variants using AI. In addition, with the aim of creating new therapeutic targets in genomic drug discovery, we have been also working on the development of a binding affinity prediction system for mutated proteins and drugs by molecular dynamics simulation using supercomputer "Kei". We also have tackled for problems in a drug virtual screening. Our developed AI technology has successfully generated virtual compound library, and deep learning method has enabled us to predict interaction between compound and target protein.

  9. Human Factors Considerations for Performance-Based Navigation

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Adams, Catherine A.

    2006-01-01

    A transition toward a performance-based navigation system is currently underway in both the United States and around the world. Performance-based navigation incorporates Area Navigation (RNAV) and Required Navigation Performance (RNP) procedures that do not rely on the location of ground-based navigation aids. These procedures offer significant benefits to both operators and air traffic managers. Under sponsorship from the Federal Aviation Administration (FAA), the National Aeronautics and Space Administration (NASA) has undertaken a project to document human factors issues that have emerged during RNAV and RNP operations and propose areas for further consideration. Issues were found to include aspects of air traffic control and airline procedures, aircraft systems, and procedure design. Major findings suggest the need for human factors-specific instrument procedure design guidelines. Ongoing industry and government activities to address air-ground communication terminology, procedure design improvements, and chart-database commonality are strongly encouraged.

  10. Microcontroller-based real-time QRS detection.

    PubMed

    Sun, Y; Suppappola, S; Wrublewski, T A

    1992-01-01

    The authors describe the design of a system for real-time detection of QRS complexes in the electrocardiogram based on a single-chip microcontroller (Motorola 68HC811). A systematic analysis of the instrumentation requirements for QRS detection and of the various design techniques is also given. Detection algorithms using different nonlinear transforms for the enhancement of QRS complexes are evaluated by using the ECG database of the American Heart Association. The results show that the nonlinear transform involving multiplication of three adjacent, sign-consistent differences in the time domain gives a good performance and a quick response. When implemented with an appropriate sampling rate, this algorithm is also capable of rejecting pacemaker spikes. The eight-bit single-chip microcontroller provides sufficient throughput and shows a satisfactory performance. Implementation of multiple detection algorithms in the same system improves flexibility and reliability. The low chip count in the design also favors maintainability and cost-effectiveness.

  11. LIMS for Lasers 2015 for achieving long-term accuracy and precision of δ2H, δ17O, and δ18O of waters using laser absorption spectrometry

    USGS Publications Warehouse

    Coplen, Tyler B.; Wassenaar, Leonard I

    2015-01-01

    Although laser absorption spectrometry (LAS) instrumentation is easy to use, its incorporation into laboratory operations is not easy, owing to extensive offline manipulation of comma-separated-values files for outlier detection, between-sample memory correction, nonlinearity (δ-variation with water amount) correction, drift correction, normalization to VSMOW-SLAP scales, and difficulty in performing long-term QA/QC audits. METHODS: A Microsoft Access relational-database application, LIMS (Laboratory Information Management System) for Lasers 2015, was developed. It automates LAS data corrections and manages clients, projects, samples, instrument-sample lists, and triple-isotope (δ(17) O, δ(18) O, and δ(2) H values) instrumental data for liquid-water samples. It enables users to (1) graphically evaluate sample injections for variable water yields and high isotope-delta variance; (2) correct for between-sample carryover, instrumental drift, and δ nonlinearity; and (3) normalize final results to VSMOW-SLAP scales. RESULTS: Cost-free LIMS for Lasers 2015 enables users to obtain improved δ(17) O, δ(18) O, and δ(2) H values with liquid-water LAS instruments, even those with under-performing syringes. For example, LAS δ(2) HVSMOW measurements of USGS50 Lake Kyoga (Uganda) water using an under-performing syringe having ±10 % variation in water concentration gave +31.7 ± 1.6 ‰ (2-σ standard deviation), compared with the reference value of +32.8 ± 0.4 ‰, after correction for variation in δ value with water concentration, between-sample memory, and normalization to the VSMOW-SLAP scale. CONCLUSIONS: LIMS for Lasers 2015 enables users to create systematic, well-founded instrument templates, import δ(2) H, δ(17) O, and δ(18) O results, evaluate performance with automatic graphical plots, correct for δ nonlinearity due to variable water concentration, correct for between-sample memory, adjust for drift, perform VSMOW-SLAP normalization, and perform long-term QA/QC audits easily.

  12. MSATT Workshop on Innovative Instrumentation for the In Situ Study of Atmosphere-Surface Interactions on Mars

    NASA Technical Reports Server (NTRS)

    Fegley, Bruce, Jr. (Editor); Waenke, Heinrich (Editor)

    1992-01-01

    Papers accepted for the Mars Surface and Atmosphere Through Time (MSATT) Workshop on Innovative Instruments for the In Situ Study of Atmosphere-Surface Interaction of Mars, 8-9 Oct. 1992 in Mainz, Germany are included. Topics covered include: a backscatter Moessbauer spectrometer (BaMS) for use on Mars; database of proposed payloads and instruments for SEI missions; determination of martian soil mineralogy and water content using the Thermal Analyzer for Planetary Soils (TAPS); in situ identification of the martian surface material and its interaction with the martian atmosphere using DTA/GC; mass spectrometer-pyrolysis experiment for atmospheric and soil sample analysis on the surface of Mars; and optical luminescence spectroscopy as a probe of the surface mineralogy of Mars.

  13. The CCD/Transit Instrument (CTI) data-analysis system

    NASA Technical Reports Server (NTRS)

    Cawson, M. G. M.; Mcgraw, J. T.; Keane, M. J.

    1995-01-01

    The automated software system for archiving, analyzing, and interrogating data from the CCD/Transit Instrument (CTI) is described. The CTI collects up to 450 Mbytes of image-data each clear night in the form of a narrow strip of sky observed in two colors. The large data-volumes and the scientific aims of the project make it imperative that the data are analyzed within the 24-hour period following the observations. To this end a fully automatic and self evaluating software system has been developed. The data are collected from the telescope in real-time and then transported to Tucson for analysis. Verification is performed by visual inspection of random subsets of the data and obvious cosmic rays are detected and removed before permanent archival is made to the optical disc. The analysis phase is performed by a pair of linked algorithms, one operating on the absolute pixel-values and the other on the spatial derivative of the data. In this way both isolated and merged images are reliably detected in a single pass. In order to isolate the latter algorithm from the effects of noise spikes a 3x3 Hanning filter is applied to the raw data before the analysis is run. The algorithms reduce the input pixel-data to a database of measured parameters for each image which has been found. A contrast filter is applied in order to assign a detection-probability to each image and then x-y calibration and intensity calibration are performed using known reference stars in the strip. These are added to as necessary by secondary standards boot-strapped from the CTI data itself. The final stages involve merging the new data into the CTI Master-list and History-list and the automatic comparison of each new detection with a set of pre-defined templates in parameter-space to find interesting objects such as supernovae, quasars and variable stars. Each stage of the processing from verification to interesting image selection is performed under a data-logging system which both controls the pipe-lining of data through the system and records key performance monitor parameters which are built into the software. Furthermore, the data from each stage are stored in databases to facilitate evaluation, and all stages offer the facility to enter keyword-indexed free-format text into the data-logging system. In this way a large measure of certification is built into the system to provide the necessary confidence in the end results.

  14. Adaptation of a nursing home culture change research instrument for frontline staff quality improvement use.

    PubMed

    Hartmann, Christine W; Palmer, Jennifer A; Mills, Whitney L; Pimentel, Camilla B; Allen, Rebecca S; Wewiorski, Nancy J; Dillon, Kristen R; Snow, A Lynn

    2017-08-01

    Enhanced interpersonal relationships and meaningful resident engagement in daily life are central to nursing home cultural transformation, yet these critical components of person-centered care may be difficult for frontline staff to measure using traditional research instruments. To address the need for easy-to-use instruments to help nursing home staff members evaluate and improve person-centered care, the psychometric method of cognitive-based interviewing was used to adapt a structured observation instrument originally developed for researchers and nursing home surveyors. Twenty-eight staff members from 2 Veterans Health Administration (VHA) nursing homes participated in 1 of 3 rounds of cognitive-based interviews, using the instrument in real-life situations. Modifications to the original instrument were guided by a cognitive processing model of instrument refinement. Following 2 rounds of cognitive interviews, pretesting of the revised instrument, and another round of cognitive interviews, the resulting set of 3 short instruments mirrored the concepts of the original longer instrument but were significantly easier for frontline staff to understand and use. Final results indicated frontline staff found the revised instruments feasible to use and clinically relevant in measuring and improving the lived experience of a changing culture. This article provides a framework for developing or adapting other measurement tools for frontline culture change efforts in nursing homes, in addition to reporting on a practical set of instruments to measure aspects of person-centered care. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. The PEACE project review of clinical instruments for hospice and palliative care.

    PubMed

    Hanson, Laura C; Scheunemann, Leslie P; Zimmerman, Sheryl; Rokoske, Franziska S; Schenck, Anna P

    2010-10-01

    Hospice and palliative care organizations are expanding their use of standardized instruments and other approaches to measure quality. We undertook a systematic review and evaluation of published patient-level instruments for potential application in hospice and palliative care clinical quality measurement. We searched prior reviews and computerized reference databases from 1990 through February 2007 for studies of instruments relevant to physical, psychological, social, cultural, spiritual, or ethical aspects of palliative care, or measuring prognosis, function or continuity of care. Publications were selected for full review if they provided evidence of psychometric properties or practical application of an instrument tested in or appropriate for a hospice or palliative care population. Selected instruments were evaluated and scored for scientific soundness and potential application in clinical quality measurement. The search found 1427 publications, with 229 selected for full manuscript review. Manuscripts provided information on 129 instruments which were evaluated using a structured scoring guide for psychometric properties. Thirty-nine instruments scoring near or above the 75th percentile were recommended. Most instruments covered multiple domains or focused on care for physical symptoms, psychological or social aspects of care. Few instruments were available to measure cultural aspects of care, structure and process of care, and continuity of care. Numerous patient-level instruments are available to measure physical, psychological and social aspects of palliative care with adequate evidence for scientific soundness and practical clinical use for quality improvement and research. Other aspects of palliative care may benefit from further instrument development research.

  16. Instrumental motives in negative emotion regulation in daily life: Frequency, consistency, and predictors.

    PubMed

    Kalokerinos, Elise K; Tamir, Maya; Kuppens, Peter

    2017-06-01

    People regulate their emotions not only for hedonic reasons but also for instrumental reasons, to attain the potential benefits of emotions beyond pleasure and pain. However, such instrumental motives have rarely been examined outside the laboratory as they naturally unfold in daily life. To assess whether and how instrumental motives operate outside the laboratory, it is necessary to examine them in response to real and personally relevant stimuli in ecologically valid contexts. In this research, we assessed the frequency, consistency, and predictors of instrumental motives in negative emotion regulation in daily life. Participants (N = 114) recalled the most negative event of their day each evening for 7 days and reported their instrumental motives and negative emotion goals in that event. Participants endorsed performance motives in approximately 1 in 3 events and social, eudaimonic, and epistemic motives in approximately 1 in 10 events. Instrumental motives had substantially higher within- than between-person variance, indicating that they were context-dependent. Indeed, although we found few associations between instrumental motives and personality traits, relationships between instrumental motives and contextual variables were more extensive. Performance, social, and epistemic motives were each predicted by a unique pattern of contextual appraisals. Our data demonstrate that instrumental motives play a role in daily negative emotion regulation as people encounter situations that pose unique regulatory demands. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Cognitive assessment instruments in Parkinson's disease patients undergoing deep brain stimulation

    PubMed Central

    Romann, Aline Juliane; Dornelles, Silvia; Maineri, Nicole de Liz; Rieder, Carlos Roberto de Mello; Olchik, Maira Rozenfeld

    2012-01-01

    Deep Brain Stimulation (DBS) is a widely used surgical technique in individuals with Parkinson's disease (PD) that can lead to significant reductions in motor symptoms. Objectives To determine, from publications, the most commonly used instruments for cognitive evaluation of individuals with PD undergoing DBS. Methods A systematic review of the databases: PubMed, Medline, EBECS, Scielo and LILACS was conducted, using the descriptors "Deep Brain Stimulation", "Verbal Fluency", "Parkinson Disease", "Executive Function", "Cognition" and "Cognitive Assessment" in combination. Results The Verbal Fluency test was found to be the most used instrument for this investigation in the studies, followed by the Boston Naming Test. References to the Stroop Test, Trail Making Test, and Rey's Auditory Verbal Learning Test were also found. Conclusions The validation of instruments for this population is needed as is the use of batteries offering greater specificity and sensitivity for the detection of cognitive impairment. PMID:29213766

  18. Component-Level Electronic-Assembly Repair (CLEAR) System Architecture

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Bradish, Martin A.; Juergens, Jeffrey R.; Lewis, Michael J.; Vrnak, Daniel R.

    2011-01-01

    This document captures the system architecture for a Component-Level Electronic-Assembly Repair (CLEAR) capability needed for electronics maintenance and repair of the Constellation Program (CxP). CLEAR is intended to improve flight system supportability and reduce the mass of spares required to maintain the electronics of human rated spacecraft on long duration missions. By necessity it allows the crew to make repairs that would otherwise be performed by Earth based repair depots. Because of practical knowledge and skill limitations of small spaceflight crews they must be augmented by Earth based support crews and automated repair equipment. This system architecture covers the complete system from ground-user to flight hardware and flight crew and defines an Earth segment and a Space segment. The Earth Segment involves database management, operational planning, and remote equipment programming and validation processes. The Space Segment involves the automated diagnostic, test and repair equipment required for a complete repair process. This document defines three major subsystems including, tele-operations that links the flight hardware to ground support, highly reconfigurable diagnostics and test instruments, and a CLEAR Repair Apparatus that automates the physical repair process.

  19. Reliability of robotic system during general surgical procedures in a university hospital.

    PubMed

    Buchs, Nicolas C; Pugin, François; Volonté, Francesco; Morel, Philippe

    2014-01-01

    Data concerning the reliability of robotic systems are scarce, especially for general surgery. The aim of this study was to assess the incidence and consequences of robotic malfunction in a teaching institution. From January 2006 to September 2012, 526 consecutive robotic general surgical procedures were performed. All failures were prospectively recorded in a computerized database and reviewed retrospectively. Robotic malfunctions occurred in 18 cases (3.4%). These dysfunctions concerned the robotic instruments in 9 cases, the robotic arms in 4 cases, the surgical console in 3 cases, and the optical system in 2 cases. Two malfunctions were considered critical, and 1 led to a laparoscopic conversion (conversion rate due to malfunction, .2%). Overall, there were more dysfunctions at the beginning of the study period (2006 to 2010) than more recently (2011 to 2012) (4.2% vs 2.6%, P = .35). The robotic system malfunction rate was low. Most malfunctions could be resolved during surgery, allowing the procedures to be completed safely. With increased experience, the system malfunction rate seems to be reduced. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. A Relational Database System for Student Use.

    ERIC Educational Resources Information Center

    Fertuck, Len

    1982-01-01

    Describes an APL implementation of a relational database system suitable for use in a teaching environment in which database development and database administration are studied, and discusses the functions of the user and the database administrator. An appendix illustrating system operation and an eight-item reference list are attached. (Author/JL)

  1. A systematic review of substance misuse assessment packages.

    PubMed

    Sweetman, Jennifer; Raistrick, Duncan; Mdege, Noreen D; Crosby, Helen

    2013-07-01

    Health-care systems globally are moving away from process measures of performance to payments for outcomes achieved. It follows that there is a need for a selection of proven quality tools that are suitable for undertaking comprehensive assessments and outcomes assessments. This review aimed to identify and evaluate existing comprehensive assessment packages. The work is part of a national program in the UK, Collaborations in Leadership of Applied Health Research and Care. Systematic searches were carried out across major databases to identify instruments designed to assess substance misuse. For those instruments identified, searches were carried out using the Cochrane Library, Embase, Ovid MEDLINE(®) and PsychINFO to identify articles reporting psychometric data. From 595 instruments, six met the inclusion criteria: Addiction Severity Index; Chemical Use, Abuse and Dependence Scale; Form 90; Maudsley Addiction Profile; Measurements in the Addictions for Triage and Evaluation; and Substance Abuse Outcomes Module. The most common reasons for exclusion were that instruments were: (i) designed for a specific substance (239); (ii) not designed for use in addiction settings (136); (iii) not providing comprehensive assessment (89); and (iv) not suitable as an outcome measure (20). The six packages are very different and suited to different uses. No package had adequate evaluation of their properties and so the emphasis should be on refining a small number of tools with very general application rather than creating new ones. An alternative to using 'off-the-shelf' packages is to create bespoke packages from well-validated, single-construct scales. [ © 2013 Australasian Professional Society on Alcohol and other Drugs.

  2. An automated calibration laboratory - Requirements and design approach

    NASA Technical Reports Server (NTRS)

    O'Neil-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  3. James Webb Space Telescope XML Database: From the Beginning to Today

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan; Fatig, Curtis C.

    2005-01-01

    The James Webb Space Telescope (JWST) Project has been defining, developing, and exercising the use of a common eXtensible Markup Language (XML) for the command and telemetry (C&T) database structure. JWST is the first large NASA space mission to use XML for databases. The JWST project started developing the concepts for the C&T database in 2002. The database will need to last at least 20 years since it will be used beginning with flight software development, continuing through Observatory integration and test (I&T) and through operations. Also, a database tool kit has been provided to the 18 various flight software development laboratories located in the United States, Europe, and Canada that allows the local users to create their own databases. Recently the JWST Project has been working with the Jet Propulsion Laboratory (JPL) and Object Management Group (OMG) XML Telemetry and Command Exchange (XTCE) personnel to provide all the information needed by JWST and JPL for exchanging database information using a XML standard structure. The lack of standardization requires custom ingest scripts for each ground system segment, increasing the cost of the total system. Providing a non-proprietary standard of the telemetry and command database definition formation will allow dissimilar systems to communicate without the need for expensive mission specific database tools and testing of the systems after the database translation. The various ground system components that would benefit from a standardized database are the telemetry and command systems, archives, simulators, and trending tools. JWST has exchanged the XML database with the Eclipse, EPOCH, ASIST ground systems, Portable spacecraft simulator (PSS), a front-end system, and Integrated Trending and Plotting System (ITPS) successfully. This paper will discuss how JWST decided to use XML, the barriers to a new concept, experiences utilizing the XML structure, exchanging databases with other users, and issues that have been experienced in creating databases for the C&T system.

  4. Systematic review of measurement properties of self-reported instruments for evaluating self-care in adults.

    PubMed

    Matarese, Maria; Lommi, Marzia; De Marinis, Maria Grazia

    2017-06-01

    The aims of this study were as follows: to identify instruments developed to assess self-care in healthy adults; to determine the theory on which they were based; their validity and reliability properties and to synthesize the evidence on their measurement properties. Many instruments have been developed to assess self-care in many different populations and conditions. Clinicians and researchers should select the most appropriate self-care instrument based on the knowledge of their measurement properties. Systematic review of measurement instruments according to the protocol recommended by the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) panel. PubMed, Embase, PsycINFO, Scopus and CINAHL databases were searched from inception to December 2015. Studies testing measurement properties of self-report instruments assessing self-care in healthy adults, published in the English language and in peer review journals were selected. Two reviewers independently appraised the methodological quality of the studies with the COSMIN checklist and the quality of results using specific quality criteria. Twenty-six articles were included in the review testing the measurement properties of nine instruments. Seven instruments were based on Orem's Self-care theory. Not all the measurement properties were evaluated for the identified instruments. No self-care instrument showed strong evidence supporting the evaluated measurement properties. Despite the development of several instruments to assess self-care in the adult population, no instrument can be fully recommended to clinical nurses and researchers. Further studies of high methodological quality are needed to confirm the measurement properties of these instruments. © 2016 John Wiley & Sons Ltd.

  5. Database Access Systems.

    ERIC Educational Resources Information Center

    Dalrymple, Prudence W.; Roderer, Nancy K.

    1994-01-01

    Highlights the changes that have occurred from 1987-93 in database access systems. Topics addressed include types of databases, including CD-ROMs; enduser interface; database selection; database access management, including library instruction and use of primary literature; economic issues; database users; the search process; and improving…

  6. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  7. 78 FR 28756 - Defense Federal Acquisition Regulation Supplement: System for Award Management Name Changes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... Excluded Parties Listing System (EPLS) databases into the System for Award Management (SAM) database. DATES... combined the functional capabilities of the CCR, ORCA, and EPLS procurement systems into the SAM database... identification number and the type of organization from the System for Award Management database. 0 3. Revise the...

  8. HIPPO Experiment Data Access and Subseting System

    NASA Astrophysics Data System (ADS)

    Krassovski, Misha; Hook, Les; Christensen, Sigurd; Boden, Tom

    2014-05-01

    HIAPER Pole-to-Pole Observations (HIPPO) was an NSF- and NOAA-funded, multi-year global airborne research project to survey the latitudinal and vertical distribution of greenhouse and related gases, and aerosols. Project scientists and support staff flew five month-long missions over the Pacific Basin on the NSF/NCAR Gulfstream V, High-performance Instrumented Airborne Platform for Environmental Research (HIAPER) aircraft between January 2009 and September 2011, spread throughout the annual cycle, from the surface to 14 km in altitude, and from 87N to 67S. The landmark study resulted in an extensive, highly detailed dataset of over 90 atmospheric species, from six categories, all with navigation and atmospheric structure data, including greenhouse gases and carbon cycle gases; ozone and water vapor; black carbon and aerosols; ozone-depleting substances and their replacements; light hydrocarbons and PAN; and sulfur gases/ocean-derived gases. A suite of specialized instruments on the aircraft made high-rate measurements as the plane flew, while several whole air samplers collected flasks of air for later analysis in laboratories around the U.S. Flights were conducted in a continuously profiling mode, with the aircraft alternately climbing or descending as it flew from its home base in Broomfield, Colorado north to Alaska and the Arctic, south down the middle of the Pacific Ocean to New Zealand and the Southern Ocean near Antarctica, and then back to the Arctic a second time before returning home. In all, the aircraft made 64 flights and flew 787 vertical profiles while covering 285,000 km. Instruments collected 434 hours of high-rate continuous measurements and 4,235 flask samples were collected during the five HIPPO missions. Data from the HIPPO study of greenhouse gases and aerosols are now available to the atmospheric research community and the public. This comprehensive dataset provides the first high-resolution vertically resolved measurements of over 90 unique atmospheric species from nearly pole-to-pole over the Pacific Ocean across all seasons. The suite of atmospheric trace gases and aerosols is pertinent to understanding the carbon cycle and challenging global climate models. This dataset will provide opportunities for research across a broad spectrum of Earth sciences, including those analyzing the evolution in time and space of the greenhouse gases that affect global climate. The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) provides data management support for the HIPPO experiment including long-term data storage and dissemination. CDIAC has developed a relational database to house HIPPO merged 10-second meteorology, atmospheric chemistry, and aerosol data. This data set provides measurements from all Missions, 1 through 5, that took place from January of 2009 to September 2011. This presentation introduces newly build database and web interface, reflects the present state and functionality of the HIPPO Database and Exploration System as well as future plans for expansion and inclusion of combined discrete flask and GC sample GHG, Halocarbon, and hydrocarbon data.

  9. Mega-precovery and data mining of near-Earth asteroids and other Solar System objects

    NASA Astrophysics Data System (ADS)

    Popescu, M.; Vaduvescu, O.; Char, F.; Curelaru, L.; Euronear Team

    2014-07-01

    The vast collection of CCD images and photographic plate archives available from the world-wide archives and telescopes is still insufficiently exploited. Within the EURONEAR project we designed two data mining software with the purpose to search very large collections of archives for images which serendipitously include known asteroids or comets in their field, with the main aims to extend the arc and improve the orbits. In this sense, ''Precovery'' (published in 2008, aiming to search all known NEAs in few archives via IMCCE's SkyBoT server) and ''Mega-Precovery'' (published in 2010, querying the IMCCE's Miriade server) were made available to the community via the EURONEAR website (euronear.imcce.fr). Briefly, Mega-Precovery aims to search one or a few known asteroids or comets in a mega-collection including millions of images from some of the largest observatory archives: ESO (15 instruments served by ESO Archive including VLT), NVO (8 instruments served by U.S. NVO Archive), CADC (11 instruments, including HST and Gemini), plus other important instrument archives: SDSS, CFHTLS, INT-WFC, Subaru-SuprimeCam and AAT-WFI, adding together 39 instruments and 4.3 million images (Mar 2014), and our Mega-Archive is growing. Here we present some of the most important results obtained with our data-mining software and some new planned search options of Mega-Precovery. Particularly, the following capabilities will be added soon: the ING archive (all imaging cameras) will be included and new search options will be made available (such as query by orbital elements and by observations) to be able to target new Solar System objects such as Virtual Impactors, bolides, planetary satellites, TNOs (besides the comets added recently). In order to better characterize the archives, we introduce the ''AOmegaA'' factor (archival etendue) proportional to the AOmega (etendue) and the number of images in an archive. With the aim to enlarge the Mega-Archive database, we invite the observatories (particularly those storing their images online and also those that own plate archives which could be scanned on request) to contact us in order to add their instrument archives (consisting of an ASCII file with telescope pointings in a simple format) to our Mega-Precovery open project. We intend for the future to synchronise our service with the Virtual Observatory.

  10. Psychosocial measures used to assess the effectiveness of school-based nutrition education programs: review and analysis of self-report instruments for children 8 to 12 years old.

    PubMed

    Hernández-Garbanzo, Yenory; Brosh, Joanne; Serrano, Elena L; Cason, Katherine L; Bhattarai, Ranju

    2013-01-01

    To identify the psychometric properties of evaluation instruments that measure mediators of dietary behaviors in school-aged children. Systematic search of scientific databases limited to 1999-2010. Psychometric properties related to development and testing of self-report instruments for children 8-12 years old. Systematic search of 189 articles and review of 15 instruments (20 associated articles) meeting the inclusion criteria. Search terms used included children, school, nutrition, diet, nutrition education, and evaluation. Fourteen studies used a theoretical framework to guide the instrument's development. Knowledge and self-efficacy were the most commonly used psychosocial measures. Twelve instruments focused on specific nutrition-related behaviors. Eight instruments included over 40 items and used age-appropriate response formats. Acceptable reliability properties were most commonly reported for attitude and self-efficacy measures. Although most of the instruments were reviewed by experts (n = 8) and/or pilot-tested (n = 9), only 7 were tested using both rigorous types of validity and with low-income youth. Results from this review suggest that additional research is needed to develop more robust psychosocial measures for dietary behaviors, for low-income youth audiences. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  11. Development of a Homogenous Database of Bipolar Active Regions Spanning Four Cycles

    NASA Astrophysics Data System (ADS)

    Munoz-Jaramillo, A.; Werginz, Z. A.; Vargas-Acosta, J. P.; DeLuca, M. D.; Vargas-Dominguez, S.; Lamb, D. A.; DeForest, C. E.; Longcope, D. W.; Martens, P.

    2016-12-01

    The solar cycle can be understood as a process that alternates the large-scale magnetic field of the Sun between poloidal and toroidal configurations. Although the process that transitions the solar cycle between toroidal and poloidal phases is still not fully understood, theoretical studies, and observational evidence, suggest that this process is driven by the emergence and decay of bipolar magnetic regions (BMRs) at the photosphere. Furthermore, the emergence of BMRs at the photosphere is the main driver behind solar variability and solar activity in general; making the study of their properties doubly important for heliospheric physics. However, in spite of their critical role, there is still no unified catalog of BMRs spanning multiple instruments and covering the entire period of systematic measurement of the solar magnetic field (i.e. 1975 to present).In this presentation we discuss an ongoing project to address this deficiency by applying our Bipolar Active Region Detection (BARD) code on full disk magnetograms measured by the 512 (1975-1993) and SPMG (1992-2003) instruments at the Kitt Peak Vacuum Telescope (KPVT), SOHO/MDI (1996-2011) and SDO/HMI (2010-present). First we will discuss the results of our revitalization of 512 and SPMG KPVT data, then we will discuss how our BARD code operates, and finally report the results of our cross-callibration across instruments.The corrected and improved KPVT magnetograms will be made available through the National Solar Observatory (NSO) and Virtual Solar Observatory (VSO), including updated synoptic maps produced by running the corrected KPVT magnetograms though the SOLIS pipeline. The homogeneous active region database will be made public by the end of 2017 once it has reached a satisfactory level of quality and maturity. The Figure shows all bipolar active regions present in our database (as of Aug 2016) colored according to the instrument where they were detected. The image also includes the names of the NSF-REU students in charge of the supervision of the detection algorithm and the year in which they worked on the catalog. Marker size is indicative of the total active region flux.

  12. Hippo Experiment Data Access and Subseting System

    NASA Astrophysics Data System (ADS)

    Krassovski, M.; Hook, L.; Boden, T.

    2014-12-01

    HIAPER Pole-to-Pole Observations (HIPPO) was an NSF- and NOAA-funded, multi-year global airborne research project to survey the latitudinal and vertical distribution of greenhouse and related gases, and aerosols. Project scientists and support staff flew five month-long missions over the Pacific Basin on the NSF/NCAR Gulfstream V, High-performance Instrumented Airborne Platform for Environmental Research (HIAPER) aircraft between January 2009 and September 2011, spread throughout the annual cycle, from the surface to 14 km in altitude, and from 87°N to 67°S. Data from the HIPPO study of greenhouse gases and aerosols are now available to the atmospheric research community and the public. This comprehensive dataset provides the first high-resolution vertically resolved measurements of over 90 unique atmospheric species from nearly pole-to-pole over the Pacific Ocean across all seasons. The suite of atmospheric trace gases and aerosols is pertinent to understanding the carbon cycle and challenging global climate models. This dataset will provide opportunities for research across a broad spectrum of Earth sciences, including those analyzing the evolution in time and space of the greenhouse gases that affect global climate. The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) provides data management support for the HIPPO experiment including long-term data storage and dissemination. CDIAC has developed a relational database to house HIPPO merged 10-second meteorology, atmospheric chemistry, and aerosol data. This data set provides measurements from all Missions, 1 through 5, that took place from January of 2009 to September 2011. This presentation introduces newly build database and web interface, reflects the present state and functionality of the HIPPO Database and Exploration System as well as future plans for expansion and inclusion of combined discrete flask and GC sample GHG, Halocarbon, and hydrocarbon data.

  13. StatsDB: platform-agnostic storage and understanding of next generation sequencing run metrics

    PubMed Central

    Ramirez-Gonzalez, Ricardo H.; Leggett, Richard M.; Waite, Darren; Thanki, Anil; Drou, Nizar; Caccamo, Mario; Davey, Robert

    2014-01-01

    Modern sequencing platforms generate enormous quantities of data in ever-decreasing amounts of time. Additionally, techniques such as multiplex sequencing allow one run to contain hundreds of different samples. With such data comes a significant challenge to understand its quality and to understand how the quality and yield are changing across instruments and over time. As well as the desire to understand historical data, sequencing centres often have a duty to provide clear summaries of individual run performance to collaborators or customers. We present StatsDB, an open-source software package for storage and analysis of next generation sequencing run metrics. The system has been designed for incorporation into a primary analysis pipeline, either at the programmatic level or via integration into existing user interfaces. Statistics are stored in an SQL database and APIs provide the ability to store and access the data while abstracting the underlying database design. This abstraction allows simpler, wider querying across multiple fields than is possible by the manual steps and calculation required to dissect individual reports, e.g. ”provide metrics about nucleotide bias in libraries using adaptor barcode X, across all runs on sequencer A, within the last month”. The software is supplied with modules for storage of statistics from FastQC, a commonly used tool for analysis of sequence reads, but the open nature of the database schema means it can be easily adapted to other tools. Currently at The Genome Analysis Centre (TGAC), reports are accessed through our LIMS system or through a standalone GUI tool, but the API and supplied examples make it easy to develop custom reports and to interface with other packages. PMID:24627795

  14. Results of qualification tests on water-level sensing instruments, 1987

    USGS Publications Warehouse

    Olive, T.E.

    1989-01-01

    The U.S. Geological Survey 's Hydrologic Instrumentation Facility at the Stennis Space Center, Mississippi, conducts qualification tests on water level sensing instruments. Instrument systems, which meet or exceed the Survey 's minimum performance requirements, are placed on the Survey 's Qualified Products List. The qualification tests conducted in 1987 added two instrument systems to the Survey 's Qualified Products List. One system met requirements for use at a daily-discharge station , and the other system met requirements for a special-case station. The report is prepared for users of hydrologic instruments. The report provides a list of instrument features, describes the instrument systems, summarizes test procedures, and presents test results for the two instrument systems that met the Survey 's minimum performance standards for the 1987 round of qualification tests. (USGS)

  15. Joint Urban 2003: Study Overview And Instrument Locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allwine, K Jerry; Flaherty, Julia E.

    2006-08-16

    Quality-assured meteorological and tracer data sets are vital for establishing confidence that indoor and outdoor dispersion models used to simulate dispersal of potential toxic agents in urban atmospheres are giving trustworthy results. The U.S. Department of Defense-Defense Threat Reduction Agency and the U.S. Department of Homeland Security joined together to conduct the Joint Urban 2003 atmospheric dispersion study to provide this critically-needed high-resolution dispersion data. This major urban study was conducted from June 28 through July 31, 2003, in Oklahoma City, Oklahoma, with the participation of over 150 scientists and engineers from over 20 U.S. and foreign institutions. The Jointmore » Urban 2003 lead scientist was Jerry Allwine (Pacific Northwest National Laboratory) who oversaw study design, logistical arrangements and field operations with the help of Joe Shinn (Lawrence Livermore National Laboratory), Marty Leach (Lawrence Livermore National Laboratory), Ray Hosker (Atmospheric Turbulence and Diffusion Division), Leo Stockham (Northrop Grumman Information Technology) and Jim Bowers (Dugway Proving Grounds). This report gives a brief overview of the field campaign, describing the scientific objectives, the dates of the intensive observation periods, and the instruments deployed. The data from this field study is available to the scientific community through an on-line database that is managed by Dugway Proving Ground. This report will be included in the database to provide its users with some general information about the field study, and specific information about the instrument coordinates. Appendix A of this document provides the definitive record of the instrument locations during this field campaign, and Appendix B lists all the study principal investigators and participants.« less

  16. Heterogeneous distributed databases: A case study

    NASA Technical Reports Server (NTRS)

    Stewart, Tracy R.; Mukkamala, Ravi

    1991-01-01

    Alternatives are reviewed for accessing distributed heterogeneous databases and a recommended solution is proposed. The current study is limited to the Automated Information Systems Center at the Naval Sea Combat Systems Engineering Station at Norfolk, VA. This center maintains two databases located on Digital Equipment Corporation's VAX computers running under the VMS operating system. The first data base, ICMS, resides on a VAX11/780 and has been implemented using VAX DBMS, a CODASYL based system. The second database, CSA, resides on a VAX 6460 and has been implemented using the ORACLE relational database management system (RDBMS). Both databases are used for configuration management within the U.S. Navy. Different customer bases are supported by each database. ICMS tracks U.S. Navy ships and major systems (anti-sub, sonar, etc.). Even though the major systems on ships and submarines have totally different functions, some of the equipment within the major systems are common to both ships and submarines.

  17. Total Hip Intraoperative Femur Fracture: Do the Design Enhancements of a Second-Generation Tapered-Wedge Stem Reduce the Incidence?

    PubMed

    Colacchio, Nicholas D; Robbins, Claire E; Aghazadeh, Mehran S; Talmo, Carl T; Bono, James V

    2017-10-01

    Intraoperative femur fracture (IFF) is a well-known complication in primary uncemented total hip arthroplasty (THA). Variations in implant instrumentation design and operative technique may influence the risk of IFF. This study investigates IFF between a standard uncemented tapered-wedge femoral stem and its second-generation successor with the following design changes: size-specific medial curvature, proportional incremental stem growth, modest reduction in stem length, and distal lateral relief. A single experienced surgeon's patient database was retrospectively queried for IFF occurring during primary uncemented THA using a standard tapered-wedge femoral stem system or a second-generation stem. All procedures were performed using soft tissue preserving anatomic capsule repair and posterior approach. The primary outcome measure was IFF. A z-test of proportions was performed to determine significant difference between the 2 stems with respect to IFF. Patient demographics, Dorr classification, and implant characteristics were also examined. Forty-one of 1510 patients (2.72%) who received a standard tapered-wedge femoral stem sustained an IFF, whereas 5 of 800 patients (0.63%) using the second-generation stem incurred an IFF. No other significant associations were found. A standard tapered-wedge femoral stem instrumentation system resulted in greater than 4 times higher incidence of IFF than its second-generation successor used for primary uncemented THA. Identifying risk factors for IFF is necessary to facilitate implant system improvements and thus maximize patient outcomes. Copyright © 2017. Published by Elsevier Inc.

  18. How I do it: a practical database management system to assist clinical research teams with data collection, organization, and reporting.

    PubMed

    Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe

    2015-04-01

    The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  19. Implementation of a data management software system for SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, Kenneth

    1986-01-01

    The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.

  20. Development and Operation of a Database Machine for Online Access and Update of a Large Database.

    ERIC Educational Resources Information Center

    Rush, James E.

    1980-01-01

    Reviews the development of a fault tolerant database processor system which replaced OCLC's conventional file system. A general introduction to database management systems and the operating environment is followed by a description of the hardware selection, software processes, and system characteristics. (SW)

  1. 75 FR 18255 - Passenger Facility Charge Database System for Air Carrier Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Facility Charge Database System for Air Carrier Reporting AGENCY: Federal Aviation Administration (FAA... the Passenger Facility Charge (PFC) database system to report PFC quarterly report information. In... developed a national PFC database system in order to more easily track the PFC program on a nationwide basis...

  2. An Improved Database System for Program Assessment

    ERIC Educational Resources Information Center

    Haga, Wayne; Morris, Gerard; Morrell, Joseph S.

    2011-01-01

    This research paper presents a database management system for tracking course assessment data and reporting related outcomes for program assessment. It improves on a database system previously presented by the authors and in use for two years. The database system presented is specific to assessment for ABET (Accreditation Board for Engineering and…

  3. 76 FR 11465 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... separate systems of records: ``FHFA-OIG Audit Files Database,'' ``FHFA-OIG Investigative & Evaluative Files Database,'' ``FHFA-OIG Investigative & Evaluative MIS Database,'' and ``FHFA-OIG Hotline Database.'' These... Audit Files Database. FHFA-OIG-2: FHFA-OIG Investigative & Evaluative Files Database. FHFA-OIG-3: FHFA...

  4. Active in-database processing to support ambient assisted living systems.

    PubMed

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  5. Improved Information Retrieval Performance on SQL Database Using Data Adapter

    NASA Astrophysics Data System (ADS)

    Husni, M.; Djanali, S.; Ciptaningtyas, H. T.; Wicaksana, I. G. N. A.

    2018-02-01

    The NoSQL databases, short for Not Only SQL, are increasingly being used as the number of big data applications increases. Most systems still use relational databases (RDBs), but as the number of data increases each year, the system handles big data with NoSQL databases to analyze and access data more quickly. NoSQL emerged as a result of the exponential growth of the internet and the development of web applications. The query syntax in the NoSQL database differs from the SQL database, therefore requiring code changes in the application. Data adapter allow applications to not change their SQL query syntax. Data adapters provide methods that can synchronize SQL databases with NotSQL databases. In addition, the data adapter provides an interface which is application can access to run SQL queries. Hence, this research applied data adapter system to synchronize data between MySQL database and Apache HBase using direct access query approach, where system allows application to accept query while synchronization process in progress. From the test performed using data adapter, the results obtained that the data adapter can synchronize between SQL databases, MySQL, and NoSQL database, Apache HBase. This system spends the percentage of memory resources in the range of 40% to 60%, and the percentage of processor moving from 10% to 90%. In addition, from this system also obtained the performance of database NoSQL better than SQL database.

  6. Active In-Database Processing to Support Ambient Assisted Living Systems

    PubMed Central

    de Morais, Wagner O.; Lundström, Jens; Wickström, Nicholas

    2014-01-01

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164

  7. The EarthCARE Simulator (Invited)

    NASA Astrophysics Data System (ADS)

    Donovan, D. P.; van Zadellhoff, G.; Lajas, D.; Eisinger, M.; Franco, R.

    2009-12-01

    In recent years, the value of multisensor remote sensing techniques applied to cloud, aerosol, radiation and precipitation studies has become clear. For example, combinations of instruments including lidars and/or radars have proved very useful for profile retrievals of cloud macrophysical and microphysical properties. This is amply illustrated by various results from the ARM (and similar) sites as well as from results derived using the Cloudsat/CALIPSO/A-train combination of instruments. The Earth Clouds Aerosol and Radiation Explorer (EarthCARE) mission is a combined ESA/JAXA mission scheduled for launch in 2013 and has been designed with sensor-synergy playing a driving role in its scientific applications. The EarthCARE mission consists of a cloud profiling Doppler radar, a high-spectral-resolution lidar, a cloud/aerosol imager and a three-view broadband radiometer. As part of the mission development process, a detailed end-to-end multisensor simulation system has been developed. The EarthCARE Simulator (ECSIM) consists of a modular general framework populated by various models. The models within ECSIM are grouped according to the following scheme: 1) Scene creation models (3D atmospheric scene definition) 2) Orbit models (orbit and orientation of the platform as it overflies the scene) 3) Forward models (calculate the signal impinging on the telescope/antenna of the instrument(s) in question) 4) Instrument models (calculate the instrument response to the signals calculated by the Forward models) 5) Retrieval models (invert the instrument signals to recover relevant geophysical information) Within the default ECSIM models crude instrument specific parameterizations (i.e. empirically based Z vs IWC relationships) are avoided. Instead, the radiative transfer forward models are kept as separate as possible from the instrument models. In order to accomplish this, the atmospheric scenes are specified in high detail (i.e. bin resolved cloud size distribution are stored) and the relevant wavelength dependent optical properties are stored in a separate database. This helps insure that all the instruments involved in the simulation are treated in a consistent fashion and that the physical relationships between the various measurements are realistically captured (something that using instrument specific parameterizations relationships can not guarantee). As a consequence, ECSIM's modular structure makes it straightforward to add new instruments (thus expanding ECSIM beyond the EarthCARE instrument suite) and also makes ECSIM well-suited for physically based retrieval algorithm development. In this talk, we will introduce ECSIM and emphasize the philosophy behind its design. We will also give a brief overview on the various default models. Finally, we will present several examples of how ECSIM can and is being used for purposes ranging from general radiative transfer calculations to instrument performance estimation and synergistic algorithm development and characterization.

  8. Interorganizational Collaborative Capacity: Development of a Database to Refine Instrumentation and Explore Patterns

    DTIC Science & Technology

    2009-08-24

    of some defined population […].” ( Thorndike , 1971, p. 533). Norms in this context would allow an organization to understand its relative standing on...York: McGraw-Hill. Thorndike , R. L. (1971). Educational Measurement (2nd Ed.). Washington, DC: American Council on Education. Undersecretary for

  9. CARBON MONOXIDE EMISSIONS FROM ROAD DRIVING: EVIDENCE OF EMISSIONS DUE TO POWER ENRICHMENT

    EPA Science Inventory

    The paper examines one aspect of motor vehicle emissions behavior; i.e. emissions due to engine power enrichment, a factor not well represented in existing models. Database reflecting 46 instrumented vehicles was used to analyze the importance of enrichment emissions to overall v...

  10. An Antibiotic Resource Program for Students of the Health Professions.

    ERIC Educational Resources Information Center

    Tritz, Gerald J.

    1986-01-01

    Provides a description of a computer program developed to supplement instruction in testing of antibiotics on clinical isolates of microorganisms. The program is a simulation and database for interpretation of experimental data designed to enhance laboratory learning and prepare future physicians to use computerized diagnostic instrumentation and…

  11. Self-Report Measures of Juvenile Psychopathic Personality Traits: A Comparative Review

    ERIC Educational Resources Information Center

    Vaughn, Michael G.; Howard, Matthew O.

    2005-01-01

    The authors evaluated self-report instruments currently being used to assess children and adolescents with psychopathic personality traits with respect to their reliability, validity, and research utility. Comprehensive searches across multiple computerized bibliographic databases were conducted and supplemented with manual searches. A total of 30…

  12. A systematic review of psychometric testing of instruments that measure intention to work with older people.

    PubMed

    Che, Chong Chin; Hairi, Noran Naqiah; Chong, Mei Chan

    2017-09-01

    To review systematically the psychometric properties of instruments used to measure intention to work with older people. Nursing students are part of the future healthcare workforce; thus, being aware of their intention to work with older people would give valuable insights to nursing education and practice. Despite a plethora of research on measuring intention to work with older people, a valid and reliable instrument has not been identified. A systematic literature review of evidence and psychometric properties. Eight database searches were conducted between 2006 - 2016. English articles were selected based on inclusion and exclusion criteria. The COSMIN checklist was used to assess instruments reporting a psychometric evaluation of validity and reliability. Of 41 studies identified for full text review, 36 met the inclusion criteria. Seven different types of instruments were identified for psychometric evaluation. Measures of reliability were reported in eight papers and validity in five papers. Evidence for each measurement property was limited, with each instrument demonstrating a lack of information on measurement properties. Based on the COSMIN checklist, the overall quality of the psychometric properties was rated as poor to good. No single instrument was found to be optimal for use. Studies of high methodological quality are needed to properly assess the measurement properties of the instruments that are currently available. Until such studies are available, we recommend using existing instruments with caution. © 2017 John Wiley & Sons Ltd.

  13. A review of instruments developed to measure food neophobia.

    PubMed

    Damsbo-Svendsen, Marie; Frøst, Michael Bom; Olsen, Annemarie

    2017-06-01

    Food choices are influenced by an individual's attitude towards foods. Food neophobia may be associated with less variety of diets, inadequate nutrient intake and high product failure rate for new food products entering the market. To quantify the extent of these challenges, instruments to measure the food neophobia in different target groups are needed. Several such instruments with significantly different measurement outcomes and procedures have been developed. This review provides an overview and discusses strengths and weaknesses of these instruments. We evaluate strengths and weaknesses of previously developed instruments to measure neophobia and willingness to try unfamiliar foods. Literature was searched through the databases Web of Science and Google Scholar. We identified 255 studies concerning neophobia and willingness to try unfamiliar foods. Of these, 13 studies encompassing 13 instruments to measure neophobia and willingness to try unfamiliar foods were included in the review. Results are summarized and evaluated with a narrative approach. In the 13 instruments to assess neophobia and willingness to try unfamiliar foods, 113 to 16.644 subjects aged 2-65 years were involved, scales with 3-7 response categories were used and behavioral validation tests were included in 6 studies. Several instruments to measure neophobia and willingness to try unfamiliar foods exist. We recommend selecting one or more among the 13 instruments reviewed in this paper to assess relevant aspects of neophobia. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Measuring women's childbirth experiences: a systematic review for identification and analysis of validated instruments.

    PubMed

    Nilvér, Helena; Begley, Cecily; Berg, Marie

    2017-06-29

    Women's childbirth experience can have immediate as well as long-term positive or negative effects on their life, well-being and health. When evaluating and drawing conclusions from research results, women's experiences of childbirth should be one aspect to consider. Researchers and clinicians need help in finding and selecting the most suitable instrument for their purpose. The aim of this study was therefore to systematically identify and present validated instruments measuring women's childbirth experience. A systematic review was conducted in January 2016 with a comprehensive search in the bibliographic databases PubMed, CINAHL, Scopus, The Cochrane Library and PsycINFO. Included instruments measured women's childbirth experiences. Papers were assessed independently by two reviewers for inclusion, and quality assessment of included instruments was made by two reviewers independently and in pairs using Terwee et al's criteria for evaluation of psychometric properties. In total 5189 citations were screened, of which 5106 were excluded by title and abstract. Eighty-three full-text papers were reviewed, and 37 papers were excluded, resulting in 46 included papers representing 36 instruments. These instruments demonstrated a wide range in purpose and content as well as in the quality of psychometric properties. This systematic review provides an overview of existing instruments measuring women's childbirth experiences and can support researchers to identify appropriate instruments to be used, and maybe adapted, in their specific contexts and research purpose.

  15. [Telematics equipment for poison control surveillance. Its applications in the health management of relevant chemical incidents].

    PubMed

    Butera, R; Locatelli, C; Gandini, C; Minuco, G; Mazzoleni, M C; Giordano, A; Zanuti, M; Varango, C; Petrolini, V; Candura, S M; Manzo, L

    1997-01-01

    Health management of major chemical incidents requires a close collaboration between rescuers (on the disaster site and in the emergency department) and the poison center. The study tested telematic technologies allowing telepresence and teleconsulting, a real time and continuous connection among health care personnel and toxicologists involved in the management of the emergency. The link between the poison center (PC) and the emergency department in the local hospital is provided by a ISDN operating video conferencing system, while the data transmission from the site of the accident to the PC is achieved with a personal computer and GSM cellular data transmission. Toxicological databases and risk assessment software are integrated in the system, to support information sharing. To test such instruments in operative nearly realistic conditions, the main phase of the study has implemented simulated chemical disasters in different locations in Italy. Instruments for telepresence and teleconsulting have been effectively utilized to evaluate from a remote location the scenario and the severity of the accident, by inspecting either specific details or the whole scene, to enable PC guiding the triage of the victims before and after hospitalization, to utilize and share data, such as intervention protocols or patient records, and to document all the activities. In summary, this experience shows that the telematic link allows the toxicologists of the poison center to rapidly understand the situation, and to correctly learn about the conditions of patients with the help of images. The results of this study indicate the valuable benefits of telematic instruments for the health care in case of major chemical disasters occurring in a remote geographical location or in an area which lacks local toxicological experts, where specialized expertise can be achieved by the use of telematic technologies.

  16. Gregoriano cadastre (1818-35) from old maps to a GIS of historical landscape data

    NASA Astrophysics Data System (ADS)

    Frazzica, V.; Galletti, F.; Orciani, M.; Colosi, L.; Cartaro, A.

    2009-04-01

    Our analysis covered specifically an area located along the "internal Marche ridge" of the Apennines, in the province of Ancona (Marche Region, Italy). The cartographical working-out for our historical analysis has been conduct drawing up maps originating from the nineteenth century Gregoriano Cadastre (Catasto Gregoriano) maps preserved in the State Archive of Rome, which have been reproduced in digital format, georeferenced and vectorialized. With the creation of a database, it has been possible to add to the maps the information gathered from the property registers concerning crop production and socioeconomic variables, in order to set up a Geographical Information System (G.I.S.). The combination of the database with the digitalized maps has allowed to create an univocal relation between each parcel and the related historical data, obtaining an information system which integrally and completely evidences the original cadastre data as a final result. It was also possible to create a three-dimensional model of the historical landscapes which permits to visualize the cultural diversification of that historical period. The integration in Territorial Information System (S.I.T.) of historical information from Gregoriano Cadastre, of socio-economic analyses concerning business changes and in parallel the study of the transformations of territorial framework, showed to be a very important instrument for the area planning, allowing to identify specific planning approaches not only for urban settlement but also for restoration of variety and complexity of agricultural landscape. The work opens further research in various directions, identifying some pilot areas which test new managerial models, foreseeing simulation of management impacts both on business profitability and landscape configuration. The future development of the project is also the upgrade and evolution of the database, followed by the acquisition of data related to the following historical periods. It'll also allow to improve the three-dimensional model (rendering) of the landscape described in the Gregoriano Cadastre.

  17. Rigorous noise test and calibration check of strong-motion instrumentation at the Conrad Observatory in Austria.

    NASA Astrophysics Data System (ADS)

    Steiner, R.; Costa, G.; Lenhardt, W.; Horn, N.; Suhadolc, P.

    2012-04-01

    In the framework of the European InterregIV Italy/Austria project: "HAREIA - Historical and Recent Earthquakes in Italy and Austria" the Central Institute for Meteorology and Geodynamics (ZAMG) and Mathematic and Geosciences Department of University of Trieste (DMG) are upgrading the transfrontier seismic network of South-Eastern Alps with new 12 accelerometric stations to enhance the strong motion instrument density near the Austria/Italy border. Various public institutions of the provinces Alto Adige (Bolzano Province), Veneto (ARPAV) and Friuli Venezia Giulia (Regional Civil Defense) in Italy and in the Austrian province of Tyrol are involved in the project. The site selection was carried out to improve the present local network geometry thus meeting the needs of public Institutions in the involved regions. In Tyrol and Alto Adige some strategic buildings (hospitals and public buildings) have been selected, whereas in Veneto and Friuli Venezia Giulia the sites are in the free field, mainly located near villages. The instruments will be installed in an innovative box, designed by ZAMG, that provides electric and water isolation. The common choice regarding the instrument selection has been the new Kinemetrics Basalt ® accelerograph to guarantee homogeneity with the already installed instrumentation and compatibility with the software already in use at the different seismic institutions in the area. Prior to deployment the equipment was tested at the Conrad Observatory and a common set-up has been devised. The Conrad Observatory, seismically particularly quiet, permits to analyze both the sensor and the acquisition system noise. The instruments were connected to the network and the data sent in real-time to the ZAMG data center in Vienna and the DMG data center in Trieste. The data have been collected in the database and analyzed using signal processing modules PQLX and Matlab. The data analysis of the recordings at the ultra-quiet Conrad Observatory pointed out some differences in the seismic response of the 12 instruments, mainly within the tolerance stated by the factory, and an optimization of a few sensors can be carried out in order to guarantee compatible high quality measurements.

  18. 78 FR 2363 - Notification of Deletion of a System of Records; Automated Trust Funds Database

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-11

    ... Database AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice of deletion of a system... establishing the Automated Trust Funds (ATF) database system of records. The Federal Information Security... Integrity Act of 1982, Public Law 97-255, provided authority for the system. The ATF database has been...

  19. Measuring quality of life of people with predementia and dementia and their caregivers: a systematic review protocol.

    PubMed

    Landeiro, Filipa; Walsh, Katie; Ghinai, Isaac; Mughal, Seher; Nye, Elsbeth; Wace, Helena; Roberts, Nia; Lecomte, Pascal; Wittenberg, Raphael; Wolstenholme, Jane; Handels, Ron; Roncancio-Diaz, Emilse; Potashman, Michele H; Tockhorn-Heidenreich, Antje; Gray, Alastair M

    2018-03-30

    Dementia is the fastest growing major cause of disability globally and may have a profound impact on the health-related quality of life (HRQoL) of both the patient with dementia and those who care for them. This review aims to systematically identify and synthesise the measurements of HRQoL for people with, and their caregivers across the full spectrum of, dementia from its preceding stage of predementia to end of life. A systematic literature review was conducted in Medical Literature Analysis and Retrieval System Online , ExcerptaMedicadataBASE, Cochrane Database of Systematic Reviews , Cochrane Central Register of Controlled Trials, Database of Abstracts of Reviews of Effect, National Health Service Economic Evaluation Database and PsycINFO between January 1990 and the end of April 2017. Two reviewers will independently assess each study for inclusion and disagreements will be resolved by a third reviewer. Data will be extracted using a predefined data extraction form following best practice. Study quality will be assessed with the Effective Public Health Practice Project quality assessment tool. HRQoL measurements will be presented separately for people with dementia and caregivers by instrument used and, when possible, HRQoL will be reported by disease type and stage of the disease. Descriptive statistics of the results will be provided. A narrative synthesis of studies will also be provided discussing differences in HRQoL measurements by instrument used to estimate it, type of dementia and disease severity. This systematic literature review is exempt from ethics approval because the work is carried out on published documents. The findings of the review will be disseminated in a related peer-reviewed journal and presented at conferences. They will also contribute to the work developed in the Real World Outcomes across the Alzheimer's disease spectrum for better care: multimodal data access platform (ROADMAP). CRD42017071416. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. A flood geodatabase and its climatological applications: the case of Catalonia for the last century

    NASA Astrophysics Data System (ADS)

    Barnolas, M.; Llasat, M. C.

    2007-04-01

    Floods are the natural hazards that produce the highest number of casualties and material damage in the Western Mediterranean. An improvement in flood risk assessment and study of a possible increase in flooding occurrence are therefore needed. To carry out these tasks it is important to have at our disposal extensive knowledge on historical floods and to find an efficient way to manage this geographical data. In this paper we present a complete flood database spanning the 20th century for the whole of Catalonia (NE Spain), which includes documentary information (affected areas and damage) and instrumental information (meteorological and hydrological records). This geodatabase, named Inungama, has been implemented on a GIS (Geographical Information System) in order to display all the information within a given geographical scenario, as well as to carry out an analysis thereof using queries, overlays and calculus. Following a description of the type and amount of information stored in the database and the structure of the information system, the first applications of Inungama are presented. The geographical distribution of floods shows the localities which are more likely to be flooded, confirming that the most affected municipalities are the most densely populated ones in coastal areas. Regarding the existence of an increase in flooding occurrence, a temporal analysis has been carried out, showing a steady increase over the last 30 years.

  1. The HISTMAG database: combining historical, archaeomagnetic and volcanic data

    NASA Astrophysics Data System (ADS)

    Arneitz, Patrick; Leonhardt, Roman; Schnepp, Elisabeth; Heilig, Balázs; Mayrhofer, Franziska; Kovacs, Peter; Hejda, Pavel; Valach, Fridrich; Vadasz, Gergely; Hammerl, Christa; Egli, Ramon; Fabian, Karl; Kompein, Niko

    2017-09-01

    Records of the past geomagnetic field can be divided into two main categories. These are instrumental historical observations on the one hand, and field estimates based on the magnetization acquired by rocks, sediments and archaeological artefacts on the other hand. In this paper, a new database combining historical, archaeomagnetic and volcanic records is presented. HISTMAG is a relational database, implemented in MySQL, and can be accessed via a web-based interface (http://www.conrad-observatory.at/zamg/index.php/data-en/histmag-database). It combines available global historical data compilations covering the last ∼500 yr as well as archaeomagnetic and volcanic data collections from the last 50 000 yr. Furthermore, new historical and archaeomagnetic records, mainly from central Europe, have been acquired. In total, 190 427 records are currently available in the HISTMAG database, whereby the majority is related to historical declination measurements (155 525). The original database structure was complemented by new fields, which allow for a detailed description of the different data types. A user-comment function provides the possibility for a scientific discussion about individual records. Therefore, HISTMAG database supports thorough reliability and uncertainty assessments of the widely different data sets, which are an essential basis for geomagnetic field reconstructions. A database analysis revealed systematic offset for declination records derived from compass roses on historical geographical maps through comparison with other historical records, while maps created for mining activities represent a reliable source.

  2. [The future of clinical laboratory database management system].

    PubMed

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  3. Overview of the Novel Intelligent JAXA Active Rotor Program

    NASA Technical Reports Server (NTRS)

    Saito, Shigeru; Kobiki, Noboru; Tanabe, Yasutada; Johnson, Wayne; Yamauchi, Gloria K.; Young, Larry A.

    2010-01-01

    The Novel Intelligent JAXA Active Rotor (NINJA Rotor) program is a cooperative effort between JAXA and NASA, involving a test of a JAXA pressure-instrumented, active-flap rotor in the 40- by 80-Foot Wind Tunnel at Ames Research Center. The objectives of the program are to obtain an experimental database of a rotor with active flaps and blade pressure instrumentation, and to use that data to develop analyses to predict the aerodynamic and aeroacoustic performance of rotors with active flaps. An overview of the program is presented, including a description of the rotor and preliminary pretest calculations.

  4. Online Resource for Earth-Observing Satellite Sensor Calibration

    NASA Technical Reports Server (NTRS)

    McCorkel, J.; Czapla-Myers, J.; Thome, K.; Wenny, B.

    2015-01-01

    The Radiometric Calibration Test Site (RadCaTS) at Railroad Valley Playa, Nevada is being developed by the University of Arizona to enable improved accuracy and consistency for airborne and satellite sensor calibration. Primary instrumentation at the site consists of ground-viewing radiometers, a sun photometer, and a meteorological station. Measurements made by these instruments are used to calculate surface reflectance, atmospheric properties and a prediction for top-of-atmosphere reflectance and radiance. This work will leverage research for RadCaTS, and describe the requirements for an online database, associated data formats and quality control, and processing levels.

  5. Comparison of cyclic fatigue and torsional resistance in reciprocating single-file systems and continuous rotary instrumentation systems.

    PubMed

    da Frota, Matheus F; Espir, Camila G; Berbert, Fábio L C V; Marques, André A F; Sponchiado-Junior, Emílio C; Tanomaru-Filho, Mario; Garcia, Lucas F R; Bonetti-Filho, Idomeo

    2014-12-01

    As compared with continuous rotary systems, reciprocating motion is believed to increase the fatigue resistance of NiTi instruments. We compared the cyclic fatigue and torsional resistance of reciprocating single-file systems and continuous rotary instrumentation systems in simulated root canals. Eighty instruments from the ProTaper Universal, WaveOne, MTwo, and Reciproc systems (n = 20) were submitted to dynamic bending testing in stainless-steel simulated curved canals. Axial displacement of the simulated canals was performed with half of the instruments (n = 10), with back-and-forth movements in a range of 1.5 mm. Time until fracture was recorded, and the number of cycles until instrument fracture was calculated. Cyclic fatigue resistance was greater for reciprocating systems than for rotary systems (P < 0.05). Instruments from the Reciproc and WaveOne systems significantly differed only when axial displacement occurred (P < 0.05). Instruments of the ProTaper Universal and MTwo systems did not significantly differ (P > 0.05). Cyclic fatigue and torsional resistance were greater for reciprocating systems than for continuous rotary systems, irrespective of axial displacement.

  6. Systems budgets architecture and development for the Maunakea Spectroscopic Explorer

    NASA Astrophysics Data System (ADS)

    Mignot, Shan; Flagey, Nicolas; Szeto, Kei; Murowinski, Rick; McConnachie, Alan

    2016-08-01

    The Maunakea Spectroscopic Explorer (MSE) project is an enterprise to upgrade the existing Canada-France- Hawaii observatory into a spectroscopic facility based on a 10 meter-class telescope. As such, the project relies on engineering requirements not limited only to its instruments (the low, medium and high resolution spectrographs) but for the whole observatory. The science requirements, the operations concept, the project management and the applicable regulations are the basis from which these requirements are initially derived, yet they do not form hierarchies as each may serve several purposes, that is, pertain to several budgets. Completeness and consistency are hence the main systems engineering challenges for such a large project as MSE. Special attention is devoted to ensuring the traceability of requirements via parametric models, derivation documents, simulations, and finally maintaining KAOS diagrams and a database under IBM Rational DOORS linking them together. This paper will present the architecture of the main budgets under development and the associated processes, expand to highlight those that are interrelated and how the system, as a whole, is then optimized by modelling and analysis of the pertinent system parameters.

  7. Automatic labeling and characterization of objects using artificial neural networks

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms, i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  8. Factors influencing nurses' attitudes towards healthcare information technology.

    PubMed

    Huryk, Laurie A

    2010-07-01

    This literature review examines the current trend in nurses' attitudes toward healthcare information technology (HIT). HIT implementation and expansion are at the core of global efforts to improve healthcare quality and patient safety. As a large portion of the healthcare workforce, nurses' attitudes towards HIT are likely to have a major impact on the electronic health record (EHR) implementation process. A search of PubMed, CINAHL and Medline databases produced 1930 combined hits. Returned articles were scanned for relevancy and applicability. Thirteen articles met all criteria and were subsequently reviewed in their entirety. In accordance with two change theories, if HIT implementation projects are to be successful, nurses must recognize that incorporating EHRs into their daily practice is beneficial to patient outcomes. Overall, the attitudes of nurses toward HIT are positive. Increased computer experience is the main demographic indicator for positive attitudes. The most common detractors are poor system design, system slowdown and system downtime. Nurses are also fearful that the use of technology will dehumanize patient care. Involving nurses in system design is likely to improve post-implementation satisfaction. Creating a positive, supportive atmosphere appears to be instrumental to sustainability.

  9. Computerised tomography vs magnetic resonance imaging for modeling of patient-specific instrumentation in total knee arthroplasty.

    PubMed

    Stirling, Paul; Valsalan Mannambeth, Rejith; Soler, Agustin; Batta, Vineet; Malhotra, Rajeev Kumar; Kalairajah, Yegappan

    2015-03-18

    To summarise and compare currently available evidence regarding accuracy of pre-operative imaging, which is one of the key choices for surgeons contemplating patient-specific instrumentation (PSI) surgery. The MEDLINE and EMBASE medical literature databases were searched, from January 1990 to December 2013, to identify relevant studies. The data from several clinical studies was assimilated to allow appreciation and comparison of the accuracy of each modality. The overall accuracy of each modality was calculated as proportion of outliers > 3% in the coronal plane of both computerised tomography (CT) or magnetic resonance imaging (MRI). Seven clinical studies matched our inclusion criteria for comparison and were included in our study for statistical analysis. Three of these reported series using MRI and four with CT. Overall percentage of outliers > 3% in patients with CT-based PSI systems was 12.5% vs 16.9% for MRI-based systems. These results were not statistically significant. Although many studies have been undertaken to determine the ideal pre-operative imaging modality, conclusions remain speculative in the absence of long term data. Ultimately, information regarding accuracy of CT and MRI will be the main determining factor. Increased accuracy of pre-operative imaging could result in longer-term savings, and reduced accumulated dose of radiation by eliminating the need for post-operative imaging and revision surgery.

  10. Computerised tomography vs magnetic resonance imaging for modeling of patient-specific instrumentation in total knee arthroplasty

    PubMed Central

    Stirling, Paul; Valsalan Mannambeth, Rejith; Soler, Agustin; Batta, Vineet; Malhotra, Rajeev Kumar; Kalairajah, Yegappan

    2015-01-01

    AIM: To summarise and compare currently available evidence regarding accuracy of pre-operative imaging, which is one of the key choices for surgeons contemplating patient-specific instrumentation (PSI) surgery. METHODS: The MEDLINE and EMBASE medical literature databases were searched, from January 1990 to December 2013, to identify relevant studies. The data from several clinical studies was assimilated to allow appreciation and comparison of the accuracy of each modality. The overall accuracy of each modality was calculated as proportion of outliers > 3% in the coronal plane of both computerised tomography (CT) or magnetic resonance imaging (MRI). RESULTS: Seven clinical studies matched our inclusion criteria for comparison and were included in our study for statistical analysis. Three of these reported series using MRI and four with CT. Overall percentage of outliers > 3% in patients with CT-based PSI systems was 12.5% vs 16.9% for MRI-based systems. These results were not statistically significant. CONCLUSION: Although many studies have been undertaken to determine the ideal pre-operative imaging modality, conclusions remain speculative in the absence of long term data. Ultimately, information regarding accuracy of CT and MRI will be the main determining factor. Increased accuracy of pre-operative imaging could result in longer-term savings, and reduced accumulated dose of radiation by eliminating the need for post-operative imaging and revision surgery. PMID:25793170

  11. The Data Base and Decision Making in Public Schools.

    ERIC Educational Resources Information Center

    Hedges, William D.

    1984-01-01

    Describes generic types of databases--file management systems, relational database management systems, and network/hierarchical database management systems--with their respective strengths and weaknesses; discusses factors to be considered in determining whether a database is desirable; and provides evaluative criteria for use in choosing…

  12. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  13. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  14. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  15. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  16. Instruments measuring spirituality in clinical research: a systematic review.

    PubMed

    Monod, Stéfanie; Brennan, Mark; Rochat, Etienne; Martin, Estelle; Rochat, Stéphane; Büla, Christophe J

    2011-11-01

    Numerous instruments have been developed to assess spirituality and measure its association with health outcomes. This study's aims were to identify instruments used in clinical research that measure spirituality; to propose a classification of these instruments; and to identify those instruments that could provide information on the need for spiritual intervention. A systematic literature search in MEDLINE, CINHAL, PsycINFO, ATLA, and EMBASE databases, using the terms "spirituality" and "adult$," and limited to journal articles was performed to identify clinical studies that used a spiritual assessment instrument. For each instrument identified, measured constructs, intended goals, and data on psychometric properties were retrieved. A conceptual and a functional classification of instruments were developed. Thirty-five instruments were retrieved and classified into measures of general spirituality (N = 22), spiritual well-being (N = 5), spiritual coping (N = 4), and spiritual needs (N = 4) according to the conceptual classification. Instruments most frequently used in clinical research were the FACIT-Sp and the Spiritual Well-Being Scale. Data on psychometric properties were mostly limited to content validity and inter-item reliability. According to the functional classification, 16 instruments were identified that included at least one item measuring a current spiritual state, but only three of those appeared suitable to address the need for spiritual intervention. Instruments identified in this systematic review assess multiple dimensions of spirituality, and the proposed classifications should help clinical researchers interested in investigating the complex relationship between spirituality and health. Findings underscore the scarcity of instruments specifically designed to measure a patient's current spiritual state. Moreover, the relatively limited data available on psychometric properties of these instruments highlight the need for additional research to determine whether they are suitable in identifying the need for spiritual interventions.

  17. Analysis of Six Reviews on the Quality of Instruments for the Evaluation of Interprofessional Education in German-Speaking Countries.

    PubMed

    Ehlers, Jan P; Kaap-Fröhlich, Sylvia; Mahler, Cornelia; Scherer, Theresa; Huber, Marion

    2017-01-01

    Background: More and more institutions worldwide and in German-speaking countries are developing and establishing interprofessional seminars in undergraduate education of health professions. In order to evaluate the different didactic approaches and different outcomes regarding the anticipated interprofessional competencies, it is necessary to apply appropriate instruments. Cross-cultural instruments are particularly helpful for international comparability. The Interprofessional Education working group of the German Medical Association (GMA) aims at identifying existing instruments for the evaluation of interprofessional education in order to make recommendations for German-speaking countries. Methods: Systematic literature research was performed on the websites of international interprofessional organisations (CAIPE, EIPEN, AIPEN), as well as in the PubMed and Cinahl databases. Reviews focusing on quantitative instruments to evaluate competencies according to the modified Kirkpatrick competency levels were searched for. Psychometrics, language/country and setting, in which the instrument was applied, were recorded. Results: Six reviews out of 73 literature research hits were included. A large number of instruments were identified; however, their psychometrics and the applied setting were very heterogeneous. The instruments can mainly be assigned to Kirkpatrick levels 1, 2a & 2b. Most instruments have been developed in English but their psychometrics were not always reported rigorously. Only very few instruments are available in German. Conclusion: It is difficult to find appropriate instruments in German. Internationally, there are different approaches and objectives in the measurement and evaluation of interprofessional competencies. The question arises whether it makes sense to translate existing instruments or to go through the lengthy process of developing new ones. The evaluation of interprofessional seminars with quantitative instruments remains mainly on Kirkpatrick levels 1 and 2. Levels 3 and 4 can probably only be assessed with qualitative or mixed methods. German language instruments are necessary.

  18. Analysis of Six Reviews on the Quality of Instruments for the Evaluation of Interprofessional Education in German-Speaking Countries

    PubMed Central

    Ehlers, Jan P.; Kaap-Fröhlich, Sylvia; Mahler, Cornelia; Scherer, Theresa; Huber, Marion

    2017-01-01

    Background: More and more institutions worldwide and in German-speaking countries are developing and establishing interprofessional seminars in undergraduate education of health professions. In order to evaluate the different didactic approaches and different outcomes regarding the anticipated interprofessional competencies, it is necessary to apply appropriate instruments. Cross-cultural instruments are particularly helpful for international comparability. The Interprofessional Education working group of the German Medical Association (GMA) aims at identifying existing instruments for the evaluation of interprofessional education in order to make recommendations for German-speaking countries. Methods: Systematic literature research was performed on the websites of international interprofessional organisations (CAIPE, EIPEN, AIPEN), as well as in the PubMed and Cinahl databases. Reviews focusing on quantitative instruments to evaluate competencies according to the modified Kirkpatrick competency levels were searched for. Psychometrics, language/country and setting, in which the instrument was applied, were recorded. Results: Six reviews out of 73 literature research hits were included. A large number of instruments were identified; however, their psychometrics and the applied setting were very heterogeneous. The instruments can mainly be assigned to Kirkpatrick levels 1, 2a & 2b. Most instruments have been developed in English but their psychometrics were not always reported rigorously. Only very few instruments are available in German. Conclusion: It is difficult to find appropriate instruments in German. Internationally, there are different approaches and objectives in the measurement and evaluation of interprofessional competencies. The question arises whether it makes sense to translate existing instruments or to go through the lengthy process of developing new ones. The evaluation of interprofessional seminars with quantitative instruments remains mainly on Kirkpatrick levels 1 and 2. Levels 3 and 4 can probably only be assessed with qualitative or mixed methods. German language instruments are necessary. PMID:28890927

  19. Advanced transportation system studies. Alternate propulsion subsystem concepts: Propulsion database

    NASA Technical Reports Server (NTRS)

    Levack, Daniel

    1993-01-01

    The Advanced Transportation System Studies alternate propulsion subsystem concepts propulsion database interim report is presented. The objective of the database development task is to produce a propulsion database which is easy to use and modify while also being comprehensive in the level of detail available. The database is to be available on the Macintosh computer system. The task is to extend across all three years of the contract. Consequently, a significant fraction of the effort in this first year of the task was devoted to the development of the database structure to ensure a robust base for the following years' efforts. Nonetheless, significant point design propulsion system descriptions and parametric models were also produced. Each of the two propulsion databases, parametric propulsion database and propulsion system database, are described. The descriptions include a user's guide to each code, write-ups for models used, and sample output. The parametric database has models for LOX/H2 and LOX/RP liquid engines, solid rocket boosters using three different propellants, a hybrid rocket booster, and a NERVA derived nuclear thermal rocket engine.

  20. GeoInt: the first macroseismic intensity database for the Republic of Georgia

    NASA Astrophysics Data System (ADS)

    Varazanashvili, O.; Tsereteli, N.; Bonali, F. L.; Arabidze, V.; Russo, E.; Pasquaré Mariotto, F.; Gogoladze, Z.; Tibaldi, A.; Kvavadze, N.; Oppizzi, P.

    2018-05-01

    Our work is intended to present the new macroseismic intensity database for the Republic of Georgia—hereby named GeoInt—which includes earthquakes from the historical (from 1250 B.C. onwards) to the instrumental era. Such database is composed of 111 selected earthquakes and related 3944 intensity data points (IDPs) for 1509 different localities, reported in the Medvedev-Sponheuer-Karnik scale (MSK). Regarding the earthquakes, the M S is in the 3.3-7 range and the depth is in the 2-36 km range. The entire set of IDPs is characterized by intensities ranging from 2-3 to 9-10 and covers an area spanning from 39.508° N to 45.043° N in a N-S direction and from 37.324° E to 48.500° E in an E-W direction, with some of the IDPs located outside the Georgian border, in the (i) Republic of Armenia, (ii) Russian Federation, (iii) Republic of Turkey, and (iv) Republic of Azerbaijan. We have revised each single IDP and have reevaluated and homogenized intensity values to the MSK scale. In particular, regarding the whole set of 3944 IDPs, 348 belong to the Historical era (pre-1900) and 3596 belong to the instrumental era (post-1900). With particular regard to the 3596 IDPs, 105 are brand new (3%), whereas the intensity values for 804 IDPs have been reevaluated (22%); for 2687 IDPs (75%), intensities have been confirmed from previous interpretations. We introduce this database as a key input for further improvements in seismic hazard modeling and seismic risk calculation for this region, based on macroseismic intensity; we report all the 111 earthquakes with available macroseismic information. The GeoInt database is also accessible online at http://www.enguriproject.unimib.it and will be kept updated in the future.

  1. GeoInt: the first macroseismic intensity database for the Republic of Georgia

    NASA Astrophysics Data System (ADS)

    Varazanashvili, O.; Tsereteli, N.; Bonali, F. L.; Arabidze, V.; Russo, E.; Pasquaré Mariotto, F.; Gogoladze, Z.; Tibaldi, A.; Kvavadze, N.; Oppizzi, P.

    2018-01-01

    Our work is intended to present the new macroseismic intensity database for the Republic of Georgia—hereby named GeoInt—which includes earthquakes from the historical (from 1250 B.C. onwards) to the instrumental era. Such database is composed of 111 selected earthquakes and related 3944 intensity data points (IDPs) for 1509 different localities, reported in the Medvedev-Sponheuer-Karnik scale (MSK). Regarding the earthquakes, the M S is in the 3.3-7 range and the depth is in the 2-36 km range. The entire set of IDPs is characterized by intensities ranging from 2-3 to 9-10 and covers an area spanning from 39.508° N to 45.043° N in a N-S direction and from 37.324° E to 48.500° E in an E-W direction, with some of the IDPs located outside the Georgian border, in the (i) Republic of Armenia, (ii) Russian Federation, (iii) Republic of Turkey, and (iv) Republic of Azerbaijan. We have revised each single IDP and have reevaluated and homogenized intensity values to the MSK scale. In particular, regarding the whole set of 3944 IDPs, 348 belong to the Historical era (pre-1900) and 3596 belong to the instrumental era (post-1900). With particular regard to the 3596 IDPs, 105 are brand new (3%), whereas the intensity values for 804 IDPs have been reevaluated (22%); for 2687 IDPs (75%), intensities have been confirmed from previous interpretations. We introduce this database as a key input for further improvements in seismic hazard modeling and seismic risk calculation for this region, based on macroseismic intensity; we report all the 111 earthquakes with available macroseismic information. The GeoInt database is also accessible online at http://www.enguriproject.unimib.it and will be kept updated in the future.

  2. Recommendations on the most suitable quality-of-life measurement instruments for bariatric and body contouring surgery: a systematic review.

    PubMed

    de Vries, C E E; Kalff, M C; Prinsen, C A C; Coulman, K D; den Haan, C; Welbourn, R; Blazeby, J M; Morton, J M; van Wagensveld, B A

    2018-06-08

    The objective of this study is to systematically assess the quality of existing patient-reported outcome measures developed and/or validated for Quality of Life measurement in bariatric surgery (BS) and body contouring surgery (BCS). We conducted a systematic literature search in PubMed, EMBASE, PsycINFO, CINAHL, Cochrane Database Systematic Reviews and CENTRAL identifying studies on measurement properties of BS and BCS Quality of Life instruments. For all eligible studies, we evaluated the methodological quality of the studies by using the COnsensus-based Standards for the selection of health Measurement INstruments checklist and the quality of the measurement instruments by applying quality criteria. Four degrees of recommendation were assigned to validated instruments (A-D). Out of 4,354 articles, a total of 26 articles describing 24 instruments were included. No instrument met all requirements (category A). Seven instruments have the potential to be recommended depending on further validation studies (category B). Of these seven, the BODY-Q has the strongest evidence for content validity in BS and BCS. Two instruments had poor quality in at least one required quality criterion (category C). Fifteen instruments were minimally validated (category D). The BODY-Q, developed for BS and BCS, possessed the strongest evidence for quality of measurement properties and has the potential to be recommended in future clinical trials. © 2018 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of World Obesity Federation.

  3. Body image in Brazil: recent advances in the state of knowledge and methodological issues

    PubMed Central

    Laus, Maria Fernanda; Kakeshita, Idalina Shiraishi; Costa, Telma Maria Braga; Ferreira, Maria Elisa Caputo; Fortes, Leonardo de Sousa; Almeida, Sebastião Sousa

    2014-01-01

    OBJECTIVE To analyze Brazilian literature on body image and the theoretical and methodological advances that have been made. METHODS A detailed review was undertaken of the Brazilian literature on body image, selecting published articles, dissertations and theses from the SciELO, SCOPUS, LILACS and PubMed databases and the CAPES thesis database. Google Scholar was also used. There was no start date for the search, which used the following search terms: “body image” AND “Brazil” AND “scale(s)”; “body image” AND “Brazil” AND “questionnaire(s)”; “body image” AND “Brazil” AND “instrument(s)”; “body image” limited to Brazil and “body image”. RESULTS The majority of measures available were intended to be used in college students, with half of them evaluating satisfaction/dissatisfaction with the body. Females and adolescents of both sexes were the most studied population. There has been a significant increase in the number of available instruments. Nevertheless, numerous published studies have used non-validated instruments, with much confusion in the use of the appropriate terms (e.g., perception, dissatisfaction, distortion). CONCLUSIONS Much more is needed to understand body image within the Brazilian population, especially in terms of evaluating different age groups and diversifying the components/dimensions assessed. However, interest in this theme is increasing, and important steps have been taken in a short space of time. PMID:24897056

  4. Quantitative instruments used to assess children's sense of smell: a review article.

    PubMed

    Moura, Raissa Gomes Fonseca; Cunha, Daniele Andrade; Gomes, Ana Carolina de Lima Gusmão; Silva, Hilton Justino da

    2014-01-01

    To systematically gather from the literature available the quantitative instruments used to assess the sense of smell in studies carried out with children. The present study included a survey in the Pubmed and Bireme platforms and in the databases of MedLine, Lilacs, regional SciELO and Web of Science, followed by selection and critical analysis of the articles found and chosen. We selected original articles related to the topic in question, conducted only with children in Portuguese, English, and Spanish. We excluded studies addressing other phases of human development, exclusively or concurrently with the pediatric population; studies on animals; literature review articles; dissertations; book chapters; case study articles; and editorials. A book report protocol was created for this study, including the following information: author, department, year, location, population/sample, age, purpose of the study, methods, and main results. We found 8,451 articles by typing keywords and identifiers. Out of this total, 5,928 were excluded by the title, 2,366 by the abstract, and 123 after we read the full text. Thus, 34 articles were selected, of which 28 were repeated in the databases, totalizing 6 articles analyzed in this review. We observed a lack of standardization of the quantitative instruments used to assess children's sense of smell, with great variability in the methodology of the tests, which reduces the effectiveness and reliability of the results.

  5. Meniscal shear stress for punching.

    PubMed

    Tuijthof, Gabrielle J M; Meulman, Hubert N; Herder, Just L; van Dijk, C Niek

    2009-01-01

    Experimental determination of the shear stress for punching meniscal tissue. Meniscectomy (surgical treatment of a lesion of one of the menisci) is the most frequently performed arthroscopic procedure. The performance of a meniscectomy is not optimal with the currently available instruments. To design new instruments, the punching force of meniscal tissue is an important parameter. Quantitative data are unavailable. The meniscal punching process was simulated by pushing a rod through meniscal tissue at constant speed. Three punching rods were tested: a solid rod of Oslash; 3.00 mm, and two hollow tubes (Oslash; 3.00-2.60 mm) with sharpened cutting edges of 0.15 mm and 0.125 mm thick, respectively. Nineteen menisci acquired from 10 human cadaveric knee joints were punched (30 tests). The force and displacement were recorded from which the maximum shear stress was determined (average added with three times the standard deviation). The maximum shear stress for the solid rod was determined at 10.2 N/mm2. This rod required a significantly lower punch force in comparison with the hollow tube having a 0.15 mm cutting edge (plt;0.01). The maximum shear stress for punching can be applied to design instruments, and virtual reality training environments. This type of experiment is suitable to form a database with material properties of human tissue similar to databases for the manufacturing industry.

  6. The new Mediterranean background monitoring station of Ersa, Cape Corsica: A long term Observatory component of the Chemistry-Aerosol Mediterranean Experiment (ChArMEx)

    NASA Astrophysics Data System (ADS)

    Dulac, Francois

    2013-04-01

    The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) is a French initiative supported by the MISTRALS program (Mediterranean Integrated Studies at Regional And Locals Scales, http://www.mistrals-home.org). It aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The major stake is an understanding of the future of the Mediterranean region in a context of strong regional anthropogenic and climatic pressures. The target of ChArMEx is short-lived particulate and gaseous tropospheric trace species which are the cause of poor air quality events, have two-way interactions with climate, or impact the marine biogeochemistry. In order to fulfill these objectives, important efforts have been put in 2012 in order to implement the infrastructure and instrumentation for a fully equipped background monitoring station at Ersa, Cape Corsica, a key location at the crossroads of dusty southerly air masses and polluted outflows from the European continent. The observations at this station began in June 2012 (in the context of the EMEP / ACTRIS / PEGASOS / ChArMEx campaigns). A broad spectrum of aerosol properties is also measured at the station, from the chemical composition (off-line daily filter sampling in PM2.5/PM10, on-line Aerosol Chemical Speciation Monitor), ground optical properties (extinction/absorption/light scattering coeff. with 1-? CAPS PMex monitor, 7-? Aethalometer, 3-? Nephelometer), integrated and vertically resolved optical properties (4-? Cimel sunphotometer and LIDAR, respective), size distribution properties (N-AIS, SMPS, APS, and OPS instruments), mass (PM1/PM10 by TEOM/TEOM-FDMS), hygroscopicity (CCN), as well as total insoluble deposition. So far, real-time measurement of reactive gases (O3, CO, NO, NO2), and off-line VOC measurements (cylinders, cartridges) are also performed. A Kipp and Zonen system for monitoring direct and diffuse broadband radiative fluxes will also be in operation soon, as well as an ICOS/RAMCES CO2 and CH4 monitoring instrument. Through this unprecedented effort and with the support from ChArMEx, ADEME, and CORSiCA programs (http://www.obs-mip.fr/corsica), this observatory represents so far the most achieved French atmospheric station having the best set of instruments for measuring in-situ reactive gases and aerosols. It stands out as the station of not one laboratory but of a large number (see list of co-authors). It provides "real time" information useful to the local air quality network (Qualitair Corse, http://www.qualitaircorse.org/) concerning EU regulated parameters (O3, PMx). This station aims providing quality controlled climatically relevant gas/aerosol database following the recommendations of the EU-FP7 ACTRIS infrastructure, EMEP and WMO-GAW programs. Atmospheric datasets are currently available at the MISTRALS database (http://mistrals.sedoo.fr/ChArMEx/) and soon at the ACTRIS & GAW databases. After a brief presentation of the Cape Corsica Station (location, climatology, instrumental settings ...), we present here the first months of aerosols properties (optical / chemical / particle size) obtained at this station. Acknowledgements: the station is mainly supported by ADEME, CNRS-INSU, CEA, CTC, EMD, FEDER, and Météo-France.

  7. MOSAIC: An organic geochemical and sedimentological database for marine surface sediments

    NASA Astrophysics Data System (ADS)

    Tavagna, Maria Luisa; Usman, Muhammed; De Avelar, Silvania; Eglinton, Timothy

    2015-04-01

    Modern ocean sediments serve as the interface between the biosphere and the geosphere, play a key role in biogeochemical cycles and provide a window on how contemporary processes are written into the sedimentary record. Research over past decades has resulted in a wealth of information on the content and composition of organic matter in marine sediments, with ever-more sophisticated techniques continuing to yield information of greater detail and as an accelerating pace. However, there has been no attempt to synthesize this wealth of information. We are establishing a new database that incorporates information relevant to local, regional and global-scale assessment of the content, source and fate of organic materials accumulating in contemporary marine sediments. In the MOSAIC (Modern Ocean Sediment Archive and Inventory of Carbon) database, particular emphasis is placed on molecular and isotopic information, coupled with relevant contextual information (e.g., sedimentological properties) relevant to elucidating factors that influence the efficiency and nature of organic matter burial. The main features of MOSAIC include: (i) Emphasis on continental margin sediments as major loci of carbon burial, and as the interface between terrestrial and oceanic realms; (ii) Bulk to molecular-level organic geochemical properties and parameters, including concentration and isotopic compositions; (iii) Inclusion of extensive contextual data regarding the depositional setting, in particular with respect to sedimentological and redox characteristics. The ultimate goal is to create an open-access instrument, available on the web, to be utilized for research and education by the international community who can both contribute to, and interrogate the database. The submission will be accomplished by means of a pre-configured table available on the MOSAIC webpage. The information on the filled tables will be checked and eventually imported, via the Structural Query Language (SQL), into MOSAIC. MOSAIC is programmed with PostgreSQL, an open-source database management system. In order to locate geographically the data, each element/datum is associated to a latitude, longitude and depth, facilitating creation of a geospatial database which can be easily interfaced to a Geographic Information System (GIS). In order to make the database broadly accessible, a HTML-PHP language-based website will ultimately be created and linked to the database. Consulting the website will allow for both data visualization as well as export of data in txt format for utilization with common software solutions (e.g. ODV, Excel, Matlab, Python, Word, PPT, Illustrator…). In this very early stage, MOSAIC presently contains approximately 10000 analyses conducted on more than 1800 samples which were collected from over 1600 different geographical locations around the world. Through participation of the international research community, MOSAIC will rapidly develop into a rich archive and versatile tool for investigation of distribution and composition of organic matter accumulating in seafloor sediments. The present contribution will outline the structure of MOSAIC, provide examples of data output, and solicit feedback on desirable features to be included in the database and associated software tools.

  8. Near Real-Time Automatic Data Quality Controls for the AERONET Version 3 Database: An Introduction to the New Level 1.5V Aerosol Optical Depth Data Product

    NASA Astrophysics Data System (ADS)

    Giles, D. M.; Holben, B. N.; Smirnov, A.; Eck, T. F.; Slutsker, I.; Sorokin, M. G.; Espenak, F.; Schafer, J.; Sinyuk, A.

    2015-12-01

    The Aerosol Robotic Network (AERONET) has provided a database of aerosol optical depth (AOD) measured by surface-based Sun/sky radiometers for over 20 years. AERONET provides unscreened (Level 1.0) and automatically cloud cleared (Level 1.5) AOD in near real-time (NRT), while manually inspected quality assured (Level 2.0) AOD are available after instrument field deployment (Smirnov et al., 2000). The growing need for NRT quality controlled aerosol data has become increasingly important. Applications of AERONET NRT data include the satellite evaluation (e.g., MODIS, VIIRS, MISR, OMI), data synergism (e.g., MPLNET), verification of aerosol forecast models and reanalysis (e.g., GOCART, ICAP, NAAPS, MERRA), input to meteorological models (e.g., NCEP, ECMWF), and field campaign support (e.g., KORUS-AQ, ORACLES). In response to user needs for quality controlled NRT data sets, the new Version 3 (V3) Level 1.5V product was developed with similar quality controls as those applied by hand to the Version 2 (V2) Level 2.0 data set. The AERONET cloud screened (Level 1.5) NRT AOD database can be significantly impacted by data anomalies. The most significant data anomalies include AOD diurnal dependence due to contamination or obstruction of the sensor head windows, anomalous AOD spectral dependence due to problems with filter degradation, instrument gains, or non-linear changes in calibration, and abnormal changes in temperature sensitive wavelengths (e.g., 1020nm) in response to anomalous sensor head temperatures. Other less common AOD anomalies result from loose filters, uncorrected clock shifts, connection and electronic issues, and various solar eclipse episodes. Automatic quality control algorithms are applied to the new V3 Level 1.5 database to remove NRT AOD anomalies and produce the new AERONET V3 Level 1.5V AOD product. Results of the quality control algorithms are presented and the V3 Level 1.5V AOD database is compared to the V2 Level 2.0 AOD database.

  9. Historical seismometry database project: A comprehensive relational database for historical seismic records

    NASA Astrophysics Data System (ADS)

    Bono, Andrea

    2007-01-01

    The recovery and preservation of the patrimony made of the instrumental registrations regarding the historical earthquakes is with no doubt a subject of great interest. This attention, besides being purely historical, must necessarily be also scientific. In fact, the availability of a great amount of parametric information on the seismic activity in a given area is a doubtless help to the seismologic researcher's activities. In this article the project of the Sismos group of the National Institute of Geophysics and Volcanology of Rome new database is presented. In the structure of the new scheme the matured experience of five years of activity is summarized. We consider it useful for those who are approaching to "recovery and reprocess" computer based facilities. In the past years several attempts on Italian seismicity have followed each other. It has almost never been real databases. Some of them have had positive success because they were well considered and organized. In others it was limited in supplying lists of events with their relative hypocentral standards. What makes this project more interesting compared to the previous work is the completeness and the generality of the managed information. For example, it will be possible to view the hypocentral information regarding a given historical earthquake; it will be possible to research the seismograms in raster, digital or digitalized format, the information on times of arrival of the phases in the various stations, the instrumental standards and so on. The relational modern logic on which the archive is based, allows the carrying out of all these operations with little effort. The database described below will completely substitute Sismos' current data bank. Some of the organizational principles of this work are similar to those that inspire the database for the real-time monitoring of the seismicity in use in the principal offices of international research. A modern planning logic in a distinctly historical context is introduced. Following are the descriptions of the various planning phases, from the conceptual level to the physical implementation of the scheme. Each time principle instructions, rules, considerations of technical-scientific nature are highlighted that take to the final result: a vanguard relational scheme for historical data.

  10. New approaches in assessing food intake in epidemiology.

    PubMed

    Conrad, Johanna; Koch, Stefanie A J; Nöthlings, Ute

    2018-06-22

    A promising direction for improving dietary intake measurement in epidemiologic studies is the combination of short-term and long-term dietary assessment methods using statistical methods. Thereby, web-based instruments are particularly interesting as their application offers several potential advantages such as self-administration and a shorter completion time. The objective of this review is to provide an overview of new web-based short-term instruments and to describe their features. A number of web-based short-term dietary assessment tools for application in different countries and age-groups have been developed so far. Particular attention should be paid to the underlying database and the search function of the tool. Moreover, web-based instruments can improve the estimation of portion sizes by offering several options to the user. Web-based dietary assessment methods are associated with lower costs and reduced burden for participants and researchers, and show a comparable validity with traditional instruments. When there is a need for a web-based tool researcher should consider the adaptation of existing tools rather than developing new instruments. The combination of short-term and long-term instruments seems more feasible with the use of new technology.

  11. [Shoulder disability questionnaires: a systematic review].

    PubMed

    Fayad, F; Mace, Y; Lefevre-Colau, M M

    2005-07-01

    To identify all available shoulder disability questionnaires designed to measure physical functioning and to examine those with satisfactory clinimetric quality. We used the Medline database and the "Guide des outils de mesure de l'évaluation en médecine physique et de réadaptation" textbook to search for questionnaires. Analysis took into account the development methodology, clinimetric quality of the instruments and frequency of their utilization. We classified the instruments according to the International Classification of Functioning, Disability and Health. Thirty-eight instruments have been developed to measure disease-, shoulder- or upper extremity-specific outcome. Four scales assess upper-extremity disability and 3 others shoulder disability. We found 6 scales evaluating disability and shoulder pain, 7 scales measuring the quality of life in patients with various conditions of the shoulder, 14 scales combining objective and subjective measures, 2 pain scales and 2 unclassified scales. Older instruments developed before the advent of modern measurement development methodology usually combine objective and subjective measures. Recent instruments were designed with appropriate methodology. Most are self-administered questionnaires. Numerous shoulder outcome measure instruments are available. There is no "gold standard" for assessing shoulder function outcome in the general population.

  12. Assessing the social dimension of frailty in old age: A systematic review.

    PubMed

    Bessa, Bruno; Ribeiro, Oscar; Coelho, Tiago

    2018-06-18

    Different concepts of frailty have resulted in different assessment tools covering distinct dimensions. Despite the growing recognition that there is an association between frailty and social factors, there's a lack of clarity on what is being assessed in terms of "social aspects" of frailty. This paper provides a review of frailty assessment instruments (screening tools and severity measures) with a special focus on their social components. Systematic review of studies published in English between 2001 and March 2018 in the PubMed database using a combination of MeSH Terms and logical operators through inclusion and exclusion criteria. A total of 27 assessment tools including at least one social question were identified. Three instruments focuses exclusively on social frailty, whereas the weight of social dimensions in the other instruments ranges between 5% and 43%. Social activities, social support, social network, loneliness and living alone were the social concepts most represented by the social components of the various frailty instruments. Social components of frailty vary from instrument to instrument and cover the concepts of social isolation, loneliness, social network, social support and social participation. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Medical student attitudes towards older people: a critical review of quantitative measures.

    PubMed

    Wilson, Mark A G; Kurrle, Susan; Wilson, Ian

    2018-01-24

    Further research into medical student attitudes towards older people is important, and requires accurate and detailed evaluative methodology. The two objectives for this paper are: (1) From the literature, to critically review instruments of measure for medical student attitudes towards older people, and (2) To recommend the most appropriate quantitative instrument for future research into medical student attitudes towards older people. A SCOPUS and Ovid cross search was performed using the keywords Attitude and medical student and aged or older or elderly. This search was supplemented by manual searching, guided by citations in articles identified by the initial literature search, using the SCOPUS and PubMed databases. International studies quantifying medical student attitudes have demonstrated neutral to positive attitudes towards older people, using various instruments. The most commonly used instruments are the Ageing Semantic Differential (ASD) and the University of California Los Angeles Geriatric Attitudes Scale, with several other measures occasionally used. All instruments used to date have inherent weaknesses. A reliable and valid instrument with which to quantify modern medical student attitudes towards older people has not yet been developed. Adaptation of the ASD for contemporary usage is recommended.

  14. Health-related quality of life questionnaires in lung cancer trials: a systematic literature review

    PubMed Central

    2013-01-01

    Background Lung cancer is one of the leading causes of cancer deaths. Treatment goals are the relief of symptoms and the increase of overall survival. With the rising number of treatment alternatives, the need for comparable assessments of health-related quality of life (HRQoL) parameters grows. The aim of this paper was to identify and describe measurement instruments applied in lung cancer patients under drug therapy. Methods We conducted a systematic literature review at the beginning of 2011 using the electronic database Pubmed. Results A total of 43 studies were included in the review. About 17 different measurement instruments were identified, including 5 generic, 5 cancer-specific, 4 lung cancer-specific and 3 symptom-specific questionnaires. In 29 studies at least 2 instruments were used. In most cases these were cancer and lung cancer-specific ones. The most frequently used instruments are the EORTC QLQ-C30 and its lung cancer modules LC13 or LC17. Only 5 studies combined (lung) cancer-specific questionnaires with generic instruments. Conclusions The EORTC-C30 and EORTC-LC13 are the most frequently used health-related quality of life measurement instruments in pharmacological lung cancer trials. PMID:23680096

  15. Measuring the youth bullying experience: a systematic review of the psychometric properties of available instruments.

    PubMed

    Vessey, Judith; Strout, Tania D; DiFazio, Rachel L; Walker, Allison

    2014-12-01

    Bullying is a significant problem in schools and measuring this concept remains problematic. The purposes of this study were to (1) identify the published self-report measures developed to assess youth bullying; (2) evaluate their psychometric properties and instrument characteristics; and (3) evaluate the quality of identified psychometric papers evaluating youth bullying measures. A systematic review of the literature was conducted using 4 electronic databases. Data extraction and appraisal of identified instruments were completed using a standardized method and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Thirty-one articles describing 27 self-report instruments were evaluated in our analysis. Quality assessments ranged from 18% to 91%, with 6 papers reaching or exceeding a quality score of 75%. Limited evidence supporting the reliability, validity, and responsiveness of existing youth bullying measures was identified. Evidence supporting the psychometric soundness of the instruments identified was limited. Many measures were in early development and additional evaluation is necessary to validate their psychometric properties. A pool of instruments possesses acceptable initial psychometric dependability for selected assessment purposes. These findings have significant implications for assessing youth bullying and designing and evaluating school-based interventions. © 2014, American School Health Association.

  16. Estimating functional cognition in older adults using observational assessments of task performance in complex everyday activities: A systematic review and evaluation of measurement properties.

    PubMed

    Wesson, Jacqueline; Clemson, Lindy; Brodaty, Henry; Reppermund, Simone

    2016-09-01

    Functional cognition is a relatively new concept in assessment of older adults with mild cognitive impairment or dementia. Instruments need to be reliable and valid, hence we conducted a systematic review of observational assessments of task performance used to estimate functional cognition in this population. Two separate database searches were conducted: firstly to identify instruments; and secondly to identify studies reporting on the psychometric properties of the instruments. Studies were analysed using a published checklist and their quality reviewed according to specific published criteria. Clinical utility was reviewed and the information formulated into a best evidence synthesis. We found 21 instruments and included 58 studies reporting on measurement properties. The majority of studies were rated as being of fair methodological quality and the range of properties investigated was restricted. Most instruments had studies reporting on construct validity (hypothesis testing), none on content validity and there were few studies reporting on reliability. Overall the evidence on psychometric properties is lacking and there is an urgent need for further evaluation of instruments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Assessing Social Networks in Patients with Psychotic Disorders: A Systematic Review of Instruments

    PubMed Central

    Priebe, Stefan

    2015-01-01

    Background Evidence suggests that social networks of patients with psychotic disorders influence symptoms, quality of life and treatment outcomes. It is therefore important to assess social networks for which appropriate and preferably established instruments should be used. Aims To identify instruments assessing social networks in studies of patients with psychotic disorders and explore their properties. Method A systematic search of electronic databases was conducted to identify studies that used a measure of social networks in patients with psychotic disorders. Results Eight instruments were identified, all of which had been developed before 1991. They have been used in 65 studies (total N of patients = 8,522). They assess one or more aspects of social networks such as their size, structure, dimensionality and quality. Most instruments have various shortcomings, including questionable inter-rater and test-retest reliability. Conclusions The assessment of social networks in patients with psychotic disorders is characterized by a variety of approaches which may reflect the complexity of the construct. Further research on social networks in patients with psychotic disorders would benefit from advanced and more precise instruments using comparable definitions of and timescales for social networks across studies. PMID:26709513

  18. Tools to assess living with a chronic illness: A systematic review.

    PubMed

    Ambrosio, Leire; Portillo, Mari Carmen

    2018-05-16

    To analyse the currently available instruments to assess living with a chronic illness and related aspects. A review of the evidence was made using the databases: Medline, CINHAL, PsycINFO, Cochrane Library, Embase and Cuiden. The criteria that limited the search were: the language, English and / or Spanish and studies carried out in an adult population. Years of article publication were not used as a limit. A total of 16 instruments were identified and analysed that apparently measured the concept of living with a chronic illness and/or related aspects. According to the name of the instrument, four seemed to evaluate the concept of living with a chronic illness while the rest of the instruments evaluated aspects intrinsically related to the concept of "living with", such as attributes or the meaning of living with a chronic illness. Different instruments were identified to evaluate daily living for the chronically ill patient, as well as related aspects. According to this review, further validation studies are required in other populations and/or contexts in order to achieve valid and reliable instruments that could be used in clinical practice. Copyright © 2018 Elsevier España, S.L.U. All rights reserved.

  19. Organizational Capabilities for Integrating Care: A Review of Measurement Tools.

    PubMed

    Evans, Jenna M; Grudniewicz, Agnes; Baker, G Ross; Wodchis, Walter P

    2016-12-01

    The success of integrated care interventions is highly dependent on the internal and collective capabilities of the organizations in which they are implemented. Yet, organizational capabilities are rarely described, understood, or measured with sufficient depth and breadth in empirical studies or in practice. Assessing these capabilities can contribute to understanding why some integrated care interventions are more effective than others. We identified, organized, and assessed survey instruments that measure the internal and collective organizational capabilities required for integrated care delivery. We conducted an expert consultation and searched Medline and Google Scholar databases for survey instruments measuring factors outlined in the Context and Capabilities for Integrating Care Framework. A total of 58 instruments were included in the review and assessed based on their psychometric properties, practical considerations, and applicability to integrated care efforts. This study provides a bank of psychometrically sound instruments for describing and comparing organizational capabilities. Greater use of these instruments across integrated care interventions and studies can enhance standardized comparative analyses and inform change management. Further research is needed to build an evidence base for these instruments and to explore the associations between organizational capabilities and integrated care processes and outcomes. © The Author(s) 2016.

  20. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

Top