Sample records for database integrity monitor

  1. The Application of Lidar to Synthetic Vision System Integrity

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve

    2003-01-01

    One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.

  2. Remote monitoring of patients with implanted devices: data exchange and integration.

    PubMed

    Van der Velde, Enno T; Atsma, Douwe E; Foeken, Hylke; Witteman, Tom A; Hoekstra, Wybo H G J

    2013-06-01

    Remote follow-up of implanted implantable cardioverter defibrillators (ICDs) may offer a solution to the problem of overcrowded outpatient clinics, and may also be effective in detecting clinical events early. Data obtained from remote follow up systems, as developed by all major device companies, are stored in a central database system, operated and owned by the device company. A problem now arises that the patient's clinical information is partly stored in the local electronic health record (EHR) system in the hospital, and partly in the remote monitoring database, which may potentially result in patient safety issues. To address the requirement of integrating remote monitoring data in the local EHR, the Integrating the Healthcare Enterprise (IHE) Implantable Device Cardiac Observation (IDCO) profile has been developed. This IHE IDCO profile has been adapted by all major device companies. In our hospital, we have implemented the IHE IDCO profile to import data from the remote databases from two device vendors into the departmental Cardiology Information System (EPD-Vision). Data is exchanged via a HL7/XML communication protocol, as defined in the IHE IDCO profile. By implementing the IHE IDCO profile, we have been able to integrate the data from the remote monitoring databases in our local EHRs. It can be expected that remote monitoring systems will develop into dedicated monitoring and therapy platforms. Data retrieved from these systems should form an integral part of the electronic patient record as more and more out-patient clinic care will shift to personalized care provided at a distance, in other words at the patient's home.

  3. Real-Time Integrity Monitoring of Stored Geo-Spatial Data Using Forward-Looking Remote Sensing Technology

    NASA Technical Reports Server (NTRS)

    Young, Steven D.; Harrah, Steven D.; deHaag, Maarten Uijt

    2002-01-01

    Terrain Awareness and Warning Systems (TAWS) and Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data (e.g. terrain, obstacles, and/or features). As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. This lack of a quantifiable integrity level is one of the constraints that has limited certification and operational approval of TAWS/SVS to "advisory-only" systems for civil aviation. Previous work demonstrated the feasibility of using a real-time monitor to bound database integrity by using downward-looking remote sensing technology (i.e. radar altimeters). This paper describes an extension of the integrity monitor concept to include a forward-looking sensor to cover additional classes of terrain database faults and to reduce the exposure time associated with integrity threats. An operational concept is presented that combines established feature extraction techniques with a statistical assessment of similarity measures between the sensed and stored features using principles from classical detection theory. Finally, an implementation is presented that uses existing commercial-off-the-shelf weather radar sensor technology.

  4. Navigation integrity monitoring and obstacle detection for enhanced-vision systems

    NASA Astrophysics Data System (ADS)

    Korn, Bernd; Doehler, Hans-Ullrich; Hecker, Peter

    2001-08-01

    Typically, Enhanced Vision (EV) systems consist of two main parts, sensor vision and synthetic vision. Synthetic vision usually generates a virtual out-the-window view using databases and accurate navigation data, e. g. provided by differential GPS (DGPS). The reliability of the synthetic vision highly depends on both, the accuracy of the used database and the integrity of the navigation data. But especially in GPS based systems, the integrity of the navigation can't be guaranteed. Furthermore, only objects that are stored in the database can be displayed to the pilot. Consequently, unexpected obstacles are invisible and this might cause severe problems. Therefore, additional information has to be extracted from sensor data to overcome these problems. In particular, the sensor data analysis has to identify obstacles and has to monitor the integrity of databases and navigation. Furthermore, if a lack of integrity arises, navigation data, e.g. the relative position of runway and aircraft, has to be extracted directly from the sensor data. The main contribution of this paper is about the realization of these three sensor data analysis tasks within our EV system, which uses the HiVision 35 GHz MMW radar of EADS, Ulm as the primary EV sensor. For the integrity monitoring, objects extracted from radar images are registered with both database objects and objects (e. g. other aircrafts) transmitted via data link. This results in a classification into known and unknown radar image objects and consequently, in a validation of the integrity of database and navigation. Furthermore, special runway structures are searched for in the radar image where they should appear. The outcome of this runway check contributes to the integrity analysis, too. Concurrent to this investigation a radar image based navigation is performed without using neither precision navigation nor detailed database information to determine the aircraft's position relative to the runway. The performance of our approach is demonstrated with real data acquired during extensive flight tests to several airports in Northern Germany.

  5. Integrated cluster management at Manchester

    NASA Astrophysics Data System (ADS)

    McNab, Andrew; Forti, Alessandra

    2012-12-01

    We describe an integrated management system using third-party, open source components used in operating a large Tier-2 site for particle physics. This system tracks individual assets and records their attributes such as MAC and IP addresses; derives DNS and DHCP configurations from this database; creates each host's installation and re-configuration scripts; monitors the services on each host according to the records of what should be running; and cross references tickets with asset records and per-asset monitoring pages. In addition, scripts which detect problems and automatically remove hosts record these new states in the database which are available to operators immediately through the same interface as tickets and monitoring.

  6. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  7. Monitoring SLAC High Performance UNIX Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lettsome, Annette K.; /Bethune-Cookman Coll. /SLAC

    2005-12-15

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia.more » Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface.« less

  8. Ant-App-DB: a smart solution for monitoring arthropods activities, experimental data management and solar calculations without GPS in behavioral field studies.

    PubMed

    Ahmed, Zeeshan; Zeeshan, Saman; Fleischmann, Pauline; Rössler, Wolfgang; Dandekar, Thomas

    2014-01-01

    Field studies on arthropod ecology and behaviour require simple and robust monitoring tools, preferably with direct access to an integrated database. We have developed and here present a database tool allowing smart-phone based monitoring of arthropods. This smart phone application provides an easy solution to collect, manage and process the data in the field which has been a very difficult task for field biologists using traditional methods. To monitor our example species, the desert ant Cataglyphis fortis, we considered behavior, nest search runs, feeding habits and path segmentations including detailed information on solar position and azimuth calculation, ant orientation and time of day. For this we established a user friendly database system integrating the Ant-App-DB with a smart phone and tablet application, combining experimental data manipulation with data management and providing solar position and timing estimations without any GPS or GIS system. Moreover, the new desktop application Dataplus allows efficient data extraction and conversion from smart phone application to personal computers, for further ecological data analysis and sharing. All features, software code and database as well as Dataplus application are made available completely free of charge and sufficiently generic to be easily adapted to other field monitoring studies on arthropods or other migratory organisms. The software applications Ant-App-DB and Dataplus described here are developed using the Android SDK, Java, XML, C# and SQLite Database.

  9. Ant-App-DB: a smart solution for monitoring arthropods activities, experimental data management and solar calculations without GPS in behavioral field studies

    PubMed Central

    Ahmed, Zeeshan; Zeeshan, Saman; Fleischmann, Pauline; Rössler, Wolfgang; Dandekar, Thomas

    2015-01-01

    Field studies on arthropod ecology and behaviour require simple and robust monitoring tools, preferably with direct access to an integrated database. We have developed and here present a database tool allowing smart-phone based monitoring of arthropods. This smart phone application provides an easy solution to collect, manage and process the data in the field which has been a very difficult task for field biologists using traditional methods. To monitor our example species, the desert ant Cataglyphis fortis, we considered behavior, nest search runs, feeding habits and path segmentations including detailed information on solar position and azimuth calculation, ant orientation and time of day. For this we established a user friendly database system integrating the Ant-App-DB with a smart phone and tablet application, combining experimental data manipulation with data management and providing solar position and timing estimations without any GPS or GIS system. Moreover, the new desktop application Dataplus allows efficient data extraction and conversion from smart phone application to personal computers, for further ecological data analysis and sharing. All features, software code and database as well as Dataplus application are made available completely free of charge and sufficiently generic to be easily adapted to other field monitoring studies on arthropods or other migratory organisms. The software applications Ant-App-DB and Dataplus described here are developed using the Android SDK, Java, XML, C# and SQLite Database. PMID:25977753

  10. Using X-band Weather Radar Measurements to Monitor the Integrity of Digital Elevation Models for Synthetic Vision Systems

    NASA Technical Reports Server (NTRS)

    Young, Steve; UijtdeHaag, Maarten; Sayre, Jonathon

    2003-01-01

    Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data representing terrain, obstacles, and cultural features. As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. Further, updates to the databases may not be provided as changes occur. These issues limit the certification level and constrain the operational context of SVS for civil aviation. Previous work demonstrated the feasibility of using a realtime monitor to bound the integrity of Digital Elevation Models (DEMs) by using radar altimeter measurements during flight. This paper describes an extension of this concept to include X-band Weather Radar (WxR) measurements. This enables the monitor to detect additional classes of DEM errors and to reduce the exposure time associated with integrity threats. Feature extraction techniques are used along with a statistical assessment of similarity measures between the sensed and stored features that are detected. Recent flight-testing in the area around the Juneau, Alaska Airport (JNU) has resulted in a comprehensive set of sensor data that is being used to assess the feasibility of the proposed monitor technology. Initial results of this assessment are presented.

  11. 8.0 Integrating the effect of terrestrial ecosystem health and land use on the hydrology, habitat, and water quality of the Delaware River and estuary

    Treesearch

    Peter S. Murdoch; John L. Hom; Yude Pan; Jeffrey M. Fischer

    2008-01-01

    To complete the collaborative monitoring study of forested landscapes within the DRB, regional perspective on the cumulative effect of different disturbances on overall ecosystem health. This section describes two modeling activities used as integrating tools for the CEMRI database and a validation system that used nested river monitoring stations.

  12. [Computerized monitoring for integrated cervical screening. Rationale, methods and indicators of participation].

    PubMed

    Bucchi, L; Pierri, C; Caprara, L; Cortecchia, S; De Lillo, M; Bondi, A

    2003-02-01

    This paper presents a computerised system for the monitoring of integrated cervical screening, i.e. the integration of spontaneous Pap smear practice into organised screening. The general characteristics of the system are described, including background and rationale (integrated cervical screening in European countries, impact of integration on monitoring, decentralised organization of screening and levels of monitoring), general methods (definitions, sections, software description, and setting of application), and indicators of participation (distribution by time interval since previous Pap smear, distribution by screening sector--organised screening centres vs public and private clinical settings--, distribution by time interval between the last two Pap smears, and movement of women between the two screening sectors). Also, the paper reports the results of the application of these indicators in the general database of the Pathology Department of Imola Health District in northern Italy.

  13. MIMIC II: a massive temporal ICU patient database to support research in intelligent patient monitoring

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Lieu, C.; Raber, G.; Mark, R. G.

    2002-01-01

    Development and evaluation of Intensive Care Unit (ICU) decision-support systems would be greatly facilitated by the availability of a large-scale ICU patient database. Following our previous efforts with the MIMIC (Multi-parameter Intelligent Monitoring for Intensive Care) Database, we have leveraged advances in networking and storage technologies to develop a far more massive temporal database, MIMIC II. MIMIC II is an ongoing effort: data is continuously and prospectively archived from all ICU patients in our hospital. MIMIC II now consists of over 800 ICU patient records including over 120 gigabytes of data and is growing. A customized archiving system was used to store continuously up to four waveforms and 30 different parameters from ICU patient monitors. An integrated user-friendly relational database was developed for browsing of patients' clinical information (lab results, fluid balance, medications, nurses' progress notes). Based upon its unprecedented size and scope, MIMIC II will prove to be an important resource for intelligent patient monitoring research, and will support efforts in medical data mining and knowledge-discovery.

  14. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  15. [Computerised monitoring of integrated cervical screening. Indicators of diagnostic performance].

    PubMed

    Bucchi, L; Pierri, C; Amadori, A; Folicaldi, S; Ghidoni, D; Nannini, R; Bondi, A

    2003-12-01

    In a previous issue of this journal, we presented the background, rationale, general methods, and indicators of participation of a computerised system for the monitoring of integrated cervical screening, i.e. the integration of spontaneous Pap smear practice into organised screening. We also reported the results of the application of those indicators in the general database of the Pathology Department of Imola Health District in northern Italy. In the current paper, we present the rationale and definitions of indicators of diagnostic performance (total Pap smears and rate of unsatisfactory Pap smears, distribution by cytology class reported, rate of patients without timely follow-up, detection rate, positive predictive value, distribution of cytology classes reported by histology diagnosis, and distribution of cases of CIN and carcinoma registered by detection modality) as well as the results of their application in the same database as above.

  16. Security Controls in the Stockpoint Logistics Integrated Communications Environment (SPLICE).

    DTIC Science & Technology

    1985-03-01

    call programs as authorized after checks by the Terminal Management Subsystem on SAS databases . SAS overlays the TANDEM GUARDIAN operating system to...Security Access Profile database (SAP) and a query capability generating various security reports. SAS operates with the System Monitor (SMON) subsystem...system to DDN and other components. The first SAS component to be reviewed is the SAP database . SAP is organized into two types of files. Relational

  17. IDESSA: An Integrative Decision Support System for Sustainable Rangeland Management in Southern African Savannas

    NASA Astrophysics Data System (ADS)

    Meyer, Hanna; Authmann, Christian; Dreber, Niels; Hess, Bastian; Kellner, Klaus; Morgenthal, Theunis; Nauss, Thomas; Seeger, Bernhard; Tsvuura, Zivanai; Wiegand, Kerstin

    2017-04-01

    Bush encroachment is a syndrome of land degradation that occurs in many savannas including those of southern Africa. The increase in density, cover or biomass of woody vegetation often has negative effects on a range of ecosystem functions and services, which are hardly reversible. However, despite its importance, neither the causes of bush encroachment, nor the consequences of different resource management strategies to combat or mitigate related shifts in savanna states are fully understood. The project "IDESSA" (An Integrative Decision Support System for Sustainable Rangeland Management in Southern African Savannas) aims to improve the understanding of the complex interplays between land use, climate patterns and vegetation dynamics and to implement an integrative monitoring and decision-support system for the sustainable management of different savanna types. For this purpose, IDESSA follows an innovative approach that integrates local knowledge, botanical surveys, remote-sensing and machine-learning based time-series of atmospheric and land-cover dynamics, spatially explicit simulation modeling and analytical database management. The integration of the heterogeneous data will be implemented in a user oriented database infrastructure and scientific workflow system. Accessible via web-based interfaces, this database and analysis system will allow scientists to manage and analyze monitoring data and scenario computations, as well as allow stakeholders (e. g. land users, policy makers) to retrieve current ecosystem information and seasonal outlooks. We present the concept of the project and show preliminary results of the realization steps towards the integrative savanna management and decision-support system.

  18. Flight Test Results of a Synthetic Vision Elevation Database Integrity Monitor

    NASA Technical Reports Server (NTRS)

    deHaag, Maarten Uijt; Sayre, Jonathon; Campbell, Jacob; Young, Steve; Gray, Robert

    2001-01-01

    This paper discusses the flight test results of a real-time Digital Elevation Model (DEM) integrity monitor for Civil Aviation applications. Providing pilots with Synthetic Vision (SV) displays containing terrain information has the potential to improve flight safety by improving situational awareness and thereby reducing the likelihood of Controlled Flight Into Terrain (CFIT). Utilization of DEMs, such as the digital terrain elevation data (DTED), requires a DEM integrity check and timely integrity alerts to the pilots when used for flight-critical terrain-displays, otherwise the DEM may provide hazardous misleading terrain information. The discussed integrity monitor checks the consistency between a terrain elevation profile synthesized from sensor information, and the profile given in the DEM. The synthesized profile is derived from DGPS and radar altimeter measurements. DEMs of various spatial resolutions are used to illustrate the dependency of the integrity monitor s performance on the DEMs spatial resolution. The paper will give a description of proposed integrity algorithms, the flight test setup, and the results of a flight test performed at the Ohio University airport and in the vicinity of Asheville, NC.

  19. Remote monitoring of cardiovascular implanted electronic devices: a paradigm shift for the 21st century.

    PubMed

    Cronin, Edmond M; Varma, Niraj

    2012-07-01

    Traditional follow-up of cardiac implantable electronic devices involves the intermittent download of largely nonactionable data. Remote monitoring represents a paradigm shift from episodic office-based follow-up to continuous monitoring of device performance and patient and disease state. This lessens device clinical burden and may also lead to cost savings, although data on economic impact are only beginning to emerge. Remote monitoring technology has the potential to improve the outcomes through earlier detection of arrhythmias and compromised device integrity, and possibly predict heart failure hospitalizations through integration of heart failure diagnostics and hemodynamic monitors. Remote monitoring platforms are also huge databases of patients and devices, offering unprecedented opportunities to investigate real-world outcomes. Here, the current status of the field is described and future directions are predicted.

  20. Chesapeake Bay Program Water Quality Database

    EPA Pesticide Factsheets

    The Chesapeake Information Management System (CIMS), designed in 1996, is an integrated, accessible information management system for the Chesapeake Bay Region. CIMS is an organized, distributed library of information and software tools designed to increase basin-wide public access to Chesapeake Bay information. The information delivered by CIMS includes technical and public information, educational material, environmental indicators, policy documents, and scientific data. Through the use of relational databases, web-based programming, and web-based GIS a large number of Internet resources have been established. These resources include multiple distributed on-line databases, on-demand graphing and mapping of environmental data, and geographic searching tools for environmental information. Baseline monitoring data, summarized data and environmental indicators that document ecosystem status and trends, confirm linkages between water quality, habitat quality and abundance, and the distribution and integrity of biological populations are also available. One of the major features of the CIMS network is the Chesapeake Bay Program's Data Hub, providing users access to a suite of long- term water quality and living resources databases. Chesapeake Bay mainstem and tidal tributary water quality, benthic macroinvertebrates, toxics, plankton, and fluorescence data can be obtained for a network of over 800 monitoring stations.

  1. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation ofmore » the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.« less

  2. Database of episode-integrated solar energetic proton fluences

    NASA Astrophysics Data System (ADS)

    Robinson, Zachary D.; Adams, James H.; Xapsos, Michael A.; Stauffer, Craig A.

    2018-04-01

    A new database of proton episode-integrated fluences is described. This database contains data from two different instruments on multiple satellites. The data are from instruments on the Interplanetary Monitoring Platform-8 (IMP8) and the Geostationary Operational Environmental Satellites (GOES) series. A method to normalize one set of data to one another is presented to create a seamless database spanning 1973 to 2016. A discussion of some of the characteristics that episodes exhibit is presented, including episode duration and number of peaks. As an example of what can be understood about episodes, the July 4, 2012 episode is examined in detail. The coronal mass ejections and solar flares that caused many of the fluctuations of the proton flux seen at Earth are associated with peaks in the proton flux during this episode. The reasoning for each choice is laid out to provide a reference for how CME and solar flares associations are made.

  3. R2 Water Quality Portal Monitoring Stations

    EPA Pesticide Factsheets

    The Water Quality Data Portal (WQP) provides an easy way to access data stored in various large water quality databases. The WQP provides various input parameters on the form including location, site, sampling, and date parameters to filter and customize the returned results. The The Water Quality Portal (WQP) is a cooperative service sponsored by the United States Geological Survey (USGS), the Environmental Protection Agency (EPA) and the National Water Quality Monitoring Council (NWQMC) that integrates publicly available water quality data from the USGS National Water Information System (NWIS) the EPA STOrage and RETrieval (STORET) Data Warehouse, and the USDA ARS Sustaining The Earth??s Watersheds - Agricultural Research Database System (STEWARDS).

  4. YUCSA: A CLIPS expert database system to monitor academic performance

    NASA Technical Reports Server (NTRS)

    Toptsis, Anestis A.; Ho, Frankie; Leindekar, Milton; Foon, Debra Low; Carbonaro, Mike

    1991-01-01

    The York University CLIPS Student Administrator (YUCSA), an expert database system implemented in C Language Integrated Processing System (CLIPS), for monitoring the academic performance of undergraduate students at York University, is discussed. The expert system component in the system has already been implemented for two major departments, and it is under testing and enhancement for more departments. Also, more elaborate user interfaces are under development. We describe the design and implementation of the system, problems encountered, and immediate future plans. The system has excellent maintainability and it is very efficient, taking less than one minute to complete an assessment of one student.

  5. Integrated technologies for solid waste bin monitoring system.

    PubMed

    Arebey, Maher; Hannan, M A; Basri, Hassan; Begum, R A; Abdullah, Huda

    2011-06-01

    The integration of communication technologies such as radio frequency identification (RFID), global positioning system (GPS), general packet radio system (GPRS), and geographic information system (GIS) with a camera are constructed for solid waste monitoring system. The aim is to improve the way of responding to customer's inquiry and emergency cases and estimate the solid waste amount without any involvement of the truck driver. The proposed system consists of RFID tag mounted on the bin, RFID reader as in truck, GPRS/GSM as web server, and GIS as map server, database server, and control server. The tracking devices mounted in the trucks collect location information in real time via the GPS. This information is transferred continuously through GPRS to a central database. The users are able to view the current location of each truck in the collection stage via a web-based application and thereby manage the fleet. The trucks positions and trash bin information are displayed on a digital map, which is made available by a map server. Thus, the solid waste of the bin and the truck are being monitored using the developed system.

  6. Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system.

    PubMed

    Hannan, M A; Arebey, Maher; Begum, R A; Basri, Hassan

    2011-12-01

    This paper deals with a system of integration of Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system. RFID, GPS, GPRS and GIS along with camera technologies have been integrated and developed the bin and truck intelligent monitoring system. A new kind of integrated theoretical framework, hardware architecture and interface algorithm has been introduced between the technologies for the successful implementation of the proposed system. In this system, bin and truck database have been developed such a way that the information of bin and truck ID, date and time of waste collection, bin status, amount of waste and bin and truck GPS coordinates etc. are complied and stored for monitoring and management activities. The results showed that the real-time image processing, histogram analysis, waste estimation and other bin information have been displayed in the GUI of the monitoring system. The real-time test and experimental results showed that the performance of the developed system was stable and satisfied the monitoring system with high practicability and validity. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2017-04-01

    This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.

  8. 76 FR 63288 - Notice of Proposed Information Collection Requests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-12

    ... ED to monitor CSP grant performance and analyze data related to accountability for academic performance, financial integrity, and program effectiveness. Copies of the proposed information collection... database of current CSP-funded charter schools and award amounts; ED merges performance information...

  9. Flight Testing an Integrated Synthetic Vision System

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Arthur, Jarvis J., III; Bailey, Randall E.; Prinzel, Lawrence J., III

    2005-01-01

    NASA's Synthetic Vision Systems (SVS) project is developing technologies with practical applications to eliminate low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced pathway guidance for transport aircraft. The SVS concept being developed at NASA encompasses the integration of tactical and strategic Synthetic Vision Display Concepts (SVDC) with Runway Incursion Prevention System (RIPS) alerting and display concepts, real-time terrain database integrity monitoring equipment (DIME), and Enhanced Vision Systems (EVS) and/or improved Weather Radar for real-time object detection and database integrity monitoring. A flight test evaluation was jointly conducted (in July and August 2004) by NASA Langley Research Center and an industry partner team under NASA's Aviation Safety and Security, Synthetic Vision System project. A Gulfstream GV aircraft was flown over a 3-week period in the Reno/Tahoe International Airport (NV) local area and an additional 3-week period in the Wallops Flight Facility (VA) local area to evaluate integrated Synthetic Vision System concepts. The enabling technologies (RIPS, EVS and DIME) were integrated into the larger SVS concept design. This paper presents experimental methods and the high level results of this flight test.

  10. Development of an Intelligent Monitoring System for Geological Carbon Sequestration (GCS) Systems

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Jeong, H.; Xu, W.; Hovorka, S. D.; Zhu, T.; Templeton, T.; Arctur, D. K.

    2016-12-01

    To provide stakeholders timely evidence that GCS repositories are operating safely and efficiently requires integrated monitoring to assess the performance of the storage reservoir as the CO2 plume moves within it. As a result, GCS projects can be data intensive, as a result of proliferation of digital instrumentation and smart-sensing technologies. GCS projects are also resource intensive, often requiring multidisciplinary teams performing different monitoring, verification, and accounting (MVA) tasks throughout the lifecycle of a project to ensure secure containment of injected CO2. How to correlate anomaly detected by a certain sensor to events observed by other devices to verify leakage incidents? How to optimally allocate resources for task-oriented monitoring if reservoir integrity is in question? These are issues that warrant further investigation before real integration can take place. In this work, we are building a web-based, data integration, assimilation, and learning framework for geologic carbon sequestration projects (DIAL-GCS). DIAL-GCS will be an intelligent monitoring system (IMS) for automating GCS closed-loop management by leveraging recent developments in high-throughput database, complex event processing, data assimilation, and machine learning technologies. Results will be demonstrated using realistic data and model derived from a GCS site.

  11. A development and integration of database code-system with a compilation of comparator, k0 and absolute methods for INAA using microsoft access

    NASA Astrophysics Data System (ADS)

    Hoh, Siew Sin; Rapie, Nurul Nadiah; Lim, Edwin Suh Wen; Tan, Chun Yuan; Yavar, Alireza; Sarmani, Sukiman; Majid, Amran Ab.; Khoo, Kok Siong

    2013-05-01

    Instrumental Neutron Activation Analysis (INAA) is often used to determine and calculate the elemental concentrations of a sample at The National University of Malaysia (UKM) typically in Nuclear Science Programme, Faculty of Science and Technology. The objective of this study was to develop a database code-system based on Microsoft Access 2010 which could help the INAA users to choose either comparator method, k0-method or absolute method for calculating the elemental concentrations of a sample. This study also integrated k0data, Com-INAA, k0Concent, k0-Westcott and Abs-INAA to execute and complete the ECC-UKM database code-system. After the integration, a study was conducted to test the effectiveness of the ECC-UKM database code-system by comparing the concentrations between the experiments and the code-systems. 'Triple Bare Monitor' Zr-Au and Cr-Mo-Au were used in k0Concent, k0-Westcott and Abs-INAA code-systems as monitors to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration were net peak area (Np), measurement time (tm), irradiation time (tirr), k-factor (k), thermal to epithermal neutron flux ratio (f), parameters of the neutron flux distribution epithermal (α) and detection efficiency (ɛp). For Com-INAA code-system, certified reference material IAEA-375 Soil was used to calculate the concentrations of elements in a sample. Other CRM and SRM were also used in this database codesystem. Later, a verification process to examine the effectiveness of the Abs-INAA code-system was carried out by comparing the sample concentrations between the code-system and the experiment. The results of the experimental concentration values of ECC-UKM database code-system were performed with good accuracy.

  12. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  13. Design of wideband solar ultraviolet radiation intensity monitoring and control system

    NASA Astrophysics Data System (ADS)

    Ye, Linmao; Wu, Zhigang; Li, Yusheng; Yu, Guohe; Jin, Qi

    2009-08-01

    According to the principle of SCM (Single Chip Microcomputer) and computer communication technique, the system is composed of chips such as ATML89C51, ADL0809, integrated circuit and sensors for UV radiation, which is designed for monitoring and controlling the UV index. This system can automatically collect the UV index data, analyze and check the history database, research the law of UV radiation in the region.

  14. Electrochemical Impedance Sensors for Monitoring Trace Amounts of NO3 in Selected Growing Media.

    PubMed

    Ghaffari, Seyed Alireza; Caron, William-O; Loubier, Mathilde; Normandeau, Charles-O; Viens, Jeff; Lamhamedi, Mohammed S; Gosselin, Benoit; Messaddeq, Younes

    2015-07-21

    With the advent of smart cities and big data, precision agriculture allows the feeding of sensor data into online databases for continuous crop monitoring, production optimization, and data storage. This paper describes a low-cost, compact, and scalable nitrate sensor based on electrochemical impedance spectroscopy for monitoring trace amounts of NO3- in selected growing media. The nitrate sensor can be integrated to conventional microelectronics to perform online nitrate sensing continuously over a wide concentration range from 0.1 ppm to 100 ppm, with a response time of about 1 min, and feed data into a database for storage and analysis. The paper describes the structural design, the Nyquist impedance response, the measurement sensitivity and accuracy, and the field testing of the nitrate sensor performed within tree nursery settings under ISO/IEC 17025 certifications.

  15. Electrochemical Impedance Sensors for Monitoring Trace Amounts of NO3 in Selected Growing Media

    PubMed Central

    Ghaffari, Seyed Alireza; Caron, William-O.; Loubier, Mathilde; Normandeau, Charles-O.; Viens, Jeff; Lamhamedi, Mohammed S.; Gosselin, Benoit; Messaddeq, Younes

    2015-01-01

    With the advent of smart cities and big data, precision agriculture allows the feeding of sensor data into online databases for continuous crop monitoring, production optimization, and data storage. This paper describes a low-cost, compact, and scalable nitrate sensor based on electrochemical impedance spectroscopy for monitoring trace amounts of NO3− in selected growing media. The nitrate sensor can be integrated to conventional microelectronics to perform online nitrate sensing continuously over a wide concentration range from 0.1 ppm to 100 ppm, with a response time of about 1 min, and feed data into a database for storage and analysis. The paper describes the structural design, the Nyquist impedance response, the measurement sensitivity and accuracy, and the field testing of the nitrate sensor performed within tree nursery settings under ISO/IEC 17025 certifications. PMID:26197322

  16. Utilizing Non-Contact Stress Measurement System (NSMS) as a Health Monitor

    NASA Technical Reports Server (NTRS)

    Hayes, Terry; Hayes, Bryan; Bynum, Ken

    2011-01-01

    Continuously monitor all 156 blades throughout the entire operating envelope without adversely affecting tunnel conditions or compromise compressor shell integrity, Calculate dynamic response and identify the frequency/mode to determine individual blade deflection amplitudes, natural frequencies, phase, and damping (Q), Log static deflection to build a database of deflection values at certain compressor conditions to use as basis for real-time online Blade Stack monitor, Monitor for stall, surge, flutter, and blade damage, Operate with limited user input, low maintenance cost, safe illumination of probes, easy probe replacement, and require little or no access to compressor.

  17. Homeland Security 2002: Evolving the Homeland Defense Infrastructure. Executive Summary Report (Conference Proceedings June 25 - 26, 2002) Volume 1, No. 2)

    DTIC Science & Technology

    2002-09-01

    ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Egov 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING / MONITORING...initiatives. The federal government has 55 databases that deal with security threats, but inter- agency access depends on establishing agreements through...which that information can be shared. True cooperation also will require government -wide commitment to enterprise architecture, integrated

  18. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    NASA Astrophysics Data System (ADS)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.

  19. Geodiametris: an integrated geoinformatic approach for monitoring land pollution from the disposal of olive oil mill wastes

    NASA Astrophysics Data System (ADS)

    Alexakis, Dimitrios D.; Sarris, Apostolos; Papadopoulos, Nikos; Soupios, Pantelis; Doula, Maria; Cavvadias, Victor

    2014-08-01

    The olive-oil industry is one of the most important sectors of agricultural production in Greece, which is the third in olive-oil production country worldwide. Olive oil mill wastes (OOMW) constitute a major factor in pollution in olivegrowing regions and an important problem to be solved for the agricultural industry. The olive-oil mill wastes are normally deposited at tanks, or directly in the soil or even on adjacent torrents, rivers and lakes posing a high risk to the environmental pollution and the community health. GEODIAMETRIS project aspires to develop integrated geoinformatic methodologies for performing monitoring of land pollution from the disposal of OOMW in the island of Crete -Greece. These methodologies integrate GPS surveys, satellite remote sensing and risk assessment analysis in GIS environment, application of in situ and laboratory geophysical methodologies as well as soil and water physicochemical analysis. Concerning project's preliminary results, all the operating OOMW areas located in Crete have been already registered through extensive GPS field campaigns. Their spatial and attribute information has been stored in an integrated GIS database and an overall OOMW spectral signature database has been constructed through the analysis of multi-temporal Landsat-8 OLI satellite images. In addition, a specific OOMW area located in Alikianos village (Chania-Crete) has been selected as one of the main case study areas. Various geophysical methodologies, such as Electrical Resistivity Tomography, Induced Polarization, multifrequency electromagnetic, Self Potential measurements and Ground Penetrating Radar have been already implemented. Soil as well as liquid samples have been collected for performing physico-chemical analysis. The preliminary results have already contributed to the gradual development of an integrated environmental monitoring tool for studying and understanding environmental degradation from the disposal of OOMW.

  20. PCEIS - THE PACIFIC COAST ECOSYSTEM INFORMATION SYSTEM, CHANGING THE WAY SCIENTISTS VIEW THE NATURAL HISTORY OF SPECIES

    EPA Science Inventory

    The Pacific Coast Ecosystem Information System (PCEIS) is a database that provides biological, ecological and geospatial information for over 8100 species from Alaska to Baja. PCEIS goes beyond capturing species’ taxonomic information by integrating monitoring information from Co...

  1. [Design and implementation of supply security monitoring and analysis system for Chinese patent medicines supply in national essential medicines].

    PubMed

    Wang, Hui; Zhang, Xiao-Bo; Huang, Lu-Qi; Guo, Lan-Ping; Wang, Ling; Zhao, Yu-Ping; Yang, Guang

    2017-11-01

    The supply of Chinese patent medicine is influenced by the price of raw materials (Chinese herbal medicines) and the stock of resources. On the one hand, raw material prices show cyclical volatility or even irreversible soaring, making the price of Chinese patent medicine is not stable or even the highest cost of hanging upside down. On the other hand, due to lack of resources or disable some of the proprietary Chinese medicine was forced to stop production. Based on the micro-service architecture and Redis cluster deployment Based on the micro-service architecture and Redis cluster deployment, the supply security monitoring and analysis system for Chinese patent medicines in national essential medicines has realized the dynamic monitoring and intelligence warning of herbs and Chinese patent medicine by connecting and integrating the database of Chinese medicine resources, the dynamic monitoring system of traditional Chinese medicine resources and the basic medicine database of Chinese patent medicine. Copyright© by the Chinese Pharmaceutical Association.

  2. Wearable Health Monitoring Systems

    NASA Technical Reports Server (NTRS)

    Bell, John

    2015-01-01

    The shrinking size and weight of electronic circuitry has given rise to a new generation of smart clothing that enables biological data to be measured and transmitted. As the variation in the number and type of deployable devices and sensors increases, technology must allow their seamless integration so they can be electrically powered, operated, and recharged over a digital pathway. Nyx Illuminated Clothing Company has developed a lightweight health monitoring system that integrates medical sensors, electrodes, electrical connections, circuits, and a power supply into a single wearable assembly. The system is comfortable, bendable in three dimensions, durable, waterproof, and washable. The innovation will allow astronaut health monitoring in a variety of real-time scenarios, with data stored in digital memory for later use in a medical database. Potential commercial uses are numerous, as the technology enables medical personnel to noninvasively monitor patient vital signs in a multitude of health care settings and applications.

  3. Database assessment of CMIP5 and hydrological models to determine flood risk areas

    NASA Astrophysics Data System (ADS)

    Limlahapun, Ponthip; Fukui, Hiromichi

    2016-11-01

    Solutions for water-related disasters may not be solved with a single scientific method. Based on this premise, we involved logic conceptions, associate sequential result amongst models, and database applications attempting to analyse historical and future scenarios in the context of flooding. The three main models used in this study are (1) the fifth phase of the Coupled Model Intercomparison Project (CMIP5) to derive precipitation; (2) the Integrated Flood Analysis System (IFAS) to extract amount of discharge; and (3) the Hydrologic Engineering Center (HEC) model to generate inundated areas. This research notably focused on integrating data regardless of system-design complexity, and database approaches are significantly flexible, manageable, and well-supported for system data transfer, which makes them suitable for monitoring a flood. The outcome of flood map together with real-time stream data can help local communities identify areas at-risk of flooding in advance.

  4. The evolution of a health hazard assessment database management system for military weapons, equipment, and materiel.

    PubMed

    Murnyak, George R; Spencer, Clark O; Chaney, Ann E; Roberts, Welford C

    2002-04-01

    During the 1970s, the Army health hazard assessment (HHA) process developed as a medical program to minimize hazards in military materiel during the development process. The HHA Program characterizes health hazards that soldiers and civilians may encounter as they interact with military weapons and equipment. Thus, it is a resource for medical planners and advisors to use that can identify and estimate potential hazards that soldiers may encounter as they train and conduct missions. The U.S. Army Center for Health Promotion and Preventive Medicine administers the program, which is integrated with the Army's Manpower and Personnel Integration program. As the HHA Program has matured, an electronic database has been developed to record and monitor the health hazards associated with military equipment and systems. The current database tracks the results of HHAs and provides reporting designed to assist the HHA Program manager in daily activities.

  5. USGS Integration of New Science and Technology, Appendix A

    USGS Publications Warehouse

    Brey, Marybeth; Knights, Brent C.; Cupp, Aaron R.; Amberg, Jon J.; Chapman, Duane C.; Calfee, Robin D.; Duncker, James J.

    2017-01-01

    This product summarizes the USGS plans for integration of new science and technology into Asian Carp control efforts for 2017. This includes the 1) implementation and evaluation of new tactics and behavioral information for monitoring, surveillance, control and containment; 2) understanding behavior and reproduction of Asian carp in established and emerging populations to inform deterrent deployment, rapid response, and removal efforts; and 3) development and evaluation of databases, decision support tools and performance measures.

  6. Flight Test Evaluation of Situation Awareness Benefits of Integrated Synthetic Vision System Technology f or Commercial Aircraft

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kramer, Lynda J.; Arthur, Jarvis J., III

    2005-01-01

    Research was conducted onboard a Gulfstream G-V aircraft to evaluate integrated Synthetic Vision System concepts during flight tests over a 6-week period at the Wallops Flight Facility and Reno/Tahoe International Airport. The NASA Synthetic Vision System incorporates database integrity monitoring, runway incursion prevention alerting, surface maps, enhanced vision sensors, and advanced pathway guidance and synthetic terrain presentation. The paper details the goals and objectives of the flight test with a focus on the situation awareness benefits of integrating synthetic vision system enabling technologies for commercial aircraft.

  7. Experience with Multi-Tier Grid MySQL Database Service Resiliency at BNL

    NASA Astrophysics Data System (ADS)

    Wlodek, Tomasz; Ernst, Michael; Hover, John; Katramatos, Dimitrios; Packard, Jay; Smirnov, Yuri; Yu, Dantong

    2011-12-01

    We describe the use of F5's BIG-IP smart switch technology (3600 Series and Local Traffic Manager v9.0) to provide load balancing and automatic fail-over to multiple Grid services (GUMS, VOMS) and their associated back-end MySQL databases. This resiliency is introduced in front of the external application servers and also for the back-end database systems, which is what makes it "multi-tier". The combination of solutions chosen to ensure high availability of the services, in particular the database replication and fail-over mechanism, are discussed in detail. The paper explains the design and configuration of the overall system, including virtual servers, machine pools, and health monitors (which govern routing), as well as the master-slave database scheme and fail-over policies and procedures. Pre-deployment planning and stress testing will be outlined. Integration of the systems with our Nagios-based facility monitoring and alerting is also described. And application characteristics of GUMS and VOMS which enable effective clustering will be explained. We then summarize our practical experiences and real-world scenarios resulting from operating a major US Grid center, and assess the applicability of our approach to other Grid services in the future.

  8. Information technology model for evaluating emergency medicine teaching

    NASA Astrophysics Data System (ADS)

    Vorbach, James; Ryan, James

    1996-02-01

    This paper describes work in progress to develop an Information Technology (IT) model and supporting information system for the evaluation of clinical teaching in the Emergency Medicine (EM) Department of North Shore University Hospital. In the academic hospital setting student physicians, i.e. residents, and faculty function daily in their dual roles as teachers and students respectively, and as health care providers. Databases exist that are used to evaluate both groups in either academic or clinical performance, but rarely has this information been integrated to analyze the relationship between academic performance and the ability to care for patients. The goal of the IT model is to improve the quality of teaching of EM physicians by enabling the development of integrable metrics for faculty and resident evaluation. The IT model will include (1) methods for tracking residents in order to develop experimental databases; (2) methods to integrate lecture evaluation, clinical performance, resident evaluation, and quality assurance databases; and (3) a patient flow system to monitor patient rooms and the waiting area in the Emergency Medicine Department, to record and display status of medical orders, and to collect data for analyses.

  9. An "EAR" on environmental surveillance and monitoring: A ...

    EPA Pesticide Factsheets

    Current environmental monitoring approaches focus primarily on chemical occurrence. However, based on chemical concentration alone, it can be difficult to identify which compounds may be of toxicological concern for prioritization for further monitoring or management. This can be problematic because toxicological characterization is lacking for many emerging contaminants. New sources of high throughput screening data like the ToxCast™ database, which contains data for over 9,000 compounds screened through up to 1,100 assays, are now available. Integrated analysis of chemical occurrence data with HTS data offers new opportunities to prioritize chemicals, sites, or biological effects for further investigation based on concentrations detected in the environment linked to relative potencies in pathway-based bioassays. As a case study, chemical occurrence data from a 2012 study in the Great Lakes Basin along with the ToxCast™ effects database were used to calculate exposure-activity ratios (EARs) as a prioritization tool. Technical considerations of data processing and use of the ToxCast™ database are presented and discussed. EAR prioritization identified multiple sites, biological pathways, and chemicals that warrant further investigation. Biological pathways were then linked to adverse outcome pathways to identify potential adverse outcomes and biomarkers for use in subsequent monitoring efforts. Anthropogenic contaminants are frequently reported in environm

  10. A Mobile Food Record For Integrated Dietary Assessment*

    PubMed Central

    Ahmad, Ziad; Kerr, Deborah A.; Bosch, Marc; Boushey, Carol J.; Delp, Edward J.; Khanna, Nitin; Zhu, Fengqing

    2017-01-01

    This paper presents an integrated dietary assessment system based on food image analysis that uses mobile devices or smartphones. We describe two components of our integrated system: a mobile application and an image-based food nutrient database that is connected to the mobile application. An easy-to-use mobile application user interface is described that was designed based on user preferences as well as the requirements of the image analysis methods. The user interface is validated by user feedback collected from several studies. Food nutrient and image databases are also described which facilitates image-based dietary assessment and enable dietitians and other healthcare professionals to monitor patients dietary intake in real-time. The system has been tested and validated in several user studies involving more than 500 users who took more than 60,000 food images under controlled and community-dwelling conditions. PMID:28691119

  11. Managing troubled data: Coastal data partnerships smooth data integration

    USGS Publications Warehouse

    Hale, S.S.; Hale, Miglarese A.; Bradley, M.P.; Belton, T.J.; Cooper, L.D.; Frame, M.T.; Friel, C.A.; Harwell, L.M.; King, R.E.; Michener, W.K.; Nicolson, D.T.; Peterjohn, B.G.

    2003-01-01

    Understanding the ecology, condition, and changes of coastal areas requires data from many sources. Broad-scale and long-term ecological questions, such as global climate change, biodiversity, and cumulative impacts of human activities, must be addressed with databases that integrate data from several different research and monitoring programs. Various barriers, including widely differing data formats, codes, directories, systems, and metadata used by individual programs, make such integration troublesome. Coastal data partnerships, by helping overcome technical, social, and organizational barriers, can lead to a better understanding of environmental issues, and may enable better management decisions. Characteristics of successful data partnerships include a common need for shared data, strong collaborative leadership, committed partners willing to invest in the partnership, and clear agreements on data standards and data policy. Emerging data and metadata standards that become widely accepted are crucial. New information technology is making it easier to exchange and integrate data. Data partnerships allow us to create broader databases than would be possible for any one organization to create by itself.

  12. Managing troubled data: coastal data partnerships smooth data integration.

    PubMed

    Hale, Stephen S; Miglarese, Anne Hale; Bradley, M Patricia; Belton, Thomas J; Cooper, Larry D; Frame, Michael T; Friel, Christopher A; Harwell, Linda M; King, Robert E; Michener, William K; Nicolson, David T; Peterjohn, Bruce G

    2003-01-01

    Understanding the ecology, condition, and changes of coastal areas requires data from many sources. Broad-scale and long-term ecological questions, such as global climate change, biodiversity, and cumulative impacts of human activities, must be addressed with databases that integrate data from several different research and monitoring programs. Various barriers, including widely differing data formats, codes, directories, systems, and metadata used by individual programs, make such integration troublesome. Coastal data partnerships, by helping overcome technical, social, and organizational barriers, can lead to a better understanding of environmental issues, and may enable better management decisions. Characteristics of successful data partnerships include a common need for shared data, strong collaborative leadership, committed partners willing to invest in the partnership, and clear agreements on data standards and data policy. Emerging data and metadata standards that become widely accepted are crucial. New information technology is making it easier to exchange and integrate data. Data partnerships allow us to create broader databases than would be possible for any one organization to create by itself.

  13. Regular monitoring of breast-feeding rates: feasible and sustainable. The Emilia-Romagna experience.

    PubMed

    Di Mario, Simona; Borsari, Silvana; Verdini, Eleonora; Battaglia, Sergio; Cisbani, Luca; Sforza, Stefano; Cuoghi, Chiara; Basevi, Vittorio

    2017-08-01

    An efficient breast-feeding monitoring system should be in place in every country to assist policy makers and health professionals plan activities to reach optimal breast-feeding rates. Design/Setting/Subjects From March to June 2015, breast-feeding rates at 3 and 5 months of age were monitored in Emilia-Romagna, an Italian region, using four questions added to a newly developed paediatric immunization database with single records for each individual. Data were collected at primary-care centres. Breast-feeding definitions and 24 h recall as recommended by the WHO were used. Direct age standardization was applied to breast-feeding rates. Record linkage with the medical birth database was attempted to identify maternal, pregnancy and delivery factors associated with full breast-feeding rates at 3 and 5 months of age. Data on breast-feeding were collected for 14044 infants. The mean regional full breast-feeding rate at 3 months was 52 %; differences between local health authorities ranged from 42 to 62 %. At 5 months of age, the mean regional full breast-feeding rate dropped to 33 % (range between local health authorities: 26 to 46 %). Record linkage with the birth certificate database was successful for 93 % of records. Total observations more than doubled with respect to the previous regional survey. The new monitoring system implemented in 2015 in Emilia-Romagna region, totally integrated with the immunization database, has proved to be feasible, sustainable and more efficient than the previous one. This system can be a model for other regions and countries where the vast majority of mothers obtain vaccinations from public health facilities and that already have an immunization database in place.

  14. An architecture for integrating distributed and cooperating knowledge-based Air Force decision aids

    NASA Technical Reports Server (NTRS)

    Nugent, Richard O.; Tucker, Richard W.

    1988-01-01

    MITRE has been developing a Knowledge-Based Battle Management Testbed for evaluating the viability of integrating independently-developed knowledge-based decision aids in the Air Force tactical domain. The primary goal for the testbed architecture is to permit a new system to be added to a testbed with little change to the system's software. Each system that connects to the testbed network declares that it can provide a number of services to other systems. When a system wants to use another system's service, it does not address the server system by name, but instead transmits a request to the testbed network asking for a particular service to be performed. A key component of the testbed architecture is a common database which uses a relational database management system (RDBMS). The RDBMS provides a database update notification service to requesting systems. Normally, each system is expected to monitor data relations of interest to it. Alternatively, a system may broadcast an announcement message to inform other systems that an event of potential interest has occurred. Current research is aimed at dealing with issues resulting from integration efforts, such as dealing with potential mismatches of each system's assumptions about the common database, decentralizing network control, and coordinating multiple agents.

  15. Groundwater modeling in integrated water resources management--visions for 2020.

    PubMed

    Refsgaard, Jens Christian; Højberg, Anker Lajer; Møller, Ingelise; Hansen, Martin; Søndergaard, Verner

    2010-01-01

    Groundwater modeling is undergoing a change from traditional stand-alone studies toward being an integrated part of holistic water resources management procedures. This is illustrated by the development in Denmark, where comprehensive national databases for geologic borehole data, groundwater-related geophysical data, geologic models, as well as a national groundwater-surface water model have been established and integrated to support water management. This has enhanced the benefits of using groundwater models. Based on insight gained from this Danish experience, a scientifically realistic scenario for the use of groundwater modeling in 2020 has been developed, in which groundwater models will be a part of sophisticated databases and modeling systems. The databases and numerical models will be seamlessly integrated, and the tasks of monitoring and modeling will be merged. Numerical models for atmospheric, surface water, and groundwater processes will be coupled in one integrated modeling system that can operate at a wide range of spatial scales. Furthermore, the management systems will be constructed with a focus on building credibility of model and data use among all stakeholders and on facilitating a learning process whereby data and models, as well as stakeholders' understanding of the system, are updated to currently available information. The key scientific challenges for achieving this are (1) developing new methodologies for integration of statistical and qualitative uncertainty; (2) mapping geological heterogeneity and developing scaling methodologies; (3) developing coupled model codes; and (4) developing integrated information systems, including quality assurance and uncertainty information that facilitate active stakeholder involvement and learning.

  16. BBN technical memorandum W1291 infrasound model feasibility study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrell, T., BBN Systems and Technologies

    1998-05-01

    The purpose of this study is to determine the need and level of effort required to add existing atmospheric databases and infrasound propagation models to the DOE`s Hydroacoustic Coverage Assessment Model (HydroCAM) [1,2]. The rationale for the study is that the performance of the infrasound monitoring network will be an important factor for both the International Monitoring System (IMS) and US national monitoring capability. Many of the technical issues affecting the design and performance of the infrasound network are directly related to the variability of the atmosphere and the corresponding uncertainties in infrasound propagation. It is clear that the studymore » of these issues will be enhanced by the availability of software tools for easy manipulation and interfacing of various atmospheric databases and infrasound propagation models. In addition, since there are many similarities between propagation in the oceans and in the atmosphere, it is anticipated that much of the software infrastructure developed for hydroacoustic database manipulation and propagation modeling in HydroCAM will be directly extendible to an infrasound capability. The study approach was to talk to the acknowledged domain experts in the infrasound monitoring area to determine: 1. The major technical issues affecting infrasound monitoring network performance. 2. The need for an atmospheric database/infrasound propagation modeling capability similar to HydroCAM. 3. The state of existing infrasound propagation codes and atmospheric databases. 4. A recommended approach for developing the required capabilities. A list of the people who contributed information to this study is provided in Table 1. We also relied on our knowledge of oceanographic and meteorological data sources to determine the availability of atmospheric databases and the feasibility of incorporating this information into the existing HydroCAM geographic database software. This report presents a summary of the need for an integrated infrasound modeling capability in Section 2.0. Section 3.0 provides a recommended approach for developing this capability in two stages; a basic capability and an extended capability. This section includes a discussion of the available static and dynamic databases, and the various modeling tools which are available or could be developed under such a task. The conclusions and recommendations of the study are provided in Section 4.0.« less

  17. Scale out databases for CERN use cases

    NASA Astrophysics Data System (ADS)

    Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Lanza Garcia, Daniel; Surdy, Kacper

    2015-12-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database.

  18. A data management infrastructure for bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Byun, Jaewook; Kim, Daeyoung; Sohn, Hoon; Bae, In Hwan; Law, Kincho H.

    2015-04-01

    This paper discusses a data management infrastructure framework for bridge monitoring applications. As sensor technologies mature and become economically affordable, their deployment for bridge monitoring will continue to grow. Data management becomes a critical issue not only for storing the sensor data but also for integrating with the bridge model to support other functions, such as management, maintenance and inspection. The focus of this study is on the effective data management of bridge information and sensor data, which is crucial to structural health monitoring and life cycle management of bridge structures. We review the state-of-the-art of bridge information modeling and sensor data management, and propose a data management framework for bridge monitoring based on NoSQL database technologies that have been shown useful in handling high volume, time-series data and to flexibly deal with unstructured data schema. Specifically, Apache Cassandra and Mongo DB are deployed for the prototype implementation of the framework. This paper describes the database design for an XML-based Bridge Information Modeling (BrIM) schema, and the representation of sensor data using Sensor Model Language (SensorML). The proposed prototype data management framework is validated using data collected from the Yeongjong Bridge in Incheon, Korea.

  19. Design and implementation of a wearable healthcare monitoring system.

    PubMed

    Sagahyroon, Assim; Raddy, Hazem; Ghazy, Ali; Suleman, Umair

    2009-01-01

    A wearable healthcare monitoring unit that integrates various technologies was developed to provide patients with the option of leading a healthy and independent life without risks or confinement to medical facilities. The unit consists of various sensors integrated to a microcontroller and attached to the patient's body, reading vital signs and transmitting these readings via a Bluetooth link to the patient's mobile phone. Short-Messaging-Service (SMS) is incorporated in the design to alert a physician in emergency cases. Additionally, an application program running on the mobile phone uses the internet to update (at regular intervals) the patient records in a hospital database with the most recent readings. To reduce development costs, the components used were both off-the-shelf and affordable.

  20. Integrating molecular diagnostic and flow cytometric reporting for improved longitudinal monitoring of HIV patients.

    PubMed Central

    Asare, A. L.; Huda, H.; Klimczak, J. C.; Caldwell, C. W.

    1998-01-01

    Studies have shown that monitoring HIV-infected patients undergoing antiretroviral therapy is best represented by combined measurement of plasma HIV-1 RNA and CD4+ T-lymphocytes [1]. This pilot study at the University of Missouri-Columbia integrates molecular diagnostic and flow cytometric data reporting to provide current and historical HIV-1 RNA levels and CD4+ T-cell counts. The development of a single database for storage and retrieval of these values facilitates composite report generation that includes longitudinal HIV-1 RNA levels and CD4+ T-cell counts for all patients. Results are displayed in tables and plotted graphically within a web browser. This method of data presentation converts individual data points to more useful medical information and could provide clinicians with decision support for improved monitoring of HIV patients undergoing antiretroviral therapy. Images Figure 2 Figure 3 Figure 4 PMID:9929359

  1. YPED: An Integrated Bioinformatics Suite and Database for Mass Spectrometry-based Proteomics Research

    PubMed Central

    Colangelo, Christopher M.; Shifman, Mark; Cheung, Kei-Hoi; Stone, Kathryn L.; Carriero, Nicholas J.; Gulcicek, Erol E.; Lam, TuKiet T.; Wu, Terence; Bjornson, Robert D.; Bruce, Can; Nairn, Angus C.; Rinehart, Jesse; Miller, Perry L.; Williams, Kenneth R.

    2015-01-01

    We report a significantly-enhanced bioinformatics suite and database for proteomics research called Yale Protein Expression Database (YPED) that is used by investigators at more than 300 institutions worldwide. YPED meets the data management, archival, and analysis needs of a high-throughput mass spectrometry-based proteomics research ranging from a single laboratory, group of laboratories within and beyond an institution, to the entire proteomics community. The current version is a significant improvement over the first version in that it contains new modules for liquid chromatography–tandem mass spectrometry (LC–MS/MS) database search results, label and label-free quantitative proteomic analysis, and several scoring outputs for phosphopeptide site localization. In addition, we have added both peptide and protein comparative analysis tools to enable pairwise analysis of distinct peptides/proteins in each sample and of overlapping peptides/proteins between all samples in multiple datasets. We have also implemented a targeted proteomics module for automated multiple reaction monitoring (MRM)/selective reaction monitoring (SRM) assay development. We have linked YPED’s database search results and both label-based and label-free fold-change analysis to the Skyline Panorama repository for online spectra visualization. In addition, we have built enhanced functionality to curate peptide identifications into an MS/MS peptide spectral library for all of our protein database search identification results. PMID:25712262

  2. YPED: an integrated bioinformatics suite and database for mass spectrometry-based proteomics research.

    PubMed

    Colangelo, Christopher M; Shifman, Mark; Cheung, Kei-Hoi; Stone, Kathryn L; Carriero, Nicholas J; Gulcicek, Erol E; Lam, TuKiet T; Wu, Terence; Bjornson, Robert D; Bruce, Can; Nairn, Angus C; Rinehart, Jesse; Miller, Perry L; Williams, Kenneth R

    2015-02-01

    We report a significantly-enhanced bioinformatics suite and database for proteomics research called Yale Protein Expression Database (YPED) that is used by investigators at more than 300 institutions worldwide. YPED meets the data management, archival, and analysis needs of a high-throughput mass spectrometry-based proteomics research ranging from a single laboratory, group of laboratories within and beyond an institution, to the entire proteomics community. The current version is a significant improvement over the first version in that it contains new modules for liquid chromatography-tandem mass spectrometry (LC-MS/MS) database search results, label and label-free quantitative proteomic analysis, and several scoring outputs for phosphopeptide site localization. In addition, we have added both peptide and protein comparative analysis tools to enable pairwise analysis of distinct peptides/proteins in each sample and of overlapping peptides/proteins between all samples in multiple datasets. We have also implemented a targeted proteomics module for automated multiple reaction monitoring (MRM)/selective reaction monitoring (SRM) assay development. We have linked YPED's database search results and both label-based and label-free fold-change analysis to the Skyline Panorama repository for online spectra visualization. In addition, we have built enhanced functionality to curate peptide identifications into an MS/MS peptide spectral library for all of our protein database search identification results. Copyright © 2015 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  3. An “EAR” on environmental surveillance and monitoring: A case study on the use of Exposure–Activity Ratios (EARs) to prioritize sites, chemicals, and bioactivities of concern in Great Lakes waters

    USGS Publications Warehouse

    Blackwell, Brett R.; Ankley, Gerald T.; Corsi, Steven; DeCicco, Laura; Houck, Kieth A.; Judson, Richard S.; Li, Shibin; Martin, Matthew T.; Murphy, Elizabeth; Schroeder, Anthony L.; Smith, Edwin R.; Swintek, Joe; Villeneuve, Daniel L.

    2017-01-01

    Current environmental monitoring approaches focus primarily on chemical occurrence. However, based on concentration alone, it can be difficult to identify which compounds may be of toxicological concern and should be prioritized for further monitoring, in-depth testing, or management. This can be problematic because toxicological characterization is lacking for many emerging contaminants. New sources of high-throughput screening (HTS) data, such as the ToxCast database, which contains information for over 9000 compounds screened through up to 1100 bioassays, are now available. Integrated analysis of chemical occurrence data with HTS data offers new opportunities to prioritize chemicals, sites, or biological effects for further investigation based on concentrations detected in the environment linked to relative potencies in pathway-based bioassays. As a case study, chemical occurrence data from a 2012 study in the Great Lakes Basin along with the ToxCast effects database were used to calculate exposure–activity ratios (EARs) as a prioritization tool. Technical considerations of data processing and use of the ToxCast database are presented and discussed. EAR prioritization identified multiple sites, biological pathways, and chemicals that warrant further investigation. Prioritized bioactivities from the EAR analysis were linked to discrete adverse outcome pathways to identify potential adverse outcomes and biomarkers for use in subsequent monitoring efforts.

  4. Introducing the Phytophthora database: an integrated resource for detecting, monitoring, and managing Phytophthora diseases

    Treesearch

    Kelly L. Ivors; Frank Martin; Michael Coffey; Izabela Makalowska; David M. Geiser; Seogchan Kang

    2008-01-01

    Its virulence and ability to spread rapidly throughout the world by various means establishes Phytophthora as one of the most important groups of plant pathogens. Discoveries of interspecific hybridization among Phytophthora species in nature, which could yield novel pathogens, further underscore the threat posed by members of this genus. The ability...

  5. Raspberry Pi in-situ network monitoring system of groundwater flow and temperature integrated with OpenGeoSys

    NASA Astrophysics Data System (ADS)

    Park, Chan-Hee; Lee, Cholwoo

    2016-04-01

    Raspberry Pi series is a low cost, smaller than credit-card sized computers that various operating systems such as linux and recently even Windows 10 are ported to run on. Thanks to massive production and rapid technology development, the price of various sensors that can be attached to Raspberry Pi has been dropping at an increasing speed. Therefore, the device can be an economic choice as a small portable computer to monitor temporal hydrogeological data in fields. In this study, we present a Raspberry Pi system that measures a flow rate, and temperature of groundwater at sites, stores them into mysql database, and produces interactive figures and tables such as google charts online or bokeh offline for further monitoring and analysis. Since all the data are to be monitored on internet, any computers or mobile devices can be good monitoring tools at convenience. The measured data are further integrated with OpenGeoSys, one of the hydrogeological models that is also ported to the Raspberry Pi series. This leads onsite hydrogeological modeling fed by temporal sensor data to meet various needs.

  6. Low-Cost, Distributed Environmental Monitors for Factory Worker Health

    PubMed Central

    Thomas, Geb W.; Sousan, Sinan; Tatum, Marcus; Liu, Xiaoxing; Zuidema, Christopher; Fitzpatrick, Mitchell; Koehler, Kirsten A.; Peters, Thomas M.

    2018-01-01

    An integrated network of environmental monitors was developed to continuously measure several airborne hazards in a manufacturing facility. The monitors integrated low-cost sensors to measure particulate matter, carbon monoxide, ozone and nitrogen dioxide, noise, temperature and humidity. The monitors were developed and tested in situ for three months in several overlapping deployments, before a full cohort of 40 was deployed in a heavy vehicle manufacturing facility for a year of data collection. The monitors collect data from each sensor and report them to a central database every 5 min. The work includes an experimental validation of the particle, gas and noise monitors. The R2 for the particle sensor ranges between 0.98 and 0.99 for particle mass densities up to 300 μg/m3. The R2 for the carbon monoxide sensor is 0.99 for concentrations up to 15 ppm. The R2 for the oxidizing gas sensor is 0.98 over the sensitive range from 20 to 180 ppb. The noise monitor is precise within 1% between 65 and 95 dBA. This work demonstrates the capability of distributed monitoring as a means to examine exposure variability in both space and time, building an important preliminary step towards a new approach for workplace hazard monitoring. PMID:29751534

  7. An integrated biomedical telemetry system for sleep monitoring employing a portable body area network of sensors (SENSATION).

    PubMed

    Astaras, Alexander; Arvanitidou, Marina; Chouvarda, Ioanna; Kilintzis, Vassilis; Koutkias, Vassilis; Sanchez, Eduardo Monton; Stalidis, George; Triantafyllidis, Andreas; Maglaveras, Nicos

    2008-01-01

    A flexible, scaleable and cost-effective medical telemetry system is described for monitoring sleep-related disorders in the home environment. The system was designed and built for real-time data acquisition and processing, allowing for additional use in intensive care unit scenarios where rapid medical response is required in case of emergency. It comprises a wearable body area network of Zigbee-compatible wireless sensors worn by the subject, a central database repository residing in the medical centre and thin client workstations located at the subject's home and in the clinician's office. The system supports heterogeneous setup configurations, involving a variety of data acquisition sensors to suit several medical applications. All telemetry data is securely transferred and stored in the central database under the clinicians' ownership and control.

  8. Vehicle Integrated Prognostic Reasoner (VIPR) 2010 Annual Final Report

    NASA Technical Reports Server (NTRS)

    Hadden, George D.; Mylaraswamy, Dinkar; Schimmel, Craig; Biswas, Gautam; Koutsoukos, Xenofon; Mack, Daniel

    2011-01-01

    Honeywell's Central Maintenance Computer Function (CMCF) and Aircraft Condition Monitoring Function (ACMF) represent the state-of-the art in integrated vehicle health management (IVHM). Underlying these technologies is a fault propagation modeling system that provides nose-to-tail coverage and root cause diagnostics. The Vehicle Integrated Prognostic Reasoner (VIPR) extends this technology to interpret evidence generated by advanced diagnostic and prognostic monitors provided by component suppliers to detect, isolate, and predict adverse events that affect flight safety. This report describes year one work that included defining the architecture and communication protocols and establishing the user requirements for such a system. Based on these and a set of ConOps scenarios, we designed and implemented a demonstration of communication pathways and associated three-tiered health management architecture. A series of scripted scenarios showed how VIPR would detect adverse events before they escalate as safety incidents through a combination of advanced reasoning and additional aircraft data collected from an aircraft condition monitoring system. Demonstrating VIPR capability for cases recorded in the ASIAS database and cross linking them with historical aircraft data is planned for year two.

  9. Evaluating Land-Atmosphere Interactions with the North American Soil Moisture Database

    NASA Astrophysics Data System (ADS)

    Giles, S. M.; Quiring, S. M.; Ford, T.; Chavez, N.; Galvan, J.

    2015-12-01

    The North American Soil Moisture Database (NASMD) is a high-quality observational soil moisture database that was developed to study land-atmosphere interactions. It includes over 1,800 monitoring stations the United States, Canada and Mexico. Soil moisture data are collected from multiple sources, quality controlled and integrated into an online database (soilmoisture.tamu.edu). The period of record varies substantially and only a few of these stations have an observation record extending back into the 1990s. Daily soil moisture observations have been quality controlled using the North American Soil Moisture Database QAQC algorithm. The database is designed to facilitate observationally-driven investigations of land-atmosphere interactions, validation of the accuracy of soil moisture simulations in global land surface models, satellite calibration/validation for SMOS and SMAP, and an improved understanding of how soil moisture influences climate on seasonal to interannual timescales. This paper provides some examples of how the NASMD has been utilized to enhance understanding of land-atmosphere interactions in the U.S. Great Plains.

  10. DIMA.Tools: An R package for working with the database for inventory, monitoring, and assessment

    USDA-ARS?s Scientific Manuscript database

    The Database for Inventory, Monitoring, and Assessment (DIMA) is a Microsoft Access database used to collect, store and summarize monitoring data. This database is used by both local and national monitoring efforts within the National Park Service, the Forest Service, the Bureau of Land Management, ...

  11. Development of Asset Fault Signatures for Prognostic and Health Management in the Nuclear Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford

    2014-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: Diagnostic Advisor, Asset Fault Signature (AFS) Database, Remaining Useful Life Advisor, and Remaining Useful Life Database. This paper focuses on development of asset fault signatures to assess the health status of generator step-up generators and emergency diesel generators in nuclear power plants. Asset fault signatures describe themore » distinctive features based on technical examinations that can be used to detect a specific fault type. At the most basic level, fault signatures are comprised of an asset type, a fault type, and a set of one or more fault features (symptoms) that are indicative of the specified fault. The AFS Database is populated with asset fault signatures via a content development exercise that is based on the results of intensive technical research and on the knowledge and experience of technical experts. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.« less

  12. Digital Image Support in the ROADNet Real-time Monitoring Platform

    NASA Astrophysics Data System (ADS)

    Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.

    2004-12-01

    The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.

  13. Technical literature review.

    PubMed

    Nußbeck, Gunnar; Gök, Murat

    2013-01-01

    This review gives a comprehensive overview on the technical perspective of personal health monitoring. It is designed to build a mutual basis for the project partners of the PHM-Ethics project. A literature search was conducted to screen pertinent literature databases for relevant publications. All review papers that were retrieved were analyzed. The increasing number of publications that are published per year shows that the field of personal health monitoring is of growing interest in the research community. Most publications deal with telemonitoring, thus forming the core technology of personal health monitoring. Measured parameters, fields of application, participants and stakeholders are described. Moreover an outlook on information and communication technology that foster the integration possibilities of personal health monitoring into decision making and remote monitoring of individual people's health is provided. The removal of the technological barriers opens new perspectives in health and health care delivery using home monitoring applications.

  14. Processing of the WLCG monitoring data using NoSQL

    NASA Astrophysics Data System (ADS)

    Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  15. EPA Facility Registry Service (FRS): Facility Interests Dataset - Intranet

    EPA Pesticide Factsheets

    This web feature service consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers of haz

  16. EPA Facility Registry Service (FRS): Facility Interests Dataset - Intranet Download

    EPA Pesticide Factsheets

    This downloadable data package consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers

  17. EPA Facility Registry Service (FRS): Facility Interests Dataset Download

    EPA Pesticide Factsheets

    This downloadable data package consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers

  18. EPA Facility Registry Service (FRS): Facility Interests Dataset

    EPA Pesticide Factsheets

    This web feature service consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in the FRS individual feature layers. The layers comprise the FRS major program databases, including:Assessment Cleanup and Redevelopment Exchange System (ACRES) : brownfields sites ; Air Facility System (AFS) : stationary sources of air pollution ; Air Quality System (AQS) : ambient air pollution data from monitoring stations; Bureau of Indian Affairs (BIA) : schools data on Indian land; Base Realignment and Closure (BRAC) facilities; Clean Air Markets Division Business System (CAMDBS) : market-based air pollution control programs; Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) : hazardous waste sites; Integrated Compliance Information System (ICIS) : integrated enforcement and compliance information; National Compliance Database (NCDB) : Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA); National Pollutant Discharge Elimination System (NPDES) module of ICIS : NPDES surface water permits; Radiation Information Database (RADINFO) : radiation and radioactivity facilities; RACT/BACT/LAER Clearinghouse (RBLC) : best available air pollution technology requirements; Resource Conservation and Recovery Act Information System (RCRAInfo) : tracks generators, transporters, treaters, storers, and disposers of haz

  19. The Genomes OnLine Database (GOLD) v.5: a metadata management system based on a four level (meta)genome project classification

    PubMed Central

    Reddy, T.B.K.; Thomas, Alex D.; Stamatis, Dimitri; Bertsch, Jon; Isbandi, Michelle; Jansson, Jakob; Mallajosyula, Jyothi; Pagani, Ioanna; Lobos, Elizabeth A.; Kyrpides, Nikos C.

    2015-01-01

    The Genomes OnLine Database (GOLD; http://www.genomesonline.org) is a comprehensive online resource to catalog and monitor genetic studies worldwide. GOLD provides up-to-date status on complete and ongoing sequencing projects along with a broad array of curated metadata. Here we report version 5 (v.5) of the database. The newly designed database schema and web user interface supports several new features including the implementation of a four level (meta)genome project classification system and a simplified intuitive web interface to access reports and launch search tools. The database currently hosts information for about 19 200 studies, 56 000 Biosamples, 56 000 sequencing projects and 39 400 analysis projects. More than just a catalog of worldwide genome projects, GOLD is a manually curated, quality-controlled metadata warehouse. The problems encountered in integrating disparate and varying quality data into GOLD are briefly highlighted. GOLD fully supports and follows the Genomic Standards Consortium (GSC) Minimum Information standards. PMID:25348402

  20. S.I.I.A for monitoring crop evolution and anomaly detection in Andalusia by remote sensing

    NASA Astrophysics Data System (ADS)

    Rodriguez Perez, Antonio Jose; Louakfaoui, El Mostafa; Munoz Rastrero, Antonio; Rubio Perez, Luis Alberto; de Pablos Epalza, Carmen

    2004-02-01

    A new remote sensing application was developed and incorporated to the Agrarian Integrated Information System (S.I.I.A), project which is involved on integrating the regional farming databases from a geographical point of view, adding new values and uses to the original information. The project is supported by the Studies and Statistical Service, Regional Government Ministry of Agriculture and Fisheries (CAP). The process integrates NDVI values from daily NOAA-AVHRR and monthly IRS-WIFS images, and crop classes location maps. Agrarian local information and meteorological information is being included in the working process to produce a synergistic effect. An updated crop-growing evaluation state is obtained by 10-days periods, crop class, sensor type (including data fusion) and administrative geographical borders. Last ten years crop database (1992-2002) has been organized according to these variables. Crop class database can be accessed by an application which helps users on the crop statistical analysis. Multi-temporal and multi-geographical comparative analysis can be done by the user, not only for a year but also for a historical point of view. Moreover, real time crop anomalies can be detected and analyzed. Most of the output products will be available on Internet in the near future by a on-line application.

  1. GIS Application System Design Applied to Information Monitoring

    NASA Astrophysics Data System (ADS)

    Qun, Zhou; Yujin, Yuan; Yuena, Kang

    Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.

  2. Innovative Methods for Integrating Knowledge for Long-Term Monitoring of Contaminated Groundwater Sites: Understanding Microorganism Communities and their Associated Hydrochemical Environment

    NASA Astrophysics Data System (ADS)

    Mouser, P. J.; Rizzo, D. M.; Druschel, G.; O'Grady, P.; Stevens, L.

    2005-12-01

    This interdisciplinary study integrates hydrochemical and genome-based data to estimate the redox processes occurring at long-term monitoring sites. Groundwater samples have been collected from a well-characterized landfill-leachate contaminated aquifer in northeastern New York. Primers from the 16S rDNA gene were used to amplify Bacteria and Archaea in groundwater taken from monitoring wells located in clean, fringe, and contaminated locations within the aquifer. PCR-amplified rDNA were digested with restriction enzymes to evaluate terminal restriction fragment length polymorphism (T-RFLP) community profiles. The rDNA was cloned, sequenced, and partial sequences were matched against known organisms using the NCBI Blast database. Phylogenetic trees and bootstrapping were used to identify classifications of organisms and compare the communities from clean, fringe, and contaminated locations. We used Artificial Neural Network (ANN) models to incorporate microbial data with hydrochemical information for improving our understanding of subsurface processes.

  3. A Semantic Sensor Web for Environmental Decision Support Applications

    PubMed Central

    Gray, Alasdair J. G.; Sadler, Jason; Kit, Oles; Kyzirakos, Kostis; Karpathiotakis, Manos; Calbimonte, Jean-Paul; Page, Kevin; García-Castro, Raúl; Frazer, Alex; Galpin, Ixent; Fernandes, Alvaro A. A.; Paton, Norman W.; Corcho, Oscar; Koubarakis, Manolis; De Roure, David; Martinez, Kirk; Gómez-Pérez, Asunción

    2011-01-01

    Sensing devices are increasingly being deployed to monitor the physical world around us. One class of application for which sensor data is pertinent is environmental decision support systems, e.g., flood emergency response. For these applications, the sensor readings need to be put in context by integrating them with other sources of data about the surrounding environment. Traditional systems for predicting and detecting floods rely on methods that need significant human resources. In this paper we describe a semantic sensor web architecture for integrating multiple heterogeneous datasets, including live and historic sensor data, databases, and map layers. The architecture provides mechanisms for discovering datasets, defining integrated views over them, continuously receiving data in real-time, and visualising on screen and interacting with the data. Our approach makes extensive use of web service standards for querying and accessing data, and semantic technologies to discover and integrate datasets. We demonstrate the use of our semantic sensor web architecture in the context of a flood response planning web application that uses data from sensor networks monitoring the sea-state around the coast of England. PMID:22164110

  4. The European general thoracic surgery database project.

    PubMed

    Falcoz, Pierre Emmanuel; Brunelli, Alessandro

    2014-05-01

    The European Society of Thoracic Surgeons (ESTS) Database is a free registry created by ESTS in 2001. The current online version was launched in 2007. It runs currently on a Dendrite platform with extensive data security and frequent backups. The main features are a specialty-specific, procedure-specific, prospectively maintained, periodically audited and web-based electronic database, designed for quality control and performance monitoring, which allows for the collection of all general thoracic procedures. Data collection is the "backbone" of the ESTS database. It includes many risk factors, processes of care and outcomes, which are specially designed for quality control and performance audit. The user can download and export their own data and use them for internal analyses and quality control audits. The ESTS database represents the gold standard of clinical data collection for European General Thoracic Surgery. Over the past years, the ESTS database has achieved many accomplishments. In particular, the database hit two major milestones: it now includes more than 235 participating centers and 70,000 surgical procedures. The ESTS database is a snapshot of surgical practice that aims at improving patient care. In other words, data capture should become integral to routine patient care, with the final objective of improving quality of care within Europe.

  5. Medication-use evaluation with a Web application.

    PubMed

    Burk, Muriel; Moore, Von; Glassman, Peter; Good, Chester B; Emmendorfer, Thomas; Leadholm, Thomas C; Cunningham, Francesca

    2013-12-15

    A Web-based application for coordinating medication-use evaluation (MUE) initiatives within the Veterans Affairs (VA) health care system is described. The MUE Tracker (MUET) software program was created to improve VA's ability to conduct national medication-related interventions throughout its network of 147 medical centers. MUET initiatives are centrally coordinated by the VA Center for Medication Safety (VAMedSAFE), which monitors the agency's integrated databases for indications of suboptimal prescribing or drug therapy monitoring and adverse treatment outcomes. When a pharmacovigilance signal is detected, VAMedSAFE identifies "trigger groups" of at-risk veterans and uploads patient lists to the secure MUET application, where locally designated personnel (typically pharmacists) can access and use the data to target risk-reduction efforts. Local data on patient-specific interventions are stored in a centralized database and regularly updated to enable tracking and reporting for surveillance and quality-improvement purposes; aggregated data can be further analyzed for provider education and benchmarking. In a three-year pilot project, the MUET program was found effective in promoting improved prescribing of erythropoiesis-stimulating agents (ESAs) and enhanced laboratory monitoring of ESA-treated patients in all specified trigger groups. The MUET initiative has since been expanded to target other high-risk drugs, and efforts are underway to refine the tool for broader utility. The MUET application has enabled the increased standardization of medication safety initiatives across the VA system and may serve as a useful model for the development of pharmacovigilance tools by other large integrated health care systems.

  6. Role of data warehousing in healthcare epidemiology.

    PubMed

    Wyllie, D; Davies, J

    2015-04-01

    Electronic storage of healthcare data, including individual-level risk factors for both infectious and other diseases, is increasing. These data can be integrated at hospital, regional and national levels. Data sources that contain risk factor and outcome information for a wide range of conditions offer the potential for efficient epidemiological analysis of multiple diseases. Opportunities may also arise for monitoring healthcare processes. Integrating diverse data sources presents epidemiological, practical, and ethical challenges. For example, diagnostic criteria, outcome definitions, and ascertainment methods may differ across the data sources. Data volumes may be very large, requiring sophisticated computing technology. Given the large populations involved, perhaps the most challenging aspect is how informed consent can be obtained for the development of integrated databases, particularly when it is not easy to demonstrate their potential. In this article, we discuss some of the ups and downs of recent projects as well as the potential of data warehousing for antimicrobial resistance monitoring. Copyright © 2015. Published by Elsevier Ltd.

  7. A new comprehensive database of global volcanic gas analyses

    NASA Astrophysics Data System (ADS)

    Clor, L. E.; Fischer, T. P.; Lehnert, K. A.; McCormick, B.; Hauri, E. H.

    2013-12-01

    Volcanic volatiles are the driving force behind eruptions, powerful indicators of magma provenance, present localized hazards, and have implications for climate. Studies of volcanic emissions are necessary for understanding volatile cycling from the mantle to the atmosphere. Gas compositions vary with volcanic activity, making it important to track their chemical variability over time. As studies become increasingly interdisciplinary, it is critical to have a mechanism to integrate decades of gas studies across disciplines. Despite the value of this research to a variety of fields, there is currently no integrated network to house all volcanic and hydrothermal gas data, making spatial, temporal, and interdisciplinary comparison studies time-consuming. To remedy this, we are working to establish a comprehensive database of volcanic gas emissions and compositions worldwide, as part of the Deep Carbon Observatory's DECADE (Deep Carbon Degassing) initiative. Volcanic gas data have been divided into two broad categories: 1) chemical analyses from samples collected directly at the volcanic source, and 2) measurements of gas concentrations and fluxes, such as remotely by mini-DOAS or satellite, or in-plume such as by multiGAS. The gas flux database effort is realized by the Global Volcanism Program of the Smithsonian Institution (abstract by Brendan McCormick, this meeting). The direct-sampling data is the subject of this presentation. Data from direct techniques include samples of gases collected at the volcanic source from fumaroles and springs, tephras analyzed for gas contents, filter pack samples of gases collected in a plume, and any other data types that involve collection of a sample. Data are incorporated into the existing framework of the Petrological Database, PetDB. Association with PetDB is advantageous as it will allow volcanic gas data to be linked to chemical data from lava or tephra samples, forming more complete ties between the eruptive products and the source magma. Eventually our goal is to have a seamless gas database that allows the user to easily access all gas data ever collected at volcanoes. This database will be useful in a variety of science applications: 1) correlating volcanic gas composition to volcanic activity; 2) establishing a characteristic gas composition or total volatile budget for a volcano or region in studies of global chemical cycles; 3) better quantifying the flux and source of volcanic carbon to the atmosphere. The World Organization of Volcano Observatories is populating a volcano monitoring database, WOVOdat, which centers on data collected during times of volcanic unrest for monitoring and hazard purposes. The focus of our database is to gain insight into volcanic degassing specifically, during both eruptive and quiescent times. Coordination of the new database with WOVOdat will allow comparison studies of gas compositions with seismic and other monitoring data during times of unrest, as well as promote comprehensive and cross-disciplinary questions about volcanic degassing.

  8. The Northern California Earthquake Management System: A Unified System From Realtime Monitoring to Data Distribution

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Dietz, L.; Lombard, P.; Klein, F.; Zuzlewski, S.; Kohler, W.; Hellweg, M.; Luetgert, J.; Oppenheimer, D.; Romanowicz, B.

    2006-12-01

    The longstanding cooperation between the USGS Menlo Park and UC Berkeley's Seismological Laboratory for monitoring earthquakes and providing data to the research community is achieving a new level of integration. While station support and data collection for each network (NC, BK, BP) remain the responsibilities of the host institution, picks, codas and amplitudes will be produced and shared between the data centers continuously. Thus, realtime earthquake processing from triggering and locating through magnitude and moment tensor calculation and Shakemap production will take place independently at both locations, improving the robustness of event reporting in the Northern California Earthquake Management Center. Parametric data will also be exchanged with the Southern California Earthquake Management System to allow statewide earthquake detection and processing for further redundancy within the California Integrated Seismic Network (CISN). The database plays an integral part in this system, providing the coordination for event processing as well as the repository for event, instrument (metadata) and waveform information. The same master database serves both realtime processing, data quality control and archival, and the data center which provides waveforms and earthquake data to users in the research community. Continuous waveforms from all BK, BP, and NC stations, event waveform gathers, and event information automatically become available at the Northern California Earthquake Data Center (NCEDC). Currently, the NCEDC is collecting and makes available over 4 TByes of data per year from the NCEMC stations and other seismic networks, as well as from GPS and and other geophysical instrumentation.

  9. The FAO/NASA/NLR Artemis system - An integrated concept for environmental monitoring by satellite in support of food/feed security and desert locust surveillance

    NASA Technical Reports Server (NTRS)

    Hielkema, J. U.; Howard, J. A.; Tucker, C. J.; Van Ingen Schenau, H. A.

    1987-01-01

    The African real time environmental monitoring using imaging satellites (Artemis) system, which should monitor precipitation and vegetation conditions on a continental scale, is presented. The hardware and software characteristics of the system are illustrated and the Artemis databases are outlined. Plans for the system include the use of hourly digital Meteosat data and daily NOAA/AVHRR data to study environmental conditions. Planned mapping activities include monthly rainfall anomaly maps, normalized difference vegetation index maps for ten day and monthly periods with a spatial resolution of 7.6 km, ten day crop/rangeland moisture availability maps, and desert locust potential breeding activity factor maps for a plague prevention program.

  10. Report: EPA Needs to Strengthen Financial Database Security Oversight and Monitor Compliance

    EPA Pesticide Factsheets

    Report #2007-P-00017, March 29, 2007. Weaknesses in how EPA offices monitor databases for known security vulnerabilities, communicate the status of critical system patches, and monitor the access to database administrator accounts and privileges.

  11. Informatics infrastructure for syndrome surveillance, decision support, reporting, and modeling of critical illness.

    PubMed

    Herasevich, Vitaly; Pickering, Brian W; Dong, Yue; Peters, Steve G; Gajic, Ognjen

    2010-03-01

    To develop and validate an informatics infrastructure for syndrome surveillance, decision support, reporting, and modeling of critical illness. Using open-schema data feeds imported from electronic medical records (EMRs), we developed a near-real-time relational database (Multidisciplinary Epidemiology and Translational Research in Intensive Care Data Mart). Imported data domains included physiologic monitoring, medication orders, laboratory and radiologic investigations, and physician and nursing notes. Open database connectivity supported the use of Boolean combinations of data that allowed authorized users to develop syndrome surveillance, decision support, and reporting (data "sniffers") routines. Random samples of database entries in each category were validated against corresponding independent manual reviews. The Multidisciplinary Epidemiology and Translational Research in Intensive Care Data Mart accommodates, on average, 15,000 admissions to the intensive care unit (ICU) per year and 200,000 vital records per day. Agreement between database entries and manual EMR audits was high for sex, mortality, and use of mechanical ventilation (kappa, 1.0 for all) and for age and laboratory and monitored data (Bland-Altman mean difference +/- SD, 1(0) for all). Agreement was lower for interpreted or calculated variables, such as specific syndrome diagnoses (kappa, 0.5 for acute lung injury), duration of ICU stay (mean difference +/- SD, 0.43+/-0.2), or duration of mechanical ventilation (mean difference +/- SD, 0.2+/-0.9). Extraction of essential ICU data from a hospital EMR into an open, integrative database facilitates process control, reporting, syndrome surveillance, decision support, and outcome research in the ICU.

  12. A web-based relational database for monitoring and analyzing mosquito population dynamics.

    PubMed

    Sucaet, Yves; Van Hemert, John; Tucker, Brad; Bartholomay, Lyric

    2008-07-01

    Mosquito population dynamics have been monitored on an annual basis in the state of Iowa since 1969. The primary goal of this project was to integrate light trap data from these efforts into a centralized back-end database and interactive website that is available through the internet at http://iowa-mosquito.ent.iastate.edu. For comparative purposes, all data were categorized according to the week of the year and normalized according to the number of traps running. Users can readily view current, weekly mosquito abundance compared with data from previous years. Additional interactive capabilities facilitate analyses of the data based on mosquito species, distribution, or a time frame of interest. All data can be viewed in graphical and tabular format and can be downloaded to a comma separated value (CSV) file for import into a spreadsheet or more specialized statistical software package. Having this long-term dataset in a centralized database/website is useful for informing mosquito and mosquito-borne disease control and for exploring the ecology of the species represented therein. In addition to mosquito population dynamics, this database is available as a standardized platform that could be modified and applied to a multitude of projects that involve repeated collection of observational data. The development and implementation of this tool provides capacity for the user to mine data from standard spreadsheets into a relational database and then view and query the data in an interactive website.

  13. Extending Glacier Monitoring into the Little Ice Age and Beyond

    NASA Astrophysics Data System (ADS)

    Nussbaumer, S. U.; Gärtner-Roer, I.; Zemp, M.; Zumbühl, H. J.; Masiokas, M. H.; Espizua, L. E.; Pitte, P.

    2011-12-01

    Glaciers are among the best natural proxies of climatic changes and, as such, a key variable within the international climate observing system. The worldwide monitoring of glacier distribution and fluctuations has been internationally coordinated for more than a century. Direct measurements of seasonal and annual glacier mass balance are available for the past six decades. Regular observations of glacier front variations have been carried out since the late 19th century. Information on glacier fluctuations before the onset of regular in situ measurements have to be reconstructed from moraines, historical evidence, and a wide range of dating methods. The majority of corresponding data is not available to the scientific community which challenges the reproducibility and direct comparison of the results. Here, we present a first approach towards the standardization of reconstructed Holocene glacier front variations as well as the integration of the corresponding data series into the database of the World Glacier Monitoring Service (www.wgms.ch), within the framework of the Global Terrestrial Network for Glaciers (www.gtn-g.org). The concept for the integration of these reconstructed front variations into the relational glacier database of the WGMS was jointly elaborated and tested by experts of both fields (natural and historical sciences), based on reconstruction series of 15 glaciers in Europe (western/central Alps and southern Norway) and 9 in southern South America. The reconstructed front variation series extend the direct measurements of the 20th century by two centuries in Norway and by four in the Alps and in South America. The storage of the records within the international glacier databases guarantees the long-term availability of the data series and increases the visibility of the scientific research which - in historical glaciology - is often the work of a lifetime. The standardized collection of reconstructed glacier front variations from southern Norway, the western Alps and the southern Andes allows a direct comparison between different glaciers. It is a first step towards a worldwide compilation and free dissemination of Holocene glacier fluctuation series within the internationally coordinated glacier monitoring.

  14. A web-based quantitative signal detection system on adverse drug reaction in China.

    PubMed

    Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan

    2009-07-01

    To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.

  15. A multimedia perioperative record keeper for clinical research.

    PubMed

    Perrino, A C; Luther, M A; Phillips, D B; Levin, F L

    1996-05-01

    To develop a multimedia perioperative recordkeeper that provides: 1. synchronous, real-time acquisition of multimedia data, 2. on-line access to the patient's chart data, and 3. advanced data analysis capabilities through integrated, multimedia database and analysis applications. To minimize cost and development time, the system design utilized industry standard hardware components and graphical. software development tools. The system was configured to use a Pentium PC complemented with a variety of hardware interfaces to external data sources. These sources included physiologic monitors with data in digital, analog, video, and audio as well as paper-based formats. The development process was guided by trials in over 80 clinical cases and by the critiques from numerous users. As a result of this process, a suite of custom software applications were created to meet the design goals. The Perioperative Data Acquisition application manages data collection from a variety of physiological monitors. The Charter application provides for rapid creation of an electronic medical record from the patient's paper-based chart and investigator's notes. The Multimedia Medical Database application provides a relational database for the organization and management of multimedia data. The Triscreen application provides an integrated data analysis environment with simultaneous, full-motion data display. With recent technological advances in PC power, data acquisition hardware, and software development tools, the clinical researcher now has the ability to collect and examine a more complete perioperative record. It is hoped that the description of the MPR and its development process will assist and encourage others to advance these tools for perioperative research.

  16. The ESO astronomical site monitor upgrade

    NASA Astrophysics Data System (ADS)

    Chiozzi, Gianluca; Sommer, Heiko; Sarazin, Marc; Bierwirth, Thomas; Dorigo, Dario; Vera Sequeiros, Ignacio; Navarrete, Julio; Del Valle, Diego

    2016-08-01

    Monitoring and prediction of astronomical observing conditions are essential for planning and optimizing observations. For this purpose, ESO, in the 90s, developed the concept of an Astronomical Site Monitor (ASM), as a facility fully integrated in the operations of the VLT observatory[1]. Identical systems were installed at Paranal and La Silla, providing comprehensive local weather information. By now, we had very good reasons for a major upgrade: • The need of introducing new features to satisfy the requirements of observing with the Adaptive Optics Facility and to benefit other Adaptive Optics systems. • Managing hardware and software obsolescence. • Making the system more maintainable and expandable by integrating off-the-shelf hardware solutions. The new ASM integrates: • A new Differential Image Motion Monitor (DIMM) paired with a Multi Aperture Scintillation Sensor (MASS) to measure the vertical distribution of turbulence in the high atmosphere and its characteristic velocity. • A new SLOpe Detection And Ranging (SLODAR) telescope, for measuring the altitude and intensity of turbulent layers in the low atmosphere. • A water vapour radiometer to monitor the water vapour content of the atmosphere. • The old weather tower, which is being refurbished with new sensors. The telescopes and the devices integrated are commercial products and we have used as much as possible the control system from the vendors. The existing external interfaces, based on the VLT standards, have been maintained for full backward compatibility. All data produced by the system are directly fed in real time into a relational database. A completely new web-based display replaces the obsolete plots based on HP-UX RTAP. We analyse here the architectural and technological choices and discuss the motivations and trade-offs.

  17. SITHON: A Wireless Network of in Situ Optical Cameras Applied to the Early Detection-Notification-Monitoring of Forest Fires

    PubMed Central

    Tsiourlis, Georgios; Andreadakis, Stamatis; Konstantinidis, Pavlos

    2009-01-01

    The SITHON system, a fully wireless optical imaging system, integrating a network of in-situ optical cameras linking to a multi-layer GIS database operated by Control Operating Centres, has been developed in response to the need for early detection, notification and monitoring of forest fires. This article presents in detail the architecture and the components of SITHON, and demonstrates the first encouraging results of an experimental test with small controlled fires over Sithonia Peninsula in Northern Greece. The system has already been scheduled to be installed in some fire prone areas of Greece. PMID:22408536

  18. Assessing air quality in Aksaray with time series analysis

    NASA Astrophysics Data System (ADS)

    Kadilar, Gamze Özel; Kadilar, Cem

    2017-04-01

    Sulphur dioxide (SO2) is a major air pollutant caused by the dominant usage of diesel, petrol and fuels by vehicles and industries. One of the most air-polluted city in Turkey is Aksaray. Hence, in this study, the level of SO2 is analyzed in Aksaray based on the database monitored at air quality monitoring station of Turkey. Seasonal Autoregressive Integrated Moving Average (SARIMA) approach is used to forecast the level of SO2 air quality parameter. The results indicate that the seasonal ARIMA model provides reliable and satisfactory predictions for the air quality parameters and expected to be an alternative tool for practical assessment and justification.

  19. A database application for wilderness character monitoring

    Treesearch

    Ashley Adams; Peter Landres; Simon Kingston

    2012-01-01

    The National Park Service (NPS) Wilderness Stewardship Division, in collaboration with the Aldo Leopold Wilderness Research Institute and the NPS Inventory and Monitoring Program, developed a database application to facilitate tracking and trend reporting in wilderness character. The Wilderness Character Monitoring Database allows consistent, scientifically based...

  20. Seamless personal health information system in cloud computing.

    PubMed

    Chung, Wan-Young; Fong, Ee May

    2014-01-01

    Noncontact ECG measurement has gained popularity these days due to its noninvasive and conveniences to be applied on daily life. This approach does not require any direct contact between patient's skin and sensor for physiological signal measurement. The noncontact ECG measurement is integrated with mobile healthcare system for health status monitoring. Mobile phone acts as the personal health information system displaying health status and body mass index (BMI) tracking. Besides that, it plays an important role being the medical guidance providing medical knowledge database including symptom checker and health fitness guidance. At the same time, the system also features some unique medical functions that cater to the living demand of the patients or users, including regular medication reminders, alert alarm, medical guidance, appointment scheduling. Lastly, we demonstrate mobile healthcare system with web application for extended uses, thus health data are clouded into web server system and web database storage. This allows remote health status monitoring easily and so forth it promotes a cost effective personal healthcare system.

  1. DIMA quick start, database for inventory, monitoring and assessment

    USDA-ARS?s Scientific Manuscript database

    The Database for Inventory, Monitoring and Assessment (DIMA) is a highly-customized Microsoft Access database for collecting data electronically in the field and for organizing, storing and reporting those data for monitoring and assessment. While DIMA can be used for any number of different monito...

  2. Remote monitoring system for the cryogenic system of superconducting magnets in the SuperKEKB interaction region

    NASA Astrophysics Data System (ADS)

    Aoki, K.; Ohuchi, N.; Zong, Z.; Arimoto, Y.; Wang, X.; Yamaoka, H.; Kawai, M.; Kondou, Y.; Makida, Y.; Hirose, M.; Endou, T.; Iwasaki, M.; Nakamura, T.

    2017-12-01

    A remote monitoring system was developed based on the software infrastructure of the Experimental Physics and Industrial Control System (EPICS) for the cryogenic system of superconducting magnets in the interaction region of the SuperKEKB accelerator. The SuperKEKB has been constructed to conduct high-energy physics experiments at KEK. These superconducting magnets consist of three apparatuses, the Belle II detector solenoid, and QCSL and QCSR accelerator magnets. They are each contained in three cryostats cooled by dedicated helium cryogenic systems. The monitoring system was developed to read data of the EX-8000, which is an integrated instrumentation system to control all cryogenic components. The monitoring system uses the I/O control tools of EPICS software for TCP/IP, archiving techniques using a relational database, and easy human-computer interface. Using this monitoring system, it is possible to remotely monitor all real-time data of the superconducting magnets and cryogenic systems. It is also convenient to share data among multiple groups.

  3. The Genomes OnLine Database (GOLD) v.5: a metadata management system based on a four level (meta)genome project classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, Tatiparthi B. K.; Thomas, Alex D.; Stamatis, Dimitri

    The Genomes OnLine Database (GOLD; http://www.genomesonline.org) is a comprehensive online resource to catalog and monitor genetic studies worldwide. GOLD provides up-to-date status on complete and ongoing sequencing projects along with a broad array of curated metadata. Within this paper, we report version 5 (v.5) of the database. The newly designed database schema and web user interface supports several new features including the implementation of a four level (meta)genome project classification system and a simplified intuitive web interface to access reports and launch search tools. The database currently hosts information for about 19 200 studies, 56 000 Biosamples, 56 000 sequencingmore » projects and 39 400 analysis projects. More than just a catalog of worldwide genome projects, GOLD is a manually curated, quality-controlled metadata warehouse. The problems encountered in integrating disparate and varying quality data into GOLD are briefly highlighted. Lastly, GOLD fully supports and follows the Genomic Standards Consortium (GSC) Minimum Information standards.« less

  4. Comprehensive Environmental Informatics System (CEIS) Integrating Crew and Vehicle Environmental Health

    NASA Technical Reports Server (NTRS)

    Nall, Mark E.

    2006-01-01

    Integrated Vehicle Health Management (IVHM) systems have been pursued as highly integrated systems that include smart sensors, diagnostic and prognostics software for assessments of real-time and life-cycle vehicle health information. Inclusive to such a system is the requirement to monitor the environmental health within the vehicle and the occupants of the vehicle. In this regard an enterprise approach to informatics is used to develop a methodology entitled, Comprehensive Environmental Informatics System (CEIS). The hardware and software technologies integrated into this system will be embedded in the vehicle subsystems, and maintenance operations, to provide both real-time and life-cycle health information of the environment within the vehicle cabin and of its occupants. This comprehensive information database will enable informed decision making and logistics management. One key element of the CEIS is interoperability for data acquisition and archive between environment and human system monitoring. With comprehensive components the data acquired in this system will use model based reasoning systems for subsystem and system level managers, advanced on-board and ground-based mission and maintenance planners to assess system functionality. Knowledge databases of the vehicle health state will be continuously updated and reported for critical failure modes, and routinely updated and reported for life cycle condition trending. Sufficient intelligence, including evidence-based engineering practices which are analogous to evidencebased medicine practices, will be included in the CEIS to result in more rapid recognition of off-nominal operation to enable quicker corrective actions. This will result from better information (rather than just data) for improved crew/operator situational awareness, which will produce significant vehicle and crew safety improvements, as well as increasing the chance for mission success, future mission planning as well as training. Other benefits include improved reliability, increase safety in operations and cost of operations. The cost benefits stem from significantly reduced processing and operations manpower, predictive maintenance for systems and subjects. The improvements in vehicle functionality and cost will result from increased prognostic and diagnostic capability due to the detailed total human exploration system health knowledge from CEIS. A collateral benefit is that there will be closer observations of the vehicle occupants as wrist watch sized devices are worn for continuous health monitoring. Additional database acquisition will stem from activities in countermeasure practices to ensure peak performance capability by occupants of the vehicle. The CEIS will provide data from advanced sensing technologies and informatics modeling which will be useful in problem troubleshooting, and improving NASA s awareness of systems during operation.

  5. Model-data integration for developing the Cropland Carbon Monitoring System (CCMS)

    NASA Astrophysics Data System (ADS)

    Jones, C. D.; Bandaru, V.; Pnvr, K.; Jin, H.; Reddy, A.; Sahajpal, R.; Sedano, F.; Skakun, S.; Wagle, P.; Gowda, P. H.; Hurtt, G. C.; Izaurralde, R. C.

    2017-12-01

    The Cropland Carbon Monitoring System (CCMS) has been initiated to improve regional estimates of carbon fluxes from croplands in the conterminous United States through integration of terrestrial ecosystem modeling, use of remote-sensing products and publically available datasets, and development of improved landscape and management databases. In order to develop these improved carbon flux estimates, experimental datasets are essential for evaluating the skill of estimates, characterizing the uncertainty of these estimates, characterizing parameter sensitivities, and calibrating specific modeling components. Experiments were sought that included flux tower measurement of CO2 fluxes under production of major agronomic crops. Currently data has been collected from 17 experiments comprising 117 site-years from 12 unique locations. Calibration of terrestrial ecosystem model parameters using available crop productivity and net ecosystem exchange (NEE) measurements resulted in improvements in RMSE of NEE predictions of between 3.78% to 7.67%, while improvements in RMSE for yield ranged from -1.85% to 14.79%. Model sensitivities were dominated by parameters related to leaf area index (LAI) and spring growth, demonstrating considerable capacity for model improvement through development and integration of remote-sensing products. Subsequent analyses will assess the impact of such integrated approaches on skill of cropland carbon flux estimates.

  6. Scaling up health knowledge at European level requires sharing integrated data: an approach for collection of database specification.

    PubMed

    Menditto, Enrica; Bolufer De Gea, Angela; Cahir, Caitriona; Marengoni, Alessandra; Riegler, Salvatore; Fico, Giuseppe; Costa, Elisio; Monaco, Alessandro; Pecorelli, Sergio; Pani, Luca; Prados-Torres, Alexandra

    2016-01-01

    Computerized health care databases have been widely described as an excellent opportunity for research. The availability of "big data" has brought about a wave of innovation in projects when conducting health services research. Most of the available secondary data sources are restricted to the geographical scope of a given country and present heterogeneous structure and content. Under the umbrella of the European Innovation Partnership on Active and Healthy Ageing, collaborative work conducted by the partners of the group on "adherence to prescription and medical plans" identified the use of observational and large-population databases to monitor medication-taking behavior in the elderly. This article describes the methodology used to gather the information from available databases among the Adherence Action Group partners with the aim of improving data sharing on a European level. A total of six databases belonging to three different European countries (Spain, Republic of Ireland, and Italy) were included in the analysis. Preliminary results suggest that there are some similarities. However, these results should be applied in different contexts and European countries, supporting the idea that large European studies should be designed in order to get the most of already available databases.

  7. The NIFFTE Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Qu, Hai; Niffte Collaboration

    2011-10-01

    The Neutron Induced Fission Fragment Tracking Experiment (NIFFTE) will employ a novel, high granularity, pressurized Time Projection Chamber to measure fission cross-sections of the major actinides to high precision over a wide incident neutron energy range. These results will improve nuclear data accuracy and benefit the fuel cycle in the future. The NIFFTE data acquisition system (DAQ) has been designed and implemented on the prototype TPC. Lessons learned from engineering runs have been incorporated into some design changes that are being implemented before the next run cycle. A fully instrumented sextant of EtherDAQ cards (16 sectors, 496 channels) will be used for the next run cycle. The Maximum Integrated Data Acquisition System (MIDAS) has been chosen and customized to configure and run the experiment. It also meets the requirement for remote control and monitoring of the system. The integration of the MIDAS online database with the persistent PostgreSQL database has been implemented for experiment usage. The detailed design and current status of the DAQ system will be presented.

  8. CMS users data management service integration and first experiences with its NoSQL data storage

    NASA Astrophysics Data System (ADS)

    Riahi, H.; Spiga, D.; Boccali, T.; Ciangottini, D.; Cinquilli, M.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Santocchia, A.

    2014-06-01

    The distributed data analysis workflow in CMS assumes that jobs run in a different location to where their results are finally stored. Typically the user outputs must be transferred from one site to another by a dedicated CMS service, AsyncStageOut. This new service is originally developed to address the inefficiency in using the CMS computing resources when transferring the analysis job outputs, synchronously, once they are produced in the job execution node to the remote site. The AsyncStageOut is designed as a thin application relying only on the NoSQL database (CouchDB) as input and data storage. It has progressed from a limited prototype to a highly adaptable service which manages and monitors the whole user files steps, namely file transfer and publication. The AsyncStageOut is integrated with the Common CMS/Atlas Analysis Framework. It foresees the management of nearly nearly 200k users' files per day of close to 1000 individual users per month with minimal delays, and providing a real time monitoring and reports to users and service operators, while being highly available. The associated data volume represents a new set of challenges in the areas of database scalability and service performance and efficiency. In this paper, we present an overview of the AsyncStageOut model and the integration strategy with the Common Analysis Framework. The motivations for using the NoSQL technology are also presented, as well as data design and the techniques used for efficient indexing and monitoring of the data. We describe deployment model for the high availability and scalability of the service. We also discuss the hardware requirements and the results achieved as they were determined by testing with actual data and realistic loads during the commissioning and the initial production phase with the Common Analysis Framework.

  9. Recent advances in the Lesser Antilles observatories Part 2 : WebObs - an integrated web-based system for monitoring and networks management

    NASA Astrophysics Data System (ADS)

    Beauducel, François; Bosson, Alexis; Randriamora, Frédéric; Anténor-Habazac, Christian; Lemarchand, Arnaud; Saurel, Jean-Marie; Nercessian, Alexandre; Bouin, Marie-Paule; de Chabalier, Jean-Bernard; Clouard, Valérie

    2010-05-01

    Seismological and Volcanological observatories have common needs and often common practical problems for multi disciplinary data monitoring applications. In fact, access to integrated data in real-time and estimation of measurements uncertainties are keys for an efficient interpretation, but instruments variety, heterogeneity of data sampling and acquisition systems lead to difficulties that may hinder crisis management. In Guadeloupe observatory, we have developed in the last years an operational system that attempts to answer the questions in the context of a pluri-instrumental observatory. Based on a single computer server, open source scripts (Matlab, Perl, Bash, Nagios) and a Web interface, the system proposes: an extended database for networks management, stations and sensors (maps, station file with log history, technical characteristics, meta-data, photos and associated documents); a web-form interfaces for manual data input/editing and export (like geochemical analysis, some of the deformation measurements, ...); routine data processing with dedicated automatic scripts for each technique, production of validated data outputs, static graphs on preset moving time intervals, and possible e-mail alarms; computers, acquisition processes, stations and individual sensors status automatic check with simple criteria (files update and signal quality), displayed as synthetic pages for technical control. In the special case of seismology, WebObs includes a digital stripchart multichannel continuous seismogram associated with EarthWorm acquisition chain (see companion paper Part 1), event classification database, location scripts, automatic shakemaps and regional catalog with associated hypocenter maps accessed through a user request form. This system leads to a real-time Internet access for integrated monitoring and becomes a strong support for scientists and technicians exchange, and is widely open to interdisciplinary real-time modeling. It has been set up at Martinique observatory and installation is planned this year at Montserrat Volcanological Observatory. It also in production at the geomagnetic observatory of Addis Abeba in Ethiopia.

  10. Ground Control Point - Wireless System Network for UAV-based environmental monitoring applications

    NASA Astrophysics Data System (ADS)

    Mejia-Aguilar, Abraham

    2016-04-01

    In recent years, Unmanned Aerial Vehicles (UAV) have seen widespread civil applications including usage for survey and monitoring services in areas such as agriculture, construction and civil engineering, private surveillance and reconnaissance services and cultural heritage management. Most aerial monitoring services require the integration of information acquired during the flight (such as imagery) with ground-based information (such as GPS information or others) for improved ground truth validation. For example, to obtain an accurate 3D and Digital Elevation Model based on aerial imagery, it is necessary to include ground-based information of coordinate points, which are normally acquired with surveying methods based on Global Position Systems (GPS). However, GPS surveys are very time consuming and especially for longer time series of monitoring data repeated GPS surveys are necessary. In order to improve speed of data collection and integration, this work presents an autonomous system based on Waspmote technologies build on single nodes interlinked in a Wireless Sensor Network (WSN) star-topology for ground based information collection and later integration with surveying data obtained by UAV. Nodes are designed to be visible from the air, to resist extreme weather conditions with low-power consumption. Besides, nodes are equipped with GPS as well as Inertial Measurement Unit (IMU), accelerometer, temperature and soil moisture sensors and thus provide significant advantages in a broad range of applications for environmental monitoring. For our purpose, the WSN transmits the environmental data with 3G/GPRS to a database on a regular time basis. This project provides a detailed case study and implementation of a Ground Control Point System Network for UAV-based vegetation monitoring of dry mountain grassland in the Matsch valley, Italy.

  11. Retrovirus Integration Database (RID): a public database for retroviral insertion sites into host genomes.

    PubMed

    Shao, Wei; Shan, Jigui; Kearney, Mary F; Wu, Xiaolin; Maldarelli, Frank; Mellors, John W; Luke, Brian; Coffin, John M; Hughes, Stephen H

    2016-07-04

    The NCI Retrovirus Integration Database is a MySql-based relational database created for storing and retrieving comprehensive information about retroviral integration sites, primarily, but not exclusively, HIV-1. The database is accessible to the public for submission or extraction of data originating from experiments aimed at collecting information related to retroviral integration sites including: the site of integration into the host genome, the virus family and subtype, the origin of the sample, gene exons/introns associated with integration, and proviral orientation. Information about the references from which the data were collected is also stored in the database. Tools are built into the website that can be used to map the integration sites to UCSC genome browser, to plot the integration site patterns on a chromosome, and to display provirus LTRs in their inserted genome sequence. The website is robust, user friendly, and allows users to query the database and analyze the data dynamically. https://rid.ncifcrf.gov ; or http://home.ncifcrf.gov/hivdrp/resources.htm .

  12. Ontology based heterogeneous materials database integration and semantic query

    NASA Astrophysics Data System (ADS)

    Zhao, Shuai; Qian, Quan

    2017-10-01

    Materials digital data, high throughput experiments and high throughput computations are regarded as three key pillars of materials genome initiatives. With the fast growth of materials data, the integration and sharing of data is very urgent, that has gradually become a hot topic of materials informatics. Due to the lack of semantic description, it is difficult to integrate data deeply in semantic level when adopting the conventional heterogeneous database integration approaches such as federal database or data warehouse. In this paper, a semantic integration method is proposed to create the semantic ontology by extracting the database schema semi-automatically. Other heterogeneous databases are integrated to the ontology by means of relational algebra and the rooted graph. Based on integrated ontology, semantic query can be done using SPARQL. During the experiments, two world famous First Principle Computational databases, OQMD and Materials Project are used as the integration targets, which show the availability and effectiveness of our method.

  13. Integrity Constraint Monitoring in Software Development: Proposed Architectures

    NASA Technical Reports Server (NTRS)

    Fernandez, Francisco G.

    1997-01-01

    In the development of complex software systems, designers are required to obtain from many sources and manage vast amounts of knowledge of the system being built and communicate this information to personnel with a variety of backgrounds. Knowledge concerning the properties of the system, including the structure of, relationships between and limitations of the data objects in the system, becomes increasingly more vital as the complexity of the system and the number of knowledge sources increases. Ensuring that violations of these properties do not occur becomes steadily more challenging. One approach toward managing the enforcement or system properties, called context monitoring, uses a centralized repository of integrity constraints and a constraint satisfiability mechanism for dynamic verification of property enforcement during program execution. The focus of this paper is to describe possible software architectures that define a mechanism for dynamically checking the satisfiability of a set of constraints on a program. The next section describes the context monitoring approach in general. Section 3 gives an overview of the work currently being done toward the addition of an integrity constraint satisfiability mechanism to a high-level program language, SequenceL, and demonstrates how this model is being examined to develop a general software architecture. Section 4 describes possible architectures for a general constraint satisfiability mechanism, as well as an alternative approach that, uses embedded database queries in lieu of an external monitor. The paper concludes with a brief summary outlining the, current state of the research and future work.

  14. Importance of Data Management in a Long-term Biological Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Sigurd W; Brandt, Craig C; McCracken, Kitty

    2011-01-01

    The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program. An existing relational database was adapted and extended to handle biological data. Data modeling enabled the program's database to process, store, and retrievemore » its data. The data base's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. There are limitations. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.« less

  15. Importance of Data Management in a Long-Term Biological Monitoring Program

    NASA Astrophysics Data System (ADS)

    Christensen, Sigurd W.; Brandt, Craig C.; McCracken, Mary K.

    2011-06-01

    The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to meeting this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program when an existing relational database was adapted and extended to handle biological data. The database's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. We also discuss some limitations to our implementation. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.

  16. Functional integration of automated system databases by means of artificial intelligence

    NASA Astrophysics Data System (ADS)

    Dubovoi, Volodymyr M.; Nikitenko, Olena D.; Kalimoldayev, Maksat; Kotyra, Andrzej; Gromaszek, Konrad; Iskakova, Aigul

    2017-08-01

    The paper presents approaches for functional integration of automated system databases by means of artificial intelligence. The peculiarities of turning to account the database in the systems with the usage of a fuzzy implementation of functions were analyzed. Requirements for the normalization of such databases were defined. The question of data equivalence in conditions of uncertainty and collisions in the presence of the databases functional integration is considered and the model to reveal their possible occurrence is devised. The paper also presents evaluation method of standardization of integrated database normalization.

  17. Earthquake forecasting studies using radon time series data in Taiwan

    NASA Astrophysics Data System (ADS)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  18. Much ado about SEA/SA monitoring: The performance of English Regional Spatial Strategies, and some German comparisons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanusch, Marie; Glasson, John

    Strategic Environmental Assessment (SEA) seeks to better integrate environmental considerations into the preparation and decision-making process of plans and programmes with a view to promoting sustainable development. Further to application of the European Directive 2001/42/EC (SEA Directive) in 2004, the body of practical SEA experience, and parallel research, has increased steadily. Yet there is a crucial element of SEA which cannot build on much experience but whose importance will grow over time - namely that of SEA monitoring. The paper explores the application of SEA monitoring for English Regional Spatial Strategies (RSSs). It briefly introduces the role of SEA monitoringmore » and its legal requirements, the English approach of integrating SEA into Sustainability Appraisal (SA) and the nature of the current English Regional Planning context. The main part presents the research findings and discusses how practitioners cope with the challenges of SEA/SA monitoring - with guiding questions: why, what, who, how, when, and with what outcomes? Reflecting that monitoring is just about to start, the paper draws on measures envisaged for monitoring in the SA reports prepared for RSS, and on expert interviews. It identifies monitoring trends and highlights workable approaches as well as shortcomings. For a critical reflection the findings are mirrored briefly with SEA monitoring approaches of German Regional Plans. Although it is still early days for such monitoring, the findings indicate that there is a danger that some of the specific requirements and objectives of SEA/SA monitoring are not fully met, mainly due to insufficient databases, inappropriate institutional conditions and limited personnel and financial resources. Some recommendations are offered in conclusion.« less

  19. Integrating SAR with Optical and Thermal Remote Sensing for Operational Near Real-Time Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Webley, P.; Dehn, J.; Arko, S. A.; McAlpin, D. B.

    2013-12-01

    Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing techniques have become established in operational forecasting, monitoring, and managing of volcanic hazards. Monitoring organizations, like the Alaska Volcano Observatory (AVO), are nowadays heavily relying on remote sensing data from a variety of optical and thermal sensors to provide time-critical hazard information. Despite the high utilization of these remote sensing data to detect and monitor volcanic eruptions, the presence of clouds and a dependence on solar illumination often limit their impact on decision making processes. Synthetic Aperture Radar (SAR) systems are widely believed to be superior to optical sensors in operational monitoring situations, due to the weather and illumination independence of their observations and the sensitivity of SAR to surface changes and deformation. Despite these benefits, the contributions of SAR to operational volcano monitoring have been limited in the past due to (1) high SAR data costs, (2) traditionally long data processing times, and (3) the low temporal sampling frequencies inherent to most SAR systems. In this study, we present improved data access, data processing, and data integration techniques that mitigate some of the above mentioned limitations and allow, for the first time, a meaningful integration of SAR into operational volcano monitoring systems. We will introduce a new database interface that was developed in cooperation with the Alaska Satellite Facility (ASF) and allows for rapid and seamless data access to all of ASF's SAR data holdings. We will also present processing techniques that improve the temporal frequency with which hazard-related products can be produced. These techniques take advantage of modern signal processing technology as well as new radiometric normalization schemes, both enabling the combination of multiple observation geometries in change detection procedures. Additionally, it will be shown how SAR-based hazard information can be integrated with data from optical satellites, thermal sensors, webcams and models to create near-real time volcano hazard information. We will introduce a prototype monitoring system that integrates SAR-based hazard information into the near real-time volcano hazard monitoring system of the Alaska Volcano Observatory. This prototype system was applied to historic eruptions of the volcanoes Okmok and Augustine, both located in the North Pacific. We will show that for these historic eruptions, the addition of SAR data lead to a significant improvement in activity detection and eruption monitoring, and improved the accuracy and timeliness of eruption alerts.

  20. TPS In-Flight Health Monitoring Project Progress Report

    NASA Technical Reports Server (NTRS)

    Kostyk, Chris; Richards, Lance; Hudston, Larry; Prosser, William

    2007-01-01

    Progress in the development of new thermal protection systems (TPS) is reported. New approaches use embedded lightweight, sensitive, fiber optic strain and temperature sensors within the TPS. Goals of the program are to develop and demonstrate a prototype TPS health monitoring system, develop a thermal-based damage detection algorithm, characterize limits of sensor/system performance, and develop ea methodology transferable to new designs of TPS health monitoring systems. Tasks completed during the project helped establish confidence in understanding of both test setup and the model and validated system/sensor performance in a simple TPS structure. Other progress included complete initial system testing, commencement of the algorithm development effort, generation of a damaged thermal response characteristics database, initial development of a test plan for integration testing of proven FBG sensors in simple TPS structure, and development of partnerships to apply the technology.

  1. Using Commercially available Tools for multi-faceted health assessment: Data Integration Lessons Learned

    PubMed Central

    Wilamowska, Katarzyna; Le, Thai; Demiris, George; Thompson, Hilaire

    2013-01-01

    Health monitoring data collected from multiple available intake devices provide a rich resource to support older adult health and wellness. Though large amounts of data can be collected, there is currently a lack of understanding on integration of these various data sources using commercially available products. This article describes an inexpensive approach to integrating data from multiple sources from a recently completed pilot project that assessed older adult wellness, and demonstrates challenges and benefits in pursuing data integration using commercially available products. The data in this project were sourced from a) electronically captured participant intake surveys, and existing commercial software output for b) vital signs and c) cognitive function. All the software used for data integration in this project was freeware and was chosen because of its ease of comprehension by novice database users. The methods and results of this approach provide a model for researchers with similar data integration needs to easily replicate this effort at a low cost. PMID:23728444

  2. Multisource Data-Based Integrated Agricultural Drought Monitoring in the Huai River Basin, China

    NASA Astrophysics Data System (ADS)

    Sun, Peng; Zhang, Qiang; Wen, Qingzhi; Singh, Vijay P.; Shi, Peijun

    2017-10-01

    Drought monitoring is critical for early warning of drought hazard. This study attempted to develop an integrated remote sensing drought monitoring index (IRSDI), based on meteorological data for 2003-2013 from 40 meteorological stations and soil moisture data from 16 observatory stations, as well as Moderate Resolution Imaging Spectroradiometer data using a linear trend detection method, and standardized precipitation evapotranspiration index. The objective was to investigate drought conditions across the Huai River basin in both space and time. Results indicate that (1) the proposed IRSDI monitors and describes drought conditions across the Huai River basin reasonably well in both space and time; (2) frequency of drought and severe drought are observed during April-May and July-September. The northeastern and eastern parts of Huai River basin are dominated by frequent droughts and intensified drought events. These regions are dominated by dry croplands, grasslands, and highly dense population and are hence more sensitive to drought hazards; (3) intensified droughts are detected during almost all months except January, August, October, and December. Besides, significant intensification of droughts is discerned mainly in eastern and western Huai River basin. The duration and regions dominated by intensified drought events would be a challenge for water resources management in view of agricultural and other activities in these regions in a changing climate.

  3. Assuring image authenticity within a data grid using lossless digital signature embedding and a HIPAA-compliant auditing system

    NASA Astrophysics Data System (ADS)

    Lee, Jasper C.; Ma, Kevin C.; Liu, Brent J.

    2008-03-01

    A Data Grid for medical images has been developed at the Image Processing and Informatics Laboratory, USC to provide distribution and fault-tolerant storage of medical imaging studies across Internet2 and public domain. Although back-up policies and grid certificates guarantee privacy and authenticity of grid-access-points, there still lacks a method to guarantee the sensitive DICOM images have not been altered or corrupted during transmission across a public domain. This paper takes steps toward achieving full image transfer security within the Data Grid by utilizing DICOM image authentication and a HIPAA-compliant auditing system. The 3-D lossless digital signature embedding procedure involves a private 64 byte signature that is embedded into each original DICOM image volume, whereby on the receiving end the signature can to be extracted and verified following the DICOM transmission. This digital signature method has also been developed at the IPILab. The HIPAA-Compliant Auditing System (H-CAS) is required to monitor embedding and verification events, and allows monitoring of other grid activity as well. The H-CAS system federates the logs of transmission and authentication events at each grid-access-point and stores it into a HIPAA-compliant database. The auditing toolkit is installed at the local grid-access-point and utilizes Syslog [1], a client-server standard for log messaging over an IP network, to send messages to the H-CAS centralized database. By integrating digital image signatures and centralized logging capabilities, DICOM image integrity within the Medical Imaging and Informatics Data Grid can be monitored and guaranteed without loss to any image quality.

  4. Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App

    NASA Astrophysics Data System (ADS)

    Nurnawati, E. K.; Ermawati, E.

    2018-02-01

    An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.

  5. [Development of fixed-base full task space flight training simulator].

    PubMed

    Xue, Liang; Chen, Shan-quang; Chang, Tian-chun; Yang, Hong; Chao, Jian-gang; Li, Zhi-peng

    2003-01-01

    Fixed-base full task flight training simulator is a very critical and important integrated training facility. It is mostly used in training of integrated skills and tasks, such as running the flight program of manned space flight, dealing with faults, operating and controlling spacecraft flight, communicating information between spacecraft and ground. This simulator was made up of several subentries including spacecraft simulation, simulating cabin, sight image, acoustics, main controlling computer, instructor and assistant support. It has implemented many simulation functions, such as spacecraft environment, spacecraft movement, communicating information between spacecraft and ground, typical faults, manual control and operating training, training control, training monitor, training database management, training data recording, system detecting and so on.

  6. Estimation of Planetary Wave Parameters from the Data of the 1981 Ocean Acoustic Tomography Experiment.

    DTIC Science & Technology

    1985-10-01

    can monitor a larger region and provide a larger database with fewer moorings, and its averaging (integrating) process can filter out undesirable small...as the eikonal equation, relating o to the perturbed sound-speed field Z+6c and the flow field v during ,*.. a transmission by .= (c*-v VO) 2 /(F+6c...should consult Spiesberger et al. (1980) for ray identifications. Ugincius (1970) solved the eikonal equation using the method of -. characteristics

  7. User Centric Job Monitoring - a redesign and novel approach in the STAR experiment

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.; Zulkarneeva, Y.

    2014-06-01

    User Centric Monitoring (or UCM) has been a long awaited feature in STAR, whereas programs, workflows and system "events" could be logged, broadcast and later analyzed. UCM allows to collect and filter available job monitoring information from various resources and present it to users in a user-centric view rather than an administrative-centric point of view. The first attempt and implementation of "a" UCM approach was made in STAR 2004 using a log4cxx plug-in back-end and then further evolved with an attempt to push toward a scalable database back-end (2006) and finally using a Web-Service approach (2010, CSW4DB SBIR). The latest showed to be incomplete and not addressing the evolving needs of the experiment where streamlined messages for online (data acquisition) purposes as well as the continuous support for the data mining needs and event analysis need to coexists and unified in a seamless approach. The code also revealed to be hardly maintainable. This paper presents the next evolutionary step of the UCM toolkit, a redesign and redirection of our latest attempt acknowledging and integrating recent technologies and a simpler, maintainable and yet scalable manner. The extended version of the job logging package is built upon three-tier approach based on Task, Job and Event, and features a Web-Service based logging API, a responsive AJAX-powered user interface, and a database back-end relying on MongoDB, which is uniquely suited for STAR needs. In addition, we present details of integration of this logging package with the STAR offline and online software frameworks. Leveraging on the reported experience and work from the ATLAS and CMS experience on using the ESPER engine, we discuss and show how such approach has been implemented in STAR for meta-data event triggering stream processing and filtering. An ESPER based solution seems to fit well into the online data acquisition system where many systems are monitored.

  8. Monitoring Wildlife Interactions with Their Environment: An Interdisciplinary Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles-Smith, Lauren E.; Domnguez, Ignacio X.; Fornaro, Robert J.

    In a rapidly changing world, wildlife ecologists strive to correctly model and predict complex relationships between animals and their environment, which facilitates management decisions impacting public policy to conserve and protect delicate ecosystems. Recent advances in monitoring systems span scientific domains, including animal and weather monitoring devices and landscape classification mapping techniques. The current challenge is how to combine and use detailed output from various sources to address questions spanning multiple disciplines. WolfScout wildlife and weather tracking system is a software tool capable of filling this niche. WolfScout automates integration of the latest technological advances in wildlife GPS collars, weathermore » stations, drought conditions, and severe weather reports, and animal demographic information. The WolfScout database stores a variety of classified landscape maps including natural and manmade features. Additionally, WolfScout’s spatial database management system allows users to calculate distances between animals’ location and landscape characteristics, which are linked to the best approximation of environmental conditions at the animal’s location during the interaction. Through a secure website, data are exported in formats compatible with multiple software programs including R and ArcGIS. The WolfScout design promotes interoperability in data, between researchers, and software applications while standardizing analyses of animal interactions with their environment.« less

  9. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  10. Scaling up health knowledge at European level requires sharing integrated data: an approach for collection of database specification

    PubMed Central

    Menditto, Enrica; Bolufer De Gea, Angela; Cahir, Caitriona; Marengoni, Alessandra; Riegler, Salvatore; Fico, Giuseppe; Costa, Elisio; Monaco, Alessandro; Pecorelli, Sergio; Pani, Luca; Prados-Torres, Alexandra

    2016-01-01

    Computerized health care databases have been widely described as an excellent opportunity for research. The availability of “big data” has brought about a wave of innovation in projects when conducting health services research. Most of the available secondary data sources are restricted to the geographical scope of a given country and present heterogeneous structure and content. Under the umbrella of the European Innovation Partnership on Active and Healthy Ageing, collaborative work conducted by the partners of the group on “adherence to prescription and medical plans” identified the use of observational and large-population databases to monitor medication-taking behavior in the elderly. This article describes the methodology used to gather the information from available databases among the Adherence Action Group partners with the aim of improving data sharing on a European level. A total of six databases belonging to three different European countries (Spain, Republic of Ireland, and Italy) were included in the analysis. Preliminary results suggest that there are some similarities. However, these results should be applied in different contexts and European countries, supporting the idea that large European studies should be designed in order to get the most of already available databases. PMID:27358570

  11. MIRO and IRbase: IT Tools for the Epidemiological Monitoring of Insecticide Resistance in Mosquito Disease Vectors

    PubMed Central

    Dialynas, Emmanuel; Topalis, Pantelis; Vontas, John; Louis, Christos

    2009-01-01

    Background Monitoring of insect vector populations with respect to their susceptibility to one or more insecticides is a crucial element of the strategies used for the control of arthropod-borne diseases. This management task can nowadays be achieved more efficiently when assisted by IT (Information Technology) tools, ranging from modern integrated databases to GIS (Geographic Information System). Here we describe an application ontology that we developed de novo, and a specially designed database that, based on this ontology, can be used for the purpose of controlling mosquitoes and, thus, the diseases that they transmit. Methodology/Principal Findings The ontology, named MIRO for Mosquito Insecticide Resistance Ontology, developed using the OBO-Edit software, describes all pertinent aspects of insecticide resistance, including specific methodology and mode of action. MIRO, then, forms the basis for the design and development of a dedicated database, IRbase, constructed using open source software, which can be used to retrieve data on mosquito populations in a temporally and spatially separate way, as well as to map the output using a Google Earth interface. The dependency of the database on the MIRO allows for a rational and efficient hierarchical search possibility. Conclusions/Significance The fact that the MIRO complies with the rules set forward by the OBO (Open Biomedical Ontologies) Foundry introduces cross-referencing with other biomedical ontologies and, thus, both MIRO and IRbase are suitable as parts of future comprehensive surveillance tools and decision support systems that will be used for the control of vector-borne diseases. MIRO is downloadable from and IRbase is accessible at VectorBase, the NIAID-sponsored open access database for arthropod vectors of disease. PMID:19547750

  12. Distribution Grid Integration Unit Cost Database | Solar Research | NREL

    Science.gov Websites

    Unit Cost Database Distribution Grid Integration Unit Cost Database NREL's Distribution Grid Integration Unit Cost Database contains unit cost information for different components that may be used to associated with PV. It includes information from the California utility unit cost guides on traditional

  13. Integrated knowledge-based tools for documenting and monitoring damages to built heritage

    NASA Astrophysics Data System (ADS)

    Cacciotti, R.

    2015-08-01

    The advancements of information technologies as applied to the most diverse fields of science define a breakthrough in the accessibility and processing of data for both expert and non-expert users. Nowadays it is possible to evidence an increasingly relevant research effort in the context of those domains, such as that of cultural heritage protection, in which knowledge mapping and sharing constitute critical prerequisites for accomplishing complex professional tasks. The aim of this paper is to outline the main results and outputs of the MONDIS research project. This project focusses on the development of integrated knowledge-based tools grounded on an ontological representation of the field of heritage conservation. The scope is to overcome the limitations of earlier databases by the application of modern semantic technologies able to integrate, organize and process useful information concerning damages to built heritage objects. In particular MONDIS addresses the need for supporting a diverse range of stakeholders (e.g. administrators, owners and professionals) in the documentation and monitoring of damages to historical constructions and in finding related remedies. The paper concentrates on the presentation of the following integrated knowledgebased components developed within the project: (I) MONDIS mobile application (plus desktop version), (II) MONDIS record explorer, (III) Ontomind profiles, (IV) knowledge matrix and (V) terminology editor. An example of practical application of the MONDIS integrated system is also provided and finally discussed.

  14. Individual based, long term monitoring of acacia trees in hyper arid zone: Integration of a field survey and a remote sensing approach

    NASA Astrophysics Data System (ADS)

    Isaacson, Sivan; Blumberg, Dan G.; Ginat, Hanan; Shalmon, Benny

    2013-04-01

    Vegetation in hyper arid zones is very sparse as is. Monitoring vegetation changes in hyper arid zones is important because any reduction in the vegetation cover in these areas can lead to a considerable reduction in the carrying capacity of the ecological system. This study focuses on the impact of climate fluctuations on the acacia population in the southern Arava valley, Israel. The period of this survey includes a sequence of dry years with no flashfloods in most of the plots that ended in two years with vast floods. Arid zone acacia trees play a significant role in the desert ecosystem by moderating the extreme environmental conditions including radiation, temperature, humidity and precipitation. The trees also provide nutrients for the desert dwellers. Therefore, acacia trees in arid zones are considered to be `keystone species', because they have major influence over both plants and animal species, i.e., biodiversity. Long term monitoring of the acacia tree population in this area can provide insights into long term impacts of climate fluctuations on ecosystems in arid zones. Since 2000, a continuous yearly based survey on the three species of acacia population in seven different plots is conducted in the southern Arava (established by Shalmon, ecologist of the Israel nature and parks authority). The seven plots representing different ecosystems and hydrological regimes. A yearly based population monitoring enabled us to determine the mortality and recruitment rate of the acacia populations as well as growing rates of individual trees. This survey provides a unique database of the acacia population dynamics during a sequence of dry years that ended in a vast flood event during the winter of 2010. A lack of quantitative, nondestructive methods to estimate and monitor stress status of the acacia trees, led us to integrate remote sensing tools (ground and air-based) along with conventional field measurements in order to develop a long term monitoring of acacia trees in hyper arid zones. This study includes further work on the development of ground based remote sensing as a new tool to monitor stress indicators as part of long term ecological research. Since acacia trees are long lived, we were able to identify individual trees in satellite images from 1968 (corona) and expand our monitoring "into the past". Remote sensing expands the spatial and temporal database and is thus a powerful tool for long term monitoring in arid zones, where access is limited and long-term ground data are rare.

  15. Building Capacity for a Long-Term, in-Situ, National-Scale Phenology Monitoring Network: Successes, Challenges and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Browning, D. M.

    2014-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org) is a national-scale science and monitoring initiative focused on phenology - the study of seasonal life-cycle events such as leafing, flowering, reproduction, and migration - as a tool to understand the response of biodiversity to environmental variation and change. USA-NPN provides a hierarchical, national monitoring framework that enables other organizations to leverage the capacity of the Network for their own applications - minimizing investment and duplication of effort - while promoting interoperability. Network participants can leverage: (1) Standardized monitoring protocols that have been broadly vetted, tested and published; (2) A centralized National Phenology Database (NPDb) for maintaining, archiving and replicating data, with standard metadata, terms-of-use, web-services, and documentation of QA/QC, plus tools for discovery, visualization and download of raw data and derived data products; and/or (3) A national in-situ, multi-taxa phenological monitoring system, Nature's Notebook, which enables participants to observe and record phenology of plants and animals - based on the protocols and information management system (IMS) described above - via either web or mobile applications. The protocols, NPDb and IMS, and Nature's Notebook represent a hierarchy of opportunities for involvement by a broad range of interested stakeholders, from individuals to agencies. For example, some organizations have adopted (e.g., the National Ecological Observatory Network or NEON) -- or are considering adopting (e.g., the Long-Term Agroecosystems Network or LTAR) -- the USA-NPN standardized protocols, but will develop their own database and IMS with web services to promote sharing of data with the NPDb. Other organizations (e.g., the Inventory and Monitoring Programs of the National Wildlife Refuge System and the National Park Service) have elected to use Nature's Notebook to support their phenological monitoring programs. We highlight the challenges and benefits of integrating phenology monitoring within existing and emerging national monitoring networks, and showcase opportunities that exist when standardized protocols are adopted and implemented to promote data interoperability and sharing.

  16. Improving data management and dissemination in web based information systems by semantic enrichment of descriptive data aspects

    NASA Astrophysics Data System (ADS)

    Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan

    2010-10-01

    The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data retrieval, analysis, and distribution. The graphical user interfaces facilitate metadata cataloguing, data warehousing, web sensor data analysis and thematic mapping.

  17. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    NASA Astrophysics Data System (ADS)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  18. Role of HIS/RIS DICOM interfaces in the integration of imaging into the Department of Veterans Affairs healthcare enterprise

    NASA Astrophysics Data System (ADS)

    Kuzmak, Peter M.; Dayhoff, Ruth E.

    1998-07-01

    The U.S. Department of Veterans Affairs is integrating imaging into the healthcare enterprise using the Digital Imaging and Communication in Medicine (DICOM) standard protocols. Image management is directly integrated into the VistA Hospital Information System (HIS) software and clinical database. Radiology images are acquired via DICOM, and are stored directly in the HIS database. Images can be displayed on low- cost clinician's workstations throughout the medical center. High-resolution diagnostic quality multi-monitor VistA workstations with specialized viewing software can be used for reading radiology images. DICOM has played critical roles in the ability to integrate imaging functionality into the Healthcare Enterprise. Because of its openness, it allows the integration of system components from commercial and non- commercial sources to work together to provide functional cost-effective solutions (see Figure 1). Two approaches are used to acquire and handle images within the radiology department. At some VA Medical Centers, DICOM is used to interface a commercial Picture Archiving and Communications System (PACS) to the VistA HIS. At other medical centers, DICOM is used to interface the image producing modalities directly to the image acquisition and display capabilities of VistA itself. Both of these approaches use a small set of DICOM services that has been implemented by VistA to allow patient and study text data to be transmitted to image producing modalities and the commercial PACS, and to enable images and study data to be transferred back.

  19. A Multi-Index Integrated Change detection method for updating the National Land Cover Database

    USGS Publications Warehouse

    Jin, Suming; Yang, Limin; Xian, George Z.; Danielson, Patrick; Homer, Collin G.

    2010-01-01

    Land cover change is typically captured by comparing two or more dates of imagery and associating spectral change with true thematic change. A new change detection method, Multi-Index Integrated Change (MIIC), has been developed to capture a full range of land cover disturbance patterns for updating the National Land Cover Database (NLCD). Specific indices typically specialize in identifying only certain types of disturbances; for example, the Normalized Burn Ratio (NBR) has been widely used for monitoring fire disturbance. Recognizing the potential complementary nature of multiple indices, we integrated four indices into one model to more accurately detect true change between two NLCD time periods. The four indices are NBR, Normalized Difference Vegetation Index (NDVI), Change Vector (CV), and a newly developed index called the Relative Change Vector (RCV). The model is designed to provide both change location and change direction (e.g. biomass increase or biomass decrease). The integrated change model has been tested on five image pairs from different regions exhibiting a variety of disturbance types. Compared with a simple change vector method, MIIC can better capture the desired change without introducing additional commission errors. The model is particularly accurate at detecting forest disturbances, such as forest harvest, forest fire, and forest regeneration. Agreement between the initial change map areas derived from MIIC and the retained final land cover type change areas will be showcased from the pilot test sites.

  20. Environmental monitoring techniques and wave energy potential assessment: an integrated approach for planning marine energy conversion schemes in the northern Tyrrhenian sea, Italy

    NASA Astrophysics Data System (ADS)

    Scanu, Sergio; Peviani, Maximo; Carli, Filippo Maria; Paladini de Mendoza, Francesco; Piermattei, Viviana; Bonamano, Simone; Marcelli, Marco

    2015-04-01

    This work proposes a multidisciplinary approach in which wave power potential maps are used as baseline for the application of environmental monitoring techniques identified through the use of a Database for Environmental Monitoring Techniques and Equipment (DEMTE), derived in the frame of the project "Marine Renewables Infrastructure Network for Emerging Energy Technologies" (Marinet - FP7). This approach aims to standardize the monitoring of the marine environment in the event of installation, operation and decommissioning of Marine Energy Conversion Systems. The database has been obtained through the collection of techniques and instrumentation available among the partners of the consortium, in relation with all environmental marine compounds potentially affected by any impacts. Furthermore in order to plan marine energy conversion schemes, the wave potential was assessed at regional and local scales using the numerical modelling downscaling methodology. The regional scale lead to the elaboration of the Italian Wave Power Atlas, while the local scale lead to the definition of nearshore hot spots useful for the planning of devices installation along the Latium coast. The present work focus in the application of environmental monitoring techniques identified in the DEMTE, in correspondence of the hotspot derived from the wave potential maps with particular reference to the biological interaction of the devices and the management of the marine space. The obtained results are the bases for the development of standardized procedures which aims to an effective application of marine environmental monitoring techniques during the installation, operation and decommissioning of Marine Energy Conversion Systems. The present work gives a consistent contribution to overcome non-technological barriers in the concession procedures, as far as the protection of the marine environment is of concern.

  1. A clinical database management system for improved integration of the Veterans Affairs Hospital Information System.

    PubMed

    Andrews, R D; Beauchamp, C

    1989-12-01

    The Department of Veterans Affairs (VA) Decentralized Hospital Computer Program (DHCP) contains data modules derived from separate ancillary services (e.g., Lab, Pharmacy and Radiology). It is currently difficult to integrate information between the modules. A prototype is being developed aimed at integrating ancillary data by storing clinical data oriented to the patient so that there is easy interaction of data from multiple services. A set of program utilities provides for user-defined functions of decision support, queries, and reports. Information can be used to monitor quality of care by providing feedback in the form of reports, and reminders. Initial testing has indicated the prototype's design and implementation are feasible (in terms of space requirements, speed, and ease of use) in outpatient and inpatient settings. The design, development, and clinical use of this prototype are described.

  2. Integration of population census and water point mapping data-A case study of Cambodia, Liberia and Tanzania.

    PubMed

    Yu, Weiyu; Wardrop, Nicola A; Bain, Robert; Wright, Jim A

    2017-07-01

    Sustainable Development Goal (SDG) 6 has expanded the Millennium Development Goals' focus from improved drinking-water to safely managed water services. This expanded focus to include issues such as water quality requires richer monitoring data and potentially integration of datasets from different sources. Relevant data sets include water point mapping (WPM), the survey of boreholes, wells and other water points, census and household survey data. This study examined inconsistencies between population census and WPM datasets for Cambodia, Liberia and Tanzania, and identified potential barriers to integrating the two datasets to meet monitoring needs. Literatures on numbers of people served per water point were used to convert WPM data to population served by water source type per area and compared with census reports. For Cambodia and Tanzania, discrepancies with census data suggested incomplete WPM coverage. In Liberia, where the data sets were consistent, WPM-derived data on functionality, quantity and quality of drinking water were further combined with census area statistics to generate an enhanced drinking-water access measure for protected wells and springs. The process revealed barriers to integrating census and WPM data, including exclusion of water points not used for drinking by households, matching of census and WPM source types; temporal mismatches between data sources; data quality issues such as missing or implausible data values, and underlying assumptions about population served by different water point technologies. However, integration of these two data sets could be used to identify and rectify gaps in WPM coverage. If WPM databases become more complete and the above barriers are addressed, it could also be used to develop more realistic measures of household drinking-water access for monitoring. Copyright © 2017 Elsevier GmbH. All rights reserved.

  3. Automated monitoring of medical protocols: a secure and distributed architecture.

    PubMed

    Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F

    2003-03-01

    The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.

  4. Web-based credential monitoring instantly flags health professionals with fraudulent licenses or criminal backgrounds.

    PubMed

    Haddad, Matthew

    2009-01-01

    An alarming number of practicing medical professionals and healthcare staffers across the nation may have criminal backgrounds, jeopardizing the health of hundreds of millions of patients and compromising the integrity of healthcare in this country. An investigation conducted by The Los Angeles Times found that an extraordinary number of nurses in California with criminal backgrounds had been allowed to continue working in healthcare facilities for years--their crimes virtually swept under the rug. This article suggests that continuous monitoring of healthcare credentials can mitigate the potential harm posed by credentialing fraud, recommending 24/7 monitoring in real-time as opposed to once every year or two as is the current practice. This would include verification of provider licenses, Drug Enforcement Administration certification, Office of Inspector General status, and criminal offenses. Automatic and continuous monitoring of licenses and other databases for changes and lapses, and reports on issues that are uncovered, help to prevent harmful acts on the part of healthcare providers with questionable backgrounds.

  5. CSHM: Web-based safety and health monitoring system for construction management.

    PubMed

    Cheung, Sai On; Cheung, Kevin K W; Suen, Henry C H

    2004-01-01

    This paper describes a web-based system for monitoring and assessing construction safety and health performance, entitled the Construction Safety and Health Monitoring (CSHM) system. The design and development of CSHM is an integration of internet and database systems, with the intent to create a total automated safety and health management tool. A list of safety and health performance parameters was devised for the management of safety and health in construction. A conceptual framework of the four key components of CSHM is presented: (a) Web-based Interface (templates); (b) Knowledge Base; (c) Output Data; and (d) Benchmark Group. The combined effect of these components results in a system that enables speedy performance assessment of safety and health activities on construction sites. With the CSHM's built-in functions, important management decisions can theoretically be made and corrective actions can be taken before potential hazards turn into fatal or injurious occupational accidents. As such, the CSHM system will accelerate the monitoring and assessing of performance safety and health management tasks.

  6. Integrated Geo Hazard Management System in Cloud Computing Technology

    NASA Astrophysics Data System (ADS)

    Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.

    2016-11-01

    Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.

  7. Can the Millennium Development Goals database be used to measure the effects of globalisation on women's health in Sub-Saharan Africa? A critical analysis.

    PubMed

    Wamala, Sarah; Breman, Anna; Richardson, Matt X; Loewenson, Rene

    2010-03-01

    Africa has had poor returns from integration with world markets in globalisation, has experienced worsening poverty and malnutrition and has high burdens of HIV and communicable disease, with particular burdens on women. It is therefore essential to describe the impact of globalisation on women's health. Indicators such as the Millennium Development Goals (MDGs) are presented as having a major role in measuring this impact, but an assessment of the adequacy of aggregate national indicators used in monitoring the MDGs for this purpose is lacking. The Millennium Development Goals' panel database 2000 to 2006 was used to investigate the association between globalisation and women's health in Sub-Saharan Africa based on various determinants of heath. Out of the 148 countries classified as developing countries, 48 were in Sub-Saharan Africa. Results suggest that developing countries are becoming more integrated with world markets through some lowering of trade barriers. At the same time, women's occupational roles are changing, which could affect their health status. However, it is difficult to measure the impact of globalisation on women's health from the MDG database. First, data on trade liberalization is aggregated at the regional level and does not hold any information on individual countries. Second, too few indicators in the MDG database are disaggregated by sex, making it difficult to separate the effects on women from those on men. The MDG database is not adequate to assess the effects of globalisation on women's health in Sub-Saharan Africa. We recommend that researchers aim to address this research question to find other data sources or turn to case studies. We hope that results from this study will stimulate research on globalisation and health using reliable sources.

  8. A global organism detection and monitoring system for non-native species

    USGS Publications Warehouse

    Graham, J.; Newman, G.; Jarnevich, C.; Shory, R.; Stohlgren, T.J.

    2007-01-01

    Harmful invasive non-native species are a significant threat to native species and ecosystems, and the costs associated with non-native species in the United States is estimated at over $120 Billion/year. While some local or regional databases exist for some taxonomic groups, there are no effective geographic databases designed to detect and monitor all species of non-native plants, animals, and pathogens. We developed a web-based solution called the Global Organism Detection and Monitoring (GODM) system to provide real-time data from a broad spectrum of users on the distribution and abundance of non-native species, including attributes of their habitats for predictive spatial modeling of current and potential distributions. The four major subsystems of GODM provide dynamic links between the organism data, web pages, spatial data, and modeling capabilities. The core survey database tables for recording invasive species survey data are organized into three categories: "Where, Who & When, and What." Organisms are identified with Taxonomic Serial Numbers from the Integrated Taxonomic Information System. To allow users to immediately see a map of their data combined with other user's data, a custom geographic information system (GIS) Internet solution was required. The GIS solution provides an unprecedented level of flexibility in database access, allowing users to display maps of invasive species distributions or abundances based on various criteria including taxonomic classification (i.e., phylum or division, order, class, family, genus, species, subspecies, and variety), a specific project, a range of dates, and a range of attributes (percent cover, age, height, sex, weight). This is a significant paradigm shift from "map servers" to true Internet-based GIS solutions. The remainder of the system was created with a mix of commercial products, open source software, and custom software. Custom GIS libraries were created where required for processing large datasets, accessing the operating system, and to use existing libraries in C++, R, and other languages to develop the tools to track harmful species in space and time. The GODM database and system are crucial for early detection and rapid containment of invasive species. ?? 2007 Elsevier B.V. All rights reserved.

  9. [A web-based integrated clinical database for laryngeal cancer].

    PubMed

    E, Qimin; Liu, Jialin; Li, Yong; Liang, Chuanyu

    2014-08-01

    To establish an integrated database for laryngeal cancer, and to provide an information platform for laryngeal cancer in clinical and fundamental researches. This database also meet the needs of clinical and scientific use. Under the guidance of clinical expert, we have constructed a web-based integrated clinical database for laryngeal carcinoma on the basis of clinical data standards, Apache+PHP+MySQL technology, laryngeal cancer specialist characteristics and tumor genetic information. A Web-based integrated clinical database for laryngeal carcinoma had been developed. This database had a user-friendly interface and the data could be entered and queried conveniently. In addition, this system utilized the clinical data standards and exchanged information with existing electronic medical records system to avoid the Information Silo. Furthermore, the forms of database was integrated with laryngeal cancer specialist characteristics and tumor genetic information. The Web-based integrated clinical database for laryngeal carcinoma has comprehensive specialist information, strong expandability, high feasibility of technique and conforms to the clinical characteristics of laryngeal cancer specialties. Using the clinical data standards and structured handling clinical data, the database can be able to meet the needs of scientific research better and facilitate information exchange, and the information collected and input about the tumor sufferers are very informative. In addition, the user can utilize the Internet to realize the convenient, swift visit and manipulation on the database.

  10. Quebec Trophoblastic Disease Registry: how to make an easy-to-use dynamic database.

    PubMed

    Sauthier, Philippe; Breguet, Magali; Rozenholc, Alexandre; Sauthier, Michaël

    2015-05-01

    To create an easy-to-use dynamic database designed specifically for the Quebec Trophoblastic Disease Registry (RMTQ). It is now well established that much of the success in managing trophoblastic diseases comes from the development of national and regional reference centers. Computerized databases allow the optimal use of data stored in these centers. We have created an electronic data registration system by producing a database using FileMaker Pro 12. It uses 11 external tables associated with a unique identification number for each patient. Each table allows specific data to be recorded, incorporating demographics, diagnosis, automated staging, laboratory values, pathological diagnosis, and imaging parameters. From January 1, 2009, to December 31, 2013, we used our database to register 311 patients with 380 diseases and have seen a 39.2% increase in registrations each year between 2009 and 2012. This database allows the automatic generation of semilogarithmic curves, which take into account β-hCG values as a function of time, complete with graphic markers for applied treatments (chemotherapy, radiotherapy, or surgery). It generates a summary sheet for a synthetic vision in real time. We have created, at a low cost, an easy-to-use database specific to trophoblastic diseases that dynamically integrates staging and monitoring. We propose a 10-step procedure for a successful trophoblastic database. It improves patient care, research, and education on trophoblastic diseases in Quebec and leads to an opportunity for collaboration on a national Canadian registry.

  11. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  12. Bio-optical data integration based on a 4 D database system approach

    NASA Astrophysics Data System (ADS)

    Imai, N. N.; Shimabukuro, M. H.; Carmo, A. F. C.; Alcantara, E. H.; Rodrigues, T. W. P.; Watanabe, F. S. Y.

    2015-04-01

    Bio-optical characterization of water bodies requires spatio-temporal data about Inherent Optical Properties and Apparent Optical Properties which allow the comprehension of underwater light field aiming at the development of models for monitoring water quality. Measurements are taken to represent optical properties along a column of water, and then the spectral data must be related to depth. However, the spatial positions of measurement may differ since collecting instruments vary. In addition, the records should not refer to the same wavelengths. Additional difficulty is that distinct instruments store data in different formats. A data integration approach is needed to make these large and multi source data sets suitable for analysis. Thus, it becomes possible, even automatically, semi-empirical models evaluation, preceded by preliminary tasks of quality control. In this work it is presented a solution, in the stated scenario, based on spatial - geographic - database approach with the adoption of an object relational Database Management System - DBMS - due to the possibilities to represent all data collected in the field, in conjunction with data obtained by laboratory analysis and Remote Sensing images that have been taken at the time of field data collection. This data integration approach leads to a 4D representation since that its coordinate system includes 3D spatial coordinates - planimetric and depth - and the time when each data was taken. It was adopted PostgreSQL DBMS extended by PostGIS module to provide abilities to manage spatial/geospatial data. It was developed a prototype which has the mainly tools an analyst needs to prepare the data sets for analysis.

  13. The SARVIEWS Project: Automated SAR Processing in Support of Operational Near Real-time Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Webley, P. W.; Dehn, J.; Arko, S. A.; McAlpin, D. B.; Gong, W.

    2016-12-01

    Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing has become established in operational volcano monitoring. Centers like the Alaska Volcano Observatory rely heavily on remote sensing data from optical and thermal sensors to provide time-critical hazard information. Despite this high use of remote sensing data, the presence of clouds and a dependence on solar illumination often limit their impact on decision making. Synthetic Aperture Radar (SAR) systems are widely considered superior to optical sensors in operational monitoring situations, due to their weather and illumination independence. Still, the contribution of SAR to operational volcano monitoring has been limited in the past due to high data costs, long processing times, and low temporal sampling rates of most SAR systems. In this study, we introduce the automatic SAR processing system SARVIEWS, whose advanced data analysis and data integration techniques allow, for the first time, a meaningful integration of SAR into operational monitoring systems. We will introduce the SARVIEWS database interface that allows for automatic, rapid, and seamless access to the data holdings of the Alaska Satellite Facility. We will also present a set of processing techniques designed to automatically generate a set of SAR-based hazard products (e.g. change detection maps, interferograms, geocoded images). The techniques take advantage of modern signal processing and radiometric normalization schemes, enabling the combination of data from different geometries. Finally, we will show how SAR-based hazard information is integrated in existing multi-sensor decision support tools to enable joint hazard analysis with data from optical and thermal sensors. We will showcase the SAR processing system using a set of recent natural disasters (both earthquakes and volcanic eruptions) to demonstrate its robustness. We will also show the benefit of integrating SAR with data from other sensors to support volcano monitoring. For historic eruptions at Okmok and Augustine volcano, both located in the North Pacific, we will demonstrate that the addition of SAR can lead to a significant improvement in activity detection and eruption forecasting.

  14. An intelligent service matching method for mechanical equipment condition monitoring using the fibre Bragg grating sensor network

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Zhou, Zude; Liu, Quan; Xu, Wenjun

    2017-02-01

    Due to the advantages of being able to function under harsh environmental conditions and serving as a distributed condition information source in a networked monitoring system, the fibre Bragg grating (FBG) sensor network has attracted considerable attention for equipment online condition monitoring. To provide an overall conditional view of the mechanical equipment operation, a networked service-oriented condition monitoring framework based on FBG sensing is proposed, together with an intelligent matching method for supporting monitoring service management. In the novel framework, three classes of progressive service matching approaches, including service-chain knowledge database service matching, multi-objective constrained service matching and workflow-driven human-interactive service matching, are developed and integrated with an enhanced particle swarm optimisation (PSO) algorithm as well as a workflow-driven mechanism. Moreover, the manufacturing domain ontology, FBG sensor network structure and monitoring object are considered to facilitate the automatic matching of condition monitoring services to overcome the limitations of traditional service processing methods. The experimental results demonstrate that FBG monitoring services can be selected intelligently, and the developed condition monitoring system can be re-built rapidly as new equipment joins the framework. The effectiveness of the service matching method is also verified by implementing a prototype system together with its performance analysis.

  15. Internationally coordinated glacier monitoring - a timeline since 1894

    NASA Astrophysics Data System (ADS)

    Nussbaumer, Samuel U.; Armstrong, Richard; Fetterer, Florence; Gärtner-Roer, Isabelle; Hoelzle, Martin; Machguth, Horst; Mölg, Nico; Paul, Frank; Raup, Bruce H.; Zemp, Michael

    2016-04-01

    Changes in glaciers and ice caps provide some of the clearest evidence of climate change, with impacts on sea-level variations, regional hydrological cycles, and natural hazard situations. Therefore, glaciers have been recognized as an Essential Climate Variable (ECV). Internationally coordinated collection and distribution of standardized information about the state and change of glaciers and ice caps was initiated in 1894 and is today organized within the Global Terrestrial Network for Glaciers (GTN-G). GTN-G ensures the continuous development and adaptation of the international strategies to the long-term needs of users in science and policy. A GTN-G Steering Committee coordinates, supports and advices the operational bodies responsible for the international glacier monitoring, which are the World Glacier Monitoring Service (WGMS), the US National Snow and Ice Data Center (NSIDC), and the Global Land Ice Measurements from Space (GLIMS) initiative. In this presentation, we trace the development of the internationally coordinated glacier monitoring since its beginning in the 19th century. Today, several online databases containing a wealth of diverse data types with different levels of detail and global coverage provide fast access to continuously updated information on glacier fluctuation and inventory data. All glacier datasets are made freely available through the respective operational bodies within GTN-G, and can be accessed through the GTN-G Global Glacier Browser (http://www.gtn-g.org/data_browser.html). Glacier inventory data (e.g., digital outlines) are available for about 180,000 glaciers (GLIMS database, RGI - Randolph Glacier Inventory, WGI - World Glacier Inventory). Glacier front variations with about 45,000 entries since the 17th century and about 6,200 glaciological and geodetic mass (volume) change observations dating back to the 19th century are available in the Fluctuations of Glaciers (FoG) database. These datasets reveal clear evidence that glacier retreat and mass loss is a global phenomenon. Glaciological and geodetic observations show that the rates of the 21st-century mass loss are unprecedented on a global scale, for the time period observed, and probably also for recorded history, as indicated in glacier reconstructions from written and illustrated documents. The databases are supplemented by specific index datasets (e.g., glacier thickness data) and a dataset containing information on special events including glacier surges, glacier lake outbursts, ice avalanches, eruptions of ice-clad volcanoes, etc. related to about 200 glaciers. A special database of glacier photographs (GPC - Glacier Photograph Collection) contains more than 15,000 pictures from around 500 glaciers, some of them dating back to the mid-19th century. Current efforts are to close remaining observational gaps regarding data both from in-situ measurements and remote sensing, to establish a well-distributed baseline for sound estimates of climate-related glacier changes and their impacts. Within the framework of dedicated capacity building and twinning activities, disrupted long-term mass balance programmes in Central Asia have recently been resumed, and the continuation of mass balance measurements in the Tropical Andes has been supported. New data also emerge from several research projects using NASA and ESA sensors and are actively integrated into the GTN-G databases. Key tasks for the future include the quantitative assessment of uncertainties of available measurements, and their representativeness for changes in the respective mountain ranges. For this, a well-considered integration of in-situ measurements, remotely sensed observations, and numerical modelling is required.

  16. 49 CFR 384.229 - Skills test examiner auditing and monitoring.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... overt monitoring must be performed at least once every year; (c) Establish and maintain a database to...; (d) Establish and maintain a database of all third party testers and examiners, which at a minimum... examiner; (e) Establish and maintain a database of all State CDL skills examiners, which at a minimum...

  17. 49 CFR 384.229 - Skills test examiner auditing and monitoring.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... overt monitoring must be performed at least once every year; (c) Establish and maintain a database to...; (d) Establish and maintain a database of all third party testers and examiners, which at a minimum... examiner; (e) Establish and maintain a database of all State CDL skills examiners, which at a minimum...

  18. Customizable tool for ecological data entry, assessment, monitoring, and interpretation

    USDA-ARS?s Scientific Manuscript database

    The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...

  19. Demonstrating Change with Astronaut Photography Using Object Based Image Analysis

    NASA Technical Reports Server (NTRS)

    Hollier, Andi; Jagge, Amy

    2017-01-01

    Every day, hundreds of images of Earth flood the Crew Earth Observations database as astronauts use hand held digital cameras to capture spectacular frames from the International Space Station. The variety of resolutions and perspectives provide a template for assessing land cover change over decades. We will focus on urban growth in the second fastest growing city in the nation, Houston, TX, using Object-Based Image Analysis. This research will contribute to the land change science community, integrated resource planning, and monitoring of the rapid rate of urban sprawl.

  20. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies.

    PubMed

    Yang, Xiaohuan; Huang, Yaohuan; Dong, Pinliang; Jiang, Dong; Liu, Honghui

    2009-01-01

    The spatial distribution of population is closely related to land use and land cover (LULC) patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS) have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS) is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B) data integrated with a Pattern Decomposition Method (PDM) and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM). The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable.

  1. Query Monitoring and Analysis for Database Privacy - A Security Automata Model Approach

    PubMed Central

    Kumar, Anand; Ligatti, Jay; Tu, Yi-Cheng

    2015-01-01

    Privacy and usage restriction issues are important when valuable data are exchanged or acquired by different organizations. Standard access control mechanisms either restrict or completely grant access to valuable data. On the other hand, data obfuscation limits the overall usability and may result in loss of total value. There are no standard policy enforcement mechanisms for data acquired through mutual and copyright agreements. In practice, many different types of policies can be enforced in protecting data privacy. Hence there is the need for an unified framework that encapsulates multiple suites of policies to protect the data. We present our vision of an architecture named security automata model (SAM) to enforce privacy-preserving policies and usage restrictions. SAM analyzes the input queries and their outputs to enforce various policies, liberating data owners from the burden of monitoring data access. SAM allows administrators to specify various policies and enforces them to monitor queries and control the data access. Our goal is to address the problems of data usage control and protection through privacy policies that can be defined, enforced, and integrated with the existing access control mechanisms using SAM. In this paper, we lay out the theoretical foundation of SAM, which is based on an automata named Mandatory Result Automata. We also discuss the major challenges of implementing SAM in a real-world database environment as well as ideas to meet such challenges. PMID:26997936

  2. Query Monitoring and Analysis for Database Privacy - A Security Automata Model Approach.

    PubMed

    Kumar, Anand; Ligatti, Jay; Tu, Yi-Cheng

    2015-11-01

    Privacy and usage restriction issues are important when valuable data are exchanged or acquired by different organizations. Standard access control mechanisms either restrict or completely grant access to valuable data. On the other hand, data obfuscation limits the overall usability and may result in loss of total value. There are no standard policy enforcement mechanisms for data acquired through mutual and copyright agreements. In practice, many different types of policies can be enforced in protecting data privacy. Hence there is the need for an unified framework that encapsulates multiple suites of policies to protect the data. We present our vision of an architecture named security automata model (SAM) to enforce privacy-preserving policies and usage restrictions. SAM analyzes the input queries and their outputs to enforce various policies, liberating data owners from the burden of monitoring data access. SAM allows administrators to specify various policies and enforces them to monitor queries and control the data access. Our goal is to address the problems of data usage control and protection through privacy policies that can be defined, enforced, and integrated with the existing access control mechanisms using SAM. In this paper, we lay out the theoretical foundation of SAM, which is based on an automata named Mandatory Result Automata. We also discuss the major challenges of implementing SAM in a real-world database environment as well as ideas to meet such challenges.

  3. Integration of the Eventlndex with other ATLAS systems

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Cárdenas Zárate, S. E.; Gallas, E. J.; Prokoshin, F.

    2015-12-01

    The ATLAS EventIndex System, developed for use in LHC Run 2, is designed to index every processed event in ATLAS, replacing the TAG System used in Run 1. Its storage infrastructure, based on Hadoop open-source software framework, necessitates revamping how information in this system relates to other ATLAS systems. It will store more indexes since the fundamental mechanisms for retrieving these indexes will be better integrated into all stages of data processing, allowing more events from later stages of processing to be indexed than was possible with the previous system. Connections with other systems (conditions database, monitoring) are fundamentally critical to assess dataset completeness, identify data duplication, and check data integrity, and also enhance access to information in EventIndex by user and system interfaces. This paper gives an overview of the ATLAS systems involved, the relevant metadata, and describe the technologies we are deploying to complete these connections.

  4. Web services for ecosystem services management and poverty alleviation

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Baez, S.; Veliz Rosas, C.

    2011-12-01

    Over the last decades, near real-time environmental observation, technical advances in computer power and cyber-infrastructure, and the development of environmental software algorithms have increased dramatically. The integration of these evolutions is one of the major challenges of the next decade for environmental sciences. Worldwide, many coordinated activities are ongoing to make this integration a reality. However, far less attention is paid to the question of how these developments can benefit environmental services management in a poverty alleviation context. Such projects are typically faced with issues of large predictive uncertainties, limited resources, limited local scientific capacity. At the same time, the complexity of the socio-economic contexts requires a very strong bottom-up oriented and interdisciplinary approach to environmental data collection and processing. Here, we present the results of two projects on integrated environmental monitoring and scenario analysis aimed at poverty alleviation in the Peruvian Andes and Amazon. In the upper Andean highlands, farmers are monitoring the water cycle of headwater catchments to analyse the impact of land-use changes on stream flow and potential consequences for downstream irrigation. In the Amazon, local communities are monitoring the dynamics of turtle populations and their relations with river levels. In both cases, the use of online databases and web processing services enable real-time analysis of the data and scenario analysis. The system provides both physical and social indicators to assess the impact of land-use management options on local socio-economic development.

  5. Composite use of numerical groundwater flow modeling and geoinformatics techniques for monitoring Indus Basin aquifer, Pakistan.

    PubMed

    Ahmad, Zulfiqar; Ashraf, Arshad; Fryar, Alan; Akhter, Gulraiz

    2011-02-01

    The integration of the Geographic Information System (GIS) with groundwater modeling and satellite remote sensing capabilities has provided an efficient way of analyzing and monitoring groundwater behavior and its associated land conditions. A 3-dimensional finite element model (Feflow) has been used for regional groundwater flow modeling of Upper Chaj Doab in Indus Basin, Pakistan. The approach of using GIS techniques that partially fulfill the data requirements and define the parameters of existing hydrologic models was adopted. The numerical groundwater flow model is developed to configure the groundwater equipotential surface, hydraulic head gradient, and estimation of the groundwater budget of the aquifer. GIS is used for spatial database development, integration with a remote sensing, and numerical groundwater flow modeling capabilities. The thematic layers of soils, land use, hydrology, infrastructure, and climate were developed using GIS. The Arcview GIS software is used as additive tool to develop supportive data for numerical groundwater flow modeling and integration and presentation of image processing and modeling results. The groundwater flow model was calibrated to simulate future changes in piezometric heads from the period 2006 to 2020. Different scenarios were developed to study the impact of extreme climatic conditions (drought/flood) and variable groundwater abstraction on the regional groundwater system. The model results indicated a significant response in watertable due to external influential factors. The developed model provides an effective tool for evaluating better management options for monitoring future groundwater development in the study area.

  6. A Data Management Framework for Real-Time Water Quality Monitoring

    NASA Astrophysics Data System (ADS)

    Mulyono, E.; Yang, D.; Craig, M.

    2007-12-01

    CSU East Bay operates two in-situ, near-real-time water quality monitoring stations in San Francisco Bay as a member of the Center for Integrative Coastal Ocean Observation, Research, and Education (CICORE) and the Central and Northern California Ocean Observing System (CeNCOOS). We have been operating stations at Dumbarton Pier and San Leandro Marina for the past two years. At each station, a sonde measures seven water quality parameters every six minutes. During the first year of operation, we retrieved data from the sondes every few weeks by visiting the sites and uploading data to a handheld logger. Last year we implemented a telemetry system utilizing a cellular CDMA modem to transfer data from the field to our data center on an hourly basis. Data from each station are initially stored in monthly files in native format. We import data from these files into a SQL database every hour. SQL is handled by Django, an open source web framework. Django provides a user- friendly web user interface (UI) to administer the data. We utilized parts of the Django UI for our database web- front, which allows users to access our database via the World Wide Web and perform basic queries. We also serve our data to other aggregating sites, including the central CICORE website and NOAA's National Data Buoy Center (NDBC). Since Django is written in Python, it allows us to integrate other Python modules into our software, such as the Matplot library for scientific graphics. We store our code in a Subversion repository, which keeps track of software revisions. Code is tested using Python's unittest and doctest modules within Django's testing facility, which warns us when our code modifications cause other parts of the software to break. During the past two years of data acquisition, we have incrementally updated our data model to accommodate changes in physical hardware, including equipment moves, instrument replacements, and sensor upgrades that affected data format.

  7. Web-based biobank system infrastructure monitoring using Python, Perl, and PHP.

    PubMed

    Norling, Martin; Kihara, Absolomon; Kemp, Steve

    2013-12-01

    The establishment and maintenance of biobanks is only as worthwhile as the security and logging of the biobank contents. We have designed a monitoring system that continuously measures temperature and gas content, records the movement of samples in and out of the biobank, and also records the opening and closing of the freezers-storing the results and images in a database. We have also incorporated an early warning feature that sends out alerts, via SMS and email, to responsible persons if any measurement is recorded outside the acceptable limits, guaranteeing the integrity of biobanked samples, as well as reagents used in sample analysis. A surveillance system like this increases the value for any biobank as the initial investment is small and the value of having trustworthy samples for future research is high.

  8. Hanford Environmental Information System (HEIS) Operator`s Manual. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schreck, R.I.

    1991-10-01

    The Hanford Environmental Information System (HEIS) is a consolidated set of automated resources that effectively manage the data gathered during environmental monitoring and restoration of the Hanford Site. The HEIS includes an integrated database that provides consistent and current data to all users and promotes sharing of data by the entire user community. This manual describes the facilities available to the operational user who is responsible for data entry, processing, scheduling, reporting, and quality assurance. A companion manual, the HEIS User`s Manual, describes the facilities available-to the scientist, engineer, or manager who uses the system for environmental monitoring, assessment, andmore » restoration planning; and to the regulator who is responsible for reviewing Hanford Site operations against regulatory requirements and guidelines.« less

  9. Monitoring by Use of Clusters of Sensor-Data Vectors

    NASA Technical Reports Server (NTRS)

    Iverson, David L.

    2007-01-01

    The inductive monitoring system (IMS) is a system of computer hardware and software for automated monitoring of the performance, operational condition, physical integrity, and other aspects of the health of a complex engineering system (e.g., an industrial process line or a spacecraft). The input to the IMS consists of streams of digitized readings from sensors in the monitored system. The IMS determines the type and amount of any deviation of the monitored system from a nominal or normal ( healthy ) condition on the basis of a comparison between (1) vectors constructed from the incoming sensor data and (2) corresponding vectors in a database of nominal or normal behavior. The term inductive reflects the use of a process reminiscent of traditional mathematical induction to learn about normal operation and build the nominal-condition database. The IMS offers two major advantages over prior computational monitoring systems: The computational burden of the IMS is significantly smaller, and there is no need for abnormal-condition sensor data for training the IMS to recognize abnormal conditions. The figure schematically depicts the relationships among the computational processes effected by the IMS. Training sensor data are gathered during normal operation of the monitored system, detailed computational simulation of operation of the monitored system, or both. The training data are formed into vectors that are used to generate the database. The vectors in the database are clustered into regions that represent normal or nominal operation. Once the database has been generated, the IMS compares the vectors of incoming sensor data with vectors representative of the clusters. The monitored system is deemed to be operating normally or abnormally, depending on whether the vector of incoming sensor data is or is not, respectively, sufficiently close to one of the clusters. For this purpose, a distance between two vectors is calculated by a suitable metric (e.g., Euclidean distance) and "sufficiently close" signifies lying at a distance less than a specified threshold value. It must be emphasized that although the IMS is intended to detect off-nominal or abnormal performance or health, it is not necessarily capable of performing a thorough or detailed diagnosis. Limited diagnostic information may be available under some circumstances. For example, the distance of a vector of incoming sensor data from the nearest cluster could serve as an indication of the severity of a malfunction. The identity of the nearest cluster may be a clue as to the identity of the malfunctioning component or subsystem. It is possible to decrease the IMS computation time by use of a combination of cluster-indexing and -retrieval methods. For example, in one method, the distances between each cluster and two or more reference vectors can be used for the purpose of indexing and retrieval. The clusters are sorted into a list according to these distance values, typically in ascending order of distance. When a set of input data arrives and is to be tested, the data are first arranged as an ordered set (that is, a vector). The distances from the input vector to the reference points are computed. The search of clusters from the list can then be limited to those clusters lying within a certain distance range from the input vector; the computation time is reduced by not searching the clusters at a greater distance.

  10. An operational information systems architecture for assessing sustainable transportation planning: principles and design.

    PubMed

    Borzacchiello, Maria Teresa; Torrieri, Vincenzo; Nijkamp, Peter

    2009-11-01

    This paper offers the description of an integrated information system framework for the assessment of transportation planning and management. After an introductory exposition, in the first part of the paper, a broad overview of international experiences regarding information systems on transportation is given, focusing in particular on the relationship between transportation system's performance monitoring and the decision-making process, and on the importance of this connection in the evaluation and planning process, in Italian and European cases. Next, the methodological design of an information system to support efficient and sustainable transportation planning and management aiming to integrate inputs from several different data sources is presented. The resulting framework deploys modular and integrated databases which include data stemming from different national or regional data banks and which integrate information belonging to different transportation fields. For this reason, it allows public administrations to account for many strategic elements that influence their decisions regarding transportation, both from a systemic and infrastructural point of view.

  11. Advanced integrated enhanced vision systems

    NASA Astrophysics Data System (ADS)

    Kerr, J. R.; Luk, Chiu H.; Hammerstrom, Dan; Pavel, Misha

    2003-09-01

    In anticipation of its ultimate role in transport, business and rotary wing aircraft, we clarify the role of Enhanced Vision Systems (EVS): how the output data will be utilized, appropriate architecture for total avionics integration, pilot and control interfaces, and operational utilization. Ground-map (database) correlation is critical, and we suggest that "synthetic vision" is simply a subset of the monitor/guidance interface issue. The core of integrated EVS is its sensor processor. In order to approximate optimal, Bayesian multi-sensor fusion and ground correlation functionality in real time, we are developing a neural net approach utilizing human visual pathway and self-organizing, associative-engine processing. In addition to EVS/SVS imagery, outputs will include sensor-based navigation and attitude signals as well as hazard detection. A system architecture is described, encompassing an all-weather sensor suite; advanced processing technology; intertial, GPS and other avionics inputs; and pilot and machine interfaces. Issues of total-system accuracy and integrity are addressed, as well as flight operational aspects relating to both civil certification and military applications in IMC.

  12. Development of Hydrometeorological Monitoring and Forecasting as AN Essential Component of the Early Flood Warning System:

    NASA Astrophysics Data System (ADS)

    Manukalo, V.

    2012-12-01

    Defining issue The river inundations are the most common and destructive natural hazards in Ukraine. Among non-structural flood management and protection measures a creation of the Early Flood Warning System is extremely important to be able to timely recognize dangerous situations in the flood-prone areas. Hydrometeorological information and forecasts are a core importance in this system. The primary factors affecting reliability and a lead - time of forecasts include: accuracy, speed and reliability with which real - time data are collected. The existing individual conception of monitoring and forecasting resulted in a need in reconsideration of the concept of integrated monitoring and forecasting approach - from "sensors to database and forecasters". Result presentation The Project: "Development of Flood Monitoring and Forecasting in the Ukrainian part of the Dniester River Basin" is presented. The project is developed by the Ukrainian Hydrometeorological Service in a conjunction with the Water Management Agency and the Energy Company "Ukrhydroenergo". The implementation of the Project is funded by the Ukrainian Government and the World Bank. The author is nominated as the responsible person for coordination of activity of organizations involved in the Project. The term of the Project implementation: 2012 - 2014. The principal objectives of the Project are: a) designing integrated automatic hydrometeorological measurement network (including using remote sensing technologies); b) hydrometeorological GIS database construction and coupling with electronic maps for flood risk assessment; c) interface-construction classic numerical database -GIS and with satellite images, and radar data collection; d) providing the real-time data dissemination from observation points to forecasting centers; e) developing hydrometeoroogical forecasting methods; f) providing a flood hazards risk assessment for different temporal and spatial scales; g) providing a dissemination of current information, forecasts and warnings to consumers automatically. Besides scientific and technical issues the implementation of these objectives requires solution of a number of organizational issues. Thus, as a result of the increased complexity of types of hydrometeorological data and in order to develop forecasting methods, a reconsideration of meteorological and hydrological measurement networks should be carried out. The "optimal density of measuring networks" is proposed taking into account principal terms: a) minimizing an uncertainty in characterizing the spacial distribution of hydrometeorological parameters; b) minimizing the Total Life Cycle Cost of creation and maintenance of measurement networks. Much attention will be given to training Ukrainian disaster management authorities from the Ministry of Emergencies and the Water Management Agency to identify the flood hazard risk level and to indicate the best protection measures on the basis of continuous monitoring and forecasts of evolution of meteorological and hydrological conditions in the river basin.

  13. Integrated Primary Care Information Database (IPCI)

    Cancer.gov

    The Integrated Primary Care Information Database is a longitudinal observational database that was created specifically for pharmacoepidemiological and pharmacoeconomic studies, inlcuding data from computer-based patient records supplied voluntarily by general practitioners.

  14. Construction and validation of a population-based bone densitometry database.

    PubMed

    Leslie, William D; Caetano, Patricia A; Macwilliam, Leonard R; Finlayson, Gregory S

    2005-01-01

    Utilization of dual-energy X-ray absorptiometry (DXA) for the initial diagnostic assessment of osteoporosis and in monitoring treatment has risen dramatically in recent years. Population-based studies of the impact of DXA and osteoporosis remain challenging because of incomplete and fragmented test data that exist in most regions. Our aim was to create and assess completeness of a database of all clinical DXA services and test results for the province of Manitoba, Canada and to present descriptive data resulting from testing. A regionally based bone density program for the province of Manitoba, Canada was established in 1997. Subsequent DXA services were prospectively captured in a program database. This database was retrospectively populated with earlier DXA results dating back to 1990 (the year that the first DXA scanner was installed) by integrating multiple data sources. A random chart audit was performed to assess completeness and accuracy of this dataset. For comparison, testing rates determined from the DXA database were compared with physician administrative claims data. There was a high level of completeness of this database (>99%) and accurate personal identifier information sufficient for linkage with other health care administrative data (>99%). This contrasted with physician billing data that were found to be markedly incomplete. Descriptive data provide a profile of individuals receiving DXA and their test results. In conclusion, the Manitoba bone density database has great potential as a resource for clinical and health policy research because it is population based with a high level of completeness and accuracy.

  15. Mass spectrometry-based protein identification by integrating de novo sequencing with database searching.

    PubMed

    Wang, Penghao; Wilson, Susan R

    2013-01-01

    Mass spectrometry-based protein identification is a very challenging task. The main identification approaches include de novo sequencing and database searching. Both approaches have shortcomings, so an integrative approach has been developed. The integrative approach firstly infers partial peptide sequences, known as tags, directly from tandem spectra through de novo sequencing, and then puts these sequences into a database search to see if a close peptide match can be found. However the current implementation of this integrative approach has several limitations. Firstly, simplistic de novo sequencing is applied and only very short sequence tags are used. Secondly, most integrative methods apply an algorithm similar to BLAST to search for exact sequence matches and do not accommodate sequence errors well. Thirdly, by applying these methods the integrated de novo sequencing makes a limited contribution to the scoring model which is still largely based on database searching. We have developed a new integrative protein identification method which can integrate de novo sequencing more efficiently into database searching. Evaluated on large real datasets, our method outperforms popular identification methods.

  16. Implementation and application of an interactive user-friendly validation software for RADIANCE

    NASA Astrophysics Data System (ADS)

    Sundaram, Anand; Boonn, William W.; Kim, Woojin; Cook, Tessa S.

    2012-02-01

    RADIANCE extracts CT dose parameters from dose sheets using optical character recognition and stores the data in a relational database. To facilitate validation of RADIANCE's performance, a simple user interface was initially implemented and about 300 records were evaluated. Here, we extend this interface to achieve a wider variety of functions and perform a larger-scale validation. The validator uses some data from the RADIANCE database to prepopulate quality-testing fields, such as correspondence between calculated and reported total dose-length product. The interface also displays relevant parameters from the DICOM headers. A total of 5,098 dose sheets were used to test the performance accuracy of RADIANCE in dose data extraction. Several search criteria were implemented. All records were searchable by accession number, study date, or dose parameters beyond chosen thresholds. Validated records were searchable according to additional criteria from validation inputs. An error rate of 0.303% was demonstrated in the validation. Dose monitoring is increasingly important and RADIANCE provides an open-source solution with a high level of accuracy. The RADIANCE validator has been updated to enable users to test the integrity of their installation and verify that their dose monitoring is accurate and effective.

  17. Using Web Ontology Language to Integrate Heterogeneous Databases in the Neurosciences

    PubMed Central

    Lam, Hugo Y.K.; Marenco, Luis; Shepherd, Gordon M.; Miller, Perry L.; Cheung, Kei-Hoi

    2006-01-01

    Integrative neuroscience involves the integration and analysis of diverse types of neuroscience data involving many different experimental techniques. This data will increasingly be distributed across many heterogeneous databases that are web-accessible. Currently, these databases do not expose their schemas (database structures) and their contents to web applications/agents in a standardized, machine-friendly way. This limits database interoperation. To address this problem, we describe a pilot project that illustrates how neuroscience databases can be expressed using the Web Ontology Language, which is a semantically-rich ontological language, as a common data representation language to facilitate complex cross-database queries. In this pilot project, an existing tool called “D2RQ” was used to translate two neuroscience databases (NeuronDB and CoCoDat) into OWL, and the resulting OWL ontologies were then merged. An OWL-based reasoner (Racer) was then used to provide a sophisticated query language (nRQL) to perform integrated queries across the two databases based on the merged ontology. This pilot project is one step toward exploring the use of semantic web technologies in the neurosciences. PMID:17238384

  18. Specification and Enforcement of Semantic Integrity Constraints in Microsoft Access

    ERIC Educational Resources Information Center

    Dadashzadeh, Mohammad

    2007-01-01

    Semantic integrity constraints are business-specific rules that limit the permissible values in a database. For example, a university rule dictating that an "incomplete" grade cannot be changed to an A constrains the possible states of the database. To maintain database integrity, business rules should be identified in the course of database…

  19. Building a transnational biosurveillance network using semantic web technologies: requirements, design, and preliminary evaluation.

    PubMed

    Teodoro, Douglas; Pasche, Emilie; Gobeill, Julien; Emonet, Stéphane; Ruch, Patrick; Lovis, Christian

    2012-05-29

    Antimicrobial resistance has reached globally alarming levels and is becoming a major public health threat. Lack of efficacious antimicrobial resistance surveillance systems was identified as one of the causes of increasing resistance, due to the lag time between new resistances and alerts to care providers. Several initiatives to track drug resistance evolution have been developed. However, no effective real-time and source-independent antimicrobial resistance monitoring system is available publicly. To design and implement an architecture that can provide real-time and source-independent antimicrobial resistance monitoring to support transnational resistance surveillance. In particular, we investigated the use of a Semantic Web-based model to foster integration and interoperability of interinstitutional and cross-border microbiology laboratory databases. Following the agile software development methodology, we derived the main requirements needed for effective antimicrobial resistance monitoring, from which we proposed a decentralized monitoring architecture based on the Semantic Web stack. The architecture uses an ontology-driven approach to promote the integration of a network of sentinel hospitals or laboratories. Local databases are wrapped into semantic data repositories that automatically expose local computing-formalized laboratory information in the Web. A central source mediator, based on local reasoning, coordinates the access to the semantic end points. On the user side, a user-friendly Web interface provides access and graphical visualization to the integrated views. We designed and implemented the online Antimicrobial Resistance Trend Monitoring System (ARTEMIS) in a pilot network of seven European health care institutions sharing 70+ million triples of information about drug resistance and consumption. Evaluation of the computing performance of the mediator demonstrated that, on average, query response time was a few seconds (mean 4.3, SD 0.1 × 10(2) seconds). Clinical pertinence assessment showed that resistance trends automatically calculated by ARTEMIS had a strong positive correlation with the European Antimicrobial Resistance Surveillance Network (EARS-Net) (ρ = .86, P < .001) and the Sentinel Surveillance of Antibiotic Resistance in Switzerland (SEARCH) (ρ = .84, P < .001) systems. Furthermore, mean resistance rates extracted by ARTEMIS were not significantly different from those of either EARS-Net (∆ = ±0.130; 95% confidence interval -0 to 0.030; P < .001) or SEARCH (∆ = ±0.042; 95% confidence interval -0.004 to 0.028; P = .004). We introduce a distributed monitoring architecture that can be used to build transnational antimicrobial resistance surveillance networks. Results indicated that the Semantic Web-based approach provided an efficient and reliable solution for development of eHealth architectures that enable online antimicrobial resistance monitoring from heterogeneous data sources. In future, we expect that more health care institutions can join the ARTEMIS network so that it can provide a large European and wider biosurveillance network that can be used to detect emerging bacterial resistance in a multinational context and support public health actions.

  20. Building a Transnational Biosurveillance Network Using Semantic Web Technologies: Requirements, Design, and Preliminary Evaluation

    PubMed Central

    Pasche, Emilie; Gobeill, Julien; Emonet, Stéphane; Ruch, Patrick; Lovis, Christian

    2012-01-01

    Background Antimicrobial resistance has reached globally alarming levels and is becoming a major public health threat. Lack of efficacious antimicrobial resistance surveillance systems was identified as one of the causes of increasing resistance, due to the lag time between new resistances and alerts to care providers. Several initiatives to track drug resistance evolution have been developed. However, no effective real-time and source-independent antimicrobial resistance monitoring system is available publicly. Objective To design and implement an architecture that can provide real-time and source-independent antimicrobial resistance monitoring to support transnational resistance surveillance. In particular, we investigated the use of a Semantic Web-based model to foster integration and interoperability of interinstitutional and cross-border microbiology laboratory databases. Methods Following the agile software development methodology, we derived the main requirements needed for effective antimicrobial resistance monitoring, from which we proposed a decentralized monitoring architecture based on the Semantic Web stack. The architecture uses an ontology-driven approach to promote the integration of a network of sentinel hospitals or laboratories. Local databases are wrapped into semantic data repositories that automatically expose local computing-formalized laboratory information in the Web. A central source mediator, based on local reasoning, coordinates the access to the semantic end points. On the user side, a user-friendly Web interface provides access and graphical visualization to the integrated views. Results We designed and implemented the online Antimicrobial Resistance Trend Monitoring System (ARTEMIS) in a pilot network of seven European health care institutions sharing 70+ million triples of information about drug resistance and consumption. Evaluation of the computing performance of the mediator demonstrated that, on average, query response time was a few seconds (mean 4.3, SD 0.1×102 seconds). Clinical pertinence assessment showed that resistance trends automatically calculated by ARTEMIS had a strong positive correlation with the European Antimicrobial Resistance Surveillance Network (EARS-Net) (ρ = .86, P < .001) and the Sentinel Surveillance of Antibiotic Resistance in Switzerland (SEARCH) (ρ = .84, P < .001) systems. Furthermore, mean resistance rates extracted by ARTEMIS were not significantly different from those of either EARS-Net (∆ = ±0.130; 95% confidence interval –0 to 0.030; P < .001) or SEARCH (∆ = ±0.042; 95% confidence interval –0.004 to 0.028; P = .004). Conclusions We introduce a distributed monitoring architecture that can be used to build transnational antimicrobial resistance surveillance networks. Results indicated that the Semantic Web-based approach provided an efficient and reliable solution for development of eHealth architectures that enable online antimicrobial resistance monitoring from heterogeneous data sources. In future, we expect that more health care institutions can join the ARTEMIS network so that it can provide a large European and wider biosurveillance network that can be used to detect emerging bacterial resistance in a multinational context and support public health actions. PMID:22642960

  1. Integrative literature review of the reported uses of serological tests in leprosy management.

    PubMed

    Fabri, Angélica da Conceição Oliveira Coelho; Carvalho, Ana Paula Mendes; Vieira, Nayara Figueiredo; Bueno, Isabela de Caux; Rodrigues, Rayssa Nogueira; Monteiro, Thayenne Barrozo Mota; Correa-Oliveira, Rodrigo; Duthie, Malcolm S; Lana, Francisco Carlos Félix

    2016-04-01

    An integrative literature review was conducted to synthesize available publications regarding the potential use of serological tests in leprosy programs. We searched the databases Literatura Latino-Americana e do Caribe em Ciências da Saúde, Índice Bibliográfico Espanhol em Ciências da Saúde, Acervo da Biblioteca da Organização Pan-Americana da Saúde, Medical Literature Analysis and Retrieval System Online, Hanseníase, National Library of Medicine, Scopus, Ovid, Cinahl, and Web of Science for articles investigating the use of serological tests for antibodies against phenolic glycolipid-I (PGL-I), ML0405, ML2331, leprosy IDRI diagnostic-1 (LID-1), and natural disaccharide octyl-leprosy IDRI diagnostic-1 (NDO-LID). From an initial pool of 3.514 articles, 40 full-length articles fulfilled our inclusion criteria. Based on these papers, we concluded that these antibodies can be used to assist in diagnosing leprosy, detecting neuritis, monitoring therapeutic efficacy, and monitoring household contacts or at-risk populations in leprosy-endemic areas. Thus, available data suggest that serological tests could contribute substantially to leprosy management.

  2. Software Tools Streamline Project Management

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query-Based Document Management (QBDM) is a tool that enables content or context searches, either simple or hierarchical, across a variety of databases. The system enables users to specify notification subscriptions where they associate "contexts of interest" and "events of interest" to one or more documents or collection(s) of documents. Based on these subscriptions, users receive notification when the events of interest occur within the contexts of interest for associated document or collection(s) of documents. Users can also associate at least one notification time as part of the notification subscription, with at least one option for the time period of notifications.

  3. Integrated monitoring and information systems for managing aquatic invasive species in a changing climate.

    PubMed

    Lee, Henry; Reusser, Deborah A; Olden, Julian D; Smith, Scott S; Graham, Jim; Burkett, Virginia; Dukes, Jeffrey S; Piorkowski, Robert J; McPhedran, John

    2008-06-01

    Changes in temperature, precipitation, and other climatic drivers and sea-level rise will affect populations of existing native and non-native aquatic species and the vulnerability of aquatic environments to new invasions. Monitoring surveys provide the foundation for assessing the combined effects of climate change and invasions by providing baseline biotic and environmental conditions, although the utility of a survey depends on whether the results are quantitative or qualitative, and other design considerations. The results from a variety of monitoring programs in the United States are available in integrated biological information systems, although many include only non-native species, not native species. Besides including natives, we suggest these systems could be improved through the development of standardized methods that capture habitat and physiological requirements and link regional and national biological databases into distributed Web portals that allow drawing information from multiple sources. Combining the outputs from these biological information systems with environmental data would allow the development of ecological-niche models that predict the potential distribution or abundance of native and non-native species on the basis of current environmental conditions. Environmental projections from climate models can be used in these niche models to project changes in species distributions or abundances under altered climatic conditions and to identify potential high-risk invaders. There are, however, a number of challenges, such as uncertainties associated with projections from climate and niche models and difficulty in integrating data with different temporal and spatial granularity. Even with these uncertainties, integration of biological and environmental information systems, niche models, and climate projections would improve management of aquatic ecosystems under the dual threats of biotic invasions and climate change.

  4. Integrated monitoring and information systems for managing aquatic invasive species in a changing climate

    USGS Publications Warehouse

    Lee, Henry; Reusser, Deborah A.; Olden, Julian D.; Smith, Scott S.; Graham, Jim; Burkett, Virginia; Dukes, Jeffrey S.; Piorkowski, Robert J.; Mcphedran, John

    2008-01-01

    Changes in temperature, precipitation, and other climatic drivers and sea-level rise will affect populations of existing native and non-native aquatic species and the vulnerability of aquatic environments to new invasions. Monitoring surveys provide the foundation for assessing the combined effects of climate change and invasions by providing baseline biotic and environmental conditions, although the utility of a survey depends on whether the results are quantitative or qualitative, and other design considerations. The results from a variety of monitoring programs in the United States are available in integrated biological information systems, although many include only non-native species, not native species. Besides including natives, we suggest these systems could be improved through the development of standardized methods that capture habitat and physiological requirements and link regional and national biological databases into distributed Web portals that allow drawing information from multiple sources. Combining the outputs from these biological information systems with environmental data would allow the development of ecological-niche models that predict the potential distribution or abundance of native and non-native species on the basis of current environmental conditions. Environmental projections from climate models can be used in these niche models to project changes in species distributions or abundances under altered climatic conditions and to identify potential high-risk invaders. There are, however, a number of challenges, such as uncertainties associated with projections from climate and niche models and difficulty in integrating data with different temporal and spatial granularity. Even with these uncertainties, integration of biological and environmental information systems, niche models, and climate projections would improve management of aquatic ecosystems under the dual threats of biotic invasions and climate change

  5. Longitudinal study of radiation exposure in computed tomography with an in-house developed dose monitoring system

    NASA Astrophysics Data System (ADS)

    Renger, Bernhard; Rummeny, Ernst J.; Noël, Peter B.

    2013-03-01

    During the last decades, the reduction of radiation exposure especially in diagnostic computed tomography is one of the most explored topics. In the same time, it seems challenging to quantify the long-term clinical dose reduction with regard to new hardware as well as software solutions. To overcome this challenge, we developed a Dose Monitoring System (DMS), which collects information from PACS, RIS, MPPS and structured reports. The integration of all sources overcomes the weaknesses of single systems. To gather all possible information, we integrated an optical character recognition system to extract, for example, information from the CT-dose-report. All collected data are transferred to a database for further evaluation, e.g., for calculations of effective as well as organ doses. The DMS provides a single database for tracking all essential study and patient specific information across different modality as well as different vendors. As an initial study, we longitudinally investigated the dose reduction in CT examination when employing a noise-suppressing reconstruction algorithm. For this examination type a significant long-term reduction in radiation exposure is reported, when comparing to a CT-system with standard reconstruction. In summary our DMS tool not only enables us to track radiation exposure on daily bases but further enables to analyses the long term effect of new dose saving strategies. In the future the statistical analyses of all retrospective data, which are available in a modern imaging department, will provide a unique overview of advances in reduction of radiation exposure.

  6. Classification of time series patterns from complex dynamic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately,more » the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.« less

  7. Specification of parameters for development of a spatial database for drought monitoring and famine early warning in the African Sahel

    NASA Technical Reports Server (NTRS)

    Rochon, Gilbert L.

    1989-01-01

    Parameters were described for spatial database to facilitate drought monitoring and famine early warning in the African Sahel. The proposed system, referred to as the African Drought and Famine Information System (ADFIS) is ultimately recommended for implementation with the NASA/FEMA Spatial Analysis and Modeling System (SAMS), a GIS/Dymanic Modeling software package, currently under development. SAMS is derived from FEMA'S Integration Emergency Management Information System (IEMIS) and the Pacific Northwest Laborotory's/Engineering Topographic Laboratory's Airland Battlefield Environment (ALBE) GIS. SAMS is primarily intended for disaster planning and resource management applications with the developing countries. Sources of data for the system would include the Developing Economics Branch of the U.S. Dept. of Agriculture, the World Bank, Tulane University School of Public Health and Tropical Medicine's Famine Early Warning Systems (FEWS) Project, the USAID's Foreign Disaster Assistance Section, the World Resources Institute, the World Meterological Institute, the USGS, the UNFAO, UNICEF, and the United Nations Disaster Relief Organization (UNDRO). Satellite imagery would include decadal AVHRR imagery and Normalized Difference Vegetation Index (NDVI) values from 1981 to the present for the African continent and selected Landsat scenes for the Sudan pilot study. The system is initially conceived for the MicroVAX 2/GPX, running VMS. To facilitate comparative analysis, a global time-series database (1950 to 1987) is included for a basic set of 125 socio-economic variables per country per year. A more detailed database for the Sahelian countries includes soil type, water resources, agricultural production, agricultural import and export, food aid, and consumption. A pilot dataset for the Sudan with over 2,500 variables from the World Bank's ANDREX system, also includes epidemiological data on incidence of kwashiorkor, marasmus, other nutritional deficiencies, and synergistically-related infectious diseases.

  8. A Real-Time Construction Safety Monitoring System for Hazardous Gas Integrating Wireless Sensor Network and Building Information Modeling Technologies.

    PubMed

    Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng

    2018-02-02

    In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications.

  9. Development of Climate Change Adaptation Platform using Spatial Information

    NASA Astrophysics Data System (ADS)

    Lee, J.; Oh, K. Y.; Lee, M. J.; Han, W. J.

    2014-12-01

    Climate change adaptation has attracted growing attention with the recent extreme weather conditions that affect people around the world. More and more countries, including the Republic of Korea, have begun to hatch adaptation plan to resolve these matters of great concern. They all, meanwhile, have mentioned that it should come first to integrate climate information in all analysed areas. That's because climate information is not independently made through one source; that is to say, the climate information is connected one another in a complicated way. That is the reason why we have to promote integrated climate change adaptation platform before setting up climate change adaptation plan. Therefore, the large-scaled project has been actively launched and worked on. To date, we researched 620 literatures and interviewed 51 government organizations. Based on the results of the researches and interviews, we obtained 2,725 impacts about vulnerability assessment information such as Monitoring and Forecasting, Health, Disaster, Agriculture, Forest, Water Management, Ecosystem, Ocean/Fisheries, Industry/Energy. Among 2,725 impacts, 995 impacts are made into a database until now. This database is made up 3 sub categories like Climate-Exposure, Sensitivity, Adaptive capacity, presented by IPCC. Based on the constructed database, vulnerability assessments were carried out in order to evaluate climate change capacity of local governments all over the country. These assessments were conducted by using web-based vulnerability assessment tool which was newly developed through this project. These results have shown that, metropolitan areas like Seoul, Pusan, Inchon, and so on have high risks more than twice than rural areas. Acknowledgements: The authors appreciate the support that this study has received from "Development of integrated model for climate change impact and vulnerability assessment and strengthening the framework for model implementation ", an initiative of the Korea Environmental & Industry Technology Institute .

  10. The USA National Phenology Network: A national observatory for assessment of biotic response to environmental variation

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; USA National Phenology Network National Coordinating Office

    2011-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org), established in 2007, is a national science and monitoring initiative focused on phenology as a tool to understand how plants, animals and landscapes respond to climatic variability and change. Core functions of the National Coordinating Office (NCO) of USA-NPN are to provide a national information management system including databases, develop and implement internationally standardized phenology monitoring protocols, create partnerships with a variety of organizations including field stations for implementation, facilitate research and the development of decision support tools, and promote education and outreach activities related to phenology and climate change. This presentation will describe programs, tools and materials developed by USA-NPN to facilitate science, management and education related to phenology of plants, animals and landscapes within protected areas at local, regional and national scales. Particular emphasis will be placed on the on-line integrated animal and plant monitoring program, Nature's Notebook, which provides standardized protocols for phenological status monitoring and data management for over 500 animal and plant species. The monitoring system facilitates collection of sampling intensity, absence data, considerable metadata (from site to observation). We recently added functionality for recording estimates of animal abundance and plant canopy development. Real-time raw data for plants (from 2009 to present) and animals (from 2010 to present), including FGDC-compliant metadata and documented methodology, are now available for download from the website. A new data exploration tool premiered in spring 2010 allows sophisticated graphical visualization of integrated phenological and meteorological data. The network seeks to develop partnerships with other organizations interested in (1) implementing vetted, standardized protocols for phenological or ecological monitoring, and (2) using phenology data and information for a variety of modeling applications.

  11. Anti-Runaway Prevention System with Wireless Sensors for Intelligent Track Skates at Railway Stations.

    PubMed

    Jiang, Chaozhe; Xu, Yibo; Wen, Chao; Chen, Dilin

    2017-12-19

    Anti-runaway prevention of rolling stocks at a railway station is essential in railway safety management. The traditional track skates for anti-runaway prevention of rolling stocks have some disadvantages since they are operated and monitored completely manually. This paper describes an anti-runaway prevention system (ARPS) based on intelligent track skates equipped with sensors and real-time monitoring and management system. This system, which has been updated from the traditional track skates, comprises four parts: intelligent track skates, a signal reader, a database station, and a monitoring system. This system can monitor the real-time situation of track skates without changing their workflow for anti-runaway prevention, and thus realize the integration of anti-runaway prevention information management. This system was successfully tested and practiced at Sunjia station in Harbin Railway Bureau in 2014, and the results confirmed that the system showed 100% accuracy in reflecting the usage status of the track skates. The system could meet practical demands, as it is highly reliable and supports long-distance communication.

  12. Reliability of Beam Loss Monitors System for the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Guaglio, G.; Dehning, B.; Santoni, C.

    2004-11-01

    The employment of superconducting magnets in high energy colliders opens challenging failure scenarios and brings new criticalities for the whole system protection. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particle losses, while at medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data have been processed by reliability software (Isograph). The analysis ranges from the components data to the system configuration.

  13. Anti-Runaway Prevention System with Wireless Sensors for Intelligent Track Skates at Railway Stations

    PubMed Central

    Jiang, Chaozhe; Xu, Yibo; Chen, Dilin

    2017-01-01

    Anti-runaway prevention of rolling stocks at a railway station is essential in railway safety management. The traditional track skates for anti-runaway prevention of rolling stocks have some disadvantages since they are operated and monitored completely manually. This paper describes an anti-runaway prevention system (ARPS) based on intelligent track skates equipped with sensors and real-time monitoring and management system. This system, which has been updated from the traditional track skates, comprises four parts: intelligent track skates, a signal reader, a database station, and a monitoring system. This system can monitor the real-time situation of track skates without changing their workflow for anti-runaway prevention, and thus realize the integration of anti-runaway prevention information management. This system was successfully tested and practiced at Sunjia station in Harbin Railway Bureau in 2014, and the results confirmed that the system showed 100% accuracy in reflecting the usage status of the track skates. The system could meet practical demands, as it is highly reliable and supports long-distance communication. PMID:29257108

  14. [Integrated DNA barcoding database for identifying Chinese animal medicine].

    PubMed

    Shi, Lin-Chun; Yao, Hui; Xie, Li-Fang; Zhu, Ying-Jie; Song, Jing-Yuan; Zhang, Hui; Chen, Shi-Lin

    2014-06-01

    In order to construct an integrated DNA barcoding database for identifying Chinese animal medicine, the authors and their cooperators have completed a lot of researches for identifying Chinese animal medicines using DNA barcoding technology. Sequences from GenBank have been analyzed simultaneously. Three different methods, BLAST, barcoding gap and Tree building, have been used to confirm the reliabilities of barcode records in the database. The integrated DNA barcoding database for identifying Chinese animal medicine has been constructed using three different parts: specimen, sequence and literature information. This database contained about 800 animal medicines and the adulterants and closely related species. Unknown specimens can be identified by pasting their sequence record into the window on the ID page of species identification system for traditional Chinese medicine (www. tcmbarcode. cn). The integrated DNA barcoding database for identifying Chinese animal medicine is significantly important for animal species identification, rare and endangered species conservation and sustainable utilization of animal resources.

  15. PathCase-SB architecture and database design

    PubMed Central

    2011-01-01

    Background Integration of metabolic pathways resources and regulatory metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation in metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Description PathCase Systems Biology (PathCase-SB) is built and released. The PathCase-SB database provides data and API for multiple user interfaces and software tools. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate data of selected biological data sources on the web (currently, BioModels database and KEGG), and to provide more powerful and/or new capabilities via the new web-based integrative framework. This paper describes architecture and database design issues encountered in PathCase-SB's design and implementation, and presents the current design of PathCase-SB's architecture and database. Conclusions PathCase-SB architecture and database provide a highly extensible and scalable environment with easy and fast (real-time) access to the data in the database. PathCase-SB itself is already being used by researchers across the world. PMID:22070889

  16. E-MSD: an integrated data resource for bioinformatics.

    PubMed

    Velankar, S; McNeil, P; Mittard-Runte, V; Suarez, A; Barrell, D; Apweiler, R; Henrick, K

    2005-01-01

    The Macromolecular Structure Database (MSD) group (http://www.ebi.ac.uk/msd/) continues to enhance the quality and consistency of macromolecular structure data in the worldwide Protein Data Bank (wwPDB) and to work towards the integration of various bioinformatics data resources. One of the major obstacles to the improved integration of structural databases such as MSD and sequence databases like UniProt is the absence of up to date and well-maintained mapping between corresponding entries. We have worked closely with the UniProt group at the EBI to clean up the taxonomy and sequence cross-reference information in the MSD and UniProt databases. This information is vital for the reliable integration of the sequence family databases such as Pfam and Interpro with the structure-oriented databases of SCOP and CATH. This information has been made available to the eFamily group (http://www.efamily.org.uk/) and now forms the basis of the regular interchange of information between the member databases (MSD, UniProt, Pfam, Interpro, SCOP and CATH). This exchange of annotation information has enriched the structural information in the MSD database with annotation from wider sequence-oriented resources. This work was carried out under the 'Structure Integration with Function, Taxonomy and Sequences (SIFTS)' initiative (http://www.ebi.ac.uk/msd-srv/docs/sifts) in the MSD group.

  17. A complete history of everything

    NASA Astrophysics Data System (ADS)

    Lanclos, Kyle; Deich, William T. S.

    2012-09-01

    This paper discusses Lick Observatory's local solution for retaining a complete history of everything. Leveraging our existing deployment of a publish/subscribe communications model that is used to broadcast the state of all systems at Lick Observatory, a monitoring daemon runs on a dedicated server that subscribes to and records all published messages. Our success with this system is a testament to the power of simple, straightforward approaches to complex problems. The solution itself is written in Python, and the initial version required about a week of development time; the data are stored in PostgreSQL database tables using a distinctly simple schema. Over time, we addressed scaling issues as the data set grew, which involved reworking the PostgreSQL database schema on the back-end. We also duplicate the data in flat files to enable recovery or migration of the data from one server to another. This paper will cover both the initial design as well as the solutions to the subsequent deployment issues, the trade-offs that motivated those choices, and the integration of this history database with existing client applications.

  18. The Monitoring Erosion of Agricultural Land and spatial database of erosion events

    NASA Astrophysics Data System (ADS)

    Kapicka, Jiri; Zizala, Daniel

    2013-04-01

    In 2011 originated in The Czech Republic The Monitoring Erosion of Agricultural Land as joint project of State Land Office (SLO) and Research Institute for Soil and Water Conservation (RISWC). The aim of the project is collecting and record keeping information about erosion events on agricultural land and their evaluation. The main idea is a creation of a spatial database that will be source of data and information for evaluation and modeling erosion process, for proposal of preventive measures and measures to reduce negative impacts of erosion events. A subject of monitoring is the manifestations of water erosion, wind erosion and slope deformation in which cause damaged agriculture land. A website, available on http://me.vumop.cz, is used as a tool for keeping and browsing information about monitored events. SLO employees carry out record keeping. RISWC is specialist institute in the Monitoring Erosion of Agricultural Land that performs keeping the spatial database, running the website, managing the record keeping of events, analysis the cause of origins events and statistical evaluations of keeping events and proposed measures. Records are inserted into the database using the user interface of the website which has map server as a component. Website is based on database technology PostgreSQL with superstructure PostGIS and MapServer UMN. Each record is in the database spatial localized by a drawing and it contains description information about character of event (data, situation description etc.) then there are recorded information about land cover and about grown crops. A part of database is photodocumentation which is taken in field reconnaissance which is performed within two days after notify of event. Another part of database are information about precipitations from accessible precipitation gauges. Website allows to do simple spatial analysis as are area calculation, slope calculation, percentage representation of GAEC etc.. Database structure was designed on the base of needs analysis inputs to mathematical models. Mathematical models are used for detailed analysis of chosen erosion events which include soil analysis. Till the end 2012 has had the database 135 events. The content of database still accrues and gives rise to the extensive source of data that is usable for testing mathematical models.

  19. Historic Bim: a New Repository for Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Banfi, F.; Barazzetti, L.; Previtali, M.; Roncoroni, F.

    2017-05-01

    Recent developments in Building Information Modelling (BIM) technologies are facilitating the management of historic complex structures using new applications. This paper proposes a generative method combining the morphological and typological aspects of the historic buildings (H-BIM), with a set of monitoring information. This combination of 3D digital survey, parametric modelling and monitoring datasets allows for the development of a system for archiving and visualizing structural health monitoring (SHM) data (Fig. 1). The availability of a BIM database allows one to integrate a different kind of data stored in different ways (e.g. reports, tables, graphs, etc.) with a representation directly connected to the 3D model of the structure with appropriate levels of detail (LoD). Data can be interactively accessed by selecting specific objects of the BIM, i.e. connecting the 3D position of the sensors installed with additional digital documentation. Such innovative BIM objects, which form a new BIM family for SHM, can be then reused in other projects, facilitating data archiving and exploitation of data acquired and processed. The application of advanced modeling techniques allows for the reduction of time and costs of the generation process, and support cooperation between different disciplines using a central workspace. However, it also reveals new challenges for parametric software and exchange formats. The case study presented is the medieval bridge Azzone Visconti in Lecco (Italy), in which multi-temporal vertical movements during load testing were integrated into H-BIM.

  20. Reliability of Beam Loss Monitor Systems for the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Guaglio, G.; Dehning, B.; Santoni, C.

    2005-06-01

    The increase of beam energy and beam intensity, together with the use of super conducting magnets, opens new failure scenarios and brings new criticalities for the whole accelerator protection system. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system, and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particles losses at 7 TeV and assisted by the Fast Beam Current Decay Monitors at 450 GeV. At medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data has been processed by reliability software (Isograph). The analysis spaces from the components data to the system configuration.

  1. POPSCAN: A CNES Geo-Information Study for Re-Entry Risk Assessment

    NASA Astrophysics Data System (ADS)

    Fuentes, N.; Tholey, N.; Battiston, S.; Montabord, M.; Studer, M.

    2013-09-01

    Within the framework of the FSOA, French Space Operations Act (referred to as the "Loi relative aux Opérations Spatiales" or LOS in French), including in particular the monitoring of safety requirements for people and property, one major parameter to consider is Geographic Information (GI) on population distribution, human activity, and land occupation.This article gives an overview of the set of geographic and demographic data examined for CNES control offices, outlining the advantages and limits of each one : coverage, precision, update frequency, availability, distribution, ...It focuses on the two major available global population databases: GPW-GRUMP from CIESIN of COLUMBIA University and LandScan from ORNL. The work engaged on POPSCAN integrates digital analysis about these two world population grids and also comparisons on other databases such as GLOBAL- INSIGHT, VMAP0, ESRI, DMSP-ISA, GLOBCOVER, OpenFlights, ... for urban areas, communication networks, sensitive human activities and land use.

  2. The Comprehensive Antibiotic Resistance Database

    PubMed Central

    McArthur, Andrew G.; Waglechner, Nicholas; Nizam, Fazmin; Yan, Austin; Azad, Marisa A.; Baylay, Alison J.; Bhullar, Kirandeep; Canova, Marc J.; De Pascale, Gianfranco; Ejim, Linda; Kalan, Lindsay; King, Andrew M.; Koteva, Kalinka; Morar, Mariya; Mulvey, Michael R.; O'Brien, Jonathan S.; Pawlowski, Andrew C.; Piddock, Laura J. V.; Spanogiannopoulos, Peter; Sutherland, Arlene D.; Tang, Irene; Taylor, Patricia L.; Thaker, Maulik; Wang, Wenliang; Yan, Marie; Yu, Tennison

    2013-01-01

    The field of antibiotic drug discovery and the monitoring of new antibiotic resistance elements have yet to fully exploit the power of the genome revolution. Despite the fact that the first genomes sequenced of free living organisms were those of bacteria, there have been few specialized bioinformatic tools developed to mine the growing amount of genomic data associated with pathogens. In particular, there are few tools to study the genetics and genomics of antibiotic resistance and how it impacts bacterial populations, ecology, and the clinic. We have initiated development of such tools in the form of the Comprehensive Antibiotic Research Database (CARD; http://arpcard.mcmaster.ca). The CARD integrates disparate molecular and sequence data, provides a unique organizing principle in the form of the Antibiotic Resistance Ontology (ARO), and can quickly identify putative antibiotic resistance genes in new unannotated genome sequences. This unique platform provides an informatic tool that bridges antibiotic resistance concerns in health care, agriculture, and the environment. PMID:23650175

  3. SIDD: A Semantically Integrated Database towards a Global View of Human Disease

    PubMed Central

    Cheng, Liang; Wang, Guohua; Li, Jie; Zhang, Tianjiao; Xu, Peigang; Wang, Yadong

    2013-01-01

    Background A number of databases have been developed to collect disease-related molecular, phenotypic and environmental features (DR-MPEs), such as genes, non-coding RNAs, genetic variations, drugs, phenotypes and environmental factors. However, each of current databases focused on only one or two DR-MPEs. There is an urgent demand to develop an integrated database, which can establish semantic associations among disease-related databases and link them to provide a global view of human disease at the biological level. This database, once developed, will facilitate researchers to query various DR-MPEs through disease, and investigate disease mechanisms from different types of data. Methodology To establish an integrated disease-associated database, disease vocabularies used in different databases are mapped to Disease Ontology (DO) through semantic match. 4,284 and 4,186 disease terms from Medical Subject Headings (MeSH) and Online Mendelian Inheritance in Man (OMIM) respectively are mapped to DO. Then, the relationships between DR-MPEs and diseases are extracted and merged from different source databases for reducing the data redundancy. Conclusions A semantically integrated disease-associated database (SIDD) is developed, which integrates 18 disease-associated databases, for researchers to browse multiple types of DR-MPEs in a view. A web interface allows easy navigation for querying information through browsing a disease ontology tree or searching a disease term. Furthermore, a network visualization tool using Cytoscape Web plugin has been implemented in SIDD. It enhances the SIDD usage when viewing the relationships between diseases and DR-MPEs. The current version of SIDD (Jul 2013) documents 4,465,131 entries relating to 139,365 DR-MPEs, and to 3,824 human diseases. The database can be freely accessed from: http://mlg.hit.edu.cn/SIDD. PMID:24146757

  4. Real-Time Payload Control and Monitoring on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Sun, Charles; Windrem, May; Givens, John J. (Technical Monitor)

    1998-01-01

    World Wide Web (W3) technologies such as the Hypertext Transfer Protocol (HTTP) and the Java object-oriented programming environment offer a powerful, yet relatively inexpensive, framework for distributed application software development. This paper describes the design of a real-time payload control and monitoring system that was developed with W3 technologies at NASA Ames Research Center. Based on Java Development Toolkit (JDK) 1.1, the system uses an event-driven "publish and subscribe" approach to inter-process communication and graphical user-interface construction. A C Language Integrated Production System (CLIPS) compatible inference engine provides the back-end intelligent data processing capability, while Oracle Relational Database Management System (RDBMS) provides the data management function. Preliminary evaluation shows acceptable performance for some classes of payloads, with Java's portability and multimedia support identified as the most significant benefit.

  5. Monitoring safety in a phase III real‐world effectiveness trial: use of novel methodology in the Salford Lung Study

    PubMed Central

    Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F.; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P.; McCrae, John; McCorkindale, Sheila; Leather, David

    2016-01-01

    Abstract Background The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once‐daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. Objective The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in‐depth exploration of the safety results will be the subject of future publications. Achievements The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Conclusion Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. PMID:27804174

  6. Monitoring safety in a phase III real-world effectiveness trial: use of novel methodology in the Salford Lung Study.

    PubMed

    Collier, Sue; Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P; McCrae, John; McCorkindale, Sheila; Leather, David

    2017-03-01

    The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once-daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in-depth exploration of the safety results will be the subject of future publications. The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd.

  7. GenomewidePDB 2.0: A Newly Upgraded Versatile Proteogenomic Database for the Chromosome-Centric Human Proteome Project.

    PubMed

    Jeong, Seul-Ki; Hancock, William S; Paik, Young-Ki

    2015-09-04

    Since the launch of the Chromosome-centric Human Proteome Project (C-HPP) in 2012, the number of "missing" proteins has fallen to 2932, down from ∼5932 since the number was first counted in 2011. We compared the characteristics of missing proteins with those of already annotated proteins with respect to transcriptional expression pattern and the time periods in which newly identified proteins were annotated. We learned that missing proteins commonly exhibit lower levels of transcriptional expression and less tissue-specific expression compared with already annotated proteins. This makes it more difficult to identify missing proteins as time goes on. One of the C-HPP goals is to identify alternative spliced product of proteins (ASPs), which are usually difficult to find by shot-gun proteomic methods due to their sequence similarities with the representative proteins. To resolve this problem, it may be necessary to use a targeted proteomics approach (e.g., selected and multiple reaction monitoring [S/MRM] assays) and an innovative bioinformatics platform that enables the selection of target peptides for rarely expressed missing proteins or ASPs. Given that the success of efforts to identify missing proteins may rely on more informative public databases, it was necessary to upgrade the available integrative databases. To this end, we attempted to improve the features and utility of GenomewidePDB by integrating transcriptomic information (e.g., alternatively spliced transcripts), annotated peptide information, and an advanced search interface that can find proteins of interest when applying a targeted proteomics strategy. This upgraded version of the database, GenomewidePDB 2.0, may not only expedite identification of the remaining missing proteins but also enhance the exchange of information among the proteome community. GenomewidePDB 2.0 is available publicly at http://genomewidepdb.proteomix.org/.

  8. Implementation of integrated care for diabetes mellitus type 2 by two Dutch care groups: a case study.

    PubMed

    Busetto, Loraine; Luijkx, Katrien; Huizing, Anna; Vrijhoef, Bert

    2015-08-21

    Even though previous research has demonstrated improved outcomes of integrated care initiatives, it is not clear why and when integrated care works. This study aims to contribute to filling this knowledge gap by examining the implementation of integrated care for type 2 diabetes by two Dutch care groups. An embedded single case study was conducted including 26 interviews with management staff, care purchasers and health professionals. The Context + Mechanism = Outcome Model was used to study the relationship between context factors, mechanisms and outcomes. Dutch integrated care involves care groups, bundled payments, patient involvement, health professional cooperation and task substitution, evidence-based care protocols and a shared clinical information system. Community involvement is not (yet) part of Dutch integrated care. Barriers to the implementation of integrated care included insufficient integration between the patient databases, decreased earnings for some health professionals, patients' insufficient medical and policy-making expertise, resistance by general practitioner assistants due to perceived competition, too much care provided by practice nurses instead of general practitioners and the funding system incentivising the provision of care exactly as described in the care protocols. Facilitators included performance monitoring via the care chain information system, increased earnings for some health professionals, increased focus on self-management, innovators in primary and secondary care, diabetes nurses acting as integrators and financial incentives for guideline adherence. Economic and political context and health IT-related barriers were discussed as the most problematic areas of integrated care implementation. The implementation of integrated care led to improved communication and cooperation but also to insufficient and unnecessary care provision and deteriorated preconditions for person-centred care. Dutch integrated diabetes care is still a work in progress, in the academic and the practice setting. This makes it difficult to establish whether overall quality of care has improved. Future efforts should focus on areas that this study found to be problematic or to not have received enough attention yet. Increased efforts are needed to improve the interoperability of the patient databases and to keep the negative consequences of the bundled payment system in check. Moreover, patient and community involvement should be incorporated.

  9. GIS-project: geodynamic globe for global monitoring of geological processes

    NASA Astrophysics Data System (ADS)

    Ryakhovsky, V.; Rundquist, D.; Gatinsky, Yu.; Chesalova, E.

    2003-04-01

    A multilayer geodynamic globe at the scale 1:10,000,000 was created at the end of the nineties in the GIS Center of the Vernadsky Museum. A special soft-and-hardware complex was elaborated for its visualization with a set of multitarget object directed databases. The globe includes separate thematic covers represented by digital sets of spatial geological, geochemical, and geophysical information (maps, schemes, profiles, stratigraphic columns, arranged databases etc.). At present the largest databases included in the globe program are connected with petrochemical and isotopic data on magmatic rocks of the World Ocean and with the large and supperlarge mineral deposits. Software by the Environmental Scientific Research Institute (ESRI), USA as well as ArcScan vectrorizator were used for covers digitizing and database adaptation (ARC/INFO 7.0, 8.0). All layers of the geoinformational project were obtained by scanning of separate objects and their transfer to the real geographic co-ordinates of an equiintermediate conic projection. Then the covers were projected on plane degree-system geographic co-ordinates. Some attributive databases were formed for each thematic layer, and in the last stage all covers were combined into the single information system. Separate digital covers represent mathematical descriptions of geological objects and relations between them, such as Earth's altimetry, active fault systems, seismicity etc. Some grounds of the cartographic generalization were taken into consideration in time of covers compilation with projection and co-ordinate systems precisely answered a given scale. The globe allows us to carry out in the interactive regime the formation of coordinated with each other object-oriented databases and thematic covers directly connected with them. They can be spread for all the Earth and the near-Earth space, and for the most well known parts of divergent and convergent boundaries of the lithosphere plates. Such covers and time series reflect in diagram form a total combination and dynamics of data on the geological structure, geophysical fields, seismicity, geomagnetism, composition of rock complexes, and metalloge-ny of different areas on the Earth's surface. They give us possibility to scale, detail, and develop 3D spatial visualization. Information filling the covers could be replenished as in the existing so in newly formed databases with new data. The integrated analyses of the data allows us more precisely to define our ideas on regularities in development of lithosphere and mantle unhomogeneities using some original technologies. It also enables us to work out 3D digital models for geodynamic development of tectonic zones in convergent and divergent plate boundaries with the purpose of integrated monitoring of mineral resources and establishing correlation between seismicity, magmatic activity, and metallogeny in time-spatial co-ordinates. The created multifold geoinformation system gives a chance to execute an integral analyses of geoinformation flows in the interactive regime and, in particular, to establish some regularities in the time-spatial distribution and dynamics of main structural units in the lithosphere, as well as illuminate the connection between stages of their development and epochs of large and supperlarge mineral deposit formation. Now we try to use the system for prediction of large oil and gas concentration in the main sedimentary basins. The work was supported by RFBR, (grants 93-07-14680, 96-07-89499, 99-07-90030, 00-15-98535, 02-07-90140) and MTC.

  10. Implementation of medical monitor system based on networks

    NASA Astrophysics Data System (ADS)

    Yu, Hui; Cao, Yuzhen; Zhang, Lixin; Ding, Mingshi

    2006-11-01

    In this paper, the development trend of medical monitor system is analyzed and portable trend and network function become more and more popular among all kinds of medical monitor devices. The architecture of medical network monitor system solution is provided and design and implementation details of medical monitor terminal, monitor center software, distributed medical database and two kind of medical information terminal are especially discussed. Rabbit3000 system is used in medical monitor terminal to implement security administration of data transfer on network, human-machine interface, power management and DSP interface while DSP chip TMS5402 is used in signal analysis and data compression. Distributed medical database is designed for hospital center according to DICOM information model and HL7 standard. Pocket medical information terminal based on ARM9 embedded platform is also developed to interactive with center database on networks. Two kernels based on WINCE are customized and corresponding terminal software are developed for nurse's routine care and doctor's auxiliary diagnosis. Now invention patent of the monitor terminal is approved and manufacture and clinic test plans are scheduled. Applications for invention patent are also arranged for two medical information terminals.

  11. Emission & Generation Resource Integrated Database (eGRID)

    EPA Pesticide Factsheets

    The Emissions & Generation Resource Integrated Database (eGRID) is an integrated source of data on environmental characteristics of electric power generation. Twelve federal databases are represented by eGRID, which provides air emission and resource mix information for thousands of power plants and generating companies. eGRID allows direct comparison of the environmental attributes of electricity from different plants, companies, States, or regions of the power grid.

  12. Integrating workplace exposure databases for occupational medicine services and epidemiologic studies at a former nuclear weapons facility.

    PubMed

    Ruttenber, A J; McCrea, J S; Wade, T D; Schonbeck, M F; LaMontagne, A D; Van Dyke, M V; Martyny, J W

    2001-02-01

    We outline methods for integrating epidemiologic and industrial hygiene data systems for the purpose of exposure estimation, exposure surveillance, worker notification, and occupational medicine practice. We present examples of these methods from our work at the Rocky Flats Plant--a former nuclear weapons facility that fabricated plutonium triggers for nuclear weapons and is now being decontaminated and decommissioned. The weapons production processes exposed workers to plutonium, gamma photons, neutrons, beryllium, asbestos, and several hazardous chemical agents, including chlorinated hydrocarbons and heavy metals. We developed a job exposure matrix (JEM) for estimating exposures to 10 chemical agents in 20 buildings for 120 different job categories over a production history spanning 34 years. With the JEM, we estimated lifetime chemical exposures for about 12,000 of the 16,000 former production workers. We show how the JEM database is used to estimate cumulative exposures over different time periods for epidemiological studies and to provide notification and determine eligibility for a medical screening program developed for former workers. We designed an industrial hygiene data system for maintaining exposure data for current cleanup workers. We describe how this system can be used for exposure surveillance and linked with the JEM and databases on radiation doses to develop lifetime exposure histories and to determine appropriate medical monitoring tests for current cleanup workers. We also present time-line-based graphical methods for reviewing and correcting exposure estimates and reporting them to individual workers.

  13. An integrated approach for monitoring efficiency and investments of activated sludge-based wastewater treatment plants at large spatial scale.

    PubMed

    De Gisi, Sabino; Sabia, Gianpaolo; Casella, Patrizia; Farina, Roberto

    2015-08-01

    WISE, the Water Information System for Europe, is the web-portal of the European Commission (EU) that disseminates the quality state of the receiving water bodies and the efficiency of the municipal wastewater treatment plants (WWTPs) in order to monitor advances in the application of both the Water Framework Directive (WFD) as well as the Urban Wastewater Treatment Directive (UWWTD). With the intention to develop WISE applications, the aim of the work was to define and apply an integrated approach capable of monitoring the efficiency and investments of activated sludge-based WWTPs located in a large spatial area, providing the following outcomes useful to the decision-makers: (i) the identification of critical facilities and their critical processes by means of a Performance Assessment System (PAS), (ii) the choice of the most suitable upgrading actions, through a scenario analysis. (iii) the assessment of the investment costs to upgrade the critical WWTPs and (iv) the prioritization of the critical facilities by means of a multi-criteria approach which includes the stakeholders involvement, along with the integration of some technical, environmental, economic and health aspects. The implementation of the proposed approach to a high number of municipal WWTPs highlighted how the PAS developed was able to identify critical processes with a particular effectiveness in identifying the critical nutrient removal ones. In addition, a simplified approach that considers the cost related to a basic-configuration and those for the WWTP integration, allowed to link the critical processes identified and the investment costs. Finally, the questionnaire for the acquisition of data such as that provided by the Italian Institute of Statistics, the PAS defined and the database on the costs, if properly adapted, may allow for the extension of the integrated approach on an EU-scale by providing useful information to water utilities as well as institutions. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. A Case Study in Software Adaptation

    DTIC Science & Technology

    2002-01-01

    1 A Case Study in Software Adaptation Giuseppe Valetto Telecom Italia Lab Via Reiss Romoli 274 10148, Turin, Italy +39 011 2288788...configuration of the service; monitoring of database connectivity from within the service; monitoring of crashes and shutdowns of IM servers; monitoring of...of the IM server all share a relational database and a common runtime state repository, which make up the backend tier, and allow replicas to

  15. Heterogeneous database integration in biomedicine.

    PubMed

    Sujansky, W

    2001-08-01

    The rapid expansion of biomedical knowledge, reduction in computing costs, and spread of internet access have created an ocean of electronic data. The decentralized nature of our scientific community and healthcare system, however, has resulted in a patchwork of diverse, or heterogeneous, database implementations, making access to and aggregation of data across databases very difficult. The database heterogeneity problem applies equally to clinical data describing individual patients and biological data characterizing our genome. Specifically, databases are highly heterogeneous with respect to the data models they employ, the data schemas they specify, the query languages they support, and the terminologies they recognize. Heterogeneous database systems attempt to unify disparate databases by providing uniform conceptual schemas that resolve representational heterogeneities, and by providing querying capabilities that aggregate and integrate distributed data. Research in this area has applied a variety of database and knowledge-based techniques, including semantic data modeling, ontology definition, query translation, query optimization, and terminology mapping. Existing systems have addressed heterogeneous database integration in the realms of molecular biology, hospital information systems, and application portability.

  16. Elements of an integrated health monitoring framework

    NASA Astrophysics Data System (ADS)

    Fraser, Michael; Elgamal, Ahmed; Conte, Joel P.; Masri, Sami; Fountain, Tony; Gupta, Amarnath; Trivedi, Mohan; El Zarki, Magda

    2003-07-01

    Internet technologies are increasingly facilitating real-time monitoring of Bridges and Highways. The advances in wireless communications for instance, are allowing practical deployments for large extended systems. Sensor data, including video signals, can be used for long-term condition assessment, traffic-load regulation, emergency response, and seismic safety applications. Computer-based automated signal-analysis algorithms routinely process the incoming data and determine anomalies based on pre-defined response thresholds and more involved signal analysis techniques. Upon authentication, appropriate action may be authorized for maintenance, early warning, and/or emergency response. In such a strategy, data from thousands of sensors can be analyzed with near real-time and long-term assessment and decision-making implications. Addressing the above, a flexible and scalable (e.g., for an entire Highway system, or portfolio of Networked Civil Infrastructure) software architecture/framework is being developed and implemented. This framework will network and integrate real-time heterogeneous sensor data, database and archiving systems, computer vision, data analysis and interpretation, physics-based numerical simulation of complex structural systems, visualization, reliability & risk analysis, and rational statistical decision-making procedures. Thus, within this framework, data is converted into information, information into knowledge, and knowledge into decision at the end of the pipeline. Such a decision-support system contributes to the vitality of our economy, as rehabilitation, renewal, replacement, and/or maintenance of this infrastructure are estimated to require expenditures in the Trillion-dollar range nationwide, including issues of Homeland security and natural disaster mitigation. A pilot website (http://bridge.ucsd.edu/compositedeck.html) currently depicts some basic elements of the envisioned integrated health monitoring analysis framework.

  17. Development and Feasibility Testing of a Critical Care EEG Monitoring Database for Standardized Clinical Reporting and Multicenter Collaborative Research.

    PubMed

    Lee, Jong Woo; LaRoche, Suzette; Choi, Hyunmi; Rodriguez Ruiz, Andres A; Fertig, Evan; Politsky, Jeffrey M; Herman, Susan T; Loddenkemper, Tobias; Sansevere, Arnold J; Korb, Pearce J; Abend, Nicholas S; Goldstein, Joshua L; Sinha, Saurabh R; Dombrowski, Keith E; Ritzl, Eva K; Westover, Michael B; Gavvala, Jay R; Gerard, Elizabeth E; Schmitt, Sarah E; Szaflarski, Jerzy P; Ding, Kan; Haas, Kevin F; Buchsbaum, Richard; Hirsch, Lawrence J; Wusthoff, Courtney J; Hopp, Jennifer L; Hahn, Cecil D

    2016-04-01

    The rapid expansion of the use of continuous critical care electroencephalogram (cEEG) monitoring and resulting multicenter research studies through the Critical Care EEG Monitoring Research Consortium has created the need for a collaborative data sharing mechanism and repository. The authors describe the development of a research database incorporating the American Clinical Neurophysiology Society standardized terminology for critical care EEG monitoring. The database includes flexible report generation tools that allow for daily clinical use. Key clinical and research variables were incorporated into a Microsoft Access database. To assess its utility for multicenter research data collection, the authors performed a 21-center feasibility study in which each center entered data from 12 consecutive intensive care unit monitoring patients. To assess its utility as a clinical report generating tool, three large volume centers used it to generate daily clinical critical care EEG reports. A total of 280 subjects were enrolled in the multicenter feasibility study. The duration of recording (median, 25.5 hours) varied significantly between the centers. The incidence of seizure (17.6%), periodic/rhythmic discharges (35.7%), and interictal epileptiform discharges (11.8%) was similar to previous studies. The database was used as a clinical reporting tool by 3 centers that entered a total of 3,144 unique patients covering 6,665 recording days. The Critical Care EEG Monitoring Research Consortium database has been successfully developed and implemented with a dual role as a collaborative research platform and a clinical reporting tool. It is now available for public download to be used as a clinical data repository and report generating tool.

  18. A dedicated database system for handling multi-level data in systems biology.

    PubMed

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging. To overcome this, we designed and developed a dedicated database system that can serve and solve the vital issues in data management and hereby facilitate data integration, modeling and analysis in systems biology within a sole database. In addition, a yeast data repository was implemented as an integrated database environment which is operated by the database system. Two applications were implemented to demonstrate extensibility and utilization of the system. Both illustrate how the user can access the database via the web query function and implemented scripts. These scripts are specific for two sample cases: 1) Detecting the pheromone pathway in protein interaction networks; and 2) Finding metabolic reactions regulated by Snf1 kinase. In this study we present the design of database system which offers an extensible environment to efficiently capture the majority of biological entities and relations encountered in systems biology. Critical functions and control processes were designed and implemented to ensure consistent, efficient, secure and reliable transactions. The two sample cases on the yeast integrated data clearly demonstrate the value of a sole database environment for systems biology research.

  19. IMGMD: A platform for the integration and standardisation of In silico Microbial Genome-scale Metabolic Models.

    PubMed

    Ye, Chao; Xu, Nan; Dong, Chuan; Ye, Yuannong; Zou, Xuan; Chen, Xiulai; Guo, Fengbiao; Liu, Liming

    2017-04-07

    Genome-scale metabolic models (GSMMs) constitute a platform that combines genome sequences and detailed biochemical information to quantify microbial physiology at the system level. To improve the unity, integrity, correctness, and format of data in published GSMMs, a consensus IMGMD database was built in the LAMP (Linux + Apache + MySQL + PHP) system by integrating and standardizing 328 GSMMs constructed for 139 microorganisms. The IMGMD database can help microbial researchers download manually curated GSMMs, rapidly reconstruct standard GSMMs, design pathways, and identify metabolic targets for strategies on strain improvement. Moreover, the IMGMD database facilitates the integration of wet-lab and in silico data to gain an additional insight into microbial physiology. The IMGMD database is freely available, without any registration requirements, at http://imgmd.jiangnan.edu.cn/database.

  20. Telematics and smart cards in integrated health information system.

    PubMed

    Sicurello, F; Nicolosi, A

    1997-01-01

    Telematics and information technology are the base on which it will be possible to build an integrated health information system to support population and improve their quality of life. This system should be based on record linkage of all data based on the interactions of the patients with the health structures, such as general practitioners, specialists, health institutes and hospitals, pharmacies, etc. The record linkage can provide the connection and integration of various records, thanks to the use of telematic technology (either urban or geographical local networks, such as the Internet) and electronic data cards. Particular emphasis should be placed on the introduction of smart cards, such as portable health cards, which will contain a standardized data set and will be sufficient to access different databases found in various health services. The inter-operability of the social-health records (including multimedia types) and the smart cards (which are one of the most important prerequisites for the homogenization and wide diffusion of these cards at an European level) should be strongly taken into consideration. In this framework a project is going to be developed aiming towards the integration of various data bases distributed territorially, from the reading of the software and the updating of the smart cards to the complete management of the patients' evaluation records, to the quality of the services offered and to the health planning. The applications developed will support epidemiological investigation software and data analysis. The inter-connection of all the databases of the various structures involved will take place through a coordination center, the most important system of which we will call "record linkage" or "integrated database". Smart cards will be distributed to a sample group of possible users and the necessary smart card management tools will be installed in all the structures involved. All the final users (the patients) in the whole network of services involved will be monitored for the duration of the project. The system users will also include general practitioners, social workers, physicians, health operators, pharmacists, laboratory workers and administrative personnel of the municipality and of the health structures concerned.

  1. E-MSD: an integrated data resource for bioinformatics

    PubMed Central

    Velankar, S.; McNeil, P.; Mittard-Runte, V.; Suarez, A.; Barrell, D.; Apweiler, R.; Henrick, K.

    2005-01-01

    The Macromolecular Structure Database (MSD) group (http://www.ebi.ac.uk/msd/) continues to enhance the quality and consistency of macromolecular structure data in the worldwide Protein Data Bank (wwPDB) and to work towards the integration of various bioinformatics data resources. One of the major obstacles to the improved integration of structural databases such as MSD and sequence databases like UniProt is the absence of up to date and well-maintained mapping between corresponding entries. We have worked closely with the UniProt group at the EBI to clean up the taxonomy and sequence cross-reference information in the MSD and UniProt databases. This information is vital for the reliable integration of the sequence family databases such as Pfam and Interpro with the structure-oriented databases of SCOP and CATH. This information has been made available to the eFamily group (http://www.efamily.org.uk/) and now forms the basis of the regular interchange of information between the member databases (MSD, UniProt, Pfam, Interpro, SCOP and CATH). This exchange of annotation information has enriched the structural information in the MSD database with annotation from wider sequence-oriented resources. This work was carried out under the ‘Structure Integration with Function, Taxonomy and Sequences (SIFTS)’ initiative (http://www.ebi.ac.uk/msd-srv/docs/sifts) in the MSD group. PMID:15608192

  2. Integrated G and C Implementation within IDOS: A Simulink Based Reusable Launch Vehicle Simulation

    NASA Technical Reports Server (NTRS)

    Fisher, Joseph E.; Bevacqua, Tim; Lawrence, Douglas A.; Zhu, J. Jim; Mahoney, Michael

    2003-01-01

    The implementation of multiple Integrated Guidance and Control (IG&C) algorithms per flight phase within a vehicle simulation poses a daunting task to coordinate algorithm interactions with the other G&C components and with vehicle subsystems. Currently being developed by Universal Space Lines LLC (USL) under contract from NASA, the Integrated Development and Operations System (IDOS) contains a high fidelity Simulink vehicle simulation, which provides a means to test cutting edge G&C technologies. Combining the modularity of this vehicle simulation and Simulink s built-in primitive blocks provide a quick way to implement algorithms. To add discrete-event functionality to the unfinished IDOS simulation, Vehicle Event Manager (VEM) and Integrated Vehicle Health Monitoring (IVHM) subsystems were created to provide discrete-event and pseudo-health monitoring processing capabilities. Matlab's Stateflow is used to create the IVHM and Event Manager subsystems and to implement a supervisory logic controller referred to as the Auto-commander as part of the IG&C to coordinate the control system adaptation and reconfiguration and to select the control and guidance algorithms for a given flight phase. Manual creation of the Stateflow charts for all of these subsystems is a tedious and time-consuming process. The Stateflow Auto-builder was developed as a Matlab based software tool for the automatic generation of a Stateflow chart from information contained in a database. This paper describes the IG&C, VEM and IVHM implementations in IDOS. In addition, this paper describes the Stateflow Auto-builder.

  3. A sensor simulation framework for the testing and evaluation of external hazard monitors and integrated alerting and notification functions

    NASA Astrophysics Data System (ADS)

    Uijt de Haag, Maarten; Venable, Kyle; Bezawada, Rajesh; Adami, Tony; Vadlamani, Ananth K.

    2009-05-01

    This paper discusses a sensor simulator/synthesizer framework that can be used to test and evaluate various sensor integration strategies for the implementation of an External Hazard Monitor (EHM) and Integrated Alerting and Notification (IAN) function as part of NASA's Integrated Intelligent Flight Deck (IIFD) project. The IIFD project under the NASA's Aviation Safety program "pursues technologies related to the flight deck that ensure crew workload and situational awareness are both safely optimized and adapted to the future operational environment as envisioned by NextGen." Within the simulation framework, various inputs to the IIFD and its subsystems, the EHM and IAN, are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. Sensors and avionics included in this framework are TCAS, ADS-B, Forward-Looking Infrared, Vision cameras, GPS, Inertial navigators, EGPWS, Laser Detection and Ranging sensors, altimeters, communication links with ATC, and weather radar. The framework is implemented in Simulink, a modeling language developed by The Mathworks. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft. Specifically, this paper addresses the architecture of the simulator, the sensor model interfaces, the timing and database (environment) aspects of the sensor models, the user interface of the modeling environment, and the various avionics implementations.

  4. WWW.NMDB.EU: The real-time Neutron Monitor database

    NASA Astrophysics Data System (ADS)

    Klein, Karl-Ludwig; Steigies, Christian T.; NMDB Consortium

    2010-05-01

    The Real time database for high-resolution neutron monitor measurements (NMDB), which was supported by the 7th framework program of the European Commission, hosts data on cosmic rays in the GeV range from European and some non-European neutron monitor stations. It offers a variety of applications ranging from the representation and retrieval of cosmic ray data over solar energetic particle alerts to the calculation of ionisation doses in the atmosphere and radiation dose rates at aircraft altitudes. Furthermore the web site comprises public outreach pages in several languages and offers training material on cosmic rays for university students and researchers and engineers who want to get familiar with cosmic rays and neutron monitor measurements. This contribution presents an overview of the provided services and indications on how to access the database. Operators of other neutron monitor stations are welcome to submit their data to NMDB.

  5. A Web-based tool for UV irradiance data: predictions for European and Southeast Asian sites.

    PubMed

    Kift, Richard; Webb, Ann R; Page, John; Rimmer, John; Janjai, Serm

    2006-01-01

    There are a range of UV models available, but one needs significant pre-existing knowledge and experience in order to be able to use them. In this article a comparatively simple Web-based model developed for the SoDa (Integration and Exploitation of Networked Solar Radiation Databases for Environment Monitoring) project is presented. This is a clear-sky model with modifications for cloud effects. To determine if the model produces realistic UV data the output is compared with 1 year sets of hourly measurements at sites in the United Kingdom and Thailand. The accuracy of the output depends on the input, but reasonable results were obtained with the use of the default database inputs and improved when pyranometer instead of modeled data provided the global radiation input needed to estimate the UV. The average modeled values of UV for the UK site were found to be within 10% of measurements. For the tropical sites in Thailand the average modeled values were within 1120% of measurements for the four sites with the use of the default SoDa database values. These results improved when pyranometer data and TOMS ozone data from 2002 replaced the standard SoDa database values, reducing the error range for all four sites to less than 15%.

  6. Integration of an Evidence Base into a Probabilistic Risk Assessment Model. The Integrated Medical Model Database: An Organized Evidence Base for Assessing In-Flight Crew Health Risk and System Design

    NASA Technical Reports Server (NTRS)

    Saile, Lynn; Lopez, Vilma; Bickham, Grandin; FreiredeCarvalho, Mary; Kerstman, Eric; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) database, which is an organized evidence base for assessing in-flight crew health risk. The database is a relational database accessible to many people. The database quantifies the model inputs by a ranking based on the highest value of the data as Level of Evidence (LOE) and the quality of evidence (QOE) score that provides an assessment of the evidence base for each medical condition. The IMM evidence base has already been able to provide invaluable information for designers, and for other uses.

  7. Applications of TRMM-based Multi-Satellite Precipitation Estimation for Global Runoff Simulation: Prototyping a Global Flood Monitoring System

    NASA Technical Reports Server (NTRS)

    Hong, Yang; Adler, Robert F.; Huffman, George J.; Pierce, Harold

    2008-01-01

    Advances in flood monitoring/forecasting have been constrained by the difficulty in estimating rainfall continuously over space (catchment-, national-, continental-, or even global-scale areas) and flood-relevant time scale. With the recent availability of satellite rainfall estimates at fine time and space resolution, this paper describes a prototype research framework for global flood monitoring by combining real-time satellite observations with a database of global terrestrial characteristics through a hydrologically relevant modeling scheme. Four major components included in the framework are (1) real-time precipitation input from NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA); (2) a central geospatial database to preprocess the land surface characteristics: water divides, slopes, soils, land use, flow directions, flow accumulation, drainage network etc.; (3) a modified distributed hydrological model to convert rainfall to runoff and route the flow through the stream network in order to predict the timing and severity of the flood wave, and (4) an open-access web interface to quickly disseminate flood alerts for potential decision-making. Retrospective simulations for 1998-2006 demonstrate that the Global Flood Monitor (GFM) system performs consistently at both station and catchment levels. The GFM website (experimental version) has been running at near real-time in an effort to offer a cost-effective solution to the ultimate challenge of building natural disaster early warning systems for the data-sparse regions of the world. The interactive GFM website shows close-up maps of the flood risks overlaid on topography/population or integrated with the Google-Earth visualization tool. One additional capability, which extends forecast lead-time by assimilating QPF into the GFM, also will be implemented in the future.

  8. Success of HIS DICOM interfaces in the integration of the healthcare enterprise at the Department of Veterans Affairs

    NASA Astrophysics Data System (ADS)

    Kuzmak, Peter M.; Dayhoff, Ruth E.

    1999-07-01

    The US Department of Veterans Affairs (VA) is integrating imaging into the healthcare enterprise using the Digital Imaging and Communication in Medicine (DICOM) standard protocols. Image management is directly integrated into the VistA Hospital Information System (HIS) software and the clinical database. Radiology images are acquired via DICOM, and are stored directly in the HIS database. Images can be displayed on low-cost clinician's workstations throughout the medical center. High-resolution diagnostic quality multi-monitor VistA workstations with specialized viewing software can be used for reading radiology images. Two approaches are used to acquire and handle imags within the radiology department. Some sties have a commercial Picture Archiving and Communications System (PACS) interfaced to the VistA HIS, while other sites use the direct image acquisition and integrated diagnostic reading capabilities of VistA itself. A small set of DICOM services have been implemented by VistA to allow patient and study text data to be transmitted to image producing modalities and the commercial PACS, and to enable images and study data to be transferred back. The VistA DICOM capabilities are now used to interface seven different commercial PACS products and over twenty different radiology modalities. The communications capabilities of DICOM and the VA wide area network are begin used to support reading of radiology images form remote sites. DICOM has been the cornerstone in the ability to integrate imaging functionality into the Healthcare Enterprise. Because of its openness, it allows the integration of system component from commercial and non- commercial sources to work together to provide functional cost-effective solutions. As DICOM expands to non-radiology devices, integration must occur with the specialty information subsystems that handle orders and reports, their associated DICOM image capture systems, and the computer- based patient record. The mode and concepts of the DICOM standard can be extended to these other areas, but some adjustments may be required.

  9. Database Integrity Monitoring for Synthetic Vision Systems Using Machine Vision and SHADE

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; Young, Steven D.

    2005-01-01

    In an effort to increase situational awareness, the aviation industry is investigating technologies that allow pilots to visualize what is outside of the aircraft during periods of low-visibility. One of these technologies, referred to as Synthetic Vision Systems (SVS), provides the pilot with real-time computer-generated images of obstacles, terrain features, runways, and other aircraft regardless of weather conditions. To help ensure the integrity of such systems, methods of verifying the accuracy of synthetically-derived display elements using onboard remote sensing technologies are under investigation. One such method is based on a shadow detection and extraction (SHADE) algorithm that transforms computer-generated digital elevation data into a reference domain that enables direct comparison with radar measurements. This paper describes machine vision techniques for making this comparison and discusses preliminary results from application to actual flight data.

  10. Diagnostic grade wireless ECG monitoring.

    PubMed

    Garudadri, Harinath; Chi, Yuejie; Baker, Steve; Majumdar, Somdeb; Baheti, Pawan K; Ballard, Dan

    2011-01-01

    In remote monitoring of Electrocardiogram (ECG), it is very important to ensure that the diagnostic integrity of signals is not compromised by sensing artifacts and channel errors. It is also important for the sensors to be extremely power efficient to enable wearable form factors and long battery life. We present an application of Compressive Sensing (CS) as an error mitigation scheme at the application layer for wearable, wireless sensors in diagnostic grade remote monitoring of ECG. In our previous work, we described an approach to mitigate errors due to packet losses by projecting ECG data to a random space and recovering a faithful representation using sparse reconstruction methods. Our contributions in this work are twofold. First, we present an efficient hardware implementation of random projection at the sensor. Second, we validate the diagnostic integrity of the reconstructed ECG after packet loss mitigation. We validate our approach on MIT and AHA databases comprising more than 250,000 normal and abnormal beats using EC57 protocols adopted by the Food and Drug Administration (FDA). We show that sensitivity and positive predictivity of a state-of-the-art ECG arrhythmia classifier is essentially invariant under CS based packet loss mitigation for both normal and abnormal beats even at high packet loss rates. In contrast, the performance degrades significantly in the absence of any error mitigation scheme, particularly for abnormal beats such as Ventricular Ectopic Beats (VEB).

  11. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    NASA Astrophysics Data System (ADS)

    Steiakakis, Chrysanthos; Agioutantis, Zacharias; Apostolou, Evangelia; Papavgeri, Georgia; Tripolitsiotis, Achilles

    2016-01-01

    The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions. This paper presents an integrated data management system which was developed over a number of years as well as the advantages through a specific application. The presented case study illustrates how the high production slopes of a mine that exceed depths of 100-120 m were successfully mined with an average displacement rate of 10- 20 mm/day, approaching an almost slow to moderate landslide velocity. Monitoring data of the past four years are included in the database and can be analyzed to produce valuable results. Time-series data correlations of movements, precipitation records, etc. are evaluated and presented in this case study. The results can be used to successfully manage mine operations and ensure the safety of the mine and the workforce.

  12. BioWarehouse: a bioinformatics database warehouse toolkit

    PubMed Central

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David WJ; Tenenbaum, Jessica D; Karp, Peter D

    2006-01-01

    Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the database integration problem for bioinformatics. PMID:16556315

  13. BioWarehouse: a bioinformatics database warehouse toolkit.

    PubMed

    Lee, Thomas J; Pouliot, Yannick; Wagner, Valerie; Gupta, Priyanka; Stringer-Calvert, David W J; Tenenbaum, Jessica D; Karp, Peter D

    2006-03-23

    This article addresses the problem of interoperation of heterogeneous bioinformatics databases. We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. BioWarehouse embodies significant progress on the database integration problem for bioinformatics.

  14. Towards G2G: Systems of Technology Database Systems

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David

    2005-01-01

    We present an approach and methodology for developing Government-to-Government (G2G) Systems of Technology Database Systems. G2G will deliver technologies for distributed and remote integration of technology data for internal use in analysis and planning as well as for external communications. G2G enables NASA managers, engineers, operational teams and information systems to "compose" technology roadmaps and plans by selecting, combining, extending, specializing and modifying components of technology database systems. G2G will interoperate information and knowledge that is distributed across organizational entities involved that is ideal for NASA future Exploration Enterprise. Key contributions of the G2G system will include the creation of an integrated approach to sustain effective management of technology investments that supports the ability of various technology database systems to be independently managed. The integration technology will comply with emerging open standards. Applications can thus be customized for local needs while enabling an integrated management of technology approach that serves the global needs of NASA. The G2G capabilities will use NASA s breakthrough in database "composition" and integration technology, will use and advance emerging open standards, and will use commercial information technologies to enable effective System of Technology Database systems.

  15. Building a multi-scaled geospatial temporal ecology database from disparate data sources: fostering open science and data reuse.

    PubMed

    Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.

  16. Building a multi-scaled geospatial temporal ecology database from disparate data sources: Fostering open science through data reuse

    USGS Publications Warehouse

    Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.

  17. "Spice," "kryptonite," "black mamba": an overview of brand names and marketing strategies of novel psychoactive substances on the web.

    PubMed

    Corazza, Ornella; Valeriani, Giuseppe; Bersani, Francesco Saverio; Corkery, John; Martinotti, Giovanni; Bersani, Giuseppe; Schifano, Fabrizio

    2014-01-01

    Abstract Introduction: Novel Psychoactive Substances (NPSs) are often sold online as "legal" and "safer" alternatives to International Controlled Drugs (ICDs) with captivating marketing strategies. Our aim was to review and summarize such strategies in terms of the appearance of the products, the brand names, and the latest trends in the illicit online marketplaces. Scientific data were searched in PsychInfo and Pubmed databases; results were integrated with an extensive monitoring of Internet (websites, online shops, chat rooms, fora, social networks) and media sources in nine languages (English, French, Farsi, Portuguese, Arabic, Russian, Spanish, and Chinese simplified/traditional) available from secure databases of the Global Public Health Intelligence Network. Evolving strategies for the online diffusion and the retail of NPSs have been identified, including discounts and periodic offers on chosen products. Advertisements and new brand names have been designed to attract customers, especially young people. An increased number of retailers have been recorded as well as new Web platforms and privacy systems. NPSs represent an unprecedented challenge in the field of public health with social, cultural, legal, and political implications. Web monitoring activities are essential for mapping the diffusion of NPSs and for supporting innovative Web-based prevention programmes.

  18. The Danish Schizophrenia Registry

    PubMed Central

    Baandrup, Lone; Cerqueira, Charlotte; Haller, Lea; Korshøj, Lene; Voldsgaard, Inge; Nordentoft, Merete

    2016-01-01

    Aim of database To systematically monitor and improve the quality of treatment and care of patients with schizophrenia in Denmark. In addition, the database is accessible as a resource for research. Study population Patients diagnosed with schizophrenia and receiving mental health care in psychiatric hospitals or outpatient clinics. During the first year after the diagnosis, patients are classified as incident patients, and after this period as prevalent patients. Main variables The registry currently contains 21 clinical quality measures in relation to the following domains: diagnostic evaluation, antipsychotic treatment including adverse reactions, cardiovascular risk factors including laboratory values, family intervention, psychoeducation, postdischarge mental health care, assessment of suicide risk in relation to discharge, and assessment of global functioning. Descriptive data The recorded data are available electronically for the reporting clinicians and responsible administrative personnel, and they are updated monthly. The registry publishes the national and regional results of all included quality measures in the annual audit reports. External researchers may obtain access to the data for use in specific research projects by applying to the steering committee. Conclusion The Danish Schizophrenia Registry represents a valuable source of informative data to monitor and improve the quality of care of patients with schizophrenia in Denmark. However, continuous resources and time devoted is necessary to maintain the integrity of the registry and the validity of the data. PMID:27843348

  19. Improved earthquake monitoring in the central and eastern United States in support of seismic assessments for critical facilities

    USGS Publications Warehouse

    Leith, William S.; Benz, Harley M.; Herrmann, Robert B.

    2011-01-01

    Evaluation of seismic monitoring capabilities in the central and eastern United States for critical facilities - including nuclear powerplants - focused on specific improvements to understand better the seismic hazards in the region. The report is not an assessment of seismic safety at nuclear plants. To accomplish the evaluation and to provide suggestions for improvements using funding from the American Recovery and Reinvestment Act of 2009, the U.S. Geological Survey examined addition of new strong-motion seismic stations in areas of seismic activity and addition of new seismic stations near nuclear power-plant locations, along with integration of data from the Transportable Array of some 400 mobile seismic stations. Some 38 and 68 stations, respectively, were suggested for addition in active seismic zones and near-power-plant locations. Expansion of databases for strong-motion and other earthquake source-characterization data also was evaluated. Recognizing pragmatic limitations of station deployment, augmentation of existing deployments provides improvements in source characterization by quantification of near-source attenuation in regions where larger earthquakes are expected. That augmentation also supports systematic data collection from existing networks. The report further utilizes the application of modeling procedures and processing algorithms, with the additional stations and the improved seismic databases, to leverage the capabilities of existing and expanded seismic arrays.

  20. Monitoring of services with non-relational databases and map-reduce framework

    NASA Astrophysics Data System (ADS)

    Babik, M.; Souto, F.

    2012-12-01

    Service Availability Monitoring (SAM) is a well-established monitoring framework that performs regular measurements of the core site services and reports the corresponding availability and reliability of the Worldwide LHC Computing Grid (WLCG) infrastructure. One of the existing extensions of SAM is Site Wide Area Testing (SWAT), which gathers monitoring information from the worker nodes via instrumented jobs. This generates quite a lot of monitoring data to process, as there are several data points for every job and several million jobs are executed every day. The recent uptake of non-relational databases opens a new paradigm in the large-scale storage and distributed processing of systems with heavy read-write workloads. For SAM this brings new possibilities to improve its model, from performing aggregation of measurements to storing raw data and subsequent re-processing. Both SAM and SWAT are currently tuned to run at top performance, reaching some of the limits in storage and processing power of their existing Oracle relational database. We investigated the usability and performance of non-relational storage together with its distributed data processing capabilities. For this, several popular systems have been compared. In this contribution we describe our investigation of the existing non-relational databases suited for monitoring systems covering Cassandra, HBase and MongoDB. Further, we present our experiences in data modeling and prototyping map-reduce algorithms focusing on the extension of the already existing availability and reliability computations. Finally, possible future directions in this area are discussed, analyzing the current deficiencies of the existing Grid monitoring systems and proposing solutions to leverage the benefits of the non-relational databases to get more scalable and flexible frameworks.

  1. Work Coordination Engine

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, Rachel; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    The Work Coordination Engine (WCE) is a Java application integrated into the Service Management Database (SMDB), which coordinates the dispatching and monitoring of a work order system. WCE de-queues work orders from SMDB and orchestrates the dispatching of work to a registered set of software worker applications distributed over a set of local, or remote, heterogeneous computing systems. WCE monitors the execution of work orders once dispatched, and accepts the results of the work order by storing to the SMDB persistent store. The software leverages the use of a relational database, Java Messaging System (JMS), and Web Services using Simple Object Access Protocol (SOAP) technologies to implement an efficient work-order dispatching mechanism capable of coordinating the work of multiple computer servers on various platforms working concurrently on different, or similar, types of data or algorithmic processing. Existing (legacy) applications can be wrapped with a proxy object so that no changes to the application are needed to make them available for integration into the work order system as "workers." WCE automatically reschedules work orders that fail to be executed by one server to a different server if available. From initiation to completion, the system manages the execution state of work orders and workers via a well-defined set of events, states, and actions. It allows for configurable work-order execution timeouts by work-order type. This innovation eliminates a current processing bottleneck by providing a highly scalable, distributed work-order system used to quickly generate products needed by the Deep Space Network (DSN) to support space flight operations. WCE is driven by asynchronous messages delivered via JMS indicating the availability of new work or workers. It runs completely unattended in support of the lights-out operations concept in the DSN.

  2. Integrating Remote Sensing and Disease Surveillance to Forecast Malaria Epidemics

    NASA Astrophysics Data System (ADS)

    Wimberly, M. C.; Beyane, B.; DeVos, M.; Liu, Y.; Merkord, C. L.; Mihretie, A.

    2015-12-01

    Advance information about the timing and locations of malaria epidemics can facilitate the targeting of resources for prevention and emergency response. Early detection methods can detect incipient outbreaks by identifying deviations from expected seasonal patterns, whereas early warning approaches typically forecast future malaria risk based on lagged responses to meteorological factors. A critical limiting factor for implementing either of these approaches is the need for timely and consistent acquisition, processing and analysis of both environmental and epidemiological data. To address this need, we have developed EPIDEMIA - an integrated system for surveillance and forecasting of malaria epidemics. The EPIDEMIA system includes a public health interface for uploading and querying weekly surveillance reports as well as algorithms for automatically validating incoming data and updating the epidemiological surveillance database. The newly released EASTWeb 2.0 software application automatically downloads, processes, and summaries remotely-sensed environmental data from multiple earth science data archives. EASTWeb was implemented as a component of the EPIDEMIA system, which combines the environmental monitoring data and epidemiological surveillance data into a unified database that supports both early detection and early warning models. Dynamic linear models implemented with Kalman filtering were used to carry out forecasting and model updating. Preliminary forecasts have been disseminated to public health partners in the Amhara Region of Ethiopia and will be validated and refined as the EPIDEMIA system ingests new data. In addition to continued model development and testing, future work will involve updating the public health interface to provide a broader suite of outbreak alerts and data visualization tools that are useful to our public health partners. The EPIDEMIA system demonstrates a feasible approach to synthesizing the information from epidemiological surveillance systems and remotely-sensed environmental monitoring systems to improve malaria epidemic detection and forecasting.

  3. Integrative dynamic therapy for bulimia nervosa: An evidence-based case study.

    PubMed

    Richards, Lauren K; Shingleton, Rebecca M; Goldman, Rachel; Siegel, Deborah; Thompson-Brenner, Heather

    2016-06-01

    Both cognitive-behavioral therapy (CBT) and psychodynamic psychotherapy are commonly used to treat eating disorders. To further investigate the effectiveness of integrative dynamic therapy (IDT) for bulimia nervosa (BN), our research group undertook a randomized, controlled pilot study comparing IDT with CBT for BN. The case described here was selected from a sample of N = 38 female patients with the symptoms of BN who enrolled in the study. IDT incorporated aspects of the first 4-week stage of CBT, including psychoeducation, self-monitoring, and regular eating. Subsequently, the treatment focused on emotional expression, emotion regulation (defenses), intrapsychic conflict, and interpersonal relationships. The objectives of the report are to demonstrate the effectiveness of an integrative approach to the treatment of eating disorders to address the symptoms of BN and personality issues using pre-, mid-, and posttreatment data, and to illustrate the patient and clinician reactions to each approach to treatment using excerpts from session transcripts and alliance data. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. PGSB/MIPS PlantsDB Database Framework for the Integration and Analysis of Plant Genome Data.

    PubMed

    Spannagl, Manuel; Nussbaumer, Thomas; Bader, Kai; Gundlach, Heidrun; Mayer, Klaus F X

    2017-01-01

    Plant Genome and Systems Biology (PGSB), formerly Munich Institute for Protein Sequences (MIPS) PlantsDB, is a database framework for the integration and analysis of plant genome data, developed and maintained for more than a decade now. Major components of that framework are genome databases and analysis resources focusing on individual (reference) genomes providing flexible and intuitive access to data. Another main focus is the integration of genomes from both model and crop plants to form a scaffold for comparative genomics, assisted by specialized tools such as the CrowsNest viewer to explore conserved gene order (synteny). Data exchange and integrated search functionality with/over many plant genome databases is provided within the transPLANT project.

  5. Consistent Query Answering of Conjunctive Queries under Primary Key Constraints

    ERIC Educational Resources Information Center

    Pema, Enela

    2014-01-01

    An inconsistent database is a database that violates one or more of its integrity constraints. In reality, violations of integrity constraints arise frequently under several different circumstances. Inconsistent databases have long posed the challenge to develop suitable tools for meaningful query answering. A principled approach for querying…

  6. Enhancing Knowledge Integration: An Information System Capstone Project

    ERIC Educational Resources Information Center

    Steiger, David M.

    2009-01-01

    This database project focuses on learning through knowledge integration; i.e., sharing and applying specialized (database) knowledge within a group, and combining it with other business knowledge to create new knowledge. Specifically, the Tiny Tots, Inc. project described below requires students to design, build, and instantiate a database system…

  7. Women's experiences of continuous fetal monitoring - a mixed-methods systematic review.

    PubMed

    Crawford, Alexandra; Hayes, Dexter; Johnstone, Edward D; Heazell, Alexander E P

    2017-12-01

    Antepartum stillbirth is often preceded by detectable signs of fetal compromise, including changes in fetal heart rate and movement. It is hypothesized that continuous fetal monitoring could detect these signs more accurately and objectively than current forms of fetal monitoring and allow for timely intervention. This systematic review aimed to explore available evidence on women's experiences of continuous fetal monitoring to investigate its acceptability before clinical implementation and to inform clinical studies. Systematic searching of four electronic databases (Embase, PsycINFO, MEDLINE and CINAHL), using key terms defined by initial scoping searches, identified a total of 35 studies. Following title and abstract screening by two independent researchers, five studies met the inclusion criteria. Studies were not excluded based on language, methodology or quality assessment. An integrative methodology was used to synthesize qualitative and quantitative data together. Forms of continuous fetal monitoring used included Monica AN24 monitors (n = 4) and phonocardiography (n = 1). Four main themes were identified: practical limitations of the device, negative emotions, positive perceptions, and device implementation. Continuous fetal monitoring was reported to have high levels of participant satisfaction and was preferred by women to intermittent cardiotocography. This review suggests that continuous fetal monitoring is accepted by women. However, it has also highlighted both the paucity and heterogeneity of current studies and suggests that further research should be conducted into women's experiences of continuous fetal monitoring before such devices can be used clinically. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.

  8. A Recommender System in the Cyber Defense Domain

    DTIC Science & Technology

    2014-03-27

    monitoring software is a java based program sending updates to the database on the sensor machine. The host monitoring program gathers information about...3.2.2 Database. A MySQL database located on the sensor machine acts as the storage for the sensors on the network. Snort, Nmap, vulnerability scores, and...machine with the IDS and the recommender is labeled “sensor”. The recommender system code is written in java and compiled using java version 1.6.024

  9. Environment/Health/Safety (EHS): Databases

    Science.gov Websites

    Hazard Documents Database Biosafety Authorization System CATS (Corrective Action Tracking System) (for findings 12/2005 to present) Chemical Management System Electrical Safety Ergonomics Database (for new Learned / Best Practices REMS - Radiation Exposure Monitoring System SJHA Database - Subcontractor Job

  10. Optimum-AIV: A planning and scheduling system for spacecraft AIV

    NASA Technical Reports Server (NTRS)

    Arentoft, M. M.; Fuchs, Jens J.; Parrod, Y.; Gasquet, Andre; Stader, J.; Stokes, I.; Vadon, H.

    1991-01-01

    A project undertaken for the European Space Agency (ESA) is presented. The project is developing a knowledge based software system for planning and scheduling of activities for spacecraft assembly, integration, and verification (AIV). The system extends into the monitoring of plan execution and the plan repair phase. The objectives are to develop an operational kernel of a planning, scheduling, and plan repair tool, called OPTIMUM-AIV, and to provide facilities which will allow individual projects to customize the kernel to suit its specific needs. The kernel shall consist of a set of software functionalities for assistance in initial specification of the AIV plan, in verification and generation of valid plans and schedules for the AIV activities, and in interactive monitoring and execution problem recovery for the detailed AIV plans. Embedded in OPTIMUM-AIV are external interfaces which allow integration with alternative scheduling systems and project databases. The current status of the OPTIMUM-AIV project, as of Jan. 1991, is that a further analysis of the AIV domain has taken place through interviews with satellite AIV experts, a software requirement document (SRD) for the full operational tool was approved, and an architectural design document (ADD) for the kernel excluding external interfaces is ready for review.

  11. Combining GPS, GIS, and accelerometry: methodological issues in the assessment of location and intensity of travel behaviors.

    PubMed

    Oliver, Melody; Badland, Hannah; Mavoa, Suzanne; Duncan, Mitch J; Duncan, Scott

    2010-01-01

    Global positioning systems (GPS), geographic information systems (GIS), and accelerometers are powerful tools to explain activity within a built environment, yet little integration of these tools has taken place. This study aimed to assess the feasibility of combining GPS, GIS, and accelerometry to understand transport-related physical activity (TPA) in adults. Forty adults wore an accelerometer and portable GPS unit over 7 consecutive days and completed a demographics questionnaire and 7-day travel log. Accelerometer and GPS data were extracted for commutes to/from workplace and integrated into a GIS database. GIS maps were generated to visually explore physical activity intensity, GPS speeds and routes traveled. GPS, accelerometer, and survey data were collected for 37 participants. Loss of GPS data was substantial due to a range of methodological issues, such as low battery life, signal drop out, and participant noncompliance. Nonetheless, greater travel distances and significantly higher speeds were observed for motorized trips when compared with TPA. Pragmatic issues of using GPS monitoring to understand TPA behaviors and methodological recommendations for future research were identified. Although methodologically challenging, the combination of GPS monitoring, accelerometry and GIS technologies holds promise for understanding TPA within the built environment.

  12. A Real-Time Construction Safety Monitoring System for Hazardous Gas Integrating Wireless Sensor Network and Building Information Modeling Technologies

    PubMed Central

    Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng

    2018-01-01

    In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications. PMID:29393887

  13. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    PubMed

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.

  14. Virtual Manufacturing Techniques Designed and Applied to Manufacturing Activities in the Manufacturing Integration and Technology Branch

    NASA Technical Reports Server (NTRS)

    Shearrow, Charles A.

    1999-01-01

    One of the identified goals of EM3 is to implement virtual manufacturing by the time the year 2000 has ended. To realize this goal of a true virtual manufacturing enterprise the initial development of a machinability database and the infrastructure must be completed. This will consist of the containment of the existing EM-NET problems and developing machine, tooling, and common materials databases. To integrate the virtual manufacturing enterprise with normal day to day operations the development of a parallel virtual manufacturing machinability database, virtual manufacturing database, virtual manufacturing paradigm, implementation/integration procedure, and testable verification models must be constructed. Common and virtual machinability databases will include the four distinct areas of machine tools, available tooling, common machine tool loads, and a materials database. The machine tools database will include the machine envelope, special machine attachments, tooling capacity, location within NASA-JSC or with a contractor, and availability/scheduling. The tooling database will include available standard tooling, custom in-house tooling, tool properties, and availability. The common materials database will include materials thickness ranges, strengths, types, and their availability. The virtual manufacturing databases will consist of virtual machines and virtual tooling directly related to the common and machinability databases. The items to be completed are the design and construction of the machinability databases, virtual manufacturing paradigm for NASA-JSC, implementation timeline, VNC model of one bridge mill and troubleshoot existing software and hardware problems with EN4NET. The final step of this virtual manufacturing project will be to integrate other production sites into the databases bringing JSC's EM3 into a position of becoming a clearing house for NASA's digital manufacturing needs creating a true virtual manufacturing enterprise.

  15. Implementing OpenMRS for patient monitoring in an HIV/AIDS care and treatment program in rural Mozambique.

    PubMed

    Manders, Eric-Jan; José, Eurico; Solis, Manuel; Burlison, Janeen; Nhampossa, José Leopoldo; Moon, Troy

    2010-01-01

    We have adopted the Open Medical Record System (OpenMRS) framework to implement an electronic patient monitoring system for an HIV care and treatment program in Mozambique. The program provides technical assistance to the Ministry of Health supporting the scale up of integrated HIV care and support services in health facilities in rural resource limited settings. The implementation is in use for adult and pediatric programs, with ongoing roll-out to cover all supported sites. We describe early experiences in adapting the system to the program needs, addressing infrastructure challenges, creating a regional support team, training data entry staff, migrating a legacy database, deployment, and current use. We find that OpenMRS offers excellent prospects for in-country development of health information systems, even in severely resource limited settings. However, it also requires considerable organizational infrastructure investment and technical capacity building to ensure continued local support.

  16. State of the Science in Heart Failure Symptom Perception Research: An Integrative Review.

    PubMed

    Lee, Solim; Riegel, Barbara

    Heart failure (HF) is a common condition requiring self-care to maintain physical stability, prevent hospitalization, and improve quality of life. Symptom perception, a domain of HF self-care newly added to the Situation-Specific Theory of HF Self-Care, is defined as a comprehensive process of monitoring and recognizing physical sensations and interpreting and labeling the meaning of the sensations. The purpose of this integrative review was to describe the research conducted on HF symptom perception to further understanding of this new concept. A literature search was conducted using 8 databases. The search term of HF was combined with symptom, plus symptom perception subconcepts of monitoring, somatic awareness, detection, recognition, interpretation, and appraisal. Only peer-reviewed original articles published in English with full-text availability were included. No historical limits were imposed. Study subjects were adults. Twenty-one studies met the inclusion criteria. Each study was categorized into either symptom monitoring or symptom recognition and interpretation. Although daily weighing and HF-related symptom-monitoring behaviors were insufficient in HF patients, use of a symptom diary improved HF self-care, symptom distress and functional class, and decreased mortality, hospital stay, and medical costs. Most HF patients had trouble recognizing an exacerbation of symptoms. Aging, comorbid conditions, and gradual symptom progression made it difficult to recognize and correctly interpret a symptom exacerbation. Living with others, higher education, higher uncertainty, shorter symptom duration, worse functional class, and an increased number of previous hospitalizations were positively associated with symptom recognition. Existing research fails to capture all of the elements in the theoretical definition of symptom perception.

  17. Design and development of a new facility for teaching and research in clinical anatomy.

    PubMed

    Greene, John Richard T

    2009-01-01

    This article discusses factors in the design, commissioning, project management, and intellectual property protection of developments within a new clinical anatomy facility in the United Kingdom. The project was aimed at creating cost-effective facilities that would address widespread concerns over anatomy teaching, and support other activities central to the university mission-namely research and community interaction. The new facilities comprise an engaging learning environment and were designed to support a range of pedagogies appropriate to the needs of healthcare professionals at different stages of their careers. Specific innovations include integrated workstations each comprising of a dissection table, with removable top sections, an overhead operating light, and ceiling-mounted camera. The tables incorporate waterproof touch-screen monitors to display images from the camera, an endoscope or a database of images, videos, and tutorials. The screens work independently so that instructors can run different teaching sessions simultaneously and students can progress at different speeds to suit themselves. Further, database access is provided from within an integrated anatomy and pathology museum and display units dedicated to the correlation of cross-sectional anatomy with medical imaging. A new functional neuroanatomy modeling system, called the BrainTower, has been developed to aid integration of anatomy with physiology and clinical neurology. Many aspects of the new facility are reproduced within a Mobile Teaching Unit, which can be driven to hospitals, colleges, and schools to provide appropriate work-based education and community interaction. (c) 2009 American Association of Anatomists

  18. ECLSS Integration Analysis: Advanced ECLSS Subsystem and Instrumentation Technology Study for the Space Exploration Initiative

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In his July 1989 space policy speech, President Bush proposed a long range continuing commitment to space exploration and development. Included in his goals were the establishment of permanent lunar and Mars habitats and the development of extended duration space transportation. In both cases, a major issue is the availability of qualified sensor technologies for use in real-time monitoring and control of integrated physical/chemical/biological (p/c/b) Environmental Control and Life Support Systems (ECLSS). The purpose of this study is to determine the most promising instrumentation technologies for future ECLSS applications. The study approach is as follows: 1. Precursor ECLSS Subsystem Technology Trade Study - A database of existing and advanced Atmosphere Revitalization (AR) and Water Recovery and Management (WRM) ECLSS subsystem technologies was created. A trade study was performed to recommend AR and WRM subsystem technologies for future lunar and Mars mission scenarios. The purpose of this trade study was to begin defining future ECLSS instrumentation requirements as a precursor to determining the instrumentation technologies that will be applicable to future ECLS systems. 2. Instrumentation Survey - An instrumentation database of Chemical, Microbial, Conductivity, Humidity, Flowrate, Pressure, and Temperature sensors was created. Each page of the sensor database report contains information for one type of sensor, including a description of the operating principles, specifications, and the reference(s) from which the information was obtained. This section includes a cursory look at the history of instrumentation on U.S. spacecraft. 3. Results and Recommendations - Instrumentation technologies were recommended for further research and optimization based on a consideration of both of the above sections. A sensor or monitor technology was recommended based on its applicability to future ECLS systems, as defined by the ECLSS Trade Study (1), and on whether its characteristics were considered favorable relative to similar instrumentation technologies (competitors), as determined from the Instrumentation Survey (2). The instrumentation technologies recommended by this study show considerable potential for development and promise significant returns if research efforts are invested.

  19. Database usage and performance for the Fermilab Run II experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonham, D.; Box, D.; Gallas, E.

    2004-12-01

    The Run II experiments at Fermilab, CDF and D0, have extensive database needs covering many areas of their online and offline operations. Delivering data to users and processing farms worldwide has represented major challenges to both experiments. The range of applications employing databases includes, calibration (conditions), trigger information, run configuration, run quality, luminosity, data management, and others. Oracle is the primary database product being used for these applications at Fermilab and some of its advanced features have been employed, such as table partitioning and replication. There is also experience with open source database products such as MySQL for secondary databasesmore » used, for example, in monitoring. Tools employed for monitoring the operation and diagnosing problems are also described.« less

  20. Evaluation of NoSQL databases for DIRAC monitoring and beyond

    NASA Astrophysics Data System (ADS)

    Mathe, Z.; Casajus Ramo, A.; Stagni, F.; Tomassetti, L.

    2015-12-01

    Nowadays, many database systems are available but they may not be optimized for storing time series data. Monitoring DIRAC jobs would be better done using a database optimised for storing time series data. So far it was done using a MySQL database, which is not well suited for such an application. Therefore alternatives have been investigated. Choosing an appropriate database for storing huge amounts of time series data is not trivial as one must take into account different aspects such as manageability, scalability and extensibility. We compared the performance of Elasticsearch, OpenTSDB (based on HBase) and InfluxDB NoSQL databases, using the same set of machines and the same data. We also evaluated the effort required for maintaining them. Using the LHCb Workload Management System (WMS), based on DIRAC as a use case we set up a new monitoring system, in parallel with the current MySQL system, and we stored the same data into the databases under test. We evaluated Grafana (for OpenTSDB) and Kibana (for ElasticSearch) metrics and graph editors for creating dashboards, in order to have a clear picture on the usability of each candidate. In this paper we present the results of this study and the performance of the selected technology. We also give an outlook of other potential applications of NoSQL databases within the DIRAC project.

  1. Data Auditor: Analyzing Data Quality Using Pattern Tableaux

    NASA Astrophysics Data System (ADS)

    Srivastava, Divesh

    Monitoring databases maintain configuration and measurement tables about computer systems, such as networks and computing clusters, and serve important business functions, such as troubleshooting customer problems, analyzing equipment failures, planning system upgrades, etc. These databases are prone to many data quality issues: configuration tables may be incorrect due to data entry errors, while measurement tables may be affected by incorrect, missing, duplicate and delayed polls. We describe Data Auditor, a tool for analyzing data quality and exploring data semantics of monitoring databases. Given a user-supplied constraint, such as a boolean predicate expected to be satisfied by every tuple, a functional dependency, or an inclusion dependency, Data Auditor computes "pattern tableaux", which are concise summaries of subsets of the data that satisfy or fail the constraint. We discuss the architecture of Data Auditor, including the supported types of constraints and the tableau generation mechanism. We also show the utility of our approach on an operational network monitoring database.

  2. Integration of Information Retrieval and Database Management Systems.

    ERIC Educational Resources Information Center

    Deogun, Jitender S.; Raghavan, Vijay V.

    1988-01-01

    Discusses the motivation for integrating information retrieval and database management systems, and proposes a probabilistic retrieval model in which records in a file may be composed of attributes (formatted data items) and descriptors (content indicators). The details and resolutions of difficulties involved in integrating such systems are…

  3. A new web-based system to improve the monitoring of snow avalanche hazard in France

    NASA Astrophysics Data System (ADS)

    Bourova, Ekaterina; Maldonado, Eric; Leroy, Jean-Baptiste; Alouani, Rachid; Eckert, Nicolas; Bonnefoy-Demongeot, Mylene; Deschatres, Michael

    2016-05-01

    Snow avalanche data in the French Alps and Pyrenees have been recorded for more than 100 years in several databases. The increasing amount of observed data required a more integrative and automated service. Here we report the comprehensive web-based Snow Avalanche Information System newly developed to this end for three important data sets: an avalanche chronicle (Enquête Permanente sur les Avalanches, EPA), an avalanche map (Carte de Localisation des Phénomènes d'Avalanche, CLPA) and a compilation of hazard and vulnerability data recorded on selected paths endangering human settlements (Sites Habités Sensibles aux Avalanches, SSA). These data sets are now integrated into a common database, enabling full interoperability between all different types of snow avalanche records: digitized geographic data, avalanche descriptive parameters, eyewitness reports, photographs, hazard and risk levels, etc. The new information system is implemented through modular components using Java-based web technologies with Spring and Hibernate frameworks. It automates the manual data entry and improves the process of information collection and sharing, enhancing user experience and data quality, and offering new outlooks to explore and exploit the huge amount of snow avalanche data available for fundamental research and more applied risk assessment.

  4. Development of system decision support tools for behavioral trends monitoring of machinery maintenance in a competitive environment

    NASA Astrophysics Data System (ADS)

    Adeyeri, Michael Kanisuru; Mpofu, Khumbulani

    2017-06-01

    The article is centred on software system development for manufacturing company that produces polyethylene bags using mostly conventional machines in a competitive world where each business enterprise desires to stand tall. This is meant to assist in gaining market shares, taking maintenance and production decisions by the dynamism and flexibilities embedded in the package as customers' demand varies under the duress of meeting the set goals. The production and machine condition monitoring software (PMCMS) is programmed in C# and designed in such a way to support hardware integration, real-time machine conditions monitoring, which is based on condition maintenance approach, maintenance decision suggestions and suitable production strategies as the demand for products keeps changing in a highly competitive environment. PMCMS works with an embedded device which feeds it with data from the various machines being monitored at the workstation, and the data are read at the base station through transmission via a wireless transceiver and stored in a database. A case study was used in the implementation of the developed system, and the results show that it can monitor the machine's health condition effectively by displaying machines' health status, gives repair suggestions to probable faults, decides strategy for both production methods and maintenance, and, thus, can enhance maintenance performance obviously.

  5. The NCBI BioSystems database.

    PubMed

    Geer, Lewis Y; Marchler-Bauer, Aron; Geer, Renata C; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI's Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets.

  6. Environmental geochemistry at the global scale

    USGS Publications Warehouse

    Plant, J.; Smith, D.; Smith, B.; Williams, L.

    2000-01-01

    Land degradation and pollution caused by population pressure and economic development pose a threat to the sustainability of the Earth's surface, especially in tropical regions where a long history of chemical weathering has made the surface environment particularly fragile. Systematic baseline geochemical data provide a means of monitoring the state of the environment and identifying problem areas. Regional surveys have already been carried out in some countries, and with increased national and international funding they can be extended to cover the rest of the land surface of the globe. Preparations have been made, under the auspices of the IUGS, for the establishment of just such an integrated global database.

  7. Evaluation of Online Information Sources on Alien Species in Europe: The Need of Harmonization and Integration

    NASA Astrophysics Data System (ADS)

    Gatto, Francesca; Katsanevakis, Stelios; Vandekerkhove, Jochen; Zenetos, Argyro; Cardoso, Ana Cristina

    2013-06-01

    Europe is severely affected by alien invasions, which impact biodiversity, ecosystem services, economy, and human health. A large number of national, regional, and global online databases provide information on the distribution, pathways of introduction, and impacts of alien species. The sufficiency and efficiency of the current online information systems to assist the European policy on alien species was investigated by a comparative analysis of occurrence data across 43 online databases. Large differences among databases were found which are partially explained by variations in their taxonomical, environmental, and geographical scopes but also by the variable efforts for continuous updates and by inconsistencies on the definition of "alien" or "invasive" species. No single database covered all European environments, countries, and taxonomic groups. In many European countries national databases do not exist, which greatly affects the quality of reported information. To be operational and useful to scientists, managers, and policy makers, online information systems need to be regularly updated through continuous monitoring on a country or regional level. We propose the creation of a network of online interoperable web services through which information in distributed resources can be accessed, aggregated and then used for reporting and further analysis at different geographical and political scales, as an efficient approach to increase the accessibility of information. Harmonization, standardization, conformity on international standards for nomenclature, and agreement on common definitions of alien and invasive species are among the necessary prerequisites.

  8. Rapid HIS, RIS, PACS Integration Using Graphical CASE Tools

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Breant, Claudine M.; Stepczyk, Frank M.; Kho, Hwa T.; Valentino, Daniel J.; Tashima, Gregory H.; Materna, Anthony T.

    1994-05-01

    We describe the clinical requirements of the integrated federation of databases and present our client-mediator-server design. The main body of the paper describes five important aspects of integrating information systems: (1) global schema design, (2) establishing sessions with remote database servers, (3) development of schema translators, (4) integration of global system triggers, and (5) development of job workflow scripts.

  9. A Model of Object-Identities and Values

    DTIC Science & Technology

    1990-02-23

    integrity constraints in its construct, which provides the natural integration of the logical database model and the object-oriented database model. 20...portions are integrated by a simple commutative diagram of modeling functions. The formalism includes the expression of integrity constraints in its ...38 .5.2.2 The (Concept Model and Its Semantics .. .. .. .. ... .... ... .. 40 5.2.3 Two K%.inds of Predicates

  10. Prescriber Compliance With Liver Monitoring Guidelines for Pazopanib in the Postapproval Setting: Results From a Distributed Research Network.

    PubMed

    Shantakumar, Sumitra; Nordstrom, Beth L; Hall, Susan A; Djousse, Luc; van Herk-Sukel, Myrthe P P; Fraeman, Kathy H; Gagnon, David R; Chagin, Karen; Nelson, Jeanenne J

    2017-04-20

    Pazopanib received US Food and Drug Administration approval in 2009 for advanced renal cell carcinoma. During clinical development, liver chemistry abnormalities and adverse hepatic events were observed, leading to a boxed warning for hepatotoxicity and detailed label prescriber guidelines for liver monitoring. As part of postapproval regulatory commitments, a cohort study was conducted to assess prescriber compliance with liver monitoring guidelines. Over a 4-year period, a distributed network approach was used across 3 databases: US Veterans Affairs Healthcare System, a US outpatient oncology community practice database, and the Dutch PHARMO Database Network. Measures of prescriber compliance were designed using the original pazopanib label guidelines for liver monitoring. Results from the VA (n = 288) and oncology databases (n = 283) indicate that prescriber liver chemistry monitoring was less than 100%: 73% to 74% compliance with baseline testing and 37% to 39% compliance with testing every 4 weeks. Compliance was highest near drug initiation and decreased over time. Among patients who should have had weekly testing, the compliance was 56% in both databases. The more serious elevations examined, including combinations of liver enzyme elevations meeting the laboratory definition of Hy's law were infrequent but always led to appropriate discontinuation of pazopanib. Only 4 patients were identified for analysis in the Dutch database; none had recorded baseline testing. In this population-based study, prescriber compliance was reasonable near pazopanib initiation but low during subsequent weeks of treatment. This study provides information from real-world community practice settings and offers feedback to regulators on the effectiveness of label monitoring guidelines.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

  11. Electroconvulsive Therapy Practice in the Province of Quebec: Linked Health Administrative Data Study from 1996 to 2013.

    PubMed

    Lemasson, Morgane; Haesebaert, Julie; Rochette, Louis; Pelletier, Eric; Lesage, Alain; Patry, Simon

    2017-01-01

    As part of a quality improvement process, we propose a model of routinely monitoring electroconvulsive therapy (ECT) in Canadian provinces using linked health administrative databases to generate provincial periodic reports, influence policy, and standardise ECT practices. ECT practice in Quebec was studied from 1996 to 2013, using longitudinal data from the Quebec Integrated Chronic Disease Surveillance System of the Institut National de Santé Publique du Québec, which links 5 health administrative databases. The population included all persons, aged 18 y and over, eligible for the health insurance registry, who received an ECT treatment at least once during the year. Among recorded cases, 75% were identified by physician claims and hospitalisation files, 19% exclusively by physician claims, and 6% by hospitalisation files. From 1996 to 2013, 8,149 persons in Quebec received ECT with an annual prevalence rate of 13 per 100,000. A decline was observed, which was more pronounced in women and in older persons. On average, each patient received 9.7 treatments of ECT annually. The proportion of acute ECT decreased whereas maintenance treatment proportions increased. A wide variation in the use of ECT was observed among regions and psychiatrists. This study demonstrates the profitable use of administrative data to monitor ECT use in Quebec, and provides a reliable method that could be replicated in other Canadian provinces. Although Quebec has one of the lowest utilisation rates reported in industrialized countries, regional disparities highlighted the need for a deeper examination of the quality and monitoring of ECT care and services.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsai, H.; Chen, K.; Jusko, M.

    The Packaging Certification Program (PCP) of the U.S. Department of Energy (DOE) Environmental Management (EM), Office of Packaging and Transportation (EM-14), has developed a radio frequency identification (RFID) tracking and monitoring system for the management of nuclear materials during storage and transportation. The system, developed by the PCP team at Argonne National Laboratory, consists of hardware (Mk-series sensor tags, fixed and handheld readers, form factor for multiple drum types, seal integrity sensors, and enhanced battery management), software (application programming interface, ARG-US software for local and remote/web applications, secure server and database management), and cellular/satellite communication interfaces for vehicle tracking andmore » item monitoring during transport. The ability of the above system to provide accurate, real-time tracking and monitoring of the status of multiple, certified containers of nuclear materials has been successfully demonstrated in a week-long, 1,700-mile DEMO performed in April 2008. While the feedback from the approximately fifty (50) stakeholders who participated in and/or observed the DEMO progression were very positive and encouraging, two major areas of further improvements - system integration and web application enhancement - were identified in the post-DEMO evaluation. The principal purpose of the MiniDemo described in this report was to verify these two specific improvements. The MiniDemo was conducted on August 28, 2009. In terms of system integration, a hybrid communication interface - combining the RFID item-monitoring features and a commercial vehicle tracking system by Qualcomm - was developed and implemented. In the MiniDemo, the new integrated system worked well in reporting tag status and vehicle location accurately and promptly. There was no incompatibility of components. The robust commercial communication gear, as expected, helped improve system reliability. The MiniDemo confirmed that system integration is technically feasible and reliable with the existing RFID and Qualcomm satellite equipment. In terms of web application, improvements in mapping, tracking, data presentation, and post-incident spatial query reporting were implemented in ARG-US, the application software that manages the dataflow among the RFID tags, readers, and servers. These features were tested in the MiniDemo and found to be satisfactory. The resulting web application is both informative and user-friendly. A joint developmental project is being planned between the PCP and the DOE TRANSCOM that uses the Qualcomm gear in vehicles for tracking and communication of radioactive material shipments across the country. Adding an RFID interface to TRANSCOM is a significant enhancement to the DOE infrastructure for tracking and monitoring shipments of radioactive materials.« less

  13. The ChArMEx database

    NASA Astrophysics Data System (ADS)

    Ferré, Hélène; Belmahfoud, Nizar; Boichard, Jean-Luc; Brissebrat, Guillaume; Cloché, Sophie; Descloitres, Jacques; Fleury, Laurence; Focsa, Loredana; Henriot, Nicolas; Mière, Arnaud; Ramage, Karim; Vermeulen, Anne; Boulanger, Damien

    2015-04-01

    The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The project includes long term monitoring of environmental parameters , intensive field campaigns, use of satellite data and modelling studies. Therefore ChARMEx scientists produce and need to access a wide diversity of data. In this context, the objective of the database task is to organize data management, distribution system and services, such as facilitating the exchange of information and stimulating the collaboration between researchers within the ChArMEx community, and beyond. The database relies on a strong collaboration between ICARE, IPSL and OMP data centers and has been set up in the framework of the Mediterranean Integrated Studies at Regional And Locals Scales (MISTRALS) program data portal. ChArMEx data, either produced or used by the project, are documented and accessible through the database website: http://mistrals.sedoo.fr/ChArMEx. The website offers the usual but user-friendly functionalities: data catalog, user registration procedure, search tool to select and access data... The metadata (data description) are standardized, and comply with international standards (ISO 19115-19139; INSPIRE European Directive; Global Change Master Directory Thesaurus). A Digital Object Identifier (DOI) assignement procedure allows to automatically register the datasets, in order to make them easier to access, cite, reuse and verify. At present, the ChArMEx database contains about 120 datasets, including more than 80 in situ datasets (2012, 2013 and 2014 summer campaigns, background monitoring station of Ersa...), 25 model output sets (dust model intercomparison, MEDCORDEX scenarios...), a high resolution emission inventory over the Mediterranean... Many in situ datasets have been inserted in a relational database, in order to enable more accurate selection and download of different datasets in a shared format. Many dedicated satellite products (SEVIRI, TRIMM, PARASOL...) are processed and will soon be accessible through the database website. In order to meet the operational needs of the airborne and ground based observational teams during the ChArMEx campaigns, a day-to-day chart display website has been developed and operated: http://choc.sedoo.org. It offers a convenient way to browse weather conditions and chemical composition during the campaign periods. Every scientist is invited to visit the ChArMEx websites, to register and to request data. Feel free to contact charmex-database@sedoo.fr for any question.

  14. The ChArMEx database

    NASA Astrophysics Data System (ADS)

    Ferré, Hélène; Descloitres, Jacques; Fleury, Laurence; Boichard, Jean-Luc; Brissebrat, Guillaume; Focsa, Loredana; Henriot, Nicolas; Mastrorillo, Laurence; Mière, Arnaud; Vermeulen, Anne

    2013-04-01

    The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The project includes long term monitoring of environmental parameters, intensive field campaigns, use of satellite data and modelling studies. Therefore ChARMEx scientists produce and need to access a wide diversity of data. In this context, the objective of the database task is to organize data management, distribution system and services such as facilitating the exchange of information and stimulating the collaboration between researchers within the ChArMEx community, and beyond. The database relies on a strong collaboration between OMP and ICARE data centres and falls within the scope of the Mediterranean Integrated Studies at Regional And Locals Scales (MISTRALS) program data portal. All the data produced by or of interest for the ChArMEx community will be documented in the data catalogue and accessible through the database website: http://mistrals.sedoo.fr/ChArMEx. The database website offers different tools: - A registration procedure which enables any scientist to accept the data policy and apply for a user database account. - Forms to document observations or products that will be provided to the database in compliance with metadata international standards (ISO 19115-19139; INSPIRE; Global Change Master Directory Thesaurus). - A search tool to browse the catalogue using thematic, geographic and/or temporal criteria. - Sorted lists of the datasets by thematic keywords, by measured parameters, by instruments or by platform type. - A shopping-cart web interface to order in situ data files. At present datasets from the background monitoring station of Ersa, Cape Corsica and from the 2012 ChArMEx pre-campaign are available. - A user-friendly access to satellite products (SEVIRI, TRIMM, PARASOL...) stored in the ICARE data archive using OpeNDAP protocole The website will soon propose new facilities. In particular, many in situ datasets will be homogenized and inserted in a relational database, in order to enable more accurate data selection and download of different datasets in a shared format. In order to meet the operational needs of the airborne and ground based observational teams during the ChArMEx 2012 pre-campaign and the 2013 experiment, a day-to-day quick look and report display website has been developed too: http://choc.sedoo.org. It offers a convenient way to browse weather conditions and chemical composition during the campaign periods.

  15. Development and implementation of a web-based system to study children with malnutrition.

    PubMed

    Syed-Mohamad, Sharifah-Mastura

    2009-01-01

    To develop and implement a collective web-based system to monitor child growth in order to study children with malnutrition. The system was developed using prototyping system development methodology. The implementation was carried out using open-source technologies that include Apache Web Server, PHP scripting, and MySQL database management system. There were four datasets collected by the system: demographic data, measurement data, parent data, and food program data. The system was designed to be used by two groups of users, the clinics and the researchers. The Growth Monitor System was successfully developed and used for the study, "Geoinformation System (GIS) and Remote Sensing in Mapping of Children with Malnutrition." Data collection was implemented in public clinics from two districts in the state of Kelantan, Malaysia. The development of an integrated web-based system, Growth Monitor, for the study of children with malnutrition has been achieved. This system can be expanded to new partners who are involved in the study of children with malnutrition in other parts of Malaysia as well as other countries.

  16. The non-contact heart rate measurement system for monitoring HRV.

    PubMed

    Huang, Ji-Jer; Yu, Sheng-I; Syu, Hao-Yi; See, Aaron Raymond

    2013-01-01

    A noncontact ECG monitoring and analysis system was developed using capacitive-coupled device integrated to a home sofa. Electrodes were placed on the backrest of a sofa separated from the body with only the chair covering and the user's clothing. The study also incorporates measurements using different fabric materials, and a pure cotton material was chosen to cover the chair's backrest. The material was chosen to improve the signal to noise ratio. The system is initially implemented on a home sofa and is able to measure non-contact ECG through thin cotton clothing and perform heart rate analysis to calculate the heart rate variability (HRV) parameters. It was also tested under different conditions and results from reading and sleeping exhibited a stable ECG. Subsequently, results from our calculated HRV were found to be identical to those of a commercially available HRV analyzer. However, HRV parameters are easily affected by motion artifacts generated during drinking or eating with the latter producing a more severe disturbance. Lastly, parameters measured are saved on a cloud database, providing users with a long-term monitoring and recording for physiological information.

  17. Arctic BioMap: Building Participatory Technologies for Community-Specific Environmental Monitoring and Decision Making in the North

    NASA Astrophysics Data System (ADS)

    Murray, M. S.; Panikkar, B.; Liang, S.; Kutz, S.

    2016-12-01

    The Arctic continues to undergo unprecedented and accelerated system-wide environmental change. For people who live in the north this presents challenges to resource management, subsistence, health and well-being, and yet, there is very little community-specific data on wildlife (including wildlife health), local environmental conditions and emerging hazards in Northern Canada. A novel approach that integrates community expertise with developing technologies can simplify data collection and improve understanding of current and future conditions. It can also improve our ability to manage and adapt to the rapidly transforming Arctic. Arctic BioMap is a data platform for real-time monitoring and a geospatial informational database of wildlife and environmental information useful for assessment, research, management, and education. It enables monitoring of wildlife and environmental variables including hazards to inform decision-making at multiples scales. Using participatory technologies Arctic BioMap incorporates indigenous research needs and the ensuing data can be used to inform policy making. Arctic BioMap provides a forum for continuous exchange and communication among community members, scientists, resources managers, and other stakeholders.

  18. Participatory Carbon Monitoring System in Community Forests of Nepal

    NASA Astrophysics Data System (ADS)

    Karki, S.

    2016-12-01

    With the adoption of climate change agreement, Reducing emissions from deforestation and forest degradation (REDD) has advanced as a performance based policy instruments to curtailing the deforestation and forest degradation. Developing countries are working to get REDD ready. However, the readiness assessment process entails criteria such as REDD+ Safeguards are met, monitoring and reporting of emission reductions are verified (MRV). For counties to have MRV in place, technical know-how on measuring forest carbon and capacity of the human resources are limited. International Centre for Integrated Mountain Development (ICIMOD) together with its national partners implemented REDD+ pilot project from 2009-2013 in 105 community forests (CF) of three watersheds namely Charnawati, Kayarkhola and Ludikhola in Nepal. This paper discuss prototype of the participatory carbon monitoring and measurement approach tested in these105 CFs that is systematic, transparent, and cost effective. Additionally it will demonstrate the enhanced carbon stock data from 2010-2013 assembled at ICIMOD Regional Database Initiative are made freely available. Such application can be scaled up or considered in decision making for performance based payment schemes.

  19. [Patient readmission for surgical site infection: integrative review].

    PubMed

    Machado, Lilian; Turrini, Ruth N T; Siqueira, Ana L

    2013-02-01

    Surgical site infections (SSI) represent an inherent risk after surgical procedures associated both to the surgical procedure and to the patient clinical conditions. To analyze in an integrative review the studies related to patient readmission due to SSI. The review was carried out by LILACS, CINHAL, MEDLINE and COCHRANE databases and articles published from 1966 to 2010 were selected. It was analyzed 13 studies classified as transversal (7), cohort (4) and longitudinal (2). Few studies analyzed only the readmissions related to the SSI. Time to define the readmission ranged from 28 to 90 days after surgery and studies related to orthopedic procedures were more frequent. The ISS readmission rates were lower than 5%. The main aetiological agents isolated from ISS were Staphylococcus aureus and coagulase-negative staphylococci. Monitoring readmissions due to SSI could contribute to dimension the occurrence of ISS post-discharge, once about half of the SSI post-discharge was diagnosed at the readmission moment.

  20. Combining Remote Sensing imagery of both fine and coarse spatial resolution to Estimate Crop Evapotranspiration and quantifying its Influence on Crop Growth Monitoring.

    NASA Astrophysics Data System (ADS)

    Sepulcre-Cantó, Guadalupe; Gellens-Meulenberghs, Françoise; Arboleda, Alirio; Duveiller, Gregory; Piccard, Isabelle; de Wit, Allard; Tychon, Bernard; Bakary, Djaby; Defourny, Pierre

    2010-05-01

    This study has been carried out in the framework of the GLOBAM -Global Agricultural Monitoring system by integration of earth observation and modeling techniques- project whose objective is to fill the methodological gap between the state of the art of local crop monitoring and the operational requirements of the global monitoring system programs. To achieve this goal, the research aims to develop an integrated approach using remote sensing and crop growth modeling. Evapotranspiration (ET) is a valuable parameter in the crop monitoring context since it provides information on the plant water stress status, which strongly influences crop development and, by extension, crop yield. To assess crop evapotranspiration over the GLOBAM study areas (300x300 km sites in Northern Europe and Central Ethiopia), a Soil-Vegetation-Atmosphere Transfer (SVAT) model forced with remote sensing and numerical weather prediction data has been used. This model runs at pre-operational level in the framework of the EUMETSAT LSA-SAF (Land Surface Analysis Satellite Application Facility) using SEVIRI and ECMWF data, as well as the ECOCLIMAP database to characterize the vegetation. The model generates ET images at the Meteosat Second Generation (MSG) spatial resolution (3 km at subsatellite point),with a temporal resolution of 30 min and monitors the entire MSG disk which covers Europe, Africa and part of Sud America . The SVAT model was run for 2007 using two approaches. The first approach is at the standard pre-operational mode. The second incorporates remote sensing information at various spatial resolutions going from LANDSAT (30m) to SEVIRI (3-5 km) passing by AWIFS (56m) and MODIS (250m). Fine spatial resolution data consists of crop type classification which enable to identify areas where pure crop specific MODIS time series can be compiled and used to derive Leaf Area Index estimations for the most important crops (wheat and maize). The use of this information allowed to characterize the type of vegetation and its state of development in a more accurate way than using the ECOCLIMAP database. Finally, the CASA method was applied using the evapotranspiration images with FAPAR (Fraction of Absorbed Photosynthetically Active Radiation) images from LSA-SAF to obtain Dry Matter Productivity (DMP) and crop yield. The potential of using evapotranspiration obtained from remote sensing in crop growth modeling is studied and discussed. Results of comparing the evapotranspiration obtained with ground truth data are shown as well as the influence of using high resolution information to characterize the vegetation in the evapotranspiration estimation. The values of DMP and yield obtained with the CASA method are compared with those obtained using crop growth modeling and field data, showing the potential of using this simplified remote sensing method for crop monitoring and yield forecasting. This methodology could be applied in an operative way to the entire MSG disk, allowing the continuous crop growth monitoring.

  1. The NCBI BioSystems database

    PubMed Central

    Geer, Lewis Y.; Marchler-Bauer, Aron; Geer, Renata C.; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H.

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI’s Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets. PMID:19854944

  2. The integrated web service and genome database for agricultural plants with biotechnology information.

    PubMed

    Kim, Changkug; Park, Dongsuk; Seol, Youngjoo; Hahn, Jangho

    2011-01-01

    The National Agricultural Biotechnology Information Center (NABIC) constructed an agricultural biology-based infrastructure and developed a Web based relational database for agricultural plants with biotechnology information. The NABIC has concentrated on functional genomics of major agricultural plants, building an integrated biotechnology database for agro-biotech information that focuses on genomics of major agricultural resources. This genome database provides annotated genome information from 1,039,823 records mapped to rice, Arabidopsis, and Chinese cabbage.

  3. toxoMine: an integrated omics data warehouse for Toxoplasma gondii systems biology research

    PubMed Central

    Rhee, David B.; Croken, Matthew McKnight; Shieh, Kevin R.; Sullivan, Julie; Micklem, Gos; Kim, Kami; Golden, Aaron

    2015-01-01

    Toxoplasma gondii (T. gondii) is an obligate intracellular parasite that must monitor for changes in the host environment and respond accordingly; however, it is still not fully known which genetic or epigenetic factors are involved in regulating virulence traits of T. gondii. There are on-going efforts to elucidate the mechanisms regulating the stage transition process via the application of high-throughput epigenomics, genomics and proteomics techniques. Given the range of experimental conditions and the typical yield from such high-throughput techniques, a new challenge arises: how to effectively collect, organize and disseminate the generated data for subsequent data analysis. Here, we describe toxoMine, which provides a powerful interface to support sophisticated integrative exploration of high-throughput experimental data and metadata, providing researchers with a more tractable means toward understanding how genetic and/or epigenetic factors play a coordinated role in determining pathogenicity of T. gondii. As a data warehouse, toxoMine allows integration of high-throughput data sets with public T. gondii data. toxoMine is also able to execute complex queries involving multiple data sets with straightforward user interaction. Furthermore, toxoMine allows users to define their own parameters during the search process that gives users near-limitless search and query capabilities. The interoperability feature also allows users to query and examine data available in other InterMine systems, which would effectively augment the search scope beyond what is available to toxoMine. toxoMine complements the major community database ToxoDB by providing a data warehouse that enables more extensive integrative studies for T. gondii. Given all these factors, we believe it will become an indispensable resource to the greater infectious disease research community. Database URL: http://toxomine.org PMID:26130662

  4. CyanOmics: an integrated database of omics for the model cyanobacterium Synechococcus sp. PCC 7002.

    PubMed

    Yang, Yaohua; Feng, Jie; Li, Tao; Ge, Feng; Zhao, Jindong

    2015-01-01

    Cyanobacteria are an important group of organisms that carry out oxygenic photosynthesis and play vital roles in both the carbon and nitrogen cycles of the Earth. The annotated genome of Synechococcus sp. PCC 7002, as an ideal model cyanobacterium, is available. A series of transcriptomic and proteomic studies of Synechococcus sp. PCC 7002 cells grown under different conditions have been reported. However, no database of such integrated omics studies has been constructed. Here we present CyanOmics, a database based on the results of Synechococcus sp. PCC 7002 omics studies. CyanOmics comprises one genomic dataset, 29 transcriptomic datasets and one proteomic dataset and should prove useful for systematic and comprehensive analysis of all those data. Powerful browsing and searching tools are integrated to help users directly access information of interest with enhanced visualization of the analytical results. Furthermore, Blast is included for sequence-based similarity searching and Cluster 3.0, as well as the R hclust function is provided for cluster analyses, to increase CyanOmics's usefulness. To the best of our knowledge, it is the first integrated omics analysis database for cyanobacteria. This database should further understanding of the transcriptional patterns, and proteomic profiling of Synechococcus sp. PCC 7002 and other cyanobacteria. Additionally, the entire database framework is applicable to any sequenced prokaryotic genome and could be applied to other integrated omics analysis projects. Database URL: http://lag.ihb.ac.cn/cyanomics. © The Author(s) 2015. Published by Oxford University Press.

  5. ExPASy: SIB bioinformatics resource portal.

    PubMed

    Artimo, Panu; Jonnalagedda, Manohar; Arnold, Konstantin; Baratin, Delphine; Csardi, Gabor; de Castro, Edouard; Duvaud, Séverine; Flegel, Volker; Fortier, Arnaud; Gasteiger, Elisabeth; Grosdidier, Aurélien; Hernandez, Céline; Ioannidis, Vassilios; Kuznetsov, Dmitry; Liechti, Robin; Moretti, Sébastien; Mostaguir, Khaled; Redaschi, Nicole; Rossier, Grégoire; Xenarios, Ioannis; Stockinger, Heinz

    2012-07-01

    ExPASy (http://www.expasy.org) has worldwide reputation as one of the main bioinformatics resources for proteomics. It has now evolved, becoming an extensible and integrative portal accessing many scientific resources, databases and software tools in different areas of life sciences. Scientists can henceforth access seamlessly a wide range of resources in many different domains, such as proteomics, genomics, phylogeny/evolution, systems biology, population genetics, transcriptomics, etc. The individual resources (databases, web-based and downloadable software tools) are hosted in a 'decentralized' way by different groups of the SIB Swiss Institute of Bioinformatics and partner institutions. Specifically, a single web portal provides a common entry point to a wide range of resources developed and operated by different SIB groups and external institutions. The portal features a search function across 'selected' resources. Additionally, the availability and usage of resources are monitored. The portal is aimed for both expert users and people who are not familiar with a specific domain in life sciences. The new web interface provides, in particular, visual guidance for newcomers to ExPASy.

  6. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    EPA Science Inventory

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  7. Populating a Control Point Database: A cooperative effort between the USGS, Grand Canyon Monitoring and Research Center and the Grand Canyon Youth Organization

    NASA Astrophysics Data System (ADS)

    Brown, K. M.; Fritzinger, C.; Wharton, E.

    2004-12-01

    The Grand Canyon Monitoring and Research Center measures the effects of Glen Canyon Dam operations on the resources along the Colorado River from Glen Canyon Dam to Lake Mead in support of the Grand Canyon Adaptive Management Program. Control points are integral for geo-referencing the myriad of data collected in the Grand Canyon including aerial photography, topographic and bathymetric data used for classification and change-detection analysis of physical, biologic and cultural resources. The survey department has compiled a list of 870 control points installed by various organizations needing to establish a consistent reference for data collected at field sites along the 240 mile stretch of Colorado River in the Grand Canyon. This list is the foundation for the Control Point Database established primarily for researchers, to locate control points and independently geo-reference collected field data. The database has the potential to be a valuable mapping tool for assisting researchers to easily locate a control point and reduce the occurrance of unknowingly installing new control points within close proximity of an existing control point. The database is missing photographs and accurate site description information. Current site descriptions do not accurately define the location of the point but refer to the project that used the point, or some other interesting fact associated with the point. The Grand Canyon Monitoring and Research Center (GCMRC) resolved this problem by turning the data collection effort into an educational exercise for the participants of the Grand Canyon Youth organization. Grand Canyon Youth is a non-profit organization providing experiential education for middle and high school aged youth. GCMRC and the Grand Canyon Youth formed a partnership where GCMRC provided the logistical support, equipment, and training to conduct the field work, and the Grand Canyon Youth provided the time and personnel to complete the field work. Two data collection efforts were conducted during the 2004 summer allowing 40 youth the opportunity to contribute valuable information to the Control Point Database. This information included: verification of point existence, photographs, accurate site descriptions concisely describing the location of the point, how to reach the point, the specific point location and detailed bearings to visible and obvious land marks. The youth learned to locate themselves and find the points using 1:1000 airphotos, write detailed site descriptions, take bearings with a compass, measure vertical and horizontal distances, and use a digital camera. The youth found information for 252 control points (29% of the total points).

  8. Loopedia, a database for loop integrals

    NASA Astrophysics Data System (ADS)

    Bogner, C.; Borowka, S.; Hahn, T.; Heinrich, G.; Jones, S. P.; Kerner, M.; von Manteuffel, A.; Michel, M.; Panzer, E.; Papara, V.

    2018-04-01

    Loopedia is a new database at loopedia.org for information on Feynman integrals, intended to provide both bibliographic information as well as results made available by the community. Its bibliometry is complementary to that of INSPIRE or arXiv in the sense that it admits searching for integrals by graph-theoretical objects, e.g. its topology.

  9. ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii.

    PubMed

    May, Patrick; Christian, Jan-Ole; Kempa, Stefan; Walther, Dirk

    2009-05-04

    The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. In the framework of the German Systems Biology initiative GoFORSYS, a pathway database and web-portal for Chlamydomonas (ChlamyCyc) was established, which currently features about 250 metabolic pathways with associated genes, enzymes, and compound information. ChlamyCyc was assembled using an integrative approach combining the recently published genome sequence, bioinformatics methods, and experimental data from metabolomics and proteomics experiments. We analyzed and integrated a combination of primary and secondary database resources, such as existing genome annotations from JGI, EST collections, orthology information, and MapMan classification. ChlamyCyc provides a curated and integrated systems biology repository that will enable and assist in systematic studies of fundamental cellular processes in Chlamydomonas. The ChlamyCyc database and web-portal is freely available under http://chlamycyc.mpimp-golm.mpg.de.

  10. Integrating stations from the North America Gravity Database into a local GPS-based land gravity survey

    USGS Publications Warehouse

    Shoberg, Thomas G.; Stoddard, Paul R.

    2013-01-01

    The ability to augment local gravity surveys with additional gravity stations from easily accessible national databases can greatly increase the areal coverage and spatial resolution of a survey. It is, however, necessary to integrate such data seamlessly with the local survey. One challenge to overcome in integrating data from national databases is that these data are typically of unknown quality. This study presents a procedure for the evaluation and seamless integration of gravity data of unknown quality from a national database with data from a local Global Positioning System (GPS)-based survey. The starting components include the latitude, longitude, elevation and observed gravity at each station location. Interpolated surfaces of the complete Bouguer anomaly are used as a means of quality control and comparison. The result is an integrated dataset of varying quality with many stations having GPS accuracy and other reliable stations of unknown origin, yielding a wider coverage and greater spatial resolution than either survey alone.

  11. The EBI SRS server-new features.

    PubMed

    Zdobnov, Evgeny M; Lopez, Rodrigo; Apweiler, Rolf; Etzold, Thure

    2002-08-01

    Here we report on recent developments at the EBI SRS server (http://srs.ebi.ac.uk). SRS has become an integration system for both data retrieval and sequence analysis applications. The EBI SRS server is a primary gateway to major databases in the field of molecular biology produced and supported at EBI as well as European public access point to the MEDLINE database provided by US National Library of Medicine (NLM). It is a reference server for latest developments in data and application integration. The new additions include: concept of virtual databases, integration of XML databases like the Integrated Resource of Protein Domains and Functional Sites (InterPro), Gene Ontology (GO), MEDLINE, Metabolic pathways, etc., user friendly data representation in 'Nice views', SRSQuickSearch bookmarklets. SRS6 is a licensed product of LION Bioscience AG freely available for academics. The EBI SRS server (http://srs.ebi.ac.uk) is a free central resource for molecular biology data as well as a reference server for the latest developments in data integration.

  12. Towards an integrated European strong motion data distribution

    NASA Astrophysics Data System (ADS)

    Luzi, Lucia; Clinton, John; Cauzzi, Carlo; Puglia, Rodolfo; Michelini, Alberto; Van Eck, Torild; Sleeman, Reinhoud; Akkar, Sinan

    2013-04-01

    Recent decades have seen a significant increase in the quality and quantity of strong motion data collected in Europe, as dense and often real-time and continuously monitored broadband strong motion networks have been constructed in many nations. There has been a concurrent increase in demand for access to strong motion data not only from researchers for engineering and seismological studies, but also from civil authorities and seismic networks for the rapid assessment of ground motion and shaking intensity following significant earthquakes (e.g. ShakeMaps). Aside from a few notable exceptions on the national scale, databases providing access to strong motion data has not appeared to keep pace with these developments. In the framework of the EC infrastructure project NERA (2010 - 2014), that integrates key research infrastructures in Europe for monitoring earthquakes and assessing their hazard and risk, the network activity NA3 deals with the networking of acceleration networks and SM data. Within the NA3 activity two infrastructures are being constructed: i) a Rapid Response Strong Motion (RRSM) database, that following a strong event, automatically parameterises all available on-scale waveform data within the European Integrated waveform Data Archives (EIDA) and makes the waveforms easily available to the seismological community within minutes of an event; and ii) a European Strong Motion (ESM) database of accelerometric records, with associated metadata relevant to earthquake engineering and seismology research communities, using standard, manual processing that reflects the state of the art and research needs in these fields. These two separate repositories form the core infrastructures being built to distribute strong motion data in Europe in order to guarantee rapid and long-term availability of high quality waveform data to both the international scientific community and the hazard mitigation communities. These infrastructures will provide the access to strong motion data in an eventual EPOS seismological service. A working group on Strong Motion data is being created at ORFEUS in 2013. This body, consisting of experts in strong motion data collection, processing and research from across Europe, will provide the umbrella organisation that will 1) have the political clout to negotiate data sharing agreements with strong motion data providers and 2) manage the software during a transition from the end of NERA to the EPOS community. We expect the community providing data to the RRSM and ESM will gradually grow, under the supervision of ORFEUS, and eventually include strong motion data from networks from all European countries that can have an open data policy.

  13. Building An Integrated Neurodegenerative Disease Database At An Academic Health Center

    PubMed Central

    Xie, Sharon X.; Baek, Young; Grossman, Murray; Arnold, Steven E.; Karlawish, Jason; Siderowf, Andrew; Hurtig, Howard; Elman, Lauren; McCluskey, Leo; Van Deerlin, Vivianna; Lee, Virginia M.-Y.; Trojanowski, John Q.

    2010-01-01

    Background It is becoming increasingly important to study common and distinct etiologies, clinical and pathological features, and mechanisms related to neurodegenerative diseases such as Alzheimer’s disease (AD), Parkinson’s disease (PD), amyotrophic lateral sclerosis (ALS), and frontotemporal lobar degeneration (FTLD). These comparative studies rely on powerful database tools to quickly generate data sets which match diverse and complementary criteria set by the studies. Methods In this paper, we present a novel Integrated NeuroDegenerative Disease (INDD) database developed at the University of Pennsylvania (Penn) through a consortium of Penn investigators. Since these investigators work on AD, PD, ALS and FTLD, this allowed us to achieve the goal of developing an INDD database for these major neurodegenerative disorders. We used Microsoft SQL Server as the platform with built-in “backwards” functionality to provide Access as a front-end client to interface with the database. We used PHP hypertext Preprocessor to create the “front end” web interface and then integrated individual neurodegenerative disease databases using a master lookup table. We also present methods of data entry, database security, database backups, and database audit trails for this INDD database. Results We compare the results of a biomarker study using the INDD database to those using an alternative approach by querying individual database separately. Conclusions We have demonstrated that the Penn INDD database has the ability to query multiple database tables from a single console with high accuracy and reliability. The INDD database provides a powerful tool for generating data sets in comparative studies across several neurodegenerative diseases. PMID:21784346

  14. DBGC: A Database of Human Gastric Cancer

    PubMed Central

    Wang, Chao; Zhang, Jun; Cai, Mingdeng; Zhu, Zhenggang; Gu, Wenjie; Yu, Yingyan; Zhang, Xiaoyan

    2015-01-01

    The Database of Human Gastric Cancer (DBGC) is a comprehensive database that integrates various human gastric cancer-related data resources. Human gastric cancer-related transcriptomics projects, proteomics projects, mutations, biomarkers and drug-sensitive genes from different sources were collected and unified in this database. Moreover, epidemiological statistics of gastric cancer patients in China and clinicopathological information annotated with gastric cancer cases were also integrated into the DBGC. We believe that this database will greatly facilitate research regarding human gastric cancer in many fields. DBGC is freely available at http://bminfor.tongji.edu.cn/dbgc/index.do PMID:26566288

  15. The integrated web service and genome database for agricultural plants with biotechnology information

    PubMed Central

    Kim, ChangKug; Park, DongSuk; Seol, YoungJoo; Hahn, JangHo

    2011-01-01

    The National Agricultural Biotechnology Information Center (NABIC) constructed an agricultural biology-based infrastructure and developed a Web based relational database for agricultural plants with biotechnology information. The NABIC has concentrated on functional genomics of major agricultural plants, building an integrated biotechnology database for agro-biotech information that focuses on genomics of major agricultural resources. This genome database provides annotated genome information from 1,039,823 records mapped to rice, Arabidopsis, and Chinese cabbage. PMID:21887015

  16. An Integrated Korean Biodiversity and Genetic Information Retrieval System

    PubMed Central

    Lim, Jeongheui; Bhak, Jong; Oh, Hee-Mock; Kim, Chang-Bae; Park, Yong-Ha; Paek, Woon Kee

    2008-01-01

    Background On-line biodiversity information databases are growing quickly and being integrated into general bioinformatics systems due to the advances of fast gene sequencing technologies and the Internet. These can reduce the cost and effort of performing biodiversity surveys and genetic searches, which allows scientists to spend more time researching and less time collecting and maintaining data. This will cause an increased rate of knowledge build-up and improve conservations. The biodiversity databases in Korea have been scattered among several institutes and local natural history museums with incompatible data types. Therefore, a comprehensive database and a nation wide web portal for biodiversity information is necessary in order to integrate diverse information resources, including molecular and genomic databases. Results The Korean Natural History Research Information System (NARIS) was built and serviced as the central biodiversity information system to collect and integrate the biodiversity data of various institutes and natural history museums in Korea. This database aims to be an integrated resource that contains additional biological information, such as genome sequences and molecular level diversity. Currently, twelve institutes and museums in Korea are integrated by the DiGIR (Distributed Generic Information Retrieval) protocol, with Darwin Core2.0 format as its metadata standard for data exchange. Data quality control and statistical analysis functions have been implemented. In particular, integrating molecular and genetic information from the National Center for Biotechnology Information (NCBI) databases with NARIS was recently accomplished. NARIS can also be extended to accommodate other institutes abroad, and the whole system can be exported to establish local biodiversity management servers. Conclusion A Korean data portal, NARIS, has been developed to efficiently manage and utilize biodiversity data, which includes genetic resources. NARIS aims to be integral in maximizing bio-resource utilization for conservation, management, research, education, industrial applications, and integration with other bioinformation data resources. It can be found at . PMID:19091024

  17. PICKLE 2.0: A human protein-protein interaction meta-database employing data integration via genetic information ontology

    PubMed Central

    Gioutlakis, Aris; Klapa, Maria I.

    2017-01-01

    It has been acknowledged that source databases recording experimentally supported human protein-protein interactions (PPIs) exhibit limited overlap. Thus, the reconstruction of a comprehensive PPI network requires appropriate integration of multiple heterogeneous primary datasets, presenting the PPIs at various genetic reference levels. Existing PPI meta-databases perform integration via normalization; namely, PPIs are merged after converted to a certain target level. Hence, the node set of the integrated network depends each time on the number and type of the combined datasets. Moreover, the irreversible a priori normalization process hinders the identification of normalization artifacts in the integrated network, which originate from the nonlinearity characterizing the genetic information flow. PICKLE (Protein InteraCtion KnowLedgebasE) 2.0 implements a new architecture for this recently introduced human PPI meta-database. Its main novel feature over the existing meta-databases is its approach to primary PPI dataset integration via genetic information ontology. Building upon the PICKLE principles of using the reviewed human complete proteome (RHCP) of UniProtKB/Swiss-Prot as the reference protein interactor set, and filtering out protein interactions with low probability of being direct based on the available evidence, PICKLE 2.0 first assembles the RHCP genetic information ontology network by connecting the corresponding genes, nucleotide sequences (mRNAs) and proteins (UniProt entries) and then integrates PPI datasets by superimposing them on the ontology network without any a priori transformations. Importantly, this process allows the resulting heterogeneous integrated network to be reversibly normalized to any level of genetic reference without loss of the original information, the latter being used for identification of normalization biases, and enables the appraisal of potential false positive interactions through PPI source database cross-checking. The PICKLE web-based interface (www.pickle.gr) allows for the simultaneous query of multiple entities and provides integrated human PPI networks at either the protein (UniProt) or the gene level, at three PPI filtering modes. PMID:29023571

  18. The design and implementation of EPL: An event pattern language for active databases

    NASA Technical Reports Server (NTRS)

    Giuffrida, G.; Zaniolo, C.

    1994-01-01

    The growing demand for intelligent information systems requires closer coupling of rule-based reasoning engines, such as CLIPS, with advanced data base management systems (DBMS). For instance, several commercial DBMS now support the notion of triggers that monitor events and transactions occurring in the database and fire induced actions, which perform a variety of critical functions, including safeguarding the integrity of data, monitoring access, and recording volatile information needed by administrators, analysts, and expert systems to perform assorted tasks; examples of these tasks include security enforcement, market studies, knowledge discovery, and link analysis. At UCLA, we designed and implemented the event pattern language (EPL) which is capable of detecting and acting upon complex patterns of events which are temporally related to each other. For instance, a plant manager should be notified when a certain pattern of overheating repeats itself over time in a chemical process; likewise, proper notification is required when a suspicious sequence of bank transactions is executed within a certain time limit. The EPL prototype is built in CLIPS to operate on top of Sybase, a commercial relational DBMS, where actions can be triggered by events such as simple database updates, insertions, and deletions. The rule-based syntax of EPL allows the sequences of goals in rules to be interpreted as sequences of temporal events; each goal can correspond to either (1) a simple event, or (2) a (possibly negated) event/condition predicate, or (3) a complex event defined as the disjunction and repetition of other events. Various extensions have been added to CLIPS in order to tailor the interface with Sybase and its open client/server architecture.

  19. The Perfect Marriage: Integrated Word Processing and Data Base Management Programs.

    ERIC Educational Resources Information Center

    Pogrow, Stanley

    1983-01-01

    Discussion of database integration and how it operates includes recommendations on compatible brand name word processing and database management programs, and a checklist for evaluating essential and desirable features of the available programs. (MBR)

  20. Integration of air traffic databases : a case study

    DOT National Transportation Integrated Search

    1995-03-01

    This report describes a case study to show the benefits from maximum utilization of existing air traffic databases. The study demonstrates the utility of integrating available data through developing and demonstrating a methodology addressing the iss...

  1. SUPERSITES INTEGRATED RELATIONAL DATABASE (SIRD)

    EPA Science Inventory

    As part of EPA's Particulate Matter (PM) Supersites Program (Program), the University of Maryland designed and developed the Supersites Integrated Relational Database (SIRD). Measurement data in SIRD include comprehensive air quality data from the 7 Supersite program locations f...

  2. Realization of Real-Time Clinical Data Integration Using Advanced Database Technology

    PubMed Central

    Yoo, Sooyoung; Kim, Boyoung; Park, Heekyong; Choi, Jinwook; Chun, Jonghoon

    2003-01-01

    As information & communication technologies have advanced, interest in mobile health care systems has grown. In order to obtain information seamlessly from distributed and fragmented clinical data from heterogeneous institutions, we need solutions that integrate data. In this article, we introduce a method for information integration based on real-time message communication using trigger and advanced database technologies. Messages were devised to conform to HL7, a standard for electronic data exchange in healthcare environments. The HL7 based system provides us with an integrated environment in which we are able to manage the complexities of medical data. We developed this message communication interface to generate and parse HL7 messages automatically from the database point of view. We discuss how easily real time data exchange is performed in the clinical information system, given the requirement for minimum loading of the database system. PMID:14728271

  3. Building a structured monitoring and evaluating system of postmarketing drug use in Shanghai.

    PubMed

    Du, Wenmin; Levine, Mitchell; Wang, Longxing; Zhang, Yaohua; Yi, Chengdong; Wang, Hongmin; Wang, Xiaoyu; Xie, Hongjuan; Xu, Jianglong; Jin, Huilin; Wang, Tongchun; Huang, Gan; Wu, Ye

    2007-01-01

    In order to understand a drug's full profile in the post-marketing environment, information is needed regarding utilization patterns, beneficial effects, ADRs and economic value. China, the most populated country in the world, has the largest number of people who are taking medications. To begin to appreciate the impact of these medications, a multifunctional evaluation and surveillance system was developed, the Shanghai Drug Monitoring and Evaluative System (SDMES). Set up by the Shanghai Center for Adverse Drug Reaction Monitoring in 2001, the SDMES contains three databases: a population health data base of middle aged and elderly persons; hospital patient medical records; and a spontaneous ADR reporting database. Each person has a unique identification and Medicare number, which permits record-linkage within and between these three databases. After more than three years in development, the population health database has comprehensive data for more than 320,000 residents. The hospital database has two years of inpatient medical records from five major hospitals, and will be increasing to 10 hospitals in 2007. The spontaneous reporting ADR database has collected 20,205 cases since 2001 from approximately 295 sources, including hospitals, pharmaceutical companies, drug wholesalers and pharmacies. The SDMES has the potential to become an important national and international pharmacoepidemiology resource for drug evaluation.

  4. DICOM index tracker enterprise: advanced system for enterprise-wide quality assurance and patient safety monitoring

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Pavlicek, William; Panda, Anshuman; Langer, Steve G.; Morin, Richard; Fetterly, Kenneth A.; Paden, Robert; Hanson, James; Wu, Lin-Wei; Wu, Teresa

    2015-03-01

    DICOM Index Tracker (DIT) is an integrated platform to harvest rich information available from Digital Imaging and Communications in Medicine (DICOM) to improve quality assurance in radiology practices. It is designed to capture and maintain longitudinal patient-specific exam indices of interests for all diagnostic and procedural uses of imaging modalities. Thus, it effectively serves as a quality assurance and patient safety monitoring tool. The foundation of DIT is an intelligent database system which stores the information accepted and parsed via a DICOM receiver and parser. The database system enables the basic dosimetry analysis. The success of DIT implementation at Mayo Clinic Arizona calls for the DIT deployment at the enterprise level which requires significant improvements. First, for geographically distributed multi-site implementation, the first bottleneck is the communication (network) delay; the second is the scalability of the DICOM parser to handle the large volume of exams from different sites. To address this issue, DICOM receiver and parser are separated and decentralized by site. To facilitate the enterprise wide Quality Assurance (QA), a notable challenge is the great diversities of manufacturers, modalities and software versions, as the solution DIT Enterprise provides the standardization tool for device naming, protocol naming, physician naming across sites. Thirdly, advanced analytic engines are implemented online which support the proactive QA in DIT Enterprise.

  5. The ChArMEx database

    NASA Astrophysics Data System (ADS)

    Ferré, Helene; Belmahfoud, Nizar; Boichard, Jean-Luc; Brissebrat, Guillaume; Descloitres, Jacques; Fleury, Laurence; Focsa, Loredana; Henriot, Nicolas; Mastrorillo, Laurence; Mière, Arnaud; Vermeulen, Anne

    2014-05-01

    The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The project includes long term monitoring of environmental parameters, intensive field campaigns, use of satellite data and modelling studies. Therefore ChARMEx scientists produce and need to access a wide diversity of data. In this context, the objective of the database task is to organize data management, distribution system and services, such as facilitating the exchange of information and stimulating the collaboration between researchers within the ChArMEx community, and beyond. The database relies on a strong collaboration between OMP and ICARE data centres and has been set up in the framework of the Mediterranean Integrated Studies at Regional And Locals Scales (MISTRALS) program data portal. All the data produced by or of interest for the ChArMEx community will be documented in the data catalogue and accessible through the database website: http://mistrals.sedoo.fr/ChArMEx. At present, the ChArMEx database contains about 75 datasets, including 50 in situ datasets (2012 and 2013 campaigns, Ersa background monitoring station), 25 model outputs (dust model intercomparison, MEDCORDEX scenarios), and a high resolution emission inventory over the Mediterranean. Many in situ datasets have been inserted in a relational database, in order to enable more accurate data selection and download of different datasets in a shared format. The database website offers different tools: - A registration procedure which enables any scientist to accept the data policy and apply for a user database account. - A data catalogue that complies with metadata international standards (ISO 19115-19139; INSPIRE European Directive; Global Change Master Directory Thesaurus). - Metadata forms to document observations or products that will be provided to the database. - A search tool to browse the catalogue using thematic, geographic and/or temporal criteria. - A shopping-cart web interface to order in situ data files. - A web interface to select and access to homogenized datasets. Interoperability between the two data centres is being set up using the OPEnDAP protocol. The data portal will soon propose a user-friendly access to satellite products managed by the ICARE data centre (SEVIRI, TRIMM, PARASOL...). In order to meet the operational needs of the airborne and ground based observational teams during the ChArMEx 2012 and 2013 campaigns, a day-to-day chart and report display website has been developed too: http://choc.sedoo.org. It offers a convenient way to browse weather conditions and chemical composition during the campaign periods.

  6. The new Mediterranean background monitoring station of Ersa, Cape Corsica: A long term Observatory component of the Chemistry-Aerosol Mediterranean Experiment (ChArMEx)

    NASA Astrophysics Data System (ADS)

    Dulac, Francois

    2013-04-01

    The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) is a French initiative supported by the MISTRALS program (Mediterranean Integrated Studies at Regional And Locals Scales, http://www.mistrals-home.org). It aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The major stake is an understanding of the future of the Mediterranean region in a context of strong regional anthropogenic and climatic pressures. The target of ChArMEx is short-lived particulate and gaseous tropospheric trace species which are the cause of poor air quality events, have two-way interactions with climate, or impact the marine biogeochemistry. In order to fulfill these objectives, important efforts have been put in 2012 in order to implement the infrastructure and instrumentation for a fully equipped background monitoring station at Ersa, Cape Corsica, a key location at the crossroads of dusty southerly air masses and polluted outflows from the European continent. The observations at this station began in June 2012 (in the context of the EMEP / ACTRIS / PEGASOS / ChArMEx campaigns). A broad spectrum of aerosol properties is also measured at the station, from the chemical composition (off-line daily filter sampling in PM2.5/PM10, on-line Aerosol Chemical Speciation Monitor), ground optical properties (extinction/absorption/light scattering coeff. with 1-? CAPS PMex monitor, 7-? Aethalometer, 3-? Nephelometer), integrated and vertically resolved optical properties (4-? Cimel sunphotometer and LIDAR, respective), size distribution properties (N-AIS, SMPS, APS, and OPS instruments), mass (PM1/PM10 by TEOM/TEOM-FDMS), hygroscopicity (CCN), as well as total insoluble deposition. So far, real-time measurement of reactive gases (O3, CO, NO, NO2), and off-line VOC measurements (cylinders, cartridges) are also performed. A Kipp and Zonen system for monitoring direct and diffuse broadband radiative fluxes will also be in operation soon, as well as an ICOS/RAMCES CO2 and CH4 monitoring instrument. Through this unprecedented effort and with the support from ChArMEx, ADEME, and CORSiCA programs (http://www.obs-mip.fr/corsica), this observatory represents so far the most achieved French atmospheric station having the best set of instruments for measuring in-situ reactive gases and aerosols. It stands out as the station of not one laboratory but of a large number (see list of co-authors). It provides "real time" information useful to the local air quality network (Qualitair Corse, http://www.qualitaircorse.org/) concerning EU regulated parameters (O3, PMx). This station aims providing quality controlled climatically relevant gas/aerosol database following the recommendations of the EU-FP7 ACTRIS infrastructure, EMEP and WMO-GAW programs. Atmospheric datasets are currently available at the MISTRALS database (http://mistrals.sedoo.fr/ChArMEx/) and soon at the ACTRIS & GAW databases. After a brief presentation of the Cape Corsica Station (location, climatology, instrumental settings ...), we present here the first months of aerosols properties (optical / chemical / particle size) obtained at this station. Acknowledgements: the station is mainly supported by ADEME, CNRS-INSU, CEA, CTC, EMD, FEDER, and Météo-France.

  7. Hydroacoustic propagation grids for the CTBT knowledge databaes BBN technical memorandum W1303

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Angell

    1998-05-01

    The Hydroacoustic Coverage Assessment Model (HydroCAM) has been used to develop components of the hydroacoustic knowledge database required by operational monitoring systems, particularly the US National Data Center (NDC). The database, which consists of travel time, amplitude correction and travel time standard deviation grids, is planned to support source location, discrimination and estimation functions of the monitoring network. The grids will also be used under the current BBN subcontract to support an analysis of the performance of the International Monitoring System (IMS) and national sensor systems. This report describes the format and contents of the hydroacoustic knowledgebase grids, and themore » procedures and model parameters used to generate these grids. Comparisons between the knowledge grids, measured data and other modeled results are presented to illustrate the strengths and weaknesses of the current approach. A recommended approach for augmenting the knowledge database with a database of expected spectral/waveform characteristics is provided in the final section of the report.« less

  8. MitBASE : a comprehensive and integrated mitochondrial DNA database. The present status

    PubMed Central

    Attimonelli, M.; Altamura, N.; Benne, R.; Brennicke, A.; Cooper, J. M.; D’Elia, D.; Montalvo, A. de; Pinto, B. de; De Robertis, M.; Golik, P.; Knoop, V.; Lanave, C.; Lazowska, J.; Licciulli, F.; Malladi, B. S.; Memeo, F.; Monnerot, M.; Pasimeni, R.; Pilbout, S.; Schapira, A. H. V.; Sloof, P.; Saccone, C.

    2000-01-01

    MitBASE is an integrated and comprehensive database of mitochondrial DNA data which collects, under a single interface, databases for Plant, Vertebrate, Invertebrate, Human, Protist and Fungal mtDNA and a Pilot database on nuclear genes involved in mitochondrial biogenesis in Saccharomyces cerevisiae. MitBASE reports all available information from different organisms and from intraspecies variants and mutants. Data have been drawn from the primary databases and from the literature; value adding information has been structured, e.g., editing information on protist mtDNA genomes, pathological information for human mtDNA variants, etc. The different databases, some of which are structured using commercial packages (Microsoft Access, File Maker Pro) while others use a flat-file format, have been integrated under ORACLE. Ad hoc retrieval systems have been devised for some of the above listed databases keeping into account their peculiarities. The database is resident at the EBI and is available at the following site: http://www3.ebi.ac.uk/Research/Mitbase/mitbase.pl . The impact of this project is intended for both basic and applied research. The study of mitochondrial genetic diseases and mitochondrial DNA intraspecies diversity are key topics in several biotechnological fields. The database has been funded within the EU Biotechnology programme. PMID:10592207

  9. Integrated Functional and Executional Modelling of Software Using Web-Based Databases

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Marietta, Roberta

    1998-01-01

    NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases. To appear in an article of Journal of Database Management.

  10. An integrated database-pipeline system for studying single nucleotide polymorphisms and diseases.

    PubMed

    Yang, Jin Ok; Hwang, Sohyun; Oh, Jeongsu; Bhak, Jong; Sohn, Tae-Kwon

    2008-12-12

    Studies on the relationship between disease and genetic variations such as single nucleotide polymorphisms (SNPs) are important. Genetic variations can cause disease by influencing important biological regulation processes. Despite the needs for analyzing SNP and disease correlation, most existing databases provide information only on functional variants at specific locations on the genome, or deal with only a few genes associated with disease. There is no combined resource to widely support gene-, SNP-, and disease-related information, and to capture relationships among such data. Therefore, we developed an integrated database-pipeline system for studying SNPs and diseases. To implement the pipeline system for the integrated database, we first unified complicated and redundant disease terms and gene names using the Unified Medical Language System (UMLS) for classification and noun modification, and the HUGO Gene Nomenclature Committee (HGNC) and NCBI gene databases. Next, we collected and integrated representative databases for three categories of information. For genes and proteins, we examined the NCBI mRNA, UniProt, UCSC Table Track and MitoDat databases. For genetic variants we used the dbSNP, JSNP, ALFRED, and HGVbase databases. For disease, we employed OMIM, GAD, and HGMD databases. The database-pipeline system provides a disease thesaurus, including genes and SNPs associated with disease. The search results for these categories are available on the web page http://diseasome.kobic.re.kr/, and a genome browser is also available to highlight findings, as well as to permit the convenient review of potentially deleterious SNPs among genes strongly associated with specific diseases and clinical phenotypes. Our system is designed to capture the relationships between SNPs associated with disease and disease-causing genes. The integrated database-pipeline provides a list of candidate genes and SNP markers for evaluation in both epidemiological and molecular biological approaches to diseases-gene association studies. Furthermore, researchers then can decide semi-automatically the data set for association studies while considering the relationships between genetic variation and diseases. The database can also be economical for disease-association studies, as well as to facilitate an understanding of the processes which cause disease. Currently, the database contains 14,674 SNP records and 109,715 gene records associated with human diseases and it is updated at regular intervals.

  11. Development of an alarm sound database and simulator.

    PubMed

    Takeuchi, Akihiro; Hirose, Minoru; Shinbo, Toshiro; Imai, Megumi; Mamorita, Noritaka; Ikeda, Noriaki

    2006-10-01

    The purpose of this study was to develop an interactive software package of alarm sounds to present, recognize and share problems about alarm sounds among medical staff and medical manufactures. The alarm sounds were recorded in variable alarm conditions in a WAV file. The alarm conditions were arbitrarily induced by modifying attachments of various medical devices. The software package that integrated an alarm sound database and simulator was used to assess the ability to identify the monitor that sounded the alarm for the medical staff. Eighty alarm sound files (40MB in total) were recorded from 41 medical devices made by 28 companies. There were three pairs of similar alarm sounds that could not easily be distinguished, two alarm sounds which had a different priority, either low or high. The alarm sound database was created in an Excel file (ASDB.xls 170 kB, 40 MB with photos), and included a list of file names that were hyperlinked to alarm sound files. An alarm sound simulator (AlmSS) was constructed with two modules for simultaneously playing alarm sound files and for designing new alarm sounds. The AlmSS was used in the assessing procedure to determine whether 19 clinical engineers could identify 13 alarm sounds only by their distinctive sounds. They were asked to choose from a list of devices and to rate the priority of each alarm. The overall correct identification rate of the alarm sounds was 48%, and six characteristic alarm sounds were correctly recognized by beetween 63% to 100% of the subjects. The overall recognition rate of the alarm sound priority was only 27%. We have developed an interactive software package of alarm sounds by integrating the database and the alarm sound simulator (URL: http://info.ahs.kitasato-u.ac.jp/tkweb/alarm/asdb.html ). The AlmSS was useful for replaying multiple alarm sounds simultaneously and designing new alarm sounds interactively.

  12. GraphSAW: a web-based system for graphical analysis of drug interactions and side effects using pharmaceutical and molecular data.

    PubMed

    Shoshi, Alban; Hoppe, Tobias; Kormeier, Benjamin; Ogultarhan, Venus; Hofestädt, Ralf

    2015-02-28

    Adverse drug reactions are one of the most common causes of death in industrialized Western countries. Nowadays, empirical data from clinical studies for the approval and monitoring of drugs and molecular databases is available. The integration of database information is a promising method for providing well-based knowledge to avoid adverse drug reactions. This paper presents our web-based decision support system GraphSAW which analyzes and evaluates drug interactions and side effects based on data from two commercial and two freely available molecular databases. The system is able to analyze single and combined drug-drug interactions, drug-molecule interactions as well as single and cumulative side effects. In addition, it allows exploring associative networks of drugs, molecules, metabolic pathways, and diseases in an intuitive way. The molecular medication analysis includes the capabilities of the upper features. A statistical evaluation of the integrated data and top 20 drugs concerning drug interactions and side effects is performed. The results of the data analysis give an overview of all theoretically possible drug interactions and side effects. The evaluation shows a mismatch between pharmaceutical and molecular databases. The concordance of drug interactions was about 12% and 9% of drug side effects. An application case with prescribed data of 11 patients is presented in order to demonstrate the functionality of the system under real conditions. For each patient at least two interactions occured in every medication and about 8% of total diseases were possibly induced by drug therapy. GraphSAW (http://tunicata.techfak.uni-bielefeld.de/graphsaw/) is meant to be a web-based system for health professionals and researchers. GraphSAW provides comprehensive drug-related knowledge and an improved medication analysis which may support efforts to reduce the risk of medication errors and numerous drastic side effects.

  13. Extraction, integration and analysis of alternative splicing and protein structure distributed information

    PubMed Central

    D'Antonio, Matteo; Masseroli, Marco

    2009-01-01

    Background Alternative splicing has been demonstrated to affect most of human genes; different isoforms from the same gene encode for proteins which differ for a limited number of residues, thus yielding similar structures. This suggests possible correlations between alternative splicing and protein structure. In order to support the investigation of such relationships, we have developed the Alternative Splicing and Protein Structure Scrutinizer (PASS), a Web application to automatically extract, integrate and analyze human alternative splicing and protein structure data sparsely available in the Alternative Splicing Database, Ensembl databank and Protein Data Bank. Primary data from these databases have been integrated and analyzed using the Protein Identifier Cross-Reference, BLAST, CLUSTALW and FeatureMap3D software tools. Results A database has been developed to store the considered primary data and the results from their analysis; a system of Perl scripts has been implemented to automatically create and update the database and analyze the integrated data; a Web interface has been implemented to make the analyses easily accessible; a database has been created to manage user accesses to the PASS Web application and store user's data and searches. Conclusion PASS automatically integrates data from the Alternative Splicing Database with protein structure data from the Protein Data Bank. Additionally, it comprehensively analyzes the integrated data with publicly available well-known bioinformatics tools in order to generate structural information of isoform pairs. Further analysis of such valuable information might reveal interesting relationships between alternative splicing and protein structure differences, which may be significantly associated with different functions. PMID:19828075

  14. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  15. KaBOB: ontology-based semantic integration of biomedical databases.

    PubMed

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for formal reasoning over a wealth of integrated biomedical data.

  16. ACCESS, Absolute Color Calibration Experiment for Standard Stars: Integration, Test, and Ground Performance

    NASA Astrophysics Data System (ADS)

    Kaiser, Mary Elizabeth; Morris, Matthew; Aldoroty, Lauren; Kurucz, Robert; McCandliss, Stephan; Rauscher, Bernard; Kimble, Randy; Kruk, Jeffrey; Wright, Edward L.; Feldman, Paul; Riess, Adam; Gardner, Jonathon; Bohlin, Ralph; Deustua, Susana; Dixon, Van; Sahnow, David J.; Perlmutter, Saul

    2018-01-01

    Establishing improved spectrophotometric standards is important for a broad range of missions and is relevant to many astrophysical problems. Systematic errors associated with astrophysical data used to probe fundamental astrophysical questions, such as SNeIa observations used to constrain dark energy theories, now exceed the statistical errors associated with merged databases of these measurements. ACCESS, “Absolute Color Calibration Experiment for Standard Stars”, is a series of rocket-borne sub-orbital missions and ground-based experiments designed to enable improvements in the precision of the astrophysical flux scale through the transfer of absolute laboratory detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35‑1.7μm bandpass. To achieve this goal ACCESS (1) observes HST/ Calspec stars (2) above the atmosphere to eliminate telluric spectral contaminants (e.g. OH) (3) using a single optical path and (HgCdTe) detector (4) that is calibrated to NIST laboratory standards and (5) monitored on the ground and in-flight using a on-board calibration monitor. The observations are (6) cross-checked and extended through the generation of stellar atmosphere models for the targets. The ACCESS telescope and spectrograph have been designed, fabricated, and integrated. Subsystems have been tested. Performance results for subsystems, operations testing, and the integrated spectrograph will be presented. NASA sounding rocket grant NNX17AC83G supports this work.

  17. Heterogeneous Biomedical Database Integration Using a Hybrid Strategy: A p53 Cantcer Research Database

    PubMed Central

    Bichutskiy, Vadim Y.; Colman, Richard; Brachmann, Rainer K.; Lathrop, Richard H.

    2006-01-01

    Complex problems in life science research give rise to multidisciplinary collaboration, and hence, to the need for heterogeneous database integration. The tumor suppressor p53 is mutated in close to 50% of human cancers, and a small drug-like molecule with the ability to restore native function to cancerous p53 mutants is a long-held medical goal of cancer treatment. The Cancer Research DataBase (CRDB) was designed in support of a project to find such small molecules. As a cancer informatics project, the CRDB involved small molecule data, computational docking results, functional assays, and protein structure data. As an example of the hybrid strategy for data integration, it combined the mediation and data warehousing approaches. This paper uses the CRDB to illustrate the hybrid strategy as a viable approach to heterogeneous data integration in biomedicine, and provides a design method for those considering similar systems. More efficient data sharing implies increased productivity, and, hopefully, improved chances of success in cancer research. (Code and database schemas are freely downloadable, http://www.igb.uci.edu/research/research.html.) PMID:19458771

  18. Monitoring cognitive function and need with the automated neuropsychological assessment metrics in Decompression Sickness (DCS) research

    NASA Technical Reports Server (NTRS)

    Nesthus, Thomas E.; Schiflett, Sammuel G.

    1993-01-01

    Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.

  19. Network-based drug discovery by integrating systems biology and computational technologies

    PubMed Central

    Leung, Elaine L.; Cao, Zhi-Wei; Jiang, Zhi-Hong; Zhou, Hua

    2013-01-01

    Network-based intervention has been a trend of curing systemic diseases, but it relies on regimen optimization and valid multi-target actions of the drugs. The complex multi-component nature of medicinal herbs may serve as valuable resources for network-based multi-target drug discovery due to its potential treatment effects by synergy. Recently, robustness of multiple systems biology platforms shows powerful to uncover molecular mechanisms and connections between the drugs and their targeting dynamic network. However, optimization methods of drug combination are insufficient, owning to lacking of tighter integration across multiple ‘-omics’ databases. The newly developed algorithm- or network-based computational models can tightly integrate ‘-omics’ databases and optimize combinational regimens of drug development, which encourage using medicinal herbs to develop into new wave of network-based multi-target drugs. However, challenges on further integration across the databases of medicinal herbs with multiple system biology platforms for multi-target drug optimization remain to the uncertain reliability of individual data sets, width and depth and degree of standardization of herbal medicine. Standardization of the methodology and terminology of multiple system biology and herbal database would facilitate the integration. Enhance public accessible databases and the number of research using system biology platform on herbal medicine would be helpful. Further integration across various ‘-omics’ platforms and computational tools would accelerate development of network-based drug discovery and network medicine. PMID:22877768

  20. A geo-spatial data management system for potentially active volcanoes—GEOWARN project

    NASA Astrophysics Data System (ADS)

    Gogu, Radu C.; Dietrich, Volker J.; Jenny, Bernhard; Schwandner, Florian M.; Hurni, Lorenz

    2006-02-01

    Integrated studies of active volcanic systems for the purpose of long-term monitoring and forecast and short-term eruption prediction require large numbers of data-sets from various disciplines. A modern database concept has been developed for managing and analyzing multi-disciplinary volcanological data-sets. The GEOWARN project (choosing the "Kos-Yali-Nisyros-Tilos volcanic field, Greece" and the "Campi Flegrei, Italy" as test sites) is oriented toward potentially active volcanoes situated in regions of high geodynamic unrest. This article describes the volcanological database of the spatial and temporal data acquired within the GEOWARN project. As a first step, a spatial database embedded in a Geographic Information System (GIS) environment was created. Digital data of different spatial resolution, and time-series data collected at different intervals or periods, were unified in a common, four-dimensional representation of space and time. The database scheme comprises various information layers containing geographic data (e.g. seafloor and land digital elevation model, satellite imagery, anthropogenic structures, land-use), geophysical data (e.g. from active and passive seismicity, gravity, tomography, SAR interferometry, thermal imagery, differential GPS), geological data (e.g. lithology, structural geology, oceanography), and geochemical data (e.g. from hydrothermal fluid chemistry and diffuse degassing features). As a second step based on the presented database, spatial data analysis has been performed using custom-programmed interfaces that execute query scripts resulting in a graphical visualization of data. These query tools were designed and compiled following scenarios of known "behavior" patterns of dormant volcanoes and first candidate signs of potential unrest. The spatial database and query approach is intended to facilitate scientific research on volcanic processes and phenomena, and volcanic surveillance.

  1. BIOSPIDA: A Relational Database Translator for NCBI.

    PubMed

    Hagen, Matthew S; Lee, Eva K

    2010-11-13

    As the volume and availability of biological databases continue widespread growth, it has become increasingly difficult for research scientists to identify all relevant information for biological entities of interest. Details of nucleotide sequences, gene expression, molecular interactions, and three-dimensional structures are maintained across many different databases. To retrieve all necessary information requires an integrated system that can query multiple databases with minimized overhead. This paper introduces a universal parser and relational schema translator that can be utilized for all NCBI databases in Abstract Syntax Notation (ASN.1). The data models for OMIM, Entrez-Gene, Pubmed, MMDB and GenBank have been successfully converted into relational databases and all are easily linkable helping to answer complex biological questions. These tools facilitate research scientists to locally integrate databases from NCBI without significant workload or development time.

  2. Building an integrated neurodegenerative disease database at an academic health center.

    PubMed

    Xie, Sharon X; Baek, Young; Grossman, Murray; Arnold, Steven E; Karlawish, Jason; Siderowf, Andrew; Hurtig, Howard; Elman, Lauren; McCluskey, Leo; Van Deerlin, Vivianna; Lee, Virginia M-Y; Trojanowski, John Q

    2011-07-01

    It is becoming increasingly important to study common and distinct etiologies, clinical and pathological features, and mechanisms related to neurodegenerative diseases such as Alzheimer's disease, Parkinson's disease, amyotrophic lateral sclerosis, and frontotemporal lobar degeneration. These comparative studies rely on powerful database tools to quickly generate data sets that match diverse and complementary criteria set by them. In this article, we present a novel integrated neurodegenerative disease (INDD) database, which was developed at the University of Pennsylvania (Penn) with the help of a consortium of Penn investigators. Because the work of these investigators are based on Alzheimer's disease, Parkinson's disease, amyotrophic lateral sclerosis, and frontotemporal lobar degeneration, it allowed us to achieve the goal of developing an INDD database for these major neurodegenerative disorders. We used the Microsoft SQL server as a platform, with built-in "backwards" functionality to provide Access as a frontend client to interface with the database. We used PHP Hypertext Preprocessor to create the "frontend" web interface and then used a master lookup table to integrate individual neurodegenerative disease databases. We also present methods of data entry, database security, database backups, and database audit trails for this INDD database. Using the INDD database, we compared the results of a biomarker study with those using an alternative approach by querying individual databases separately. We have demonstrated that the Penn INDD database has the ability to query multiple database tables from a single console with high accuracy and reliability. The INDD database provides a powerful tool for generating data sets in comparative studies on several neurodegenerative diseases. Copyright © 2011 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  3. Data Entities and Information System Matrix for Integrated Agriculture Information System (IAIS)

    NASA Astrophysics Data System (ADS)

    Budi Santoso, Halim; Delima, Rosa

    2018-03-01

    Integrated Agriculture Information System is a system that is developed to process data, information, and knowledge in Agriculture sector. Integrated Agriculture Information System brings valuable information for farmers: (1) Fertilizer price; (2) Agriculture technique and practise; (3) Pest management; (4) Cultivation; (5) Irrigation; (6) Post harvest processing; (7) Innovation in agriculture processing. Integrated Agriculture Information System contains 9 subsystems. To bring an integrated information to the user and stakeholder, it needs an integrated database approach. Thus, researchers describes data entity and its matrix relate to subsystem in Integrated Agriculture Information System (IAIS). As a result, there are 47 data entities as entities in single and integrated database.

  4. [Implementation of Oncomelania hupensis monitoring system based on Baidu Map].

    PubMed

    Zhi-Hua, Chen; Yi-Sheng, Zhu; Zhi-Qiang, Xue; Xue-Bing, Li; Yi-Min, Ding; Li-Jun, Bi; Kai-Min, Gao; You, Zhang

    2017-10-25

    To construct the Oncomelania hupensis snail monitoring system based on the Baidu Map. The environmental basic information about historical snail environment and existing snail environment, etc. was collected with the monitoring data about different kinds of O. hupensis snails, and then the O. hupensis snail monitoring system was built. Geographic Information System (GIS) and the electronic fence technology and Application Program Interface (API) were applied to set up the electronic fence of the snail surveillance environments, and the electronic fence was connected to the database of the snail surveillance. The O. hupensis snail monitoring system based on the Baidu Map were built up, including three modules of O. hupensis Snail Monitoring Environmental Database, Dynamic Monitoring Platform and Electronic Map. The information about monitoring O. hupensis snails could be obtained through the computer and smartphone simultaneously. The O. hupensis snail monitoring system, which is based on Baidu Map, is a visible platform to follow the process of snailsearching and molluscaciding.

  5. Simulation of CNT-AFM tip based on finite element analysis for targeted probe of the biological cell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yousefi, Amin Termeh, E-mail: at.tyousefi@gmail.com; Miyake, Mikio, E-mail: miyakejaist@gmail.com; Ikeda, Shoichiro, E-mail: sho16.ikeda@gmail.com

    Carbon nanotubes (CNTs) are potentially ideal tips for atomic force microscopy (AFM) due to the robust mechanical properties, nano scale diameter and also their ability to be functionalized by chemical and biological components at the tip ends. This contribution develops the idea of using CNTs as an AFM tip in computational analysis of the biological cell’s. Finite element analysis employed for each section and displacement of the nodes located in the contact area was monitored by using an output database (ODB). This reliable integration of CNT-AFM tip process provides a new class of high performance nanoprobes for single biological cellmore » analysis.« less

  6. SISSY: An example of a multi-threaded, networked, object-oriented databased application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scipioni, B.; Liu, D.; Song, T.

    1993-05-01

    The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is fully automated data collection and analysis system supporting the SSCL`s systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shells scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor andmore » analyze the PDSF.« less

  7. The roles of nearest neighbor methods in imputing missing data in forest inventory and monitoring databases

    Treesearch

    Bianca N. I. Eskelson; Hailemariam Temesgen; Valerie Lemay; Tara M. Barrett; Nicholas L. Crookston; Andrew T. Hudak

    2009-01-01

    Almost universally, forest inventory and monitoring databases are incomplete, ranging from missing data for only a few records and a few variables, common for small land areas, to missing data for many observations and many variables, common for large land areas. For a wide variety of applications, nearest neighbor (NN) imputation methods have been developed to fill in...

  8. Federated Search Tools in Fusion Centers: Bridging Databases in the Information Sharing Environment

    DTIC Science & Technology

    2012-09-01

    considerable variation in how fusion centers plan for, gather requirements, select and acquire federated search tools to bridge disparate databases...centers, when considering integrating federated search tools; by evaluating the importance of the planning, requirements gathering, selection and...acquisition processes for integrating federated search tools; by acknowledging the challenges faced by some fusion centers during these integration processes

  9. The Epimed Monitor ICU Database®: a cloud-based national registry for adult intensive care unit patients in Brazil.

    PubMed

    Zampieri, Fernando Godinho; Soares, Márcio; Borges, Lunna Perdigão; Salluh, Jorge Ibrain Figueira; Ranzani, Otávio Tavares

    2017-01-01

    To describe the Epimed Monitor Database®, a Brazilian intensive care unit quality improvement database. We described the Epimed Monitor® Database, including its structure and core data. We presented aggregated informative data from intensive care unit admissions from 2010 to 2016 using descriptive statistics. We also described the expansion and growth of the database along with the geographical distribution of participating units in Brazil. The core data from the database includes demographic, administrative and physiological parameters, as well as specific report forms used to gather detailed data regarding the use of intensive care unit resources, infectious episodes, adverse events and checklists for adherence to best clinical practices. As of the end of 2016, 598 adult intensive care units in 318 hospitals totaling 8,160 intensive care unit beds were participating in the database. Most units were located at private hospitals in the southeastern region of the country. The number of yearly admissions rose during this period and included a predominance of medical admissions. The proportion of admissions due to cardiovascular disease declined, while admissions due to sepsis or infections became more common. Illness severity (Simplified Acute Physiology Score - SAPS 3 - 62 points), patient age (mean = 62 years) and hospital mortality (approximately 17%) remained reasonably stable during this time period. A large private database of critically ill patients is feasible and may provide relevant nationwide epidemiological data for quality improvement and benchmarking purposes among the participating intensive care units. This database is useful not only for administrative reasons but also for the improvement of daily care by facilitating the adoption of best practices and use for clinical research.

  10. Monitoring product safety in the postmarketing environment.

    PubMed

    Sharrar, Robert G; Dieck, Gretchen S

    2013-10-01

    The safety profile of a medicinal product may change in the postmarketing environment. Safety issues not identified in clinical development may be seen and need to be evaluated. Methods of evaluating spontaneous adverse experience reports and identifying new safety risks include a review of individual reports, a review of a frequency distribution of a list of the adverse experiences, the development and analysis of a case series, and various ways of examining the database for signals of disproportionality, which may suggest a possible association. Regulatory agencies monitor product safety through a variety of mechanisms including signal detection of the adverse experience safety reports in databases and by requiring and monitoring risk management plans, periodic safety update reports and postauthorization safety studies. The United States Food and Drug Administration is working with public, academic and private entities to develop methods for using large electronic databases to actively monitor product safety. Important identified risks will have to be evaluated through observational studies and registries.

  11. Participation in a national nursing outcomes database: monitoring outcomes over time.

    PubMed

    Loan, Lori A; Patrician, Patricia A; McCarthy, Mary

    2011-01-01

    The current and future climates in health care require increased accountability of health care organizations for the quality of the care they provide. Never before in the history of health care in America has this focus on quality been so critical. The imperative to measure nursing's impact without fully developed and tested monitoring systems is a critical issue for nurse executives and managers alike. This article describes a project to measure nursing structure, process, and outcomes in the military health system, the Military Nursing Outcomes Database project. Here we review the effectiveness of this project in monitoring changes over time, in satisfying expectations of nurse leaders in participating hospitals, and evaluate the potential budgetary impacts of such a system. We conclude that the Military Nursing Outcomes Database did meet the needs of a monitoring system that is sensitive to changes over time in outcomes, provides interpretable data for nurse leaders, and could result in cost benefits and patient care improvements in organizations.

  12. A scientific database for real-time Neutron Monitor measurements - taking Neutron Monitors into the 21st century

    NASA Astrophysics Data System (ADS)

    Steigies, Christian

    2012-07-01

    The Neutron Monitor Database project, www.nmdb.eu, has been funded in 2008 and 2009 by the European Commission's 7th framework program (FP7). Neutron monitors (NMs) have been in use worldwide since the International Geophysical Year (IGY) in 1957 and cosmic ray data from the IGY and the improved NM64 NMs has been distributed since this time, but a common data format existed only for data with one hour resolution. This data was first distributed in printed books, later via the World Data Center ftp server. In the 1990's the first NM stations started to record data at higher resolutions (typically 1 minute) and publish in on their webpages. However, every NM station chose their own format, making it cumbersome to work with this distributed data. In NMDB all European and some neighboring NM stations came together to agree on a common format for high-resolution data and made this available via a centralized database. The goal of NMDB is to make all data from all NM stations available in real-time. The original NMDB network has recently been joined by the Bartol Research Institute (Newark DE, USA), the National Autonomous University of Mexico and the North-West University (Potchefstroom, South Africa). The data is accessible to everyone via an easy to use webinterface, but expert users can also directly access the database to build applications like real-time space weather alerts. Even though SQL databases are used today by most webservices (blogs, wikis, social media, e-commerce), the power of an SQL database has not yet been fully realized by the scientific community. In training courses, we are teaching how to make use of NMDB, how to join NMDB, and how to ensure the data quality. The present status of the extended NMDB will be presented. The consortium welcomes further data providers to help increase the scientific contributions of the worldwide neutron monitor network to heliospheric physics and space weather.

  13. Integrated Database And Knowledge Base For Genomic Prospective Cohort Study In Tohoku Medical Megabank Toward Personalized Prevention And Medicine.

    PubMed

    Ogishima, Soichi; Takai, Takako; Shimokawa, Kazuro; Nagaie, Satoshi; Tanaka, Hiroshi; Nakaya, Jun

    2015-01-01

    The Tohoku Medical Megabank project is a national project to revitalization of the disaster area in the Tohoku region by the Great East Japan Earthquake, and have conducted large-scale prospective genome-cohort study. Along with prospective genome-cohort study, we have developed integrated database and knowledge base which will be key database for realizing personalized prevention and medicine.

  14. The national response for preventing healthcare-associated infections: data and monitoring.

    PubMed

    Kahn, Katherine L; Weinberg, Daniel A; Leuschner, Kristin J; Gall, Elizabeth M; Siegel, Sari; Mendel, Peter

    2014-02-01

    Historically, the ability to accurately track healthcare-associated infections (HAIs) was hindered due to a lack of coordination among data sources and shortcomings in individual data sources. This paper presents the results of the evaluation of the HAI data and the monitoring component of the Action Plan, focusing on context (goals), inputs, and processes. We used the Content-Input-Process-Product framework, together with the HAI prevention system framework, to describe the transformative processes associated with data and monitoring efforts. Six HAI priority conditions in the 2009 Action Plan created a focus for the selection of goals and activities. Key Action Plan decisions included a phased-in data and monitoring approach, commitment to linking the selection of priority HAIs to highly visible national 5-year prevention targets, and the development of a comprehensive HAI database inventory. Remaining challenges relate to data validation, resources, and the opportunity to integrate electronic health and laboratory records with other provider data systems. The Action Plan's data and monitoring program has developed a sound infrastructure that builds upon technological advances and embodies a firm commitment to prioritization, coordination and alignment, accountability and incentives, stakeholder engagement, and an awareness of the need for predictable resources. With time, and adequate resources, it is likely that the investment in data-related infrastructure during the Action Plan's initial years will reap great rewards.

  15. Semantic-JSON: a lightweight web service interface for Semantic Web contents integrating multiple life science databases.

    PubMed

    Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro

    2011-07-01

    Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org.

  16. Integration of Oracle and Hadoop: Hybrid Databases Affordable at Scale

    NASA Astrophysics Data System (ADS)

    Canali, L.; Baranowski, Z.; Kothuri, P.

    2017-10-01

    This work reports on the activities aimed at integrating Oracle and Hadoop technologies for the use cases of CERN database services and in particular on the development of solutions for offloading data and queries from Oracle databases into Hadoop-based systems. The goal and interest of this investigation is to increase the scalability and optimize the cost/performance footprint for some of our largest Oracle databases. These concepts have been applied, among others, to build offline copies of CERN accelerator controls and logging databases. The tested solution allows to run reports on the controls data offloaded in Hadoop without affecting the critical production database, providing both performance benefits and cost reduction for the underlying infrastructure. Other use cases discussed include building hybrid database solutions with Oracle and Hadoop, offering the combined advantages of a mature relational database system with a scalable analytics engine.

  17. A cloud-based information repository for bridge monitoring applications

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Zhang, Yilan; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2016-04-01

    This paper describes an information repository to support bridge monitoring applications on a cloud computing platform. Bridge monitoring, with instrumentation of sensors in particular, collects significant amount of data. In addition to sensor data, a wide variety of information such as bridge geometry, analysis model and sensor description need to be stored. Data management plays an important role to facilitate data utilization and data sharing. While bridge information modeling (BrIM) technologies and standards have been proposed and they provide a means to enable integration and facilitate interoperability, current BrIM standards support mostly the information about bridge geometry. In this study, we extend the BrIM schema to include analysis models and sensor information. Specifically, using the OpenBrIM standards as the base, we draw on CSI Bridge, a commercial software widely used for bridge analysis and design, and SensorML, a standard schema for sensor definition, to define the data entities necessary for bridge monitoring applications. NoSQL database systems are employed for data repository. Cloud service infrastructure is deployed to enhance scalability, flexibility and accessibility of the data management system. The data model and systems are tested using the bridge model and the sensor data collected at the Telegraph Road Bridge, Monroe, Michigan.

  18. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  19. Achieving integration in mixed methods designs-principles and practices.

    PubMed

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. © Health Research and Educational Trust.

  20. Brain-CODE: A Secure Neuroinformatics Platform for Management, Federation, Sharing and Analysis of Multi-Dimensional Neuroscience Data.

    PubMed

    Vaccarino, Anthony L; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M; Stuss, Donald T; Theriault, Elizabeth; Evans, Kenneth R

    2018-01-01

    Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute's "Brain-CODE" is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care.

  1. Developing a ubiquitous health management system with healthy diet control for metabolic syndrome healthcare in Taiwan.

    PubMed

    Kan, Yao-Chiang; Chen, Kai-Hong; Lin, Hsueh-Chun

    2017-06-01

    Self-management in healthcare can allow patients managing their health data anytime and everywhere for prevention of chronic diseases. This study established a prototype of ubiquitous health management system (UHMS) with healthy diet control (HDC) for people who need services of metabolic syndrome healthcare in Taiwan. System infrastructure comprises of three portals and a database tier with mutually supportive components to achieve functionality of diet diaries, nutrition guides, and health risk assessments for self-health management. With the diet, nutrition, and personal health database, the design enables the analytical diagrams on the interactive interface to support a mobile application for diet diary, a Web-based platform for health management, and the modules of research and development for medical care. For database integrity, dietary data can be stored at offline mode prior to transformation between mobile device and server site at online mode. The UHMS-HDC was developed by open source technology for ubiquitous health management with personalized dietary criteria. The system integrates mobile, internet, and electronic healthcare services with the diet diary functions to manage healthy diet behaviors of users. The virtual patients were involved to simulate the self-health management procedure. The assessment functions were approved by capturing the screen snapshots in the procedure. The proposed system development was capable for practical intervention. This approach details the expandable framework with collaborative components regarding the self-developed UHMS-HDC. The multi-disciplinary applications for self-health management can support the healthcare professionals to reduce medical resources and improve healthcare effects for the patient who requires monitoring personal health condition with diet control. The proposed system can be practiced for intervention in the hospital. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Brain-CODE: A Secure Neuroinformatics Platform for Management, Federation, Sharing and Analysis of Multi-Dimensional Neuroscience Data

    PubMed Central

    Vaccarino, Anthony L.; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R.; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G.; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F. Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M.; Stuss, Donald T.; Theriault, Elizabeth; Evans, Kenneth R.

    2018-01-01

    Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute’s “Brain-CODE” is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care. PMID:29875648

  3. Integration of Web-based and PC-based clinical research databases.

    PubMed

    Brandt, C A; Sun, K; Charpentier, P; Nadkarni, P M

    2004-01-01

    We have created a Web-based repository or data library of information about measurement instruments used in studies of multi-factorial geriatric health conditions (the Geriatrics Research Instrument Library - GRIL) based upon existing features of two separate clinical study data management systems. GRIL allows browsing, searching, and selecting measurement instruments based upon criteria such as keywords and areas of applicability. Measurement instruments selected can be printed and/or included in an automatically generated standalone microcomputer database application, which can be downloaded by investigators for use in data collection and data management. Integration of database applications requires the creation of a common semantic model, and mapping from each system to this model. Various database schema conflicts at the table and attribute level must be identified and resolved prior to integration. Using a conflict taxonomy and a mapping schema facilitates this process. Critical conflicts at the table level that required resolution included name and relationship differences. A major benefit of integration efforts is the sharing of features and cross-fertilization of applications created for similar purposes in different operating environments. Integration of applications mandates some degree of metadata model unification.

  4. The limitations of some European healthcare databases for monitoring the effectiveness of pregnancy prevention programmes as risk minimisation measures.

    PubMed

    Charlton, R A; Bettoli, V; Bos, H J; Engeland, A; Garne, E; Gini, R; Hansen, A V; de Jong-van den Berg, L T W; Jordan, S; Klungsøyr, K; Neville, A J; Pierini, A; Puccini, A; Sinclair, M; Thayer, D; Dolk, H

    2018-04-01

    Pregnancy prevention programmes (PPPs) exist for some medicines known to be highly teratogenic. It is increasingly recognised that the impact of these risk minimisation measures requires periodic evaluation. This study aimed to assess the extent to which some of the data needed to monitor the effectiveness of PPPs may be present in European healthcare databases. An inventory was completed for databases contributing to EUROmediCAT capturing pregnancy and prescription data in Denmark, Norway, the Netherlands, Italy (Tuscany/Emilia Romagna), Wales and the rest of the UK, to determine the extent of data collected that could be used to evaluate the impact of PPPs. Data availability varied between databases. All databases could be used to identify the frequency and duration of prescriptions to women of childbearing age from primary care, but there were specific issues with availability of data from secondary care and private care. To estimate the frequency of exposed pregnancies, all databases could be linked to pregnancy data, but the accuracy of timing of the start of pregnancy was variable, and data on pregnancies ending in induced abortions were often not available. Data availability on contraception to estimate compliance with contraception requirements was variable and no data were available on pregnancy tests. Current electronic healthcare databases do not contain all the data necessary to fully monitor the effectiveness of PPP implementation, and thus, special data collection measures need to be instituted.

  5. An affinity-structure database of helix-turn-helix: DNA complexes with a universal coordinate system.

    PubMed

    AlQuraishi, Mohammed; Tang, Shengdong; Xia, Xide

    2015-11-19

    Molecular interactions between proteins and DNA molecules underlie many cellular processes, including transcriptional regulation, chromosome replication, and nucleosome positioning. Computational analyses of protein-DNA interactions rely on experimental data characterizing known protein-DNA interactions structurally and biochemically. While many databases exist that contain either structural or biochemical data, few integrate these two data sources in a unified fashion. Such integration is becoming increasingly critical with the rapid growth of structural and biochemical data, and the emergence of algorithms that rely on the synthesis of multiple data types to derive computational models of molecular interactions. We have developed an integrated affinity-structure database in which the experimental and quantitative DNA binding affinities of helix-turn-helix proteins are mapped onto the crystal structures of the corresponding protein-DNA complexes. This database provides access to: (i) protein-DNA structures, (ii) quantitative summaries of protein-DNA binding affinities using position weight matrices, and (iii) raw experimental data of protein-DNA binding instances. Critically, this database establishes a correspondence between experimental structural data and quantitative binding affinity data at the single basepair level. Furthermore, we present a novel alignment algorithm that structurally aligns the protein-DNA complexes in the database and creates a unified residue-level coordinate system for comparing the physico-chemical environments at the interface between complexes. Using this unified coordinate system, we compute the statistics of atomic interactions at the protein-DNA interface of helix-turn-helix proteins. We provide an interactive website for visualization, querying, and analyzing this database, and a downloadable version to facilitate programmatic analysis. This database will facilitate the analysis of protein-DNA interactions and the development of programmatic computational methods that capitalize on integration of structural and biochemical datasets. The database can be accessed at http://ProteinDNA.hms.harvard.edu.

  6. Coastal monitoring through video systems: best practices and architectural design of a new video monitoring station in Jesolo (Veneto, Italy)

    NASA Astrophysics Data System (ADS)

    Archetti, Renata; Vacchi, Matteo; Carniel, Sandro; Benetazzo, Alvise

    2013-04-01

    Measuring the location of the shoreline and monitoring foreshore changes through time represent a fundamental task for correct coastal management at many sites around the world. Several authors demonstrated video systems to be an essential tool for increasing the amount of data available for coastline management. These systems typically sample at least once per hour and can provide long-term datasets showing variations over days, events, months, seasons and years. In the past few years, due to the wide diffusion of video cameras at relatively low price, the use of video cameras and of video images analysis for environmental control has increased significantly. Even if video monitoring systems were often used in the research field they are most often applied with practical purposes including: i) identification and quantification of shoreline erosion, ii) assessment of coastal protection structure and/or beach nourishment performance, and iii) basic input to engineering design in the coastal zone iv) support for integrated numerical model validation Here we present the guidelines for the creation of a new video monitoring network in the proximity of the Jesolo beach (NW of the Adriatic Sea, Italy), Within this 10 km-long tourist district several engineering structures have been built in recent years, with the aim of solving urgent local erosion problems; as a result, almost all types of protection structures are present at this site: groynes, detached breakwaters.The area investigated experienced severe problems of coastal erosion in the past decades, inclusding a major one in the last November 2012. The activity is planned within the framework of the RITMARE project, that is also including other monitoring and scientific activities (bathymetry survey, waves and currents measurements, hydrodynamics and morphodynamic modeling). This contribution focuses on best practices to be adopted in the creation of the video monitoring system, and briefly describes the architectural design of the network, the creation of a database of images, the information extracted by the videomonitoring and its integration with other data.

  7. Suspect screening and non-targeted analysis of drinking water using point-of-use filters.

    PubMed

    Newton, Seth R; McMahen, Rebecca L; Sobus, Jon R; Mansouri, Kamel; Williams, Antony J; McEachran, Andrew D; Strynar, Mark J

    2018-03-01

    Monitored contaminants in drinking water represent a small portion of the total compounds present, many of which may be relevant to human health. To understand the totality of human exposure to compounds in drinking water, broader monitoring methods are imperative. In an effort to more fully characterize the drinking water exposome, point-of-use water filtration devices (Brita ® filters) were employed to collect time-integrated drinking water samples in a pilot study of nine North Carolina homes. A suspect screening analysis was performed by matching high resolution mass spectra of unknown features to molecular formulas from EPA's DSSTox database. Candidate compounds with those formulas were retrieved from the EPA's CompTox Chemistry Dashboard, a recently developed data hub for approximately 720,000 compounds. To prioritize compounds into those most relevant for human health, toxicity data from the US federal collaborative Tox21 program and the EPA ToxCast program, as well as exposure estimates from EPA's ExpoCast program, were used in conjunction with sample detection frequency and abundance to calculate a "ToxPi" score for each candidate compound. From ∼15,000 molecular features in the raw data, 91 candidate compounds were ultimately grouped into the highest priority class for follow up study. Fifteen of these compounds were confirmed using analytical standards including the highest priority compound, 1,2-Benzisothiazolin-3-one, which appeared in 7 out of 9 samples. The majority of the other high priority compounds are not targets of routine monitoring, highlighting major gaps in our understanding of drinking water exposures. General product-use categories from EPA's CPCat database revealed that several of the high priority chemicals are used in industrial processes, indicating the drinking water in central North Carolina may be impacted by local industries. Published by Elsevier Ltd.

  8. BIOSPIDA: A Relational Database Translator for NCBI

    PubMed Central

    Hagen, Matthew S.; Lee, Eva K.

    2010-01-01

    As the volume and availability of biological databases continue widespread growth, it has become increasingly difficult for research scientists to identify all relevant information for biological entities of interest. Details of nucleotide sequences, gene expression, molecular interactions, and three-dimensional structures are maintained across many different databases. To retrieve all necessary information requires an integrated system that can query multiple databases with minimized overhead. This paper introduces a universal parser and relational schema translator that can be utilized for all NCBI databases in Abstract Syntax Notation (ASN.1). The data models for OMIM, Entrez-Gene, Pubmed, MMDB and GenBank have been successfully converted into relational databases and all are easily linkable helping to answer complex biological questions. These tools facilitate research scientists to locally integrate databases from NCBI without significant workload or development time. PMID:21347013

  9. How can the research potential of the clinical quality databases be maximized? The Danish experience.

    PubMed

    Nørgaard, M; Johnsen, S P

    2016-02-01

    In Denmark, the need for monitoring of clinical quality and patient safety with feedback to the clinical, administrative and political systems has resulted in the establishment of a network of more than 60 publicly financed nationwide clinical quality databases. Although primarily devoted to monitoring and improving quality of care, the potential of these databases as data sources in clinical research is increasingly being recognized. In this review, we describe these databases focusing on their use as data sources for clinical research, including their strengths and weaknesses as well as future concerns and opportunities. The research potential of the clinical quality databases is substantial but has so far only been explored to a limited extent. Efforts related to technical, legal and financial challenges are needed in order to take full advantage of this potential. © 2016 The Association for the Publication of the Journal of Internal Medicine.

  10. MagnaportheDB: a federated solution for integrating physical and genetic map data with BAC end derived sequences for the rice blast fungus Magnaporthe grisea.

    PubMed

    Martin, Stanton L; Blackmon, Barbara P; Rajagopalan, Ravi; Houfek, Thomas D; Sceeles, Robert G; Denn, Sheila O; Mitchell, Thomas K; Brown, Douglas E; Wing, Rod A; Dean, Ralph A

    2002-01-01

    We have created a federated database for genome studies of Magnaporthe grisea, the causal agent of rice blast disease, by integrating end sequence data from BAC clones, genetic marker data and BAC contig assembly data. A library of 9216 BAC clones providing >25-fold coverage of the entire genome was end sequenced and fingerprinted by HindIII digestion. The Image/FPC software package was then used to generate an assembly of 188 contigs covering >95% of the genome. The database contains the results of this assembly integrated with hybridization data of genetic markers to the BAC library. AceDB was used for the core database engine and a MySQL relational database, populated with numerical representations of BAC clones within FPC contigs, was used to create appropriately scaled images. The database is being used to facilitate sequencing efforts. The database also allows researchers mapping known genes or other sequences of interest, rapid and easy access to the fundamental organization of the M.grisea genome. This database, MagnaportheDB, can be accessed on the web at http://www.cals.ncsu.edu/fungal_genomics/mgdatabase/int.htm.

  11. BNDB - the Biochemical Network Database.

    PubMed

    Küntzer, Jan; Backes, Christina; Blum, Torsten; Gerasch, Andreas; Kaufmann, Michael; Kohlbacher, Oliver; Lenhof, Hans-Peter

    2007-10-02

    Technological advances in high-throughput techniques and efficient data acquisition methods have resulted in a massive amount of life science data. The data is stored in numerous databases that have been established over the last decades and are essential resources for scientists nowadays. However, the diversity of the databases and the underlying data models make it difficult to combine this information for solving complex problems in systems biology. Currently, researchers typically have to browse several, often highly focused, databases to obtain the required information. Hence, there is a pressing need for more efficient systems for integrating, analyzing, and interpreting these data. The standardization and virtual consolidation of the databases is a major challenge resulting in a unified access to a variety of data sources. We present the Biochemical Network Database (BNDB), a powerful relational database platform, allowing a complete semantic integration of an extensive collection of external databases. BNDB is built upon a comprehensive and extensible object model called BioCore, which is powerful enough to model most known biochemical processes and at the same time easily extensible to be adapted to new biological concepts. Besides a web interface for the search and curation of the data, a Java-based viewer (BiNA) provides a powerful platform-independent visualization and navigation of the data. BiNA uses sophisticated graph layout algorithms for an interactive visualization and navigation of BNDB. BNDB allows a simple, unified access to a variety of external data sources. Its tight integration with the biochemical network library BN++ offers the possibility for import, integration, analysis, and visualization of the data. BNDB is freely accessible at http://www.bndb.org.

  12. Antipsychotic-induced hyperprolactinemia: synthesis of world-wide guidelines and integrated recommendations for assessment, management and future research.

    PubMed

    Grigg, Jasmin; Worsley, Roisin; Thew, Caroline; Gurvich, Caroline; Thomas, Natalie; Kulkarni, Jayashri

    2017-11-01

    Hyperprolactinemia is a highly prevalent adverse effect of many antipsychotic agents, with potentially serious health consequences. Several guidelines have been developed for the management of this condition; yet, their concordance has not been evaluated. The objectives of this paper were (1) to review current clinical guidelines; (2) to review key systematic evidence for management; and (3) based on our findings, to develop an integrated management recommendation specific to male and female patients who are otherwise clinically stabilised on antipsychotics. We performed searches of Medline and EMBASE, supplemented with guideline-specific database and general web searches, to identify clinical guidelines containing specific recommendations for antipsychotic-induced hyperprolactinemia, produced/updated 01/01/2010-15/09/2016. A separate systematic search was performed to identify emerging management approaches described in reviews and meta-analyses published ≥ 2010. There is some consensus among guidelines relating to baseline PRL screening (8/12 guidelines), screening for differential diagnosis (7/12) and discontinuing/switching PRL-raising agent (7/12). Guidelines otherwise diverge substantially regarding most aspects of screening, monitoring and management (e.g. treatment with dopamine agonists). There is an omission of clear sex-specific recommendations. Systematic literature on management approaches is promising; more research is needed. An integrated management recommendation is presented to guide sex-specific clinical response to antipsychotic-induced hyperprolactinemia. Key aspects include asymptomatic hyperprolactinemia monitoring and fertility considerations with PRL normalisation. Further empirical work is key to shaping robust guidelines for antipsychotic-induced hyperprolactinemia. The integrated management recommendation can assist clinician and patient decision-making, with the goal of balancing effective psychiatric treatment while minimising PRL-related adverse health effects in male and female patients.

  13. A plan for the North American Bat Monitoring Program (NABat)

    USGS Publications Warehouse

    Loeb, Susan C.; Rodhouse, Thomas J.; Ellison, Laura E.; Lausen, Cori L.; Reichard, Jonathan D.; Irvine, Kathryn M.; Ingersoll, Thomas E.; Coleman, Jeremy; Thogmartin, Wayne E.; Sauer, John R.; Francis, Charles M.; Bayless, Mylea L.; Stanley, Thomas R.; Johnson, Douglas H.

    2015-01-01

    The purpose of the North American Bat Monitoring Program (NABat) is to create a continent-wide program to monitor bats at local to rangewide scales that will provide reliable data to promote effective conservation decisionmaking and the long-term viability of bat populations across the continent. This is an international, multiagency program. Four approaches will be used to gather monitoring data to assess changes in bat distributions and abundances: winter hibernaculum counts, maternity colony counts, mobile acoustic surveys along road transects, and acoustic surveys at stationary points. These monitoring approaches are described along with methods for identifying species recorded by acoustic detectors. Other chapters describe the sampling design, the database management system (Bat Population Database), and statistical approaches that can be used to analyze data collected through this program.

  14. Landsat-4 and Landsat-5 thematic mapper band 6 historical performance and calibration

    USGS Publications Warehouse

    Barsi, J.A.; Chander, G.; Markham, B.L.; Higgs, N.; ,

    2005-01-01

    Launched in 1982 and 1984 respectively, the Landsat-4 and -5 Thematic Mappers (TM) are the backbone of an extensive archive of moderate resolution Earth imagery. However, these sensors and their data products were not subjected to the type of intensive monitoring that has been part of the Landsat-7 system since its launch in 1999. With Landsat-4's 11 year and Landsat-5's 20+ year data record, there is a need to understand the historical behavior of the instruments in order to verify the scientific integrity of the archive and processed products. Performance indicators of the Landsat-4 and -5 thermal bands have recently been extracted from a processing system database allowing for a more complete study of thermal band characteristics and calibration than was previously possible. The database records responses to the internal calibration system, instrument temperatures and applied gains and offsets for each band for every scene processed through the National Landsat Archive Production System (NLAPS). Analysis of this database has allowed for greater understanding of the calibration and improvement in the processing system. This paper will cover the trends in the Landsat-4 and -5 thermal bands, the effect of the changes seen in the trends, and how these trends affect the use of the thermal data.

  15. Community health center integration: experience in the State of Ohio.

    PubMed

    McAlearney, John S; McAlearney, Ann Scheck

    2006-02-01

    In the face of severe financial challenges and demands to improve quality and service to patients, many community health centers (CHCs) have aligned or integrated with other CHCs, physician groups, or hospitals. Yet the nature of and rationale for these organizational decisions are not well understood. Our research applied an organizational theoretical framework to test whether strategic adaptation theory or institutional theory best describes the integration activity of CHCs in Ohio. We collected primary data from case studies of seven CHCs selected for geographic representation and studied December 2000-January 2001. Semi-structured interviews and a case study database supported our chain of evidence. We found that CHC integration activity was substantial (five of seven CHCs integrated) and extremely varied. Consistent with strategic adaptation theory, we determined that CHC integration actions were predominantly center-specific, rational responses to environmental challenges and were initiated to improve operations or financial performance. Rarely did CHCs initiate major organizational change merely to mimic other CHC actions, as might have been expected of highly institutionalized organizations. Understanding the basis for CHCs' strategic decisions while monitoring financial health will remain critical as lawmakers and administrators work to develop policies that both maintain progress made and improve primary care access for the poor, the uninsured, and those with special health care needs served by these important safety net providers.

  16. A Practical Standardized Composite Nutrition Score Based on Lean Tissue Index: Application in Nutrition Screening and Prediction of Outcome in Hemodialysis Population.

    PubMed

    Chen, Huan-Sheng; Cheng, Chun-Ting; Hou, Chun-Cheng; Liou, Hung-Hsiang; Chang, Cheng-Tsung; Lin, Chun-Ju; Wu, Tsai-Kun; Chen, Chang-Hsu; Lim, Paik-Seong

    2017-07-01

    Rapid screening and monitoring of nutritional status is mandatory in hemodialysis population because of the increasingly encountered nutritional problems. Considering the limitations of previous composite nutrition scores applied in this population, we tried to develop a standardized composite nutrition score (SCNS) using low lean tissue index as a marker of protein wasting to facilitate clinical screening and monitoring and to predict outcome. This retrospective cohort used 2 databases of dialysis populations from Taiwan between 2011 and 2014. First database consisting of data from 629 maintenance hemodialysis patients was used to develop the SCNS and the second database containing data from 297 maintenance hemodialysis patients was used to validate this developed score. SCNS containing albumin, creatinine, potassium, and body mass index was developed from the first database using low lean tissue index as a marker of protein wasting. When applying this score in the original database, significantly higher risk of developing protein wasting was found for patients with lower SCNS (odds ratio 1.38 [middle tertile vs highest tertile, P < .0001] and 2.40 [lowest tertile vs middle tertile, P < .0001]). The risk of death was also shown to be higher for patients with lower SCNS (hazard ratio 4.45 [below median level vs above median level, P < .0001]). These results were validated in the second database. We developed an SCNS consisting of 4 easily available biochemical parameters. This kind of scoring system can be easily applied in different dialysis facilities for screening and monitoring of protein wasting. The wide application of body composition monitor in dialysis population will also facilitate the development of specific nutrition scoring model for individual facility. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  17. Young students, satellites aid understanding of climate-biosphere link

    NASA Astrophysics Data System (ADS)

    White, Michael A.; Schwartz, Mark D.; Running, Steven W.

    Data collected by young students from kindergarten through high school are being combined with satellite data to develop a more consistent understanding of the intimate connection between climate dynamics and the terrestrial biosphere. Comparison of the two sets of data involving the onset of budburst among trees and other vegetation has been extremely encouraging.Surface-atmosphere interactions involving exchanges of carbon, water, and energy are strongly affected by interannual variability in the timing and length of the vegetation growing season, and satellite remote sensing has the unique ability to consistently monitor global spatiotemporal variability in growing season dynamics. But without a clear picture of how satellite information (Figure 1) relates to ground conditions, the application of satellite growing season estimates for monitoring of climate-vegetation interactions, calculation of energy budgets, and large-scale ecological modeling is extremely limited.The integrated phenological analysis of field data, satellite observations, and climate advocated by Schwartz [1998], for example, has been primarily limited by the lack of geographically extensive and consistently measured phenology databases.

  18. Tools for distributed application management

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith; Cooper, Robert; Wood, Mark; Birman, Kenneth P.

    1990-01-01

    Distributed application management consists of monitoring and controlling an application as it executes in a distributed environment. It encompasses such activities as configuration, initialization, performance monitoring, resource scheduling, and failure response. The Meta system (a collection of tools for constructing distributed application management software) is described. Meta provides the mechanism, while the programmer specifies the policy for application management. The policy is manifested as a control program which is a soft real-time reactive program. The underlying application is instrumented with a variety of built-in and user-defined sensors and actuators. These define the interface between the control program and the application. The control program also has access to a database describing the structure of the application and the characteristics of its environment. Some of the more difficult problems for application management occur when preexisting, nondistributed programs are integrated into a distributed application for which they may not have been intended. Meta allows management functions to be retrofitted to such programs with a minimum of effort.

  19. Tools for distributed application management

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith; Wood, Mark; Cooper, Robert; Birman, Kenneth P.

    1990-01-01

    Distributed application management consists of monitoring and controlling an application as it executes in a distributed environment. It encompasses such activities as configuration, initialization, performance monitoring, resource scheduling, and failure response. The Meta system is described: a collection of tools for constructing distributed application management software. Meta provides the mechanism, while the programmer specifies the policy for application management. The policy is manifested as a control program which is a soft real time reactive program. The underlying application is instrumented with a variety of built-in and user defined sensors and actuators. These define the interface between the control program and the application. The control program also has access to a database describing the structure of the application and the characteristics of its environment. Some of the more difficult problems for application management occur when pre-existing, nondistributed programs are integrated into a distributed application for which they may not have been intended. Meta allows management functions to be retrofitted to such programs with a minimum of effort.

  20. Facilitating quality control for spectra assignments of small organic molecules: nmrshiftdb2--a free in-house NMR database with integrated LIMS for academic service laboratories.

    PubMed

    Kuhn, Stefan; Schlörer, Nils E

    2015-08-01

    nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Spatial Data Integration Using Ontology-Based Approach

    NASA Astrophysics Data System (ADS)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  2. The integration of digital orthophotographs with GISs in a microcomputer environment

    NASA Technical Reports Server (NTRS)

    Steiner, David R.

    1992-01-01

    The issues involved in the use of orthoimages as a data source for GIS databases are examined. The integration of digital photographs into a GIS is discussed. A prototype PC-based program for the production of GIS databases using orthoimages is described.

  3. Operational monitoring of turbidity in rivers: how satellites can contribute

    NASA Astrophysics Data System (ADS)

    Hucke, Dorothee; Hillebrand, Gudrun; Winterscheid, Axel; Kranz, Susanne; Baschek, Björn

    2016-10-01

    The applications of remote sensing in hydrology are diverse and offer significant benefits for water monitoring. Up to now, operational river monitoring and sediment management in Germany mainly rely on in-situ measurements and on results obtained from numerical modelling. Remote sensing by satellites has a great potential to supplement existing data with two-dimensional information on near-surface turbidity distributions at greater spatial scales than in-situ measurements can offer. Within the project WasMon-CT (WaterMonitoring-Chlorophyll/Turbidity), the Federal Institute of Hydrology (BfG) aims at the implementation of an operational monitoring of turbidity distributions based on satellite images (esp. Sentinel-2, Landsat7 and 8). Initially, selected federal inland and estuarine waterways will be addressed: Rhine, Elbe, Ems, Weser. WasMon-CT is funded within the German Copernicus activities. Within the project, a database of atmospherically corrected, geo-referenced turbidity data will be assembled. The collected corresponding meta-data will include aspects of satellite data as well as hydrological data, e.g. cloud cover and river run-off. Based on this catalogue of spatially linked meta-data, the satellite data will be selected by e.g. cloud cover or run-off. The permanently updated database will include past as well as recent satellite images. It is designed with a long-term perspective to optimize the existing in-situ measurement network, which will serve partly for calibration and partly as validation data set. The aim is to extend, but not to substitute, the existing frequent point measurements with spatially extensive, satellite-derived data from the near surface part of the water column. Here, turbidity is used as proxy for corresponding suspended sediment concentrations. For this, the relationship between turbidity and suspended sediment concentrations will be investigated. Products as e.g. longitudinal profiles or virtual measurement stations will be developed from an application toolbox to specifically match requirements of operational monitoring tasks and to allow for a better integration into the existing monitoring system. The toolbox demonstrates the benefits of remote sensing by applying the established processing chain to diverse hydrological questions, such as for the investigation of tidal-affected sediment loads or mixing processes at river confluences. This new application will be of great value to assess, evaluate and monitor the status or the change of large-scale sediment processes at the system level. Accordingly, the satellite-derived turbidity data will strongly enhance federal consulting activities and thus ensure a modern river monitoring of Germany's federal water ways.

  4. TransWatL - Crowdsourced water level transmission via short message service within the Sondu River Catchment, Kenya

    NASA Astrophysics Data System (ADS)

    Weeser, Björn; Jacobs, Suzanne; Breuer, Lutz; Butterbach-Bahl, Klaus; Rufino, Mariana

    2016-04-01

    The fast economic development in East African countries causes an increasing need of water and farmland. Ongoing changes in land use and climate may affect the function of water tower areas such as the Mau Forest complex as an important water source and tropical montane forest in Kenya. Reliable models and predictions are necessary to ensure a sustainable and adequate water resource management. The calibration and validation process of these models requires solid data, based on widespread monitoring in both space and time, which is a time consuming and expensive exercise. Countries with merging economies often do not have the technical capacity and resources to operate monitoring networks, although both the government and citizens are aware of the importance of sustainable water management. Our research focus on the implementation and testing of a crowdsourced database as a low-cost method to assess the water quantity within the Sondu river catchment in Kenya. Twenty to 30 water level gauges will be installed and equipped with instructional signage. Citizens are invited to read and transmit the water level and the station number to the database using a simple text message and their cell phone. The text message service is easy to use, stable, inexpensive and an established way of communication in East African countries. The simplicity of the method ensures a broad access for interested citizens and integration of locals in water monitoring all over the catchment. Furthermore, the system allows a direct and fast feedback to the users, which likely increases the awareness for water flow changes in the test region. A raspberry pi 2 Model B equipped with a mobile broadband modem will be used as a server receiving and storing incoming text messages. The received raw data will be quality checked and formatted by a python script and afterwards written back in a database. This ensures flexible and standardized access for postprocessing and data visualization, for which a web based databank is foreseen. For the validation of the method, TransWatL stations will also be installed next to permanent gauging stations to compare the quality of citizen's readings against permanent readings.

  5. MEPD: a Medaka gene expression pattern database

    PubMed Central

    Henrich, Thorsten; Ramialison, Mirana; Quiring, Rebecca; Wittbrodt, Beate; Furutani-Seiki, Makoto; Wittbrodt, Joachim; Kondoh, Hisato

    2003-01-01

    The Medaka Expression Pattern Database (MEPD) stores and integrates information of gene expression during embryonic development of the small freshwater fish Medaka (Oryzias latipes). Expression patterns of genes identified by ESTs are documented by images and by descriptions through parameters such as staining intensity, category and comments and through a comprehensive, hierarchically organized dictionary of anatomical terms. Sequences of the ESTs are available and searchable through BLAST. ESTs in the database are clustered upon entry and have been blasted against public data-bases. The BLAST results are updated regularly, stored within the database and searchable. The MEPD is a project within the Medaka Genome Initiative (MGI) and entries will be interconnected to integrated genomic map databases. MEPD is accessible through the WWW at http://medaka.dsp.jst.go.jp/MEPD. PMID:12519950

  6. Heterogenous database integration in a physician workstation.

    PubMed

    Annevelink, J; Young, C Y; Tang, P C

    1991-01-01

    We discuss the integration of a variety of data and information sources in a Physician Workstation (PWS), focusing on the integration of data from DHCP, the Veteran Administration's Distributed Hospital Computer Program. We designed a logically centralized, object-oriented data-schema, used by end users and applications to explore the data accessible through an object-oriented database using a declarative query language. We emphasize the use of procedural abstraction to transparently integrate a variety of information sources into the data schema.

  7. Heterogenous database integration in a physician workstation.

    PubMed Central

    Annevelink, J.; Young, C. Y.; Tang, P. C.

    1991-01-01

    We discuss the integration of a variety of data and information sources in a Physician Workstation (PWS), focusing on the integration of data from DHCP, the Veteran Administration's Distributed Hospital Computer Program. We designed a logically centralized, object-oriented data-schema, used by end users and applications to explore the data accessible through an object-oriented database using a declarative query language. We emphasize the use of procedural abstraction to transparently integrate a variety of information sources into the data schema. PMID:1807624

  8. Ultra-Structure database design methodology for managing systems biology data and analyses

    PubMed Central

    Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C

    2009-01-01

    Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849

  9. Monitoring of IaaS and scientific applications on the Cloud using the Elasticsearch ecosystem

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Guarise, A.; Lusso, S.; Masera, M.; Vallero, S.

    2015-05-01

    The private Cloud at the Torino INFN computing centre offers IaaS services to different scientific computing applications. The infrastructure is managed with the OpenNebula cloud controller. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BES-III collaboration, plus an increasing number of other small tenants. Besides keeping track of the usage, the automation of dynamic allocation of resources to tenants requires detailed monitoring and accounting of the resource usage. As a first investigation towards this, we set up a monitoring system to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the Elasticsearch, Logstash and Kibana stack. In the current implementation, the heterogeneous accounting information is fed to different MySQL databases and sent to Elasticsearch via a custom Logstash plugin. For the IaaS metering, we developed sensors for the OpenNebula API. The IaaS level information gathered through the API is sent to the MySQL database through an ad-hoc developed RESTful web service, which is also used for other accounting purposes. Concerning the application level, we used the Root plugin TProofMonSenderSQL to collect accounting data from the interactive analysis facility. The BES-III virtual instances used to be monitored with Zabbix, as a proof of concept we also retrieve the information contained in the Zabbix database. Each of these three cases is indexed separately in Elasticsearch. We are now starting to consider dismissing the intermediate level provided by the SQL database and evaluating a NoSQL option as a unique central database for all the monitoring information. We setup a set of Kibana dashboards with pre-defined queries in order to monitor the relevant information in each case. In this way we have achieved a uniform monitoring interface for both the IaaS and the scientific applications, mostly leveraging off-the-shelf tools.

  10. Use of electronic medical record data for quality improvement in schizophrenia treatment.

    PubMed

    Owen, Richard R; Thrush, Carol R; Cannon, Dale; Sloan, Kevin L; Curran, Geoff; Hudson, Teresa; Austen, Mark; Ritchie, Mona

    2004-01-01

    An understanding of the strengths and limitations of automated data is valuable when using administrative or clinical databases to monitor and improve the quality of health care. This study discusses the feasibility and validity of using data electronically extracted from the Veterans Health Administration (VHA) computer database (VistA) to monitor guideline performance for inpatient and outpatient treatment of schizophrenia. The authors also discuss preliminary results and their experience in applying these methods to monitor antipsychotic prescribing using the South Central VA Healthcare Network (SCVAHCN) Data Warehouse as a tool for quality improvement.

  11. WE-D-9A-06: Open Source Monitor Calibration and Quality Control Software for Enterprise Display Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bevins, N; Vanderhoek, M; Lang, S

    2014-06-15

    Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary andmore » secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system.« less

  12. Inductive monitoring system constructed from nominal system data and its use in real-time system monitoring

    NASA Technical Reports Server (NTRS)

    Iverson, David L. (Inventor)

    2008-01-01

    The present invention relates to an Inductive Monitoring System (IMS), its software implementations, hardware embodiments and applications. Training data is received, typically nominal system data acquired from sensors in normally operating systems or from detailed system simulations. The training data is formed into vectors that are used to generate a knowledge database having clusters of nominal operating regions therein. IMS monitors a system's performance or health by comparing cluster parameters in the knowledge database with incoming sensor data from a monitored-system formed into vectors. Nominal performance is concluded when a monitored-system vector is determined to lie within a nominal operating region cluster or lies sufficiently close to a such a cluster as determined by a threshold value and a distance metric. Some embodiments of IMS include cluster indexing and retrieval methods that increase the execution speed of IMS.

  13. Maintaining data integrity in a rural clinical trial.

    PubMed

    Van den Broeck, Jan; Mackay, Melanie; Mpontshane, Nontobeko; Kany Kany Luabeya, Angelique; Chhagan, Meera; Bennish, Michael L

    2007-01-01

    Clinical trials conducted in rural resource-poor settings face special challenges in ensuring quality of data collection and handling. The variable nature of these challenges, ways to overcome them, and the resulting data quality are rarely reported in the literature. To provide a detailed example of establishing local data handling capacity for a clinical trial conducted in a rural area, highlight challenges and solutions in establishing such capacity, and to report the data quality obtained by the trial. We provide a descriptive case study of a data system for biological samples and questionnaire data, and the problems encountered during its implementation. To determine the quality of data we analyzed test-retest studies using Kappa statistics of inter- and intra-observer agreement on categorical data. We calculated Technical Errors of Measurement of anthropometric measurements, audit trail analysis was done to assess error correction rates, and residual error rates were calculated by database-to-source document comparison. Initial difficulties included the unavailability of experienced research nurses, programmers and data managers in this rural area and the difficulty of designing new software tools and a complex database while making them error-free. National and international collaboration and external monitoring helped ensure good data handling and implementation of good clinical practice. Data collection, fieldwork supervision and query handling depended on streamlined transport over large distances. The involvement of a community advisory board was helpful in addressing cultural issues and establishing community acceptability of data collection methods. Data accessibility for safety monitoring required special attention. Kappa values and Technical Errors of Measurement showed acceptable values. Residual error rates in key variables were low. The article describes the experience of a single-site trial and does not address challenges particular to multi-site trials. Obtaining and maintaining data integrity in rural clinical trials is feasible, can result in acceptable data quality and can be used to develop capacity in developing country sites. It does, however, involve special challenges and requirements.

  14. A DBMS-based medical teleconferencing system.

    PubMed

    Chun, J; Kim, H; Lee, S; Choi, J; Cho, H

    2001-01-01

    This article presents the design of a medical teleconferencing system that is integrated with a multimedia patient database and incorporates easy-to-use tools and functions to effectively support collaborative work between physicians in remote locations. The design provides a virtual workspace that allows physicians to collectively view various kinds of patient data. By integrating the teleconferencing function into this workspace, physicians are able to conduct conferences using the same interface and have real-time access to the database during conference sessions. The authors have implemented a prototype based on this design. The prototype uses a high-speed network test bed and a manually created substitute for the integrated patient database.

  15. A DBMS-based Medical Teleconferencing System

    PubMed Central

    Chun, Jonghoon; Kim, Hanjoon; Lee, Sang-goo; Choi, Jinwook; Cho, Hanik

    2001-01-01

    This article presents the design of a medical teleconferencing system that is integrated with a multimedia patient database and incorporates easy-to-use tools and functions to effectively support collaborative work between physicians in remote locations. The design provides a virtual workspace that allows physicians to collectively view various kinds of patient data. By integrating the teleconferencing function into this workspace, physicians are able to conduct conferences using the same interface and have real-time access to the database during conference sessions. The authors have implemented a prototype based on this design. The prototype uses a high-speed network test bed and a manually created substitute for the integrated patient database. PMID:11522766

  16. Conceptual Design for the Amphibian Research and Monitoring Initiative (ARMI)

    NASA Astrophysics Data System (ADS)

    Battaglin, W. A.; Langtimm, C. A.; Adams, M. J.; Gallant, A. L.; James, D. L.

    2001-12-01

    In 2000, the President of the United States (US) and Congress directed Department of Interior (DOI) agencies to develop a program for monitoring trends in amphibian populations on DOI lands and to conduct research into causes of declines. The U.S. Geological Survey (USGS) was given lead responsibility for planning and implementing the Amphibian Research and Monitoring Initiative (ARMI) in cooperation with the National Park Service (NPS), Fish and Wildlife Service, and Bureau of Land Management. The program objectives are to (1) establish a network for monitoring the status and distribution of amphibian species on DOI lands; (2) identify and monitor environmental conditions known to affect amphibian populations; (3) conduct research on causes of amphibian population change and malformations; and (4) provide information to resource managers, policy makers, and the public in support of amphibian conservation. The ARMI program will integrate research efforts of USGS, other Federal, and non-federal herpetologists, hydrologists, and geographers across the Nation. ARMI will conduct a small number (~20) of intensive research efforts (for example, studies linking amphibian population changes to hydrologic conditions) and a larger number (~50) of more generalized inventory and monitoring studies encompassing broader areas such as NPS units. ARMI will coordinate with and try to augment other amphibian inventory studies such as the National Amphibian Atlas and the North American Amphibian Monitoring Program. ARMI will develop and test protocols for the standardized collection of amphibian data and provide a centrally managed database designed to simplify data entry, retrieval, and analysis. ARMI pilot projects are underway at locations across the US.

  17. Unified Desktop for Monitoring & Control Applications - The Open Navigator Framework Applied for Control Centre and EGSE Applications

    NASA Astrophysics Data System (ADS)

    Brauer, U.

    2007-08-01

    The Open Navigator Framework (ONF) was developed to provide a unified and scalable platform for user interface integration. The main objective for the framework was to raise usability of monitoring and control consoles and to provide a reuse of software components in different application areas. ONF is currently applied for the Columbus onboard crew interface, the commanding application for the Columbus Control Centre, the Columbus user facilities specialized user interfaces, the Mission Execution Crew Assistant (MECA) study and EADS Astrium internal R&D projects. ONF provides a well documented and proven middleware for GUI components (Java plugin interface, simplified concept similar to Eclipse). The overall application configuration is performed within a graphical user interface for layout and component selection. The end-user does not have to work in the underlying XML configuration files. ONF was optimized to provide harmonized user interfaces for monitoring and command consoles. It provides many convenience functions designed together with flight controllers and onboard crew: user defined workspaces, incl. support for multi screens efficient communication mechanism between the components integrated web browsing and documentation search &viewing consistent and integrated menus and shortcuts common logging and application configuration (properties) supervision interface for remote plugin GUI access (web based) A large number of operationally proven ONF components have been developed: Command Stack & History: Release of commands and follow up the command acknowledges System Message Panel: Browse, filter and search system messages/events Unified Synoptic System: Generic synoptic display system Situational Awareness : Show overall subsystem status based on monitoring of key parameters System Model Browser: Browse mission database defintions (measurements, commands, events) Flight Procedure Executor: Execute checklist and logical flow interactive procedures Web Browser : Integrated browser reference documentation and operations data Timeline Viewer: View master timeline as Gantt chart Search: Local search of operations products (e.g. documentation, procedures, displays) All GUI components access the underlying spacecraft data (commanding, reporting data, events, command history) via a common library providing adaptors for the current deployments (Columbus MCS, Columbus onboard Data Management System, Columbus Trainer raw packet protocol). New Adaptors are easy to develop. Currently an adaptor to SCOS 2000 is developed as part of a study for the ESTEC standardization section ("USS for ESTEC Reference Facility").

  18. Reviewing effectiveness of ankle assessment techniques for use in robot-assisted therapy.

    PubMed

    Zhang, Mingming; Davies, T Claire; Zhang, Yanxin; Xie, Shane

    2014-01-01

    This article provides a comprehensive review of studies that investigated ankle assessment techniques to better understand those that can be used in the real-time monitoring of rehabilitation progress for implementation in conjunction with robot-assisted therapy. Seventy-six publications published between January 1980 and August 2013 were selected based on eight databases. They were divided into two main categories (16 qualitative and 60 quantitative studies): 13 goniometer studies, 18 dynamometer studies, and 29 studies about innovative techniques. A total of 465 subjects participated in the 29 quantitative studies of innovative measurement techniques that may potentially be integrated in a real-time monitoring device, of which 19 studies included less than 10 participants. Results show that qualitative ankle assessment methods are not suitable for real-time monitoring in robot-assisted therapy, though they are reliable for certain patients, while the quantitative methods show great potential. The majority of quantitative techniques are reliable in measuring ankle kinematics and kinetics but are usually available only for use in the sagittal plane. Limited studies determine kinematics and kinetics in all three planes (sagittal, transverse, and frontal) where motions of the ankle joint and the subtalar joint actually occur.

  19. Monitoring HIV-Related Laws and Policies: Lessons for AIDS and Global Health in Agenda 2030.

    PubMed

    Torres, Mary Ann; Gruskin, Sofia; Buse, Kent; Erkkola, Taavi; Bendaud, Victoria; Alfvén, Tobias

    2017-07-01

    The National Commitments and Policy Instrument (NCPI) has been used to monitor AIDS-related laws and policies for over 10 years. What can be learnt from this process? Analyses draw on NCPI questionnaires, NCPI responses, the UNAIDS Law Database, survey data and responses to a 2014 survey on the NCPI. The NCPI provides the first and only systematic data on country self-reported national HIV laws and policies. High NCPI reporting rates and survey responses suggest the majority of countries consider the process relevant. Combined civil society and government engagement and reporting is integral to the NCPI. NCPI experience demonstrates its importance in describing the political and legal environment for the HIV response, for programmatic reviews and to stimulate dialogue among stakeholders, but there is a need for updating and in some instances to complement results with more objective quantitative data. We identify five areas that need to be updated in the next iteration of the NCPI and argue that the NCPI approach is relevant to participatory monitoring of targets in the health and other goals of the UN 2030 Agenda for Sustainable Development.

  20. Monitoring the Quality of Medicines: Results from Africa, Asia, and South America

    PubMed Central

    Hajjou, Mustapha; Krech, Laura; Lane-Barlow, Christi; Roth, Lukas; Pribluda, Victor S.; Phanouvong, Souly; El-Hadri, Latifa; Evans, Lawrence; Raymond, Christopher; Yuan, Elaine; Siv, Lang; Vuong, Tuan-Anh; Boateng, Kwasi Poku; Okafor, Regina; Chibwe, Kennedy M.; Lukulay, Patrick H.

    2015-01-01

    Monitoring the quality of medicines plays a crucial role in an integrated medicines quality assurance system. In a publicly available medicines quality database (MQDB), the U.S. Pharmacopeial Convention (USP) reports results of data collected from medicines quality monitoring (MQM) activities spanning the period of 2003–2013 in 17 countries of Africa, Asia, and South America. The MQDB contains information on 15,063 samples collected and tested using Minilab® screening methods and/or pharmacopeial methods. Approximately 71% of the samples reported came from Asia, 23% from Africa, and 6% from South America. The samples collected and tested include mainly antibiotic, antimalarial, and antituberculosis medicines. A total of 848 samples, representing 5.6% of total samples, failed the quality test. The failure proportion per region was 11.5%, 10.4%, and 2.9% for South America, Africa, and Asia, respectively. Eighty-one counterfeit medicines were reported, 86.4% of which were found in Asia and 13.6% in Africa. Additional analysis of the data shows the distribution of poor-quality medicines per region and by therapeutic indication as well as possible trends of counterfeit medicines. PMID:25897073

  1. Using Statistics for Database Management in an Academic Library.

    ERIC Educational Resources Information Center

    Hyland, Peter; Wright, Lynne

    1996-01-01

    Collecting statistical data about database usage by library patrons aids in the management of CD-ROM and database offerings, collection development, and evaluation of training programs. Two approaches to data collection are presented which should be used together: an automated or nonintrusive method which monitors search sessions while the…

  2. DSSTox EPA Integrated Risk Information System Structure-Index Locator File: SDF File and Documentation

    EPA Science Inventory

    EPA's Integrated Risk Information System (IRIS) database was developed and is maintained by EPA's Office of Research and Developement, National Center for Environmental Assessment. IRIS is a database of human health effects that may result from exposure to various substances fou...

  3. A systematic review of the cost of data collection for performance monitoring in hospitals.

    PubMed

    Jones, Cheryl; Gannon, Brenda; Wakai, Abel; O'Sullivan, Ronan

    2015-04-01

    Key performance indicators (KPIs) are used to identify where organisational performance is meeting desired standards and where performance requires improvement. Valid and reliable KPIs depend on the availability of high-quality data, specifically the relevant minimum data set ((MDS) the core data identified as the minimum required to measure performance for a KPI) elements. However, the feasibility of collecting the relevant MDS elements is always a limitation of performance monitoring using KPIs. Preferably, data should be integrated into service delivery, and, where additional data are required that are not currently collected as part of routine service delivery, there should be an economic evaluation to determine the cost of data collection. The aim of this systematic review was to synthesise the evidence base concerning the costs of data collection in hospitals for performance monitoring using KPI, and to identify hospital data collection systems that have proven to be cost minimising. We searched MEDLINE (1946 to May week 4 2014), Embase (1974 to May week 2 2014), and CINAHL (1937 to date). The database searches were supplemented by searching for grey literature through the OpenGrey database. Data was extracted, tabulated, and summarised as part of a narrative synthesis. The searches yielded a total of 1,135 publications. After assessing each identified study against specific inclusion exclusion criteria only eight studies were deemed as relevant for this review. The studies attempt to evaluate different types of data collection interventions including the installation of information communication technology (ICT), improvements to current ICT systems, and how different analysis techniques may be used to monitor performance. The evaluation methods used to measure the costs and benefits of data collection interventions are inconsistent across the identified literature. Overall, the results weakly indicate that collection of hospital data and improvements in data recording can be cost-saving. Given the limitations of this systematic review, it is difficult to conclude whether improvements in data collection systems can save money, increase quality of care, and assist performance monitoring of hospitals. With that said, the results are positive and suggest that data collection improvements may lead to cost savings and aid quality of care. PROSPERO CRD42014007450 .

  4. Semantic-JSON: a lightweight web service interface for Semantic Web contents integrating multiple life science databases

    PubMed Central

    Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro

    2011-01-01

    Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org. PMID:21632604

  5. Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project

    NASA Astrophysics Data System (ADS)

    van Eck, T.; Giardini, D.

    2010-12-01

    The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.

  6. Long-term ecosystem monitoring and change detection: the Sonoran initiative

    Treesearch

    Robert Lozar; Charles Ehlschlaeger

    2005-01-01

    Ecoregional Systems Heritage and Encroachment Monitoring (ESHEM) examines issues of land management at an ecosystem level using remote sensing. Engineer Research and Development Center (ERDC), in partnership with Western Illinois University, has developed an ecoregional database and monitoring capability covering the Sonoran region. The monitoring time horizon will...

  7. The Comprehensive, Powerful, Academic Database (CPAD): An Evaluative Study of a Predictive Tool Designed for Elementary School Personnel in Identifying At-Risk Students through Progress, Curriculum, and Performance Monitoring

    ERIC Educational Resources Information Center

    Chavez-Gibson, Sarah

    2013-01-01

    The purpose of this study is to exam in-depth, the Comprehensive, Powerful, Academic Database (CPAD), a data decision-making tool that determines and identifies students at-risk of dropping out of school, and how the CPAD assists administrators and teachers at an elementary campus to monitor progress, curriculum, and performance to improve student…

  8. Free text databases in an Integrated Academic Information System (IAIMS) at Columbia Presbyterian Medical Center.

    PubMed Central

    Clark, A. S.; Shea, S.

    1991-01-01

    The use of Folio Views, a PC DOS based product for free text databases, is explored in three applications in an Integrated Academic Information System (IAIMS): (1) a telephone directory, (2) a grants and contracts newsletter, and (3) nursing care plans. PMID:1666967

  9. IN SILICO METHODOLOGIES FOR PREDICTIVE EVALUATION OF TOXICITY BASED ON INTEGRATION OF DATABASES

    EPA Science Inventory

    In silico methodologies for predictive evaluation of toxicity based on integration of databases

    Chihae Yang1 and Ann M. Richard2, 1LeadScope, Inc. 1245 Kinnear Rd. Columbus, OH. 43212 2National Health & Environmental Effects Research Lab, U.S. EPA, Research Triangle Park, ...

  10. Integrative medicine for managing the symptoms of lupus nephritis: A protocol for systematic review and meta-analysis.

    PubMed

    Choi, Tae-Young; Jun, Ji Hee; Lee, Myeong Soo

    2018-03-01

    Integrative medicine is claimed to improve symptoms of lupus nephritis. No systematic reviews have been performed for the application of integrative medicine for lupus nephritis on patients with systemic lupus erythematosus (SLE). Thus, this review will aim to evaluate the current evidence on the efficacy of integrative medicine for the management of lupus nephritis in patients with SLE. The following electronic databases will be searched for studies published from their dates of inception February 2018: Medline, EMBASE and the Cochrane Central Register of Controlled Trials (CENTRAL), as well as 6 Korean medical databases (Korea Med, the Oriental Medicine Advanced Search Integrated System [OASIS], DBpia, the Korean Medical Database [KM base], the Research Information Service System [RISS], and the Korean Studies Information Services System [KISS]), and 1 Chinese medical database (the China National Knowledge Infrastructure [CNKI]). Study selection, data extraction, and assessment will be performed independently by 2 researchers. The risk of bias (ROB) will be assessed using the Cochrane ROB tool. This systematic review will be published in a peer-reviewed journal and disseminated both electronically and in print. The review will be updated to inform and guide healthcare practice and policy. PROSPERO 2018 CRD42018085205.

  11. Community integration after traumatic brain injury: a systematic review of the clinical implications of measurement and service provision for older adults.

    PubMed

    Ritchie, Linda; Wright-St Clair, Valerie A; Keogh, Justin; Gray, Marion

    2014-01-01

    To explore the scope, reliability, and validity of community integration measures for older adults after traumatic brain injury (TBI). A search of peer-reviewed articles in English from 1990 to April 2011 was conducted using the EBSCO Health and Scopus databases. Search terms included were community integration, traumatic brain injury or TBI, 65 plus or older adults, and assessment. Forty-three eligible articles were identified, with 11 selected for full review using a standardized critical review method. Common community integration measures were identified and ranked for relevance and psychometric properties. Of the 43 eligible articles, studies reporting community integration outcomes post-TBI were identified and critically reviewed. Older adults' community integration needs post-TBI from high quality studies were summarized. There is a relative lack of evidence pertaining to older adults post-TBI, but indicators are that older adults have poorer outcomes than their younger counterparts. The Community Integration Questionnaire (CIQ) is the most widely used community integration measurement tool used in research for people with TBI. Because of some limitations, many studies have used the CIQ in conjunction with other measures to better quantify and/or monitor changes in community integration. Enhancing integration of older adults after TBI into their community of choice, with particular emphasis on social integration and quality of life, should be a primary rehabilitation goal. However, more research is needed to inform best practice guidelines to meet the needs of this growing TBI population. It is recommended that subjective tools, such as quality of life measures, are used in conjunction with well-established community integration measures, such as the CIQ, during the assessment process. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  12. blend4php: a PHP API for galaxy

    PubMed Central

    Wytko, Connor; Soto, Brian; Ficklin, Stephen P.

    2017-01-01

    Galaxy is a popular framework for execution of complex analytical pipelines typically for large data sets, and is a commonly used for (but not limited to) genomic, genetic and related biological analysis. It provides a web front-end and integrates with high performance computing resources. Here we report the development of the blend4php library that wraps Galaxy’s RESTful API into a PHP-based library. PHP-based web applications can use blend4php to automate execution, monitoring and management of a remote Galaxy server, including its users, workflows, jobs and more. The blend4php library was specifically developed for the integration of Galaxy with Tripal, the open-source toolkit for the creation of online genomic and genetic web sites. However, it was designed as an independent library for use by any application, and is freely available under version 3 of the GNU Lesser General Public License (LPGL v3.0) at https://github.com/galaxyproject/blend4php. Database URL: https://github.com/galaxyproject/blend4php PMID:28077564

  13. Managing operational documentation in the ALICE Detector Control System

    NASA Astrophysics Data System (ADS)

    Lechman, M.; Augustinus, A.; Bond, P.; Chochula, P.; Kurepin, A.; Pinazza, O.; Rosinsky, P.

    2012-12-01

    ALICE (A Large Ion Collider Experiment) is one of the big LHC (Large Hadron Collider) experiments at CERN in Geneve, Switzerland. The experiment is composed of 18 sub-detectors controlled by an integrated Detector Control System (DCS) that is implemented using the commercial SCADA package PVSSII. The DCS includes over 1200 network devices, over 1,000,000 monitored parameters and numerous custom made software components that are prepared by over 100 developers from all around the world. This complex system is controlled by a single operator via a central user interface. One of his/her main tasks is the recovery of anomalies and errors that may occur during operation. Therefore, clear, complete and easily accessible documentation is essential to guide the shifter through the expert interfaces of different subsystems. This paper describes the idea of the management of the operational documentation in ALICE using a generic repository that is built on a relational database and is integrated with the control system. The experience gained and the conclusions drawn from the project are also presented.

  14. Using artificial intelligence to automate remittance processing.

    PubMed

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  15. On the way toward systems biology of Aspergillus fumigatus infection.

    PubMed

    Albrecht, Daniela; Kniemeyer, Olaf; Mech, Franziska; Gunzer, Matthias; Brakhage, Axel; Guthke, Reinhard

    2011-06-01

    Pathogenicity of Aspergillus fumigatus is multifactorial. Thus, global studies are essential for the understanding of the infection process. Therefore, a data warehouse was established where genome sequence, transcriptome and proteome data are stored. These data are analyzed for the elucidation of virulence determinants. The data analysis workflow starts with pre-processing including imputing of missing values and normalization. Last step is the identification of differentially expressed genes/proteins as interesting candidates for further analysis, in particular for functional categorization and correlation studies. Sequence data and other prior knowledge extracted from databases are integrated to support the inference of gene regulatory networks associated with pathogenicity. This knowledge-assisted data analysis aims at establishing mathematical models with predictive strength to assist further experimental work. Recently, first steps were done to extend the integrative data analysis and computational modeling by evaluating spatio-temporal data (movies) that monitor interactions of A. fumigatus morphotypes (e.g. conidia) with host immune cells. Copyright © 2011 Elsevier GmbH. All rights reserved.

  16. Quantitative Measurement of Integrated Band Intensities of Isoprene and Formaldehyde

    NASA Astrophysics Data System (ADS)

    Brauer, Carolyn S.; Johnson, Timothy J.; Blake, Thomas A.; Sams, Robert L.

    2013-06-01

    The OH-initiated oxidation of isoprene, which is one of the primary volatile organic compounds produced by vegetation, is a major source of atmospheric formaldehyde and other oxygenated organics. Both molecules are also known products of biomass burning. Absorption coefficients and integrated band intensities for isoprene and formaldehyde are reported in the 600 - 6500 cm^{-1} region. The pressure broadened (1 atmosphere N_2) spectra were recorded at 278, 298 and 323 K in a 19.96 cm path length cell at 0.112 cm^{-1} resolution, using a Bruker 66V FTIR. Composite spectra are composed of a minimum of seven pressures at each temperature for both molecules. These data are part of the PNNL Spectral Database, which contains quantitative spectra of over 600 molecules. These quantitative spectra facilitate atmospheric monitoring for both remote and in situ sensing and such applications will be discussed. Timothy J. Johnson, Luisa T. M. Profeta, Robert L. Sams, David W. T. Griffith, Robert L. Yokelson Vibrational Spectroscopy {53}(1);97-102 (2010).

  17. Dietary Exposure Potential Model

    EPA Science Inventory

    Existing food consumption and contaminant residue databases, typically products of nutrition and regulatory monitoring, contain useful information to characterize dietary intake of environmental chemicals. A PC-based model with resident database system, termed the Die...

  18. HOWDY: an integrated database system for human genome research

    PubMed Central

    Hirakawa, Mika

    2002-01-01

    HOWDY is an integrated database system for accessing and analyzing human genomic information (http://www-alis.tokyo.jst.go.jp/HOWDY/). HOWDY stores information about relationships between genetic objects and the data extracted from a number of databases. HOWDY consists of an Internet accessible user interface that allows thorough searching of the human genomic databases using the gene symbols and their aliases. It also permits flexible editing of the sequence data. The database can be searched using simple words and the search can be restricted to a specific cytogenetic location. Linear maps displaying markers and genes on contig sequences are available, from which an object can be chosen. Any search starting point identifies all the information matching the query. HOWDY provides a convenient search environment of human genomic data for scientists unsure which database is most appropriate for their search. PMID:11752279

  19. [Technical improvement of cohort constitution in administrative health databases: Providing a tool for integration and standardization of data applicable in the French National Health Insurance Database (SNIIRAM)].

    PubMed

    Ferdynus, C; Huiart, L

    2016-09-01

    Administrative health databases such as the French National Heath Insurance Database - SNIIRAM - are a major tool to answer numerous public health research questions. However the use of such data requires complex and time-consuming data management. Our objective was to develop and make available a tool to optimize cohort constitution within administrative health databases. We developed a process to extract, transform and load (ETL) data from various heterogeneous sources in a standardized data warehouse. This data warehouse is architected as a star schema corresponding to an i2b2 star schema model. We then evaluated the performance of this ETL using data from a pharmacoepidemiology research project conducted in the SNIIRAM database. The ETL we developed comprises a set of functionalities for creating SAS scripts. Data can be integrated into a standardized data warehouse. As part of the performance assessment of this ETL, we achieved integration of a dataset from the SNIIRAM comprising more than 900 million lines in less than three hours using a desktop computer. This enables patient selection from the standardized data warehouse within seconds of the request. The ETL described in this paper provides a tool which is effective and compatible with all administrative health databases, without requiring complex database servers. This tool should simplify cohort constitution in health databases; the standardization of warehouse data facilitates collaborative work between research teams. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  20. Discovering Knowledge from AIS Database for Application in VTS

    NASA Astrophysics Data System (ADS)

    Tsou, Ming-Cheng

    The widespread use of the Automatic Identification System (AIS) has had a significant impact on maritime technology. AIS enables the Vessel Traffic Service (VTS) not only to offer commonly known functions such as identification, tracking and monitoring of vessels, but also to provide rich real-time information that is useful for marine traffic investigation, statistical analysis and theoretical research. However, due to the rapid accumulation of AIS observation data, the VTS platform is often unable quickly and effectively to absorb and analyze it. Traditional observation and analysis methods are becoming less suitable for the modern AIS generation of VTS. In view of this, we applied the same data mining technique used for business intelligence discovery (in Customer Relation Management (CRM) business marketing) to the analysis of AIS observation data. This recasts the marine traffic problem as a business-marketing problem and integrates technologies such as Geographic Information Systems (GIS), database management systems, data warehousing and data mining to facilitate the discovery of hidden and valuable information in a huge amount of observation data. Consequently, this provides the marine traffic managers with a useful strategic planning resource.

  1. Low-complexity R-peak detection in ECG signals: a preliminary step towards ambulatory fetal monitoring.

    PubMed

    Rooijakkers, Michiel; Rabotti, Chiara; Bennebroek, Martijn; van Meerbergen, Jef; Mischi, Massimo

    2011-01-01

    Non-invasive fetal health monitoring during pregnancy has become increasingly important. Recent advances in signal processing technology have enabled fetal monitoring during pregnancy, using abdominal ECG recordings. Ubiquitous ambulatory monitoring for continuous fetal health measurement is however still unfeasible due to the computational complexity of noise robust solutions. In this paper an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, while increasing the R-peak detection quality compared to existing R-peak detection schemes. Validation of the algorithm is performed on two manually annotated datasets, the MIT/BIH Arrhythmia database and an in-house abdominal database. Both R-peak detection quality and computational complexity are compared to state-of-the-art algorithms as described in the literature. With a detection error rate of 0.22% and 0.12% on the MIT/BIH Arrhythmia and in-house databases, respectively, the quality of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.

  2. Processing and Quality Monitoring for the ATLAS Tile Hadronic Calorimeter Data

    NASA Astrophysics Data System (ADS)

    Burghgrave, Blake; ATLAS Collaboration

    2017-10-01

    An overview is presented of Data Processing and Data Quality (DQ) Monitoring for the ATLAS Tile Hadronic Calorimeter. Calibration runs are monitored from a data quality perspective and used as a cross-check for physics runs. Data quality in physics runs is monitored extensively and continuously. Any problems are reported and immediately investigated. The DQ efficiency achieved was 99.6% in 2012 and 100% in 2015, after the detector maintenance in 2013-2014. Changes to detector status or calibrations are entered into the conditions database (DB) during a brief calibration loop between the end of a run and the beginning of bulk processing of data collected in it. Bulk processed data are reviewed and certified for the ATLAS Good Run List if no problem is detected. Experts maintain the tools used by DQ shifters and the calibration teams during normal operation, and prepare new conditions for data reprocessing and Monte Carlo (MC) production campaigns. Conditions data are stored in 3 databases: Online DB, Offline DB for data and a special DB for Monte Carlo. Database updates can be performed through a custom-made web interface.

  3. Quality control and assurance for validation of DOS/I measurements

    NASA Astrophysics Data System (ADS)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  4. Construction of an ortholog database using the semantic web technology for integrative analysis of genomic data.

    PubMed

    Chiba, Hirokazu; Nishide, Hiroyo; Uchiyama, Ikuo

    2015-01-01

    Recently, various types of biological data, including genomic sequences, have been rapidly accumulating. To discover biological knowledge from such growing heterogeneous data, a flexible framework for data integration is necessary. Ortholog information is a central resource for interlinking corresponding genes among different organisms, and the Semantic Web provides a key technology for the flexible integration of heterogeneous data. We have constructed an ortholog database using the Semantic Web technology, aiming at the integration of numerous genomic data and various types of biological information. To formalize the structure of the ortholog information in the Semantic Web, we have constructed the Ortholog Ontology (OrthO). While the OrthO is a compact ontology for general use, it is designed to be extended to the description of database-specific concepts. On the basis of OrthO, we described the ortholog information from our Microbial Genome Database for Comparative Analysis (MBGD) in the form of Resource Description Framework (RDF) and made it available through the SPARQL endpoint, which accepts arbitrary queries specified by users. In this framework based on the OrthO, the biological data of different organisms can be integrated using the ortholog information as a hub. Besides, the ortholog information from different data sources can be compared with each other using the OrthO as a shared ontology. Here we show some examples demonstrating that the ortholog information described in RDF can be used to link various biological data such as taxonomy information and Gene Ontology. Thus, the ortholog database using the Semantic Web technology can contribute to biological knowledge discovery through integrative data analysis.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsai, H. C.; Chen, K.; Liu, Y. Y.

    The US Department of Energy (DOE) [Environmental Management (EM), Office of Packaging and Transportation (EM-45)] Packaging Certification Program (PCP) has developed a radiofrequency identification (RFID) tracking and monitoring system for the management of nuclear materials packages during storage and transportation. The system, developed by the PCP team at Argonne National Laboratory, involves hardware modification, application software development, secured database and web server development, and irradiation experiments. In April 2008, Argonne tested key features of the RFID tracking and monitoring system in a weeklong, 1700 mile (2736 km) demonstration employing 14 empty type B fissile material drums of three designs (modelsmore » 9975, 9977 and ES-3100) that have been certified for shipment by the DOE and the US Nuclear Regulatory Commission. The demonstration successfully integrated global positioning system (GPS) technology for vehicle tracking, satellite/cellular (general packet radio service, or GPRS) technologies for wireless communication, and active RFID tags with multiple sensors (seal integrity, shock, temperature, humidity and battery status) on drums. In addition, the demonstration integrated geographic information system (GIS) technology with automatic alarm notifications of incidents and generated buffer zone reports for emergency response and management of staged incidents. The demonstration was sponsored by EM and the US National Nuclear Security Administration, with the participation of Argonne, Savannah River and Oak Ridge National Laboratories. Over 50 authorised stakeholders across the country observed the demonstration via secured Internet access. The DOE PCP and national laboratories are working on several RFID system implementation projects at selected DOE sites, as well as continuing device and systems development and widening applications beyond DOE sites and possibly beyond nuclear materials to include other radioactive materials.« less

  6. RegNetwork: an integrated database of transcriptional and post-transcriptional regulatory networks in human and mouse

    PubMed Central

    Liu, Zhi-Ping; Wu, Canglin; Miao, Hongyu; Wu, Hulin

    2015-01-01

    Transcriptional and post-transcriptional regulation of gene expression is of fundamental importance to numerous biological processes. Nowadays, an increasing amount of gene regulatory relationships have been documented in various databases and literature. However, to more efficiently exploit such knowledge for biomedical research and applications, it is necessary to construct a genome-wide regulatory network database to integrate the information on gene regulatory relationships that are widely scattered in many different places. Therefore, in this work, we build a knowledge-based database, named ‘RegNetwork’, of gene regulatory networks for human and mouse by collecting and integrating the documented regulatory interactions among transcription factors (TFs), microRNAs (miRNAs) and target genes from 25 selected databases. Moreover, we also inferred and incorporated potential regulatory relationships based on transcription factor binding site (TFBS) motifs into RegNetwork. As a result, RegNetwork contains a comprehensive set of experimentally observed or predicted transcriptional and post-transcriptional regulatory relationships, and the database framework is flexibly designed for potential extensions to include gene regulatory networks for other organisms in the future. Based on RegNetwork, we characterized the statistical and topological properties of genome-wide regulatory networks for human and mouse, we also extracted and interpreted simple yet important network motifs that involve the interplays between TF-miRNA and their targets. In summary, RegNetwork provides an integrated resource on the prior information for gene regulatory relationships, and it enables us to further investigate context-specific transcriptional and post-transcriptional regulatory interactions based on domain-specific experimental data. Database URL: http://www.regnetworkweb.org PMID:26424082

  7. iMETHYL: an integrative database of human DNA methylation, gene expression, and genomic variation.

    PubMed

    Komaki, Shohei; Shiwa, Yuh; Furukawa, Ryohei; Hachiya, Tsuyoshi; Ohmomo, Hideki; Otomo, Ryo; Satoh, Mamoru; Hitomi, Jiro; Sobue, Kenji; Sasaki, Makoto; Shimizu, Atsushi

    2018-01-01

    We launched an integrative multi-omics database, iMETHYL (http://imethyl.iwate-megabank.org). iMETHYL provides whole-DNA methylation (~24 million autosomal CpG sites), whole-genome (~9 million single-nucleotide variants), and whole-transcriptome (>14 000 genes) data for CD4 + T-lymphocytes, monocytes, and neutrophils collected from approximately 100 subjects. These data were obtained from whole-genome bisulfite sequencing, whole-genome sequencing, and whole-transcriptome sequencing, making iMETHYL a comprehensive database.

  8. The Forest Inventory and Analysis Database Version 4.0: Database Description and Users Manual for Phase 3

    Treesearch

    Christopher W. Woodall; Barbara L. Conkling; Michael C. Amacher; John W. Coulston; Sarah Jovan; Charles H. Perry; Beth Schulz; Gretchen C. Smith; Susan Will Wolf

    2010-01-01

    Describes the structure of the Forest Inventory and Analysis Database (FIADB) 4.0 for phase 3 indicators. The FIADB structure provides a consistent framework for storing forest health monitoring data across all ownerships for the entire United States. These data are available to the public.

  9. RadNet Databases and Reports

    EPA Pesticide Factsheets

    EPA’s RadNet data are available for viewing in a searchable database or as PDF reports. Historical and current RadNet monitoring data are used to estimate long-term trends in environmental radiation levels.

  10. Tomato functional genomics database (TFGD): a comprehensive collection and analysis package for tomato functional genomics

    USDA-ARS?s Scientific Manuscript database

    Tomato Functional Genomics Database (TFGD; http://ted.bti.cornell.edu) provides a comprehensive systems biology resource to store, mine, analyze, visualize and integrate large-scale tomato functional genomics datasets. The database is expanded from the previously described Tomato Expression Database...

  11. An integrative strategy to identify the entire protein coding potential of prokaryotic genomes by proteogenomics.

    PubMed

    Omasits, Ulrich; Varadarajan, Adithi R; Schmid, Michael; Goetze, Sandra; Melidis, Damianos; Bourqui, Marc; Nikolayeva, Olga; Québatte, Maxime; Patrignani, Andrea; Dehio, Christoph; Frey, Juerg E; Robinson, Mark D; Wollscheid, Bernd; Ahrens, Christian H

    2017-12-01

    Accurate annotation of all protein-coding sequences (CDSs) is an essential prerequisite to fully exploit the rapidly growing repertoire of completely sequenced prokaryotic genomes. However, large discrepancies among the number of CDSs annotated by different resources, missed functional short open reading frames (sORFs), and overprediction of spurious ORFs represent serious limitations. Our strategy toward accurate and complete genome annotation consolidates CDSs from multiple reference annotation resources, ab initio gene prediction algorithms and in silico ORFs (a modified six-frame translation considering alternative start codons) in an integrated proteogenomics database (iPtgxDB) that covers the entire protein-coding potential of a prokaryotic genome. By extending the PeptideClassifier concept of unambiguous peptides for prokaryotes, close to 95% of the identifiable peptides imply one distinct protein, largely simplifying downstream analysis. Searching a comprehensive Bartonella henselae proteomics data set against such an iPtgxDB allowed us to unambiguously identify novel ORFs uniquely predicted by each resource, including lipoproteins, differentially expressed and membrane-localized proteins, novel start sites and wrongly annotated pseudogenes. Most novelties were confirmed by targeted, parallel reaction monitoring mass spectrometry, including unique ORFs and single amino acid variations (SAAVs) identified in a re-sequenced laboratory strain that are not present in its reference genome. We demonstrate the general applicability of our strategy for genomes with varying GC content and distinct taxonomic origin. We release iPtgxDBs for B. henselae , Bradyrhizobium diazoefficiens and Escherichia coli and the software to generate both proteogenomics search databases and integrated annotation files that can be viewed in a genome browser for any prokaryote. © 2017 Omasits et al.; Published by Cold Spring Harbor Laboratory Press.

  12. Evaluation of 3-D Air Quality System Remotely-Sensed Aerosol Optical Depth for the Baltimore/Washington Metropolitan Air Shed

    NASA Astrophysics Data System (ADS)

    Weber, S. A.; Engel-Cox, J. A.; Hoff, R. M.; Prados, A.; Zhang, H.

    2008-12-01

    Integrating satellite- and ground-based aerosol optical depth (AOD) observations with surface total fine particulate (PM2.5) and sulfate concentrations allows for a more comprehensive understanding of local- and urban-scale air quality. This study evaluates the utility of integrated databases being developed for NOAA and EPA through the 3D-AQS project by examining the relationship between remotely-sensed AOD and PM2.5 concentrations for each platform for the summer of 2004 and the entire year of 2005. We compare results for the Baltimore, MD/Washington, DC metropolitan air shed, incorporating AOD products from the Terra and GOES-12 satellites, AERONET sunphotometer, and ground-based lidar, and PM2.5 concentrations from five surface monitoring sites. The satellite-derived products include AOD from the Moderate Resolution Imaging Spectroradiometer (MODIS) and Multi-angle Imaging Spectroradiometer (MISR), as well as the GOES Aerosol/Smoke Product (GASP). The vertical profile of lidar backscatter is used to retrieve the planetary boundary layer (PBL) height in an attempt to capture only that fraction of the AOD arising from near surface aerosols. Adjusting the AOD data using platform- and season-specific ratios, calculated using the parameters of the regression equations, for two case studies resulted in a more accurate representation of surface PM2.5 concentrations when compared to a constant ratio that is currently being used in the NOAA IDEA product. This work demonstrates that quantitative relationships between remotely-sensed and in-situ aerosol observations in an integrated database can be computed and applied to improve the use of remotely-sensed observations for estimating surface concentrations.

  13. ILDgenDB: integrated genetic knowledge resource for interstitial lung diseases (ILDs).

    PubMed

    Mishra, Smriti; Shah, Mohammad I; Sarkar, Malay; Asati, Nimisha; Rout, Chittaranjan

    2018-01-01

    Interstitial lung diseases (ILDs) are a diverse group of ∼200 acute and chronic pulmonary disorders that are characterized by variable amounts of inflammation, fibrosis and architectural distortion with substantial morbidity and mortality. Inaccurate and delayed diagnoses increase the risk, especially in developing countries. Studies have indicated the significant roles of genetic elements in ILDs pathogenesis. Therefore, the first genetic knowledge resource, ILDgenDB, has been developed with an objective to provide ILDs genetic data and their integrated analyses for the better understanding of disease pathogenesis and identification of diagnostics-based biomarkers. This resource contains literature-curated disease candidate genes (DCGs) enriched with various regulatory elements that have been generated using an integrated bioinformatics workflow of databases searches, literature-mining and DCGs-microRNA (miRNAs)-single nucleotide polymorphisms (SNPs) association analyses. To provide statistical significance to disease-gene association, ILD-specificity index and hypergeomatric test scores were also incorporated. Association analyses of miRNAs, SNPs and pathways responsible for the pathogenesis of different sub-classes of ILDs were also incorporated. Manually verified 299 DCGs and their significant associations with 1932 SNPs, 2966 miRNAs and 9170 miR-polymorphisms were also provided. Furthermore, 216 literature-mined and proposed biomarkers were identified. The ILDgenDB resource provides user-friendly browsing and extensive query-based information retrieval systems. Additionally, this resource also facilitates graphical view of predicted DCGs-SNPs/miRNAs and literature associated DCGs-ILDs interactions for each ILD to facilitate efficient data interpretation. Outcomes of analyses suggested the significant involvement of immune system and defense mechanisms in ILDs pathogenesis. This resource may potentially facilitate genetic-based disease monitoring and diagnosis.Database URL: http://14.139.240.55/ildgendb/index.php.

  14. Time-critical Database Condition Data Handling in the CMS Experiment During the First Data Taking Period

    NASA Astrophysics Data System (ADS)

    Cavallari, Francesca; de Gruttola, Michele; Di Guida, Salvatore; Govi, Giacomo; Innocente, Vincenzo; Pfeiffer, Andreas; Pierro, Antonio

    2011-12-01

    Automatic, synchronous and reliable population of the condition databases is critical for the correct operation of the online selection as well as of the offline reconstruction and analysis of data. In this complex infrastructure, monitoring and fast detection of errors is a very challenging task. In this paper, we describe the CMS experiment system to process and populate the Condition Databases and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are automatically collected using centralized jobs or are "dropped" by the users in dedicated services (offline and online drop-box), which synchronize them and take care of writing them into the online database. Then they are automatically streamed to the offline database, and thus are immediately accessible offline worldwide. The condition data are managed by different users using a wide range of applications.In normal operation the database monitor is used to provide simple timing information and the history of all transactions for all database accounts, and in the case of faults it is used to return simple error messages and more complete debugging information.

  15. Multi-parameter vital sign database to assist in alarm optimization for general care units.

    PubMed

    Welch, James; Kanter, Benjamin; Skora, Brooke; McCombie, Scott; Henry, Isaac; McCombie, Devin; Kennedy, Rosemary; Soller, Babs

    2016-12-01

    Continual vital sign assessment on the general care, medical-surgical floor is expected to provide early indication of patient deterioration and increase the effectiveness of rapid response teams. However, there is concern that continual, multi-parameter vital sign monitoring will produce alarm fatigue. The objective of this study was the development of a methodology to help care teams optimize alarm settings. An on-body wireless monitoring system was used to continually assess heart rate, respiratory rate, SpO 2 and noninvasive blood pressure in the general ward of ten hospitals between April 1, 2014 and January 19, 2015. These data, 94,575 h for 3430 patients are contained in a large database, accessible with cloud computing tools. Simulation scenarios assessed the total alarm rate as a function of threshold and annunciation delay (s). The total alarm rate of ten alarms/patient/day predicted from the cloud-hosted database was the same as the total alarm rate for a 10 day evaluation (1550 h for 36 patients) in an independent hospital. Plots of vital sign distributions in the cloud-hosted database were similar to other large databases published by different authors. The cloud-hosted database can be used to run simulations for various alarm thresholds and annunciation delays to predict the total alarm burden experienced by nursing staff. This methodology might, in the future, be used to help reduce alarm fatigue without sacrificing the ability to continually monitor all vital signs.

  16. The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity

    NASA Astrophysics Data System (ADS)

    Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo

    2015-05-01

    The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.

  17. Design and implementation of an audit trail in compliance with US regulations.

    PubMed

    Jiang, Keyuan; Cao, Xiang

    2011-10-01

    Audit trails have been used widely to ensure quality of study data and have been implemented in computerized clinical trials data systems. Increasingly, there is a need to audit access to study participant identifiable information to provide assurance that study participant privacy is protected and confidentiality is maintained. In the United States, several federal regulations specify how the audit trail function should be implemented. To describe the development and implementation of a comprehensive audit trail system that meets the regulatory requirements of assuring data quality and integrity and protecting participant privacy and that is also easy to implement and maintain. The audit trail system was designed and developed after we examined regulatory requirements, data access methods, prevailing application architecture, and good security practices. Our comprehensive audit trail system was developed and implemented at the database level using a commercially available database management software product. It captures both data access and data changes with the correct user identifier. Documentation of access is initiated automatically in response to either data retrieval or data change at the database level. Currently, our system has been implemented only on one commercial database management system. Although our audit trail algorithm does not allow for logging aggregate operations, aggregation does not reveal sensitive private participant information. Careful consideration must be given to data items selected for monitoring because selection of all data items using our system can dramatically increase the requirements for computer disk space. Evaluating the criticality and sensitivity of individual data items selected can control the storage requirements for clinical trial audit trail records. Our audit trail system is capable of logging data access and data change operations to satisfy regulatory requirements. Our approach is applicable to virtually any data that can be stored in a relational database.

  18. Relationship mapping

    NASA Astrophysics Data System (ADS)

    Benachenhou, D.

    2009-04-01

    Information-technology departments in large enterprises spend 40% of budget on information integration-combining information from different data sources into a coherent form. IDC, a market-intelligence firm, estimates that the market for data integration and access software (which includes the key enabling technology for information integration) was about 2.5 billion in 2007, and is expected to grow to 3.8 billion in 2012. This is only the cost estimate for structured or traditional database information integration. Just imagine the market for transforming text into structured information and subsequent fusion with traditional databases.

  19. The role of non-technical skills in surgery

    PubMed Central

    Agha, Riaz A.; Fowler, Alexander J.; Sevdalis, Nick

    2015-01-01

    Non-technical skills are of increasing importance in surgery and surgical training. A traditional focus on technical skills acquisition and competence is no longer enough for the delivery of a modern, safe surgical practice. This review discusses the importance of non-technical skills and the values that underpin successful modern surgical practice. This narrative review used a number of sources including written and online, there was no specific search strategy of defined databases. Modern surgical practice requires; technical and non-technical skills, evidence-based practice, an emphasis on lifelong learning, monitoring of outcomes and a supportive institutional and health service framework. Finally these requirements need to be combined with a number of personal and professional values including integrity, professionalism and compassionate, patient-centred care. PMID:26904193

  20. A Random Forest-based ensemble method for activity recognition.

    PubMed

    Feng, Zengtao; Mo, Lingfei; Li, Meng

    2015-01-01

    This paper presents a multi-sensor ensemble approach to human physical activity (PA) recognition, using random forest. We designed an ensemble learning algorithm, which integrates several independent Random Forest classifiers based on different sensor feature sets to build a more stable, more accurate and faster classifier for human activity recognition. To evaluate the algorithm, PA data collected from the PAMAP (Physical Activity Monitoring for Aging People), which is a standard, publicly available database, was utilized to train and test. The experimental results show that the algorithm is able to correctly recognize 19 PA types with an accuracy of 93.44%, while the training is faster than others. The ensemble classifier system based on the RF (Random Forest) algorithm can achieve high recognition accuracy and fast calculation.

  1. Research and realization of key technology in HILS interactive system

    NASA Astrophysics Data System (ADS)

    Liu, Che; Lu, Huiming; Wang, Fankai

    2018-03-01

    This paper designed HILS (Hardware In the Loop Simulation) interactive system based on xPC platform . Through the interface between C++ and MATLAB engine, establish the seamless data connection between Simulink and interactive system, complete data interaction between system and Simulink, realize the function development of model configuration, parameter modification and off line simulation. We establish the data communication between host and target machine through TCP/IP protocol to realize the model download and real-time simulation. Use database to store simulation data, implement real-time simulation monitoring and simulation data management. Realize system function integration by Qt graphic interface library and dynamic link library. At last, take the typical control system as an example to verify the feasibility of HILS interactive system.

  2. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2017-06-01

    report. 10 Supporting Data None. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI Psychological Health...Award Number: W81XWH-13-1-0095 TITLE: Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI PRINCIPAL INVESTIGATOR...COVERED 08 MAR 2016 – 07 MAR 2017 4. TITLE AND SUBTITLE Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI 5a

  3. EPA Facility Registry Service (FRS): PCS_NPDES

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Permit Compliance System (PCS) or the National Pollutant Discharge Elimination System (NPDES) module of the Integrated Compliance Information System (ICIS). PCS tracks NPDES surface water permits issued under the Clean Water Act. This system is being incrementally replaced by the NPDES module of ICIS. Under NPDES, all facilities that discharge pollutants from any point source into waters of the United States are required to obtain a permit. The permit will likely contain limits on what can be discharged, impose monitoring and reporting requirements, and include other provisions to ensure that the discharge does not adversely affect water quality. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to NPDES facilities once the PCS or ICIS-NPDES data has been integrated into the FRS database. Additional information on FRS is available

  4. Socio-economic impacts of major floods in Italy from 1951 to 2003

    NASA Astrophysics Data System (ADS)

    Lastoria, B.; Simonetti, M. R.; Casaioli, M.; Mariani, S.; Monacelli, G.

    2006-03-01

    Meteorological and hydrological monitoring and modeling, with particular regard for extreme hydrological events, represent important activities carried out by the Hydrological and Inland Waters Service of the Italian Agency for Environmental Protection and Technical Services (APAT). Recently, a study on the socio-economic effects of floods was published in the Italian Environmental Data Yearbook by APAT. It is based on processed data related to the major floods (i.e., events with at least a casualty or that have generated economic damages higher than 0.001% of the Gross Domestic Product) striking Italy between 1951 and 2003. Information was gathered from technical reports and/or databases belonging to APAT, Italian Regional Environmental Agencies (ARPAs), central and local authorities, research institutions and newspaper reports. These data are collected in tables reporting the number of flood events and of casualties and the amount of financial resources required for environmental restoration and/or for risk mitigation purposes. For year 2003, when APAT has begun a systematic monitoring of flood events in Italy, data concerning rainfall, number of persons involved, evacuation and urgent measures introduced to face the event (laws and acts) are also included. In this way, it was possible to realize a new database, in which flood events that caused the declaration of the state of emergency have been collected. Because of the difficulties in finding sufficiently reliable data for the period before the II World War, the collection of historical data started from 1951. During this period, about 50% of the flood events examined have caused at least 5 victims each, and about 10% more than 100; these data highlight the considerable social impact of flood events and suggest the importance of creating an integrated database to collect information about flood events involving all Europe. These two databases (the historical and updating archives) could be useful for taking into account the different anthropic impacts during the time, the real effectiveness of protection measures already realized and could represent a valid reference for further interventions.

  5. Integrating forensic information in a crime intelligence database.

    PubMed

    Rossy, Quentin; Ioset, Sylvain; Dessimoz, Damien; Ribaux, Olivier

    2013-07-10

    Since 2008, intelligence units of six states of the western part of Switzerland have been sharing a common database for the analysis of high volume crimes. On a daily basis, events reported to the police are analysed, filtered and classified to detect crime repetitions and interpret the crime environment. Several forensic outcomes are integrated in the system such as matches of traces with persons, and links between scenes detected by the comparison of forensic case data. Systematic procedures have been settled to integrate links assumed mainly through DNA profiles, shoemarks patterns and images. A statistical outlook on a retrospective dataset of series from 2009 to 2011 of the database informs for instance on the number of repetition detected or confirmed and increased by forensic case data. Time needed to obtain forensic intelligence in regard with the type of marks treated, is seen as a critical issue. Furthermore, the underlying integration process of forensic intelligence into the crime intelligence database raised several difficulties in regards of the acquisition of data and the models used in the forensic databases. Solutions found and adopted operational procedures are described and discussed. This process form the basis to many other researches aimed at developing forensic intelligence models. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. An affinity-structure database of helix-turn-helix: DNA complexes with a universal coordinate system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AlQuraishi, Mohammed; Tang, Shengdong; Xia, Xide

    Molecular interactions between proteins and DNA molecules underlie many cellular processes, including transcriptional regulation, chromosome replication, and nucleosome positioning. Computational analyses of protein-DNA interactions rely on experimental data characterizing known protein-DNA interactions structurally and biochemically. While many databases exist that contain either structural or biochemical data, few integrate these two data sources in a unified fashion. Such integration is becoming increasingly critical with the rapid growth of structural and biochemical data, and the emergence of algorithms that rely on the synthesis of multiple data types to derive computational models of molecular interactions. We have developed an integrated affinity-structure database inmore » which the experimental and quantitative DNA binding affinities of helix-turn-helix proteins are mapped onto the crystal structures of the corresponding protein-DNA complexes. This database provides access to: (i) protein-DNA structures, (ii) quantitative summaries of protein-DNA binding affinities using position weight matrices, and (iii) raw experimental data of protein-DNA binding instances. Critically, this database establishes a correspondence between experimental structural data and quantitative binding affinity data at the single basepair level. Furthermore, we present a novel alignment algorithm that structurally aligns the protein-DNA complexes in the database and creates a unified residue-level coordinate system for comparing the physico-chemical environments at the interface between complexes. Using this unified coordinate system, we compute the statistics of atomic interactions at the protein-DNA interface of helix-turn-helix proteins. We provide an interactive website for visualization, querying, and analyzing this database, and a downloadable version to facilitate programmatic analysis. Lastly, this database will facilitate the analysis of protein-DNA interactions and the development of programmatic computational methods that capitalize on integration of structural and biochemical datasets. The database can be accessed at http://ProteinDNA.hms.harvard.edu.« less

  7. An affinity-structure database of helix-turn-helix: DNA complexes with a universal coordinate system

    DOE PAGES

    AlQuraishi, Mohammed; Tang, Shengdong; Xia, Xide

    2015-11-19

    Molecular interactions between proteins and DNA molecules underlie many cellular processes, including transcriptional regulation, chromosome replication, and nucleosome positioning. Computational analyses of protein-DNA interactions rely on experimental data characterizing known protein-DNA interactions structurally and biochemically. While many databases exist that contain either structural or biochemical data, few integrate these two data sources in a unified fashion. Such integration is becoming increasingly critical with the rapid growth of structural and biochemical data, and the emergence of algorithms that rely on the synthesis of multiple data types to derive computational models of molecular interactions. We have developed an integrated affinity-structure database inmore » which the experimental and quantitative DNA binding affinities of helix-turn-helix proteins are mapped onto the crystal structures of the corresponding protein-DNA complexes. This database provides access to: (i) protein-DNA structures, (ii) quantitative summaries of protein-DNA binding affinities using position weight matrices, and (iii) raw experimental data of protein-DNA binding instances. Critically, this database establishes a correspondence between experimental structural data and quantitative binding affinity data at the single basepair level. Furthermore, we present a novel alignment algorithm that structurally aligns the protein-DNA complexes in the database and creates a unified residue-level coordinate system for comparing the physico-chemical environments at the interface between complexes. Using this unified coordinate system, we compute the statistics of atomic interactions at the protein-DNA interface of helix-turn-helix proteins. We provide an interactive website for visualization, querying, and analyzing this database, and a downloadable version to facilitate programmatic analysis. Lastly, this database will facilitate the analysis of protein-DNA interactions and the development of programmatic computational methods that capitalize on integration of structural and biochemical datasets. The database can be accessed at http://ProteinDNA.hms.harvard.edu.« less

  8. Lessons learned while building the Deepwater Horizon Database: Toward improved data sharing in coastal science

    NASA Astrophysics Data System (ADS)

    Thessen, Anne E.; McGinnis, Sean; North, Elizabeth W.

    2016-02-01

    Process studies and coupled-model validation efforts in geosciences often require integration of multiple data types across time and space. For example, improved prediction of hydrocarbon fate and transport is an important societal need which fundamentally relies upon synthesis of oceanography and hydrocarbon chemistry. Yet, there are no publically accessible databases which integrate these diverse data types in a georeferenced format, nor are there guidelines for developing such a database. The objective of this research was to analyze the process of building one such database to provide baseline information on data sources and data sharing and to document the challenges and solutions that arose during this major undertaking. The resulting Deepwater Horizon Database was approximately 2.4 GB in size and contained over 8 million georeferenced data points collected from industry, government databases, volunteer networks, and individual researchers. The major technical challenges that were overcome were reconciliation of terms, units, and quality flags which were necessary to effectively integrate the disparate data sets. Assembling this database required the development of relationships with individual researchers and data managers which often involved extensive e-mail contacts. The average number of emails exchanged per data set was 7.8. Of the 95 relevant data sets that were discovered, 38 (40%) were obtained, either in whole or in part. Over one third (36%) of the requests for data went unanswered. The majority of responses were received after the first request (64%) and within the first week of the first request (67%). Although fewer than half of the potentially relevant datasets were incorporated into the database, the level of sharing (40%) was high compared to some other disciplines where sharing can be as low as 10%. Our suggestions for building integrated databases include budgeting significant time for e-mail exchanges, being cognizant of the cost versus benefits of pursuing reticent data providers, and building trust through clear, respectful communication and with flexible and appropriate attributions.

  9. Integrated Functional and Executional Modelling of Software Using Web-Based Databases

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Marietta, Roberta

    1998-01-01

    NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases.

  10. The Relationship between Treatment Integrity and Acceptability of Reading Interventions for Children with Attention-Deficit/Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Mautone, Jennifer A.; DuPaul, George J.; Jitendra, Asha K.; Tresco, Katy E.; Junod, Rosemary Vile; Volpe, Robert J.

    2009-01-01

    This study examined the relationship between treatment integrity and acceptability for reading interventions across two consultation models, intensive data-based academic intervention (IDAI) and traditional data-based academic intervention (TDAI). Participants included 83 first- through fourth-grade students who met research criteria for…

  11. The Problem with the Delta Cost Project Database

    ERIC Educational Resources Information Center

    Jaquette, Ozan; Parra, Edna

    2016-01-01

    The Integrated Postsecondary Education System (IPEDS) collects data on Title IV institutions. The Delta Cost Project (DCP) integrated data from multiple IPEDS survey components into a public-use longitudinal dataset. The DCP Database was the basis for dozens of journal articles and a series of influential policy reports. Unfortunately, a flaw in…

  12. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase two, volume 4 : web-based bridge information database--visualization analytics and distributed sensing.

    DOT National Transportation Integrated Search

    2012-03-01

    This report introduces the design and implementation of a Web-based bridge information visual analytics system. This : project integrates Internet, multiple databases, remote sensing, and other visualization technologies. The result : combines a GIS ...

  13. Wearable Performance Devices in Sports Medicine.

    PubMed

    Li, Ryan T; Kling, Scott R; Salata, Michael J; Cupp, Sean A; Sheehan, Joseph; Voos, James E

    2016-01-01

    Wearable performance devices and sensors are becoming more readily available to the general population and athletic teams. Advances in technology have allowed individual endurance athletes, sports teams, and physicians to monitor functional movements, workloads, and biometric markers to maximize performance and minimize injury. Movement sensors include pedometers, accelerometers/gyroscopes, and global positioning satellite (GPS) devices. Physiologic sensors include heart rate monitors, sleep monitors, temperature sensors, and integrated sensors. The purpose of this review is to familiarize health care professionals and team physicians with the various available types of wearable sensors, discuss their current utilization, and present future applications in sports medicine. Data were obtained from peer-reviewed literature through a search of the PubMed database. Included studies searched development, outcomes, and validation of wearable performance devices such as GPS, accelerometers, and physiologic monitors in sports. Clinical review. Level 4. Wearable sensors provide a method of monitoring real-time physiologic and movement parameters during training and competitive sports. These parameters can be used to detect position-specific patterns in movement, design more efficient sports-specific training programs for performance optimization, and screen for potential causes of injury. More recent advances in movement sensors have improved accuracy in detecting high-acceleration movements during competitive sports. Wearable devices are valuable instruments for the improvement of sports performance. Evidence for use of these devices in professional sports is still limited. Future developments are needed to establish training protocols using data from wearable devices. © 2015 The Author(s).

  14. External branch of the superior laryngeal nerve monitoring during thyroid and parathyroid surgery: International Neural Monitoring Study Group standards guideline statement.

    PubMed

    Barczyński, Marcin; Randolph, Gregory W; Cernea, Claudio R; Dralle, Henning; Dionigi, Gianlorenzo; Alesina, Piero F; Mihai, Radu; Finck, Camille; Lombardi, Davide; Hartl, Dana M; Miyauchi, Akira; Serpell, Jonathan; Snyder, Samuel; Volpi, Erivelto; Woodson, Gayle; Kraimps, Jean Louis; Hisham, Abdullah N

    2013-09-01

    Intraoperative neural monitoring (IONM) during thyroid surgery has gained widespread acceptance as an adjunct to the gold standard of visual identification of the recurrent laryngeal nerve (RLN). Contrary to routine dissection of the RLN, most surgeons tend to avoid rather than routinely expose and identify the external branch of the superior laryngeal nerve (EBSLN) during thyroidectomy or parathyroidectomy. IONM has the potential to be utilized for identification of the EBSLN and functional assessment of its integrity; therefore, IONM might contribute to voice preservation following thyroidectomy or parathyroidectomy. We reviewed the literature and the cumulative experience of the multidisciplinary International Neural Monitoring Study Group (INMSG) with IONM of the EBSLN. A systematic search of the MEDLINE database (from 1950 to the present) with predefined search terms (EBSLN, superior laryngeal nerve, stimulation, neuromonitoring, identification) was undertaken and supplemented by personal communication between members of the INMSG to identify relevant publications in the field. The hypothesis explored in this review is that the use of a standardized approach to the functional preservation of the EBSLN can be facilitated by application of IONM resulting in improved preservation of voice following thyroidectomy or parathyroidectomy. These guidelines are intended to improve the practice of neural monitoring of the EBSLN during thyroidectomy or parathyroidectomy and to optimize clinical utility of this technique based on available evidence and consensus of experts. 5 Copyright © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  15. Wearable Performance Devices in Sports Medicine

    PubMed Central

    Li, Ryan T.; Kling, Scott R.; Salata, Michael J.; Cupp, Sean A.; Sheehan, Joseph; Voos, James E.

    2016-01-01

    Context: Wearable performance devices and sensors are becoming more readily available to the general population and athletic teams. Advances in technology have allowed individual endurance athletes, sports teams, and physicians to monitor functional movements, workloads, and biometric markers to maximize performance and minimize injury. Movement sensors include pedometers, accelerometers/gyroscopes, and global positioning satellite (GPS) devices. Physiologic sensors include heart rate monitors, sleep monitors, temperature sensors, and integrated sensors. The purpose of this review is to familiarize health care professionals and team physicians with the various available types of wearable sensors, discuss their current utilization, and present future applications in sports medicine. Evidence Acquisition: Data were obtained from peer-reviewed literature through a search of the PubMed database. Included studies searched development, outcomes, and validation of wearable performance devices such as GPS, accelerometers, and physiologic monitors in sports. Study Design: Clinical review. Level of Evidence: Level 4. Results: Wearable sensors provide a method of monitoring real-time physiologic and movement parameters during training and competitive sports. These parameters can be used to detect position-specific patterns in movement, design more efficient sports-specific training programs for performance optimization, and screen for potential causes of injury. More recent advances in movement sensors have improved accuracy in detecting high-acceleration movements during competitive sports. Conclusion: Wearable devices are valuable instruments for the improvement of sports performance. Evidence for use of these devices in professional sports is still limited. Future developments are needed to establish training protocols using data from wearable devices. PMID:26733594

  16. Atlas - a data warehouse for integrative bioinformatics.

    PubMed

    Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire M S; Ling, John; Ouellette, B F Francis

    2005-02-21

    We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: http://bioinformatics.ubc.ca/atlas/

  17. Atlas – a data warehouse for integrative bioinformatics

    PubMed Central

    Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire MS; Ling, John; Ouellette, BF Francis

    2005-01-01

    Background We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. Description The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. Conclusion The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: PMID:15723693

  18. MEDCIS: Multi-Modality Epilepsy Data Capture and Integration System

    PubMed Central

    Zhang, Guo-Qiang; Cui, Licong; Lhatoo, Samden; Schuele, Stephan U.; Sahoo, Satya S.

    2014-01-01

    Sudden Unexpected Death in Epilepsy (SUDEP) is the leading mode of epilepsy-related death and is most common in patients with intractable, frequent, and continuing seizures. A statistically significant cohort of patients for SUDEP study requires meticulous, prospective follow up of a large population that is at an elevated risk, best represented by the Epilepsy Monitoring Unit (EMU) patient population. Multiple EMUs need to collaborate, share data for building a larger cohort of potential SUDEP patient using a state-of-the-art informatics infrastructure. To address the challenges of data integration and data access from multiple EMUs, we developed the Multi-Modality Epilepsy Data Capture and Integration System (MEDCIS) that combines retrospective clinical free text processing using NLP, prospective structured data capture using an ontology-driven interface, interfaces for cohort search and signal visualization, all in a single integrated environment. A dedicated Epilepsy and Seizure Ontology (EpSO) has been used to streamline the user interfaces, enhance its usability, and enable mappings across distributed databases so that federated queries can be executed. MEDCIS contained 936 patient data sets from the EMUs of University Hospitals Case Medical Center (UH CMC) in Cleveland and Northwestern Memorial Hospital (NMH) in Chicago. Patients from UH CMC and NMH were stored in different databases and then federated through MEDCIS using EpSO and our mapping module. More than 77GB of multi-modal signal data were processed using the Cloudwave pipeline and made available for rendering through the web-interface. About 74% of the 40 open clinical questions of interest were answerable accurately using the EpSO-driven VISual AGregagator and Explorer (VISAGE) interface. Questions not directly answerable were either due to their inherent computational complexity, the unavailability of primary information, or the scope of concept that has been formulated in the existing EpSO terminology system. PMID:25954436

  19. An experimental system for flood risk forecasting at global scale

    NASA Astrophysics Data System (ADS)

    Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.

    2016-12-01

    Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.

  20. OVERSEER: An Expert System Monitor for the Psychiatric Hospital

    PubMed Central

    Bronzino, Joseph D.; Morelli, Ralph A.; Goethe, John W.

    1988-01-01

    In order to improve patient care, comply with regulatory guidelines and decrease potential liability, psychiatric hospitals and clinics have been searching for computer systems to monitor the management and treatment of patients. This paper describes OVERSEER: a knowledge based system that monitors the treatment of psychiatric patients in real time. Based on procedures and protocols developed in the psychiatric setting, OVERSEER monitors the clinical database and issues alerts when standard clinical practices are not followed or when laboratory results or other clinical indicators are abnormal. Written in PROLOG, OVERSEER is designed to interface directly with the hospital's database, and, thereby utilizes all available pharmacy and laboratory data. Moreover, unlike the interactive expert systems developed for the psychiatric clinic, OVERSEER does not require extensive data entry by the clinician. Consequently, the chief benefit of OVERSEER's monitoring approach is the unobtrusive manner in which it evaluates treatment and patient responses and provides information regarding patient management.

  1. MIPS PlantsDB: a database framework for comparative plant genome research.

    PubMed

    Nussbaumer, Thomas; Martis, Mihaela M; Roessner, Stephan K; Pfeifer, Matthias; Bader, Kai C; Sharma, Sapna; Gundlach, Heidrun; Spannagl, Manuel

    2013-01-01

    The rapidly increasing amount of plant genome (sequence) data enables powerful comparative analyses and integrative approaches and also requires structured and comprehensive information resources. Databases are needed for both model and crop plant organisms and both intuitive search/browse views and comparative genomics tools should communicate the data to researchers and help them interpret it. MIPS PlantsDB (http://mips.helmholtz-muenchen.de/plant/genomes.jsp) was initially described in NAR in 2007 [Spannagl,M., Noubibou,O., Haase,D., Yang,L., Gundlach,H., Hindemitt, T., Klee,K., Haberer,G., Schoof,H. and Mayer,K.F. (2007) MIPSPlantsDB-plant database resource for integrative and comparative plant genome research. Nucleic Acids Res., 35, D834-D840] and was set up from the start to provide data and information resources for individual plant species as well as a framework for integrative and comparative plant genome research. PlantsDB comprises database instances for tomato, Medicago, Arabidopsis, Brachypodium, Sorghum, maize, rice, barley and wheat. Building up on that, state-of-the-art comparative genomics tools such as CrowsNest are integrated to visualize and investigate syntenic relationships between monocot genomes. Results from novel genome analysis strategies targeting the complex and repetitive genomes of triticeae species (wheat and barley) are provided and cross-linked with model species. The MIPS Repeat Element Database (mips-REdat) and Catalog (mips-REcat) as well as tight connections to other databases, e.g. via web services, are further important components of PlantsDB.

  2. MIPS PlantsDB: a database framework for comparative plant genome research

    PubMed Central

    Nussbaumer, Thomas; Martis, Mihaela M.; Roessner, Stephan K.; Pfeifer, Matthias; Bader, Kai C.; Sharma, Sapna; Gundlach, Heidrun; Spannagl, Manuel

    2013-01-01

    The rapidly increasing amount of plant genome (sequence) data enables powerful comparative analyses and integrative approaches and also requires structured and comprehensive information resources. Databases are needed for both model and crop plant organisms and both intuitive search/browse views and comparative genomics tools should communicate the data to researchers and help them interpret it. MIPS PlantsDB (http://mips.helmholtz-muenchen.de/plant/genomes.jsp) was initially described in NAR in 2007 [Spannagl,M., Noubibou,O., Haase,D., Yang,L., Gundlach,H., Hindemitt, T., Klee,K., Haberer,G., Schoof,H. and Mayer,K.F. (2007) MIPSPlantsDB–plant database resource for integrative and comparative plant genome research. Nucleic Acids Res., 35, D834–D840] and was set up from the start to provide data and information resources for individual plant species as well as a framework for integrative and comparative plant genome research. PlantsDB comprises database instances for tomato, Medicago, Arabidopsis, Brachypodium, Sorghum, maize, rice, barley and wheat. Building up on that, state-of-the-art comparative genomics tools such as CrowsNest are integrated to visualize and investigate syntenic relationships between monocot genomes. Results from novel genome analysis strategies targeting the complex and repetitive genomes of triticeae species (wheat and barley) are provided and cross-linked with model species. The MIPS Repeat Element Database (mips-REdat) and Catalog (mips-REcat) as well as tight connections to other databases, e.g. via web services, are further important components of PlantsDB. PMID:23203886

  3. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  4. Flexible data registration and automation in semiconductor production

    NASA Astrophysics Data System (ADS)

    Dudde, Ralf; Staudt-Fischbach, Peter; Kraemer, Benedict

    1997-08-01

    The need for cost reduction and flexibility in semiconductor production will result in a wider application of computer based automation systems. With the setup of a new and advanced CMOS semiconductor line in the Fraunhofer Institute for Silicon Technology [ISIT, Itzehoe (D)] a new line information system (LIS) was introduced based on an advanced model for the underlying data structure. This data model was implemented into an ORACLE-RDBMS. A cellworks based system (JOSIS) was used for the integration of the production equipment, communication and automated database bookings and information retrievals. During the ramp up of the production line this new system is used for the fab control. The data model and the cellworks based system integration is explained. This system enables an on-line overview of the work in progress in the fab, lot order history and equipment status and history. Based on this figures improved production and cost monitoring and optimization is possible. First examples of the information gained by this system are presented. The modular set-up of the LIS system will allow easy data exchange with additional software tools like scheduler, different fab control systems like PROMIS and accounting systems like SAP. Modifications necessary for the integration of PROMIS are described.

  5. 49 CFR 384.229 - Skills test examiner auditing and monitoring.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... must be performed at least once every year; (c) Establish and maintain a database to track pass/fail... maintain a database of all third party testers and examiners, which at a minimum tracks the dates and... and maintain a database of all State CDL skills examiners, which at a minimum tracks the dates and...

  6. 49 CFR 384.229 - Skills test examiner auditing and monitoring.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... must be performed at least once every year; (c) Establish and maintain a database to track pass/fail... maintain a database of all third party testers and examiners, which at a minimum tracks the dates and... and maintain a database of all State CDL skills examiners, which at a minimum tracks the dates and...

  7. The integration of quantitative information with an intelligent decision support system for residential energy retrofits

    NASA Astrophysics Data System (ADS)

    Mo, Yunjeong

    The purpose of this research is to support the development of an intelligent Decision Support System (DSS) by integrating quantitative information with expert knowledge in order to facilitate effective retrofit decision-making. To achieve this goal, the Energy Retrofit Decision Process Framework is analyzed. Expert system shell software, a retrofit measure cost database, and energy simulation software are needed for developing the DSS; Exsys Corvid, the NREM database and BEopt were chosen for implementing an integration model. This integration model demonstrates the holistic function of a residential energy retrofit system for existing homes, by providing a prioritized list of retrofit measures with cost information, energy simulation and expert advice. The users, such as homeowners and energy auditors, can acquire all of the necessary retrofit information from this unified system without having to explore several separate systems. The integration model plays the role of a prototype for the finalized intelligent decision support system. It implements all of the necessary functions for the finalized DSS, including integration of the database, energy simulation and expert knowledge.

  8. Sodium content of foods contributing to sodium intake: A comparison between selected foods from the CDC Packaged Food Database and the USDA National Nutrient Database for Standard Reference

    USDA-ARS?s Scientific Manuscript database

    The sodium concentration (mg/100g) for 23 of 125 Sentinel Foods were identified in the 2009 CDC Packaged Food Database (PFD) and compared with data in the USDA’s 2013 Standard Reference 26 (SR 26) database. Sentinel Foods are foods and beverages identified by USDA to be monitored as primary indicat...

  9. Development of SRS.php, a Simple Object Access Protocol-based library for data acquisition from integrated biological databases.

    PubMed

    Barbosa-Silva, A; Pafilis, E; Ortega, J M; Schneider, R

    2007-12-11

    Data integration has become an important task for biological database providers. The current model for data exchange among different sources simplifies the manner that distinct information is accessed by users. The evolution of data representation from HTML to XML enabled programs, instead of humans, to interact with biological databases. We present here SRS.php, a PHP library that can interact with the data integration Sequence Retrieval System (SRS). The library has been written using SOAP definitions, and permits the programmatic communication through webservices with the SRS. The interactions are possible by invoking the methods described in WSDL by exchanging XML messages. The current functions available in the library have been built to access specific data stored in any of the 90 different databases (such as UNIPROT, KEGG and GO) using the same query syntax format. The inclusion of the described functions in the source of scripts written in PHP enables them as webservice clients to the SRS server. The functions permit one to query the whole content of any SRS database, to list specific records in these databases, to get specific fields from the records, and to link any record among any pair of linked databases. The case study presented exemplifies the library usage to retrieve information regarding registries of a Plant Defense Mechanisms database. The Plant Defense Mechanisms database is currently being developed, and the proposal of SRS.php library usage is to enable the data acquisition for the further warehousing tasks related to its setup and maintenance.

  10. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaponov, Yu.A.; Igarashi, N.; Hiraki, M.

    2004-05-12

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view,more » create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments.« less

  11. Integration Of An MR Image Network Into A Clinical PACS

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Mankovich, Nicholas J.; Taira, Ricky K.; Cho, Paul S.; Huang, H. K.

    1988-06-01

    A direct link between a clinical pediatric PACS module and a FONAR MRI image network was implemented. The original MR network combines together the MR scanner, a remote viewing station and a central archiving station. The pediatric PACS directly connects to the archiving unit through an Ethernet TCP-IP network adhering to FONAR's protocol. The PACS communication software developed supports the transfer of patient studies and the patient information directly from the MR archive database to the pediatric PACS. In the first phase of our project we developed a package to transfer data between a VAX-111750 and the IBM PC I AT-based MR archive database through the Ethernet network. This system served as a model for PACS-to-modality network communication. Once testing was complete on this research network, the software and network hardware was moved to the clinical pediatric VAX for full PACS integration. In parallel to the direct transmission of digital images to the Pediatric PACS, a broadband communication system in video format was developed for real-time broadcasting of images originating from the MR console to 8 remote viewing stations distributed in the radiology department. These analog viewing stations allow the radiologists to directly monitor patient positioning and to select the scan levels during a patient examination from remote locations in the radiology department. This paper reports (1) the technical details of this implementation, (2) the merits of this network development scheme, and (3) the performance statistics of the network-to-PACS interface.

  12. Influenza Research Database: an integrated bioinformatics resource for influenza research and surveillance

    PubMed Central

    Squires, R. Burke; Noronha, Jyothi; Hunt, Victoria; García‐Sastre, Adolfo; Macken, Catherine; Baumgarth, Nicole; Suarez, David; Pickett, Brett E.; Zhang, Yun; Larsen, Christopher N.; Ramsey, Alvin; Zhou, Liwei; Zaremba, Sam; Kumar, Sanjeev; Deitrich, Jon; Klem, Edward; Scheuermann, Richard H.

    2012-01-01

    Please cite this paper as: Squires et al. (2012) Influenza research database: an integrated bioinformatics resource for influenza research and surveillance. Influenza and Other Respiratory Viruses 6(6), 404–416. Background  The recent emergence of the 2009 pandemic influenza A/H1N1 virus has highlighted the value of free and open access to influenza virus genome sequence data integrated with information about other important virus characteristics. Design  The Influenza Research Database (IRD, http://www.fludb.org) is a free, open, publicly‐accessible resource funded by the U.S. National Institute of Allergy and Infectious Diseases through the Bioinformatics Resource Centers program. IRD provides a comprehensive, integrated database and analysis resource for influenza sequence, surveillance, and research data, including user‐friendly interfaces for data retrieval, visualization and comparative genomics analysis, together with personal log in‐protected ‘workbench’ spaces for saving data sets and analysis results. IRD integrates genomic, proteomic, immune epitope, and surveillance data from a variety of sources, including public databases, computational algorithms, external research groups, and the scientific literature. Results  To demonstrate the utility of the data and analysis tools available in IRD, two scientific use cases are presented. A comparison of hemagglutinin sequence conservation and epitope coverage information revealed highly conserved protein regions that can be recognized by the human adaptive immune system as possible targets for inducing cross‐protective immunity. Phylogenetic and geospatial analysis of sequences from wild bird surveillance samples revealed a possible evolutionary connection between influenza virus from Delaware Bay shorebirds and Alberta ducks. Conclusions  The IRD provides a wealth of integrated data and information about influenza virus to support research of the genetic determinants dictating virus pathogenicity, host range restriction and transmission, and to facilitate development of vaccines, diagnostics, and therapeutics. PMID:22260278

  13. Use of a Relational Database to Support Clinical Research: Application in a Diabetes Program

    PubMed Central

    Lomatch, Diane; Truax, Terry; Savage, Peter

    1981-01-01

    A database has been established to support conduct of clinical research and monitor delivery of medical care for 1200 diabetic patients as part of the Michigan Diabetes Research and Training Center (MDRTC). Use of an intelligent microcomputer to enter and retrieve the data and use of a relational database management system (DBMS) to store and manage data have provided a flexible, efficient method of achieving both support of small projects and monitoring overall activity of the Diabetes Center Unit (DCU). Simplicity of access to data, efficiency in providing data for unanticipated requests, ease of manipulations of relations, security and “logical data independence” were important factors in choosing a relational DBMS. The ability to interface with an interactive statistical program and a graphics program is a major advantage of this system. Out database currently provides support for the operation and analysis of several ongoing research projects.

  14. Enabling heterogenous multi-scale database for emergency service functions through geoinformation technologies

    NASA Astrophysics Data System (ADS)

    Bhanumurthy, V.; Venugopala Rao, K.; Srinivasa Rao, S.; Ram Mohan Rao, K.; Chandra, P. Satya; Vidhyasagar, J.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Geographical Information Science (GIS) is now graduated from traditional desktop system to Internet system. Internet GIS is emerging as one of the most promising technologies for addressing Emergency Management. Web services with different privileges are playing an important role in dissemination of the emergency services to the decision makers. Spatial database is one of the most important components in the successful implementation of Emergency Management. It contains spatial data in the form of raster, vector, linked with non-spatial information. Comprehensive data is required to handle emergency situation in different phases. These database elements comprise core data, hazard specific data, corresponding attribute data, and live data coming from the remote locations. Core data sets are minimum required data including base, thematic, infrastructure layers to handle disasters. Disaster specific information is required to handle a particular disaster situation like flood, cyclone, forest fire, earth quake, land slide, drought. In addition to this Emergency Management require many types of data with spatial and temporal attributes that should be made available to the key players in the right format at right time. The vector database needs to be complemented with required resolution satellite imagery for visualisation and analysis in disaster management. Therefore, the database is interconnected and comprehensive to meet the requirement of an Emergency Management. This kind of integrated, comprehensive and structured database with appropriate information is required to obtain right information at right time for the right people. However, building spatial database for Emergency Management is a challenging task because of the key issues such as availability of data, sharing policies, compatible geospatial standards, data interoperability etc. Therefore, to facilitate using, sharing, and integrating the spatial data, there is a need to define standards to build emergency database systems. These include aspects such as i) data integration procedures namely standard coding scheme, schema, meta data format, spatial format ii) database organisation mechanism covering data management, catalogues, data models iii) database dissemination through a suitable environment, as a standard service for effective service dissemination. National Database for Emergency Management (NDEM) is such a comprehensive database for addressing disasters in India at the national level. This paper explains standards for integrating, organising the multi-scale and multi-source data with effective emergency response using customized user interfaces for NDEM. It presents standard procedure for building comprehensive emergency information systems for enabling emergency specific functions through geospatial technologies.

  15. Integrative medicine for managing the symptoms of lupus nephritis

    PubMed Central

    Choi, Tae-Young; Jun, Ji Hee; Lee, Myeong Soo

    2018-01-01

    Abstract Background: Integrative medicine is claimed to improve symptoms of lupus nephritis. No systematic reviews have been performed for the application of integrative medicine for lupus nephritis on patients with systemic lupus erythematosus (SLE). Thus, this review will aim to evaluate the current evidence on the efficacy of integrative medicine for the management of lupus nephritis in patients with SLE. Methods and analyses: The following electronic databases will be searched for studies published from their dates of inception February 2018: Medline, EMBASE and the Cochrane Central Register of Controlled Trials (CENTRAL), as well as 6 Korean medical databases (Korea Med, the Oriental Medicine Advanced Search Integrated System [OASIS], DBpia, the Korean Medical Database [KM base], the Research Information Service System [RISS], and the Korean Studies Information Services System [KISS]), and 1 Chinese medical database (the China National Knowledge Infrastructure [CNKI]). Study selection, data extraction, and assessment will be performed independently by 2 researchers. The risk of bias (ROB) will be assessed using the Cochrane ROB tool. Dissemination: This systematic review will be published in a peer-reviewed journal and disseminated both electronically and in print. The review will be updated to inform and guide healthcare practice and policy. Trial registration number: PROSPERO 2018 CRD42018085205 PMID:29595669

  16. Implementation of the Geological Hazard Monitoring and Early Warning System Based on Multi - source Data -A Case Study of Deqin Tibetan County, Yunnan Province

    NASA Astrophysics Data System (ADS)

    Zhao, Junsan; Chen, Guoping; Yuan, Lei

    2017-04-01

    The new technologies, such as 3D laser scanning, InSAR, GNSS, unmanned aerial vehicle and Internet of things, will provide much more data resources for the surveying and monitoring, as well as the development of Early Warning System (EWS). This paper provides the solutions of the design and implementation of a geological disaster monitoring and early warning system (GDMEWS), which includes landslides and debris flows hazard, based on the multi-sources of the date by use of technologies above mentioned. The complex and changeable characteristics of the GDMEWS are described. The architecture of the system, composition of the multi-source database, development mode and service logic, the methods and key technologies of system development are also analyzed. To elaborate the process of the implementation of the GDMEWS, Deqin Tibetan County is selected as a case study area, which has the unique terrain and diverse types of typical landslides and debris flows. Firstly, the system functional requirements, monitoring and forecasting models of the system are discussed. Secondly, the logic relationships of the whole process of disaster including pre-disaster, disaster rescue and post-disaster reconstruction are studied, and the support tool for disaster prevention, disaster reduction and geological disaster management are developed. Thirdly, the methods of the multi - source monitoring data integration and the generation of the mechanism model of Geological hazards and simulation are expressed. Finally, the construction of the GDMEWS is issued, which will be applied to management, monitoring and forecasting of whole disaster process in real-time and dynamically in Deqin Tibetan County. Keywords: multi-source spatial data; geological disaster; monitoring and warning system; Deqin Tibetan County

  17. EUCANEXT: an integrated database for the exploration of genomic and transcriptomic data from Eucalyptus species

    PubMed Central

    Nascimento, Leandro Costa; Salazar, Marcela Mendes; Lepikson-Neto, Jorge; Camargo, Eduardo Leal Oliveira; Parreiras, Lucas Salera; Carazzolle, Marcelo Falsarella

    2017-01-01

    Abstract Tree species of the genus Eucalyptus are the most valuable and widely planted hardwoods in the world. Given the economic importance of Eucalyptus trees, much effort has been made towards the generation of specimens with superior forestry properties that can deliver high-quality feedstocks, customized to the industrýs needs for both cellulosic (paper) and lignocellulosic biomass production. In line with these efforts, large sets of molecular data have been generated by several scientific groups, providing invaluable information that can be applied in the development of improved specimens. In order to fully explore the potential of available datasets, the development of a public database that provides integrated access to genomic and transcriptomic data from Eucalyptus is needed. EUCANEXT is a database that analyses and integrates publicly available Eucalyptus molecular data, such as the E. grandis genome assembly and predicted genes, ESTs from several species and digital gene expression from 26 RNA-Seq libraries. The database has been implemented in a Fedora Linux machine running MySQL and Apache, while Perl CGI was used for the web interfaces. EUCANEXT provides a user-friendly web interface for easy access and analysis of publicly available molecular data from Eucalyptus species. This integrated database allows for complex searches by gene name, keyword or sequence similarity and is publicly accessible at http://www.lge.ibi.unicamp.br/eucalyptusdb. Through EUCANEXT, users can perform complex analysis to identify genes related traits of interest using RNA-Seq libraries and tools for differential expression analysis. Moreover, all the bioinformatics pipeline here described, including the database schema and PERL scripts, are readily available and can be applied to any genomic and transcriptomic project, regardless of the organism. Database URL: http://www.lge.ibi.unicamp.br/eucalyptusdb PMID:29220468

  18. The Mouse Heart Attack Research Tool (mHART) 1.0 Database.

    PubMed

    DeLeon-Pennell, Kristine Y; Iyer, Rugmani Padmanabhan; Ma, Yonggang; Yabluchanskiy, Andriy; Zamilpa, Rogelio; Chiao, Ying Ann; Cannon, Presley; Cates, Courtney; Flynn, Elizabeth R; Halade, Ganesh V; de Castro Bras, Lisandra E; Lindsey, Merry L

    2018-05-18

    The generation of Big Data has enabled systems-level dissections into the mechanisms of cardiovascular pathology. Integration of genetic, proteomic, and pathophysiological variables across platforms and laboratories fosters discoveries through multidisciplinary investigations and minimizes unnecessary redundancy in research efforts. The Mouse Heart Attack Research Tool (mHART) consolidates a large dataset of over 10 years of experiments from a single laboratory for cardiovascular investigators to generate novel hypotheses and identify new predictive markers of progressive left ventricular remodeling following myocardial infarction (MI) in mice. We designed the mHART REDCap database using our own data to integrate cardiovascular community participation. We generated physiological, biochemical, cellular, and proteomic outputs from plasma and left ventricles obtained from post-MI and no MI (naïve) control groups. We included both male and female mice ranging in age from 3 to 36 months old. After variable collection, data underwent quality assessment for data curation (e.g. eliminate technical errors, check for completeness, remove duplicates, and define terms). Currently, mHART 1.0 contains >888,000 data points and includes results from >2,100 unique mice. Database performance was tested and an example provided to illustrate database utility. This report explains how the first version of the mHART database was established and provides researchers with a standard framework to aid in the integration of their data into our database or in the development of a similar database.

  19. Columba: an integrated database of proteins, structures, and annotations.

    PubMed

    Trissl, Silke; Rother, Kristian; Müller, Heiko; Steinke, Thomas; Koch, Ina; Preissner, Robert; Frömmel, Cornelius; Leser, Ulf

    2005-03-31

    Structural and functional research often requires the computation of sets of protein structures based on certain properties of the proteins, such as sequence features, fold classification, or functional annotation. Compiling such sets using current web resources is tedious because the necessary data are spread over many different databases. To facilitate this task, we have created COLUMBA, an integrated database of annotations of protein structures. COLUMBA currently integrates twelve different databases, including PDB, KEGG, Swiss-Prot, CATH, SCOP, the Gene Ontology, and ENZYME. The database can be searched using either keyword search or data source-specific web forms. Users can thus quickly select and download PDB entries that, for instance, participate in a particular pathway, are classified as containing a certain CATH architecture, are annotated as having a certain molecular function in the Gene Ontology, and whose structures have a resolution under a defined threshold. The results of queries are provided in both machine-readable extensible markup language and human-readable format. The structures themselves can be viewed interactively on the web. The COLUMBA database facilitates the creation of protein structure data sets for many structure-based studies. It allows to combine queries on a number of structure-related databases not covered by other projects at present. Thus, information on both many and few protein structures can be used efficiently. The web interface for COLUMBA is available at http://www.columba-db.de.

  20. 78 FR 2363 - Notification of Deletion of a System of Records; Automated Trust Funds Database

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-11

    ... Database AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice of deletion of a system... establishing the Automated Trust Funds (ATF) database system of records. The Federal Information Security... Integrity Act of 1982, Public Law 97-255, provided authority for the system. The ATF database has been...

  1. A User's Applications of Imaging Techniques: The University of Maryland Historic Textile Database.

    ERIC Educational Resources Information Center

    Anderson, Clarita S.

    1991-01-01

    Describes the incorporation of textile images into the University of Maryland Historic Textile Database by a computer user rather than a computer expert. Selection of a database management system is discussed, and PICTUREPOWER, a system that integrates photographic quality images with text and numeric information in databases, is described. (three…

  2. Integrating Databases with Maps: The Delivery of Cultural Data through TimeMap.

    ERIC Educational Resources Information Center

    Johnson, Ian

    TimeMap is a unique integration of database management, metadata and interactive maps, designed to contextualise and deliver cultural data through maps. TimeMap extends conventional maps with the time dimension, creating and animating maps "on-the-fly"; delivers them as a kiosk application or embedded in Web pages; links flexibly to…

  3. The NMDB collaboration

    NASA Astrophysics Data System (ADS)

    Steigies, C. T.

    2015-12-01

    Since the International Geophysical Year (IGY) in 1957-58 cosmic rays areroutinely measured by many ground-based Neutron Monitors (NM) around theworld. The World Data Center for Cosmic Rays (WDCCR) was established as apart of this activity and is providing a database of cosmic-ray neutronobservations in unified formats. However, that standard data comprises onlyof one hour averages, whereas most NM stations have been enhanced at the endof the 20th century to provide data in one minute resolution or even better.This data was only available on the web-sites of the institutes operatingthe station, and every station invented their own data format for thehigh-resolution measurements. There were some efforts to collect data fromseveral stations, to make this data available on FTP servers, however noneof these efforts could provide real-time data for all stations.The EU FP7 project NMDB (real-time database for high-resolution NeutronMonitor measurements, http://nmdb.eu) was funded by the European Commission,and a new database was set up by several Neutron Monitor stations in Europeand Asia to store high-resolution data and to provide access to the data inreal-time (i.e. less than five minute delay). By storing the measurements ina database, a standard format for the high-resolution measurements isenforced. This database is complementary to the WDCCR, as it does not (yet)provide all historical data, but the creation of this effort has spurred anew collaboration between Neutron Monitor scientists worldwide, (new)stations have gone online (again), new projects are building on the resultsof NMDB, new users outside of the Cosmic Ray community are starting to useNM data for new applications like soil moisture measurements using cosmicrays. These applications are facilitated by the easy access to the data withthe http://nest.nmdb.eu interface that offers access to all NMDB data forall users.

  4. Information integration for a sky survey by data warehousing

    NASA Astrophysics Data System (ADS)

    Luo, A.; Zhang, Y.; Zhao, Y.

    The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line

  5. Charting a Path to Location Intelligence for STD Control.

    PubMed

    Gerber, Todd M; Du, Ping; Armstrong-Brown, Janelle; McNutt, Louise-Anne; Coles, F Bruce

    2009-01-01

    This article describes the New York State Department of Health's GeoDatabase project, which developed new methods and techniques for designing and building a geocoding and mapping data repository for sexually transmitted disease (STD) control. The GeoDatabase development was supported through the Centers for Disease Control and Prevention's Outcome Assessment through Systems of Integrated Surveillance workgroup. The design and operation of the GeoDatabase relied upon commercial-off-the-shelf tools that other public health programs may also use for disease-control systems. This article provides a blueprint of the structure and software used to build the GeoDatabase and integrate location data from multiple data sources into the everyday activities of STD control programs.

  6. Documentation for Preservation: Methodology and a GIS Database of Three World Heritage Cities in Uzbekistan

    NASA Astrophysics Data System (ADS)

    Vileikis, O.; Escalante Carrillo, E.; Allayarov, S.; Feyzulayev, A.

    2017-08-01

    The historic cities of Uzbekistan are an irreplaceable legacy of the Silk Roads. Currently, Uzbekistan counts with four UNESCO World Heritage Properties, with hundreds of historic monuments and traditional historic houses. However, lack of documentation, systematic monitoring and a digital database, of the historic buildings and dwellings within the historic centers, are threatening the World Heritage properties and delaying the development of a proper management mechanism for the preservation of the heritage and an interwoven city urban development. Unlike the monuments, the traditional historic houses are being demolished without any enforced legal protection, leaving no documentation to understand the city history and its urban fabric as well of way of life, traditions and customs over the past centuries. To fill out this gap, from 2008 to 2015, the Principal Department for Preservation and Utilization of Cultural Objects of the Ministry of Culture and Sports of Uzbekistan with support from the UNESCO Office in Tashkent, and in collaboration with several international and local universities and institutions, carried out a survey of the Historic Centre of Bukhara, Itchan Kala and Samarkand Crossroad of Cultures. The collaborative work along these years have helped to consolidate a methodology and to integrate a GIS database that is currently contributing to the understanding of the outstanding heritage values of these cities as well as to develop preservation and management strategies with a solid base of heritage documentation.

  7. Measuring the evolution and output of cross-disciplinary collaborations within the NCI Physical Sciences-Oncology Centers Network.

    PubMed

    Basner, Jodi E; Theisz, Katrina I; Jensen, Unni S; Jones, C David; Ponomarev, Ilya; Sulima, Pawel; Jo, Karen; Eljanne, Mariam; Espey, Michael G; Franca-Koh, Jonathan; Hanlon, Sean E; Kuhn, Nastaran Z; Nagahara, Larry A; Schnell, Joshua D; Moore, Nicole M

    2013-12-01

    Development of effective quantitative indicators and methodologies to assess the outcomes of cross-disciplinary collaborative initiatives has the potential to improve scientific program management and scientific output. This article highlights an example of a prospective evaluation that has been developed to monitor and improve progress of the National Cancer Institute Physical Sciences-Oncology Centers (PS-OC) program. Study data, including collaboration information, was captured through progress reports and compiled using the web-based analytic database: Interdisciplinary Team Reporting, Analysis, and Query Resource. Analysis of collaborations was further supported by data from the Thomson Reuters Web of Science database, MEDLINE database, and a web-based survey. Integration of novel and standard data sources was augmented by the development of automated methods to mine investigator pre-award publications, assign investigator disciplines, and distinguish cross-disciplinary publication content. The results highlight increases in cross-disciplinary authorship collaborations from pre- to post-award years among the primary investigators and confirm that a majority of cross-disciplinary collaborations have resulted in publications with cross-disciplinary content that rank in the top third of their field. With these evaluation data, PS-OC Program officials have provided ongoing feedback to participating investigators to improve center productivity and thereby facilitate a more successful initiative. Future analysis will continue to expand these methods and metrics to adapt to new advances in research evaluation and changes in the program.

  8. RNAcentral: an international database of ncRNA sequences

    DOE PAGES

    Williams, Kelly Porter

    2014-10-28

    The field of non-coding RNA biology has been hampered by the lack of availability of a comprehensive, up-to-date collection of accessioned RNA sequences. Here we present the first release of RNAcentral, a database that collates and integrates information from an international consortium of established RNA sequence databases. The initial release contains over 8.1 million sequences, including representatives of all major functional classes. A web portal (http://rnacentral.org) provides free access to data, search functionality, cross-references, source code and an integrated genome browser for selected species.

  9. PR-EDB: Power Reactor Embrittlement Database - Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jy-An John; Subramani, Ranjit

    2008-03-01

    The aging and degradation of light-water reactor pressure vessels is of particular concern because of their relevance to plant integrity and the magnitude of the expected irradiation embrittlement. The radiation embrittlement of reactor pressure vessel materials depends on many factors, such as neutron fluence, flux, and energy spectrum, irradiation temperature, and preirradiation material history and chemical compositions. These factors must be considered to reliably predict pressure vessel embrittlement and to ensure the safe operation of the reactor. Large amounts of data from surveillance capsules are needed to develop a generally applicable damage prediction model that can be used for industrymore » standards and regulatory guides. Furthermore, the investigations of regulatory issues such as vessel integrity over plant life, vessel failure, and sufficiency of current codes, Standard Review Plans (SRPs), and Guides for license renewal can be greatly expedited by the use of a well-designed computerized database. The Power Reactor Embrittlement Database (PR-EDB) is such a comprehensive collection of data for U.S. designed commercial nuclear reactors. The current version of the PR-EDB lists the test results of 104 heat-affected-zone (HAZ) materials, 115 weld materials, and 141 base materials, including 103 plates, 35 forgings, and 3 correlation monitor materials that were irradiated in 321 capsules from 106 commercial power reactors. The data files are given in dBASE format and can be accessed with any personal computer using the Windows operating system. "User-friendly" utility programs have been written to investigate radiation embrittlement using this database. Utility programs allow the user to retrieve, select and manipulate specific data, display data to the screen or printer, and fit and plot Charpy impact data. The PR-EDB Version 3.0 upgrades Version 2.0. The package was developed based on the Microsoft .NET framework technology and uses Microsoft Access for backend data storage, and Microsoft Excel for plotting graphs. This software package is compatible with Windows (98 or higher) and has been built with a highly versatile user interface. PR-EDB Version 3.0 also contains an "Evaluated Residual File" utility for generating the evaluated processed files used for radiation embrittlement study.« less

  10. Information support of monitoring of technical condition of buildings in construction risk area

    NASA Astrophysics Data System (ADS)

    Skachkova, M. E.; Lepihina, O. Y.; Ignatova, V. V.

    2018-05-01

    The paper presents the results of the research devoted to the development of a model of information support of monitoring buildings technical condition; these buildings are located in the construction risk area. As a result of the visual and instrumental survey, as well as the analysis of existing approaches and techniques, attributive and cartographic databases have been created. These databases allow monitoring defects and damages of buildings located in a 30-meter risk area from the object under construction. The classification of structures and defects of these buildings under survey is presented. The functional capabilities of the developed model and the field of it practical applications are determined.

  11. Towards Monitoring-as-a-service for Scientific Computing Cloud applications using the ElasticSearch ecosystem

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Guarise, A.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    The INFN computing centre in Torino hosts a private Cloud, which is managed with the OpenNebula cloud controller. The infrastructure offers Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) services to different scientific computing applications. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BESIII collaboration, plus an increasing number of other small tenants. The dynamic allocation of resources to tenants is partially automated. This feature requires detailed monitoring and accounting of the resource usage. We set up a monitoring framework to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the ElasticSearch, Logstash and Kibana (ELK) stack. The infrastructure relies on a MySQL database back-end for data preservation and to ensure flexibility to choose a different monitoring solution if needed. The heterogeneous accounting information is transferred from the database to the ElasticSearch engine via a custom Logstash plugin. Each use-case is indexed separately in ElasticSearch and we setup a set of Kibana dashboards with pre-defined queries in order to monitor the relevant information in each case. For the IaaS metering, we developed sensors for the OpenNebula API. The IaaS level information gathered through the API is sent to the MySQL database through an ad-hoc developed RESTful web service. Moreover, we have developed a billing system for our private Cloud, which relies on the RabbitMQ message queue for asynchronous communication to the database and on the ELK stack for its graphical interface. The Italian Grid accounting framework is also migrating to a similar set-up. Concerning the application level, we used the Root plugin TProofMonSenderSQL to collect accounting data from the interactive analysis facility. The BESIII virtual instances used to be monitored with Zabbix, as a proof of concept we also retrieve the information contained in the Zabbix database. In this way we have achieved a uniform monitoring interface for both the IaaS and the scientific applications, mostly leveraging off-the-shelf tools. At present, we are working to define a model for monitoring-as-a-service, based on the tools described above, which the Cloud tenants can easily configure to suit their specific needs.

  12. National Vulnerability Database (NVD)

    National Institute of Standards and Technology Data Gateway

    National Vulnerability Database (NVD) (Web, free access)   NVD is a comprehensive cyber security vulnerability database that integrates all publicly available U.S. Government vulnerability resources and provides references to industry resources. It is based on and synchronized with the CVE vulnerability naming standard.

  13. Computer Databases as an Educational Tool in the Basic Sciences.

    ERIC Educational Resources Information Center

    Friedman, Charles P.; And Others

    1990-01-01

    The University of North Carolina School of Medicine developed a computer database, INQUIRER, containing scientific information in bacteriology, and then integrated the database into routine educational activities for first-year medical students in their microbiology course. (Author/MLW)

  14. Monitoring tools of COMPASS experiment at CERN

    NASA Astrophysics Data System (ADS)

    Bodlak, M.; Frolov, V.; Huber, S.; Jary, V.; Konorov, I.; Levit, D.; Novy, J.; Salac, R.; Tomsa, J.; Virius, M.

    2015-12-01

    This paper briefly introduces the data acquisition system of the COMPASS experiment and is mainly focused on the part that is responsible for the monitoring of the nodes in the whole newly developed data acquisition system of this experiment. The COMPASS is a high energy particle experiment with a fixed target located at the SPS of the CERN laboratory in Geneva, Switzerland. The hardware of the data acquisition system has been upgraded to use FPGA cards that are responsible for data multiplexing and event building. The software counterpart of the system includes several processes deployed in heterogenous network environment. There are two processes, namely Message Logger and Message Browser, taking care of monitoring. These tools handle messages generated by nodes in the system. While Message Logger collects and saves messages to the database, the Message Browser serves as a graphical interface over the database containing these messages. For better performance, certain database optimizations have been used. Lastly, results of performance tests are presented.

  15. Remote online monitoring and measuring system for civil engineering structures

    NASA Astrophysics Data System (ADS)

    Kujawińska, Malgorzata; Sitnik, Robert; Dymny, Grzegorz; Karaszewski, Maciej; Michoński, Kuba; Krzesłowski, Jakub; Mularczyk, Krzysztof; Bolewicki, Paweł

    2009-06-01

    In this paper a distributed intelligent system for civil engineering structures on-line measurement, remote monitoring, and data archiving is presented. The system consists of a set of optical, full-field displacement sensors connected to a controlling server. The server conducts measurements according to a list of scheduled tasks and stores the primary data or initial results in a remote centralized database. Simultaneously the server performs checks, ordered by the operator, which may in turn result with an alert or a specific action. The structure of whole system is analyzed along with the discussion on possible fields of application and the ways to provide a relevant security during data transport. Finally, a working implementation consisting of a fringe projection, geometrical moiré, digital image correlation and grating interferometry sensors and Oracle XE database is presented. The results from database utilized for on-line monitoring of a threshold value of strain for an exemplary area of interest at the engineering structure are presented and discussed.

  16. Smartphone home monitoring of ECG

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Moon, Gyu; Landa, Joseph; Nakajima, Hiroshi; Hata, Yutaka

    2012-06-01

    A system of ambulatory, halter, electrocardiography (ECG) monitoring system has already been commercially available for recording and transmitting heartbeats data by the Internet. However, it enjoys the confidence with a reservation and thus a limited market penetration, our system was targeting at aging global villagers having an increasingly biomedical wellness (BMW) homecare needs, not hospital related BMI (biomedical illness). It was designed within SWaP-C (Size, Weight, and Power, Cost) using 3 innovative modules: (i) Smart Electrode (lowpower mixed signal embedded with modern compressive sensing and nanotechnology to improve the electrodes' contact impedance); (ii) Learnable Database (in terms of adaptive wavelets transform QRST feature extraction, Sequential Query Relational database allowing home care monitoring retrievable Aided Target Recognition); (iii) Smartphone (touch screen interface, powerful computation capability, caretaker reporting with GPI, ID, and patient panic button for programmable emergence procedure). It can provide a supplementary home screening system for the post or the pre-diagnosis care at home with a build-in database searchable with the time, the place, and the degree of urgency happened, using in-situ screening.

  17. Establishment of an international database for genetic variants in esophageal cancer.

    PubMed

    Vihinen, Mauno

    2016-10-01

    The establishment of a database has been suggested in order to collect, organize, and distribute genetic information about esophageal cancer. The World Organization for Specialized Studies on Diseases of the Esophagus and the Human Variome Project will be in charge of a central database of information about esophageal cancer-related variations from publications, databases, and laboratories; in addition to genetic details, clinical parameters will also be included. The aim will be to get all the central players in research, clinical, and commercial laboratories to contribute. The database will follow established recommendations and guidelines. The database will require a team of dedicated curators with different backgrounds. Numerous layers of systematics will be applied to facilitate computational analyses. The data items will be extensively integrated with other information sources. The database will be distributed as open access to ensure exchange of the data with other databases. Variations will be reported in relation to reference sequences on three levels--DNA, RNA, and protein-whenever applicable. In the first phase, the database will concentrate on genetic variations including both somatic and germline variations for susceptibility genes. Additional types of information can be integrated at a later stage. © 2016 New York Academy of Sciences.

  18. 77 FR 42744 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-20

    ... information about the request is entered into the appropriate tracking databases. Use of the information in the Agency's tracking databases enables the Agency to monitor progress on the activities attendant to...

  19. Real-time Geographic Information System (GIS) for Monitoring the Area of Potential Water Level Using Rule Based System

    NASA Astrophysics Data System (ADS)

    Anugrah, Wirdah; Suryono; Suseno, Jatmiko Endro

    2018-02-01

    Management of water resources based on Geographic Information System can provide substantial benefits to water availability settings. Monitoring the potential water level is needed in the development sector, agriculture, energy and others. In this research is developed water resource information system using real-time Geographic Information System concept for monitoring the potential water level of web based area by applying rule based system method. GIS consists of hardware, software, and database. Based on the web-based GIS architecture, this study uses a set of computer that are connected to the network, run on the Apache web server and PHP programming language using MySQL database. The Ultrasound Wireless Sensor System is used as a water level data input. It also includes time and geographic location information. This GIS maps the five sensor locations. GIS is processed through a rule based system to determine the level of potential water level of the area. Water level monitoring information result can be displayed on thematic maps by overlaying more than one layer, and also generating information in the form of tables from the database, as well as graphs are based on the timing of events and the water level values.

  20. Operational Monitoring of GOME-2 and IASI Level 1 Product Processing at EUMETSAT

    NASA Astrophysics Data System (ADS)

    Livschitz, Yakov; Munro, Rosemary; Lang, Rüdiger; Fiedler, Lars; Dyer, Richard; Eisinger, Michael

    2010-05-01

    The growing complexity of operational level 1 radiance products from Low Earth Orbiting (LEO) platforms like EUMETSATs Metop series makes near-real-time monitoring of product quality a challenging task. The main challenge is to provide a monitoring system which is flexible and robust enough to identify and to react to anomalies which may be previously unknown to the system, as well as to provide all means and parameters necessary in order to support efficient ad-hoc analysis of the incident. The operational monitoring system developed at EUMETSAT for monitoring of GOME-2 and IASI level 1 data allows to perform near-real-time monitoring of operational products and instrument's health in a robust and flexible fashion. For effective information management, the system is based on a relational database (Oracle). An Extract, Transform, Load (ETL) process transforms products in EUMETSAT Polar System (EPS) format into relational data structures. The identification of commonalities between products and instruments allows for a database structure design in such a way that different data can be analyzed using the same business intelligence functionality. An interactive analysis software implementing modern data mining techniques is also provided for a detailed look into the data. The system is effectively used for day-to-day monitoring, long-term reporting, instrument's degradation analysis as well as for ad-hoc queries in case of an unexpected instrument or processing behaviour. Having data from different sources on a single instrument and even from different instruments, platforms or numerical weather prediction within the same database allows effective cross-comparison and looking for correlated parameters. Automatic alarms raised by checking for deviation of certain parameters, for data losses and other events significantly reduce time, necessary to monitor the processing on a day-to-day basis.

  1. Operational Monitoring of GOME-2 and IASI Level 1 Product Processing at EUMETSAT

    NASA Astrophysics Data System (ADS)

    Livschitz, Y.; Munro, R.; Lang, R.; Fiedler, L.; Dyer, R.; Eisinger, M.

    2009-12-01

    The growing complexity of operational level 1 radiance products from Low Earth Orbiting (LEO) platforms like EUMETSATs Metop series makes near-real-time monitoring of product quality a challenging task. The main challenge is to provide a monitoring system which is flexible and robust enough to identify and to react to anomalies which may be previously unknown to the system, as well as to provide all means and parameters necessary in order to support efficient ad-hoc analysis of the incident. The operational monitoring system developed at EUMETSAT for monitoring of GOME-2 and IASI level 1 data allows to perform near-real-time monitoring of operational products and instrument’s health in a robust and flexible fashion. For effective information management, the system is based on a relational database (Oracle). An Extract, Transform, Load (ETL) process transforms products in EUMETSAT Polar System (EPS) format into relational data structures. The identification of commonalities between products and instruments allows for a database structure design in such a way that different data can be analyzed using the same business intelligence functionality. An interactive analysis software implementing modern data mining techniques is also provided for a detailed look into the data. The system is effectively used for day-to-day monitoring, long-term reporting, instrument’s degradation analysis as well as for ad-hoc queries in case of an unexpected instrument or processing behaviour. Having data from different sources on a single instrument and even from different instruments, platforms or numerical weather prediction within the same database allows effective cross-comparison and looking for correlated parameters. Automatic alarms raised by checking for deviation of certain parameters, for data losses and other events significantly reduce time, necessary to monitor the processing on a day-to-day basis.

  2. Dynamic Integration of Mobile JXTA with Cloud Computing for Emergency Rural Public Health Care

    PubMed Central

    Rajkumar, Rajasekaran; Sriman Narayana Iyengar, Nallani Chackravatula

    2013-01-01

    Objectives The existing processes of health care systems where data collection requires a great deal of labor with high-end tasks to retrieve and analyze information, are usually slow, tedious, and error prone, which restrains their clinical diagnostic and monitoring capabilities. Research is now focused on integrating cloud services with P2P JXTA to identify systematic dynamic process for emergency health care systems. The proposal is based on the concepts of a community cloud for preventative medicine, to help promote a healthy rural community. We investigate the approaches of patient health monitoring, emergency care, and an ambulance alert alarm (AAA) under mobile cloud-based telecare or community cloud controller systems. Methods Considering permanent mobile users, an efficient health promotion method is proposed. Experiments were conducted to verify the effectiveness of the method. The performance was evaluated from September 2011 to July 2012. A total of 1,856,454 cases were transported and referred to hospital, identified with health problems, and were monitored. We selected all the peer groups and the control server N0 which controls N1, N2, and N3 proxied peer groups. The hospital cloud controller maintains the database of the patients through a JXTA network. Results Among 1,856,454 transported cases with beneficiaries of 1,712,877 cases there were 1,662,834 lives saved and 8,500 cases transported per day with 104,530 transported cases found to be registered in a JXTA network. Conclusion The registered case histories were referred from the Hospital community cloud (HCC). SMS messages were sent from node N0 to the relay peers which connected to the N1, N2, and N3 nodes, controlled by the cloud controller through a JXTA network. PMID:24298441

  3. Dynamic Integration of Mobile JXTA with Cloud Computing for Emergency Rural Public Health Care.

    PubMed

    Rajkumar, Rajasekaran; Sriman Narayana Iyengar, Nallani Chackravatula

    2013-10-01

    The existing processes of health care systems where data collection requires a great deal of labor with high-end tasks to retrieve and analyze information, are usually slow, tedious, and error prone, which restrains their clinical diagnostic and monitoring capabilities. Research is now focused on integrating cloud services with P2P JXTA to identify systematic dynamic process for emergency health care systems. The proposal is based on the concepts of a community cloud for preventative medicine, to help promote a healthy rural community. We investigate the approaches of patient health monitoring, emergency care, and an ambulance alert alarm (AAA) under mobile cloud-based telecare or community cloud controller systems. Considering permanent mobile users, an efficient health promotion method is proposed. Experiments were conducted to verify the effectiveness of the method. The performance was evaluated from September 2011 to July 2012. A total of 1,856,454 cases were transported and referred to hospital, identified with health problems, and were monitored. We selected all the peer groups and the control server N0 which controls N1, N2, and N3 proxied peer groups. The hospital cloud controller maintains the database of the patients through a JXTA network. Among 1,856,454 transported cases with beneficiaries of 1,712,877 cases there were 1,662,834 lives saved and 8,500 cases transported per day with 104,530 transported cases found to be registered in a JXTA network. The registered case histories were referred from the Hospital community cloud (HCC). SMS messages were sent from node N0 to the relay peers which connected to the N1, N2, and N3 nodes, controlled by the cloud controller through a JXTA network.

  4. [LONI & Co: about the epistemic specificity of digital spaces of knowledge in cognitive neuroscience].

    PubMed

    Huber, Lara

    2011-06-01

    In the neurosciences digital databases more and more are becoming important tools of data rendering and distributing. This development is due to the growing impact of imaging based trial design in cognitive neuroscience, including morphological as much as functional imaging technologies. As the case of the 'Laboratory of Neuro Imaging' (LONI) is showing, databases are attributed a specific epistemological power: Since the 1990s databasing is seen to foster the integration of neuroscientific data, although local regimes of data production, -manipulation and--interpretation are also challenging this development. Databasing in the neurosciences goes along with the introduction of new structures of integrating local data, hence establishing digital spaces of knowledge (epistemic spaces): At this stage, inherent norms of digital databases are affecting regimes of imaging-based trial design, for example clinical research into Alzheimer's disease.

  5. A Citizen Science Program for Monitoring Lake Stages in Northern Wisconsin

    NASA Astrophysics Data System (ADS)

    Kretschmann, A.; Drum, A.; Rubsam, J.; Watras, C. J.; Cellar-Rossler, A.

    2011-12-01

    Historical data indicate that surface water levels in northern Wisconsin are fluctuating more now than they did in the recent past. In the northern highland lake district of Vilas County, Wisconsin, concern about record low lake levels in 2008 spurred local citizens and lake associations to form a lake level monitoring network comprising citizen scientists. The network is administered by the North Lakeland Discovery Center (NLDC, a local NGO) and is supported by a grant from the Citizen Science Monitoring Program of the Wisconsin Department of Natural Resources (WDNR). With technical guidance from limnologists at neighboring UW-Madison Trout Lake Research Station, citizen scientists have installed geographic benchmarks and staff gauges on 26 area lakes. The project engages citizen and student science participants including homeowners, non-profit organization member-participants, and local schools. Each spring, staff gauges are installed and referenced to fixed benchmarks after ice off by NLDC and dedicated volunteers. Volunteers read and record staff gauges on a weekly basis during the ice-free season; and maintain log books recording lake levels to the nearest 0.5 cm. At the end of the season, before ice on, gauges are removed and log books are collected by the NLDC coordinator. Data is compiled and submitted to a database management system, coordinated within the Wisconsin Surface Water Integrated Monitoring System (SWIMS), a statewide information system managed by the WDNR in Madison. Furthermore, NLDC is collaborating with the SWIMS database manager to develop data entry screens based on records collected by citizen scientists. This program is the first of its kind in Wisconsin to utilize citizen scientists to collect lake level data. The retention rate for volunteers has been 100% over the three years since inception, and the program has expanded from four lakes in 2008 to twenty-six lakes in 2011. NLDC stresses the importance of long-term monitoring and the commitment that such monitoring takes. The volunteers recognize this importance and have fulfilled their monitoring commitments on an annual basis. All participating volunteers receive a summary report at the end of the year, and, if requested, a graph that is updated monthly. Recruitment has been through lake associations, town boards, word of mouth, newspaper articles, community events, and the NLDC citizen science webpage. Local interest and participation are high, perhaps due to the value that citizens place on lakes and the concern that they have about declining water levels.

  6. How Intrusion Detection Can Improve Software Decoy Applications

    DTIC Science & Technology

    2003-03-01

    THIS PAGE INTENTIONALLY LEFT BLANK 41 V. DISCUSSION Military history suggests it is best to employ a layered, defense-in...database: alert, postgresql , user=snort dbname=snort # output database: log, unixodbc, user=snort dbname=snort # output database: log, mssql, dbname...Threat Monitoring and Surveillance, James P. Anderson Co., Fort Washington. PA, April 1980. URL http://csrc.nist.gov/publications/ history /ande80

  7. IDAAPM: integrated database of ADMET and adverse effects of predictive modeling based on FDA approved drug data.

    PubMed

    Legehar, Ashenafi; Xhaard, Henri; Ghemtio, Leo

    2016-01-01

    The disposition of a pharmaceutical compound within an organism, i.e. its Absorption, Distribution, Metabolism, Excretion, Toxicity (ADMET) properties and adverse effects, critically affects late stage failure of drug candidates and has led to the withdrawal of approved drugs. Computational methods are effective approaches to reduce the number of safety issues by analyzing possible links between chemical structures and ADMET or adverse effects, but this is limited by the size, quality, and heterogeneity of the data available from individual sources. Thus, large, clean and integrated databases of approved drug data, associated with fast and efficient predictive tools are desirable early in the drug discovery process. We have built a relational database (IDAAPM) to integrate available approved drug data such as drug approval information, ADMET and adverse effects, chemical structures and molecular descriptors, targets, bioactivity and related references. The database has been coupled with a searchable web interface and modern data analytics platform (KNIME) to allow data access, data transformation, initial analysis and further predictive modeling. Data were extracted from FDA resources and supplemented from other publicly available databases. Currently, the database contains information regarding about 19,226 FDA approval applications for 31,815 products (small molecules and biologics) with their approval history, 2505 active ingredients, together with as many ADMET properties, 1629 molecular structures, 2.5 million adverse effects and 36,963 experimental drug-target bioactivity data. IDAAPM is a unique resource that, in a single relational database, provides detailed information on FDA approved drugs including their ADMET properties and adverse effects, the corresponding targets with bioactivity data, coupled with a data analytics platform. It can be used to perform basic to complex drug-target ADMET or adverse effects analysis and predictive modeling. IDAAPM is freely accessible at http://idaapm.helsinki.fi and can be exploited through a KNIME workflow connected to the database.Graphical abstractFDA approved drug data integration for predictive modeling.

  8. Development of the Lymphoma Enterprise Architecture Database: A caBIG(tm) Silver level compliant System

    PubMed Central

    Huang, Taoying; Shenoy, Pareen J.; Sinha, Rajni; Graiser, Michael; Bumpers, Kevin W.; Flowers, Christopher R.

    2009-01-01

    Lymphomas are the fifth most common cancer in United States with numerous histological subtypes. Integrating existing clinical information on lymphoma patients provides a platform for understanding biological variability in presentation and treatment response and aids development of novel therapies. We developed a cancer Biomedical Informatics Grid™ (caBIG™) Silver level compliant lymphoma database, called the Lymphoma Enterprise Architecture Data-system™ (LEAD™), which integrates the pathology, pharmacy, laboratory, cancer registry, clinical trials, and clinical data from institutional databases. We utilized the Cancer Common Ontological Representation Environment Software Development Kit (caCORE SDK) provided by National Cancer Institute’s Center for Bioinformatics to establish the LEAD™ platform for data management. The caCORE SDK generated system utilizes an n-tier architecture with open Application Programming Interfaces, controlled vocabularies, and registered metadata to achieve semantic integration across multiple cancer databases. We demonstrated that the data elements and structures within LEAD™ could be used to manage clinical research data from phase 1 clinical trials, cohort studies, and registry data from the Surveillance Epidemiology and End Results database. This work provides a clear example of how semantic technologies from caBIG™ can be applied to support a wide range of clinical and research tasks, and integrate data from disparate systems into a single architecture. This illustrates the central importance of caBIG™ to the management of clinical and biological data. PMID:19492074

  9. Development of the Lymphoma Enterprise Architecture Database: a caBIG Silver level compliant system.

    PubMed

    Huang, Taoying; Shenoy, Pareen J; Sinha, Rajni; Graiser, Michael; Bumpers, Kevin W; Flowers, Christopher R

    2009-04-03

    Lymphomas are the fifth most common cancer in United States with numerous histological subtypes. Integrating existing clinical information on lymphoma patients provides a platform for understanding biological variability in presentation and treatment response and aids development of novel therapies. We developed a cancer Biomedical Informatics Grid (caBIG) Silver level compliant lymphoma database, called the Lymphoma Enterprise Architecture Data-system (LEAD), which integrates the pathology, pharmacy, laboratory, cancer registry, clinical trials, and clinical data from institutional databases. We utilized the Cancer Common Ontological Representation Environment Software Development Kit (caCORE SDK) provided by National Cancer Institute's Center for Bioinformatics to establish the LEAD platform for data management. The caCORE SDK generated system utilizes an n-tier architecture with open Application Programming Interfaces, controlled vocabularies, and registered metadata to achieve semantic integration across multiple cancer databases. We demonstrated that the data elements and structures within LEAD could be used to manage clinical research data from phase 1 clinical trials, cohort studies, and registry data from the Surveillance Epidemiology and End Results database. This work provides a clear example of how semantic technologies from caBIG can be applied to support a wide range of clinical and research tasks, and integrate data from disparate systems into a single architecture. This illustrates the central importance of caBIG to the management of clinical and biological data.

  10. A National Virtual Specimen Database for Early Cancer Detection

    NASA Technical Reports Server (NTRS)

    Crichton, Daniel; Kincaid, Heather; Kelly, Sean; Thornquist, Mark; Johnsey, Donald; Winget, Marcy

    2003-01-01

    Access to biospecimens is essential for enabling cancer biomarker discovery. The National Cancer Institute's (NCI) Early Detection Research Network (EDRN) comprises and integrates a large number of laboratories into a network in order to establish a collaborative scientific environment to discover and validate disease markers. The diversity of both the institutions and the collaborative focus has created the need for establishing cross-disciplinary teams focused on integrating expertise in biomedical research, computational and biostatistics, and computer science. Given the collaborative design of the network, the EDRN needed an informatics infrastructure. The Fred Hutchinson Cancer Research Center, the National Cancer Institute,and NASA's Jet Propulsion Laboratory (JPL) teamed up to build an informatics infrastructure creating a collaborative, science-driven research environment despite the geographic and morphology differences of the information systems that existed within the diverse network. EDRN investigators identified the need to share biospecimen data captured across the country managed in disparate databases. As a result, the informatics team initiated an effort to create a virtual tissue database whereby scientists could search and locate details about specimens located at collaborating laboratories. Each database, however, was locally implemented and integrated into collection processes and methods unique to each institution. This meant that efforts to integrate databases needed to be done in a manner that did not require redesign or re-implementation of existing system

  11. 50 CFR 660.17 - Catch monitors and catch monitor providers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... work competently with standard database software and computer hardware. (v) Have a current and valid... candidate's academic transcripts and resume; (4) A statement signed by the candidate under penalty of...

  12. 50 CFR 660.17 - Catch monitors and catch monitor service providers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... work competently with standard database software and computer hardware. (v) Have a current and valid... candidate's academic transcripts and resume; (4) A statement signed by the candidate under penalty of...

  13. 50 CFR 660.17 - Catch monitors and catch monitor service providers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... work competently with standard database software and computer hardware. (v) Have a current and valid... candidate's academic transcripts and resume; (4) A statement signed by the candidate under penalty of...

  14. 50 CFR 660.17 - Catch monitors and catch monitor service providers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... work competently with standard database software and computer hardware. (v) Have a current and valid... candidate's academic transcripts and resume; (4) A statement signed by the candidate under penalty of...

  15. Distributed On-line Monitoring System Based on Modem and Public Phone Net

    NASA Astrophysics Data System (ADS)

    Chen, Dandan; Zhang, Qiushi; Li, Guiru

    In order to solve the monitoring problem of urban sewage disposal, a distributed on-line monitoring system is proposed. By introducing dial-up communication technology based on Modem, the serial communication program can rationally solve the information transmission problem between master station and slave station. The realization of serial communication program is based on the MSComm control of C++ Builder 6.0.The software includes real-time data operation part and history data handling part, which using Microsoft SQL Server 2000 for database, and C++ Builder6.0 for user interface. The monitoring center displays a user interface with alarm information of over-standard data and real-time curve. Practical application shows that the system has successfully accomplished the real-time data acquisition from data gather station, and stored them in the terminal database.

  16. An integrated chronostratigraphic data system for the twenty-first century

    USGS Publications Warehouse

    Sikora, P.J.; Ogg, James G.; Gary, A.; Cervato, C.; Gradstein, Felix; Huber, B.T.; Marshall, C.; Stein, J.A.; Wardlaw, B.

    2006-01-01

    Research in stratigraphy is increasingly multidisciplinary and conducted by diverse research teams whose members can be widely separated. This developing distributed-research process, facilitated by the availability of the Internet, promises tremendous future benefits to researchers. However, its full potential is hindered by the absence of a development strategy for the necessary infrastructure. At a National Science Foundation workshop convened in November 2001, thirty quantitative stratigraphers and database specialists from both academia and industry met to discuss how best to integrate their respective chronostratigraphic databases. The main goal was to develop a strategy that would allow efficient distribution and integration of existing data relevant to the study of geologic time. Discussions concentrated on three major themes: database standards and compatibility, strategies and tools for information retrieval and analysis of all types of global and regional stratigraphic data, and future directions for database integration and centralization of currently distributed depositories. The result was a recommendation to establish an integrated chronostratigraphic database, to be called Chronos, which would facilitate greater efficiency in stratigraphic studies (http://www.chronos.org/) . The Chronos system will both provide greater ease of data gathering and allow for multidisciplinary synergies, functions of fundamental importance in a variety of research, including time scale construction, paleoenvironmental analysis, paleoclimatology and paleoceanography. Beyond scientific research, Chronos will also provide educational and societal benefits by providing an accessible source of information of general interest (e.g., mass extinctions) and concern (e.g., climatic change). The National Science Foundation has currently funded a three-year program for implementing Chronos.. ?? 2006 Geological Society of America. All rights reserved.

  17. Enhanced systems for measuring and monitoring REDD+: Opportunities to improve the accuracy of emission factor and activity data in Indonesia

    NASA Astrophysics Data System (ADS)

    Solichin

    The importance of accurate measurement of forest biomass in Indonesia has been growing ever since climate change mitigation schemes, particularly the reduction of emissions from deforestation and forest degradation scheme (known as REDD+), were constitutionally accepted by the government of Indonesia. The need for an accurate system of historical and actual forest monitoring has also become more pronounced, as such a system would afford a better understanding of the role of forests in climate change and allow for the quantification of the impact of activities implemented to reduce greenhouse gas emissions. The aim of this study was to enhance the accuracy of estimations of carbon stocks and to monitor emissions in tropical forests. The research encompassed various scales (from trees and stands to landscape-sized scales) and a wide range of aspects, from evaluation and development of allometric equations to exploration of the potential of existing forest inventory databases and evaluation of cutting-edge technology for non-destructive sampling and accurate forest biomass mapping over large areas. In this study, I explored whether accuracy--especially regarding the identification and reduction of bias--of forest aboveground biomass (AGB) estimates in Indonesia could be improved through (1) development and refinement of allometric equations for major forest types, (2) integration of existing large forest inventory datasets, (3) assessing nondestructive sampling techniques for tree AGB measurement, and (4) landscape-scale mapping of AGB and forest cover using lidar. This thesis provides essential foundations to improve the estimation of forest AGB at tree scale through development of new AGB equations for several major forest types in Indonesia. I successfully developed new allometric equations using large datasets from various forest types that enable us to estimate tree aboveground biomass for both forest type specific and generic equations. My models outperformed the existing local equations, with lower bias and higher precision of the AGB estimates. This study also highlights the potential advantages and challenges of using terrestrial lidar and the acoustic velocity tool for non-destructive sampling of tree biomass to enable more sample collection without the felling of trees. Further, I explored whether existing forest inventories and permanent sample plot datasets can be integrated into Indonesia's existing carbon accounting system. My investigation of these existing datasets found that through quality assurance tests these datasets are essential to be integrated into national and provincial forest monitoring and carbon accounting systems. Integration of this information would eventually improve the accuracy of the estimates of forest carbon stocks, biomass growth, mortality and emission factors from deforestation and forest degradation. At landscape scale, this study demonstrates the capability of airborne lidar for forest monitoring and forest cover classification in tropical peat swamp ecosystems. The mapping application using airborne lidar showed a more accurate and precise classification of land and forest cover when compared with mapping using optical and active sensors. To reduce the cost of lidar acquisition, this study assessed the optimum lidar return density for forest monitoring. I found that the density of lidar return could be reduced to at least 1 return per 4 m2. Overall, this study provides essential scientific background to improve the accuracy of forest AGB estimates. Therefore, the described results and techniques should be integrated into the existing monitoring systems to assess emission reduction targets and the impact of REDD+ implementation.

  18. MAINTAINING DATA QUALITY IN THE PERFORMANCE OF A LARGE SCALE INTEGRATED MONITORING EFFORT

    EPA Science Inventory

    Macauley, John M. and Linda C. Harwell. In press. Maintaining Data Quality in the Performance of a Large Scale Integrated Monitoring Effort (Abstract). To be presented at EMAP Symposium 2004: Integrated Monitoring and Assessment for Effective Water Quality Management, 3-7 May 200...

  19. Integrating Social Networks and Remote Patient Monitoring Systems to Disseminate Notifications.

    PubMed

    Ribeiro, Hugo A; Germano, Eliseu; Carvalho, Sergio T; Albuquerque, Eduardo S

    2017-01-01

    Healthcare workforce shortage can be compensated by using information and communication technologies. Remote patient monitoring systems allow us to identify and communicate complications and anomalies. Integrating social networking services into remote patient monitoring systems enables users to manage their relationships. User defined relationships may be used to disseminate healthcare related notifications. Hence this integration leads to quicker interventions and may reduce hospital readmission rate. As a proof of concept, a module was integrated to a remote patient monitoring platform. A mobile application to manage relationships and receive notifications was also developed.

  20. Quality assessment of clinical practice guidelines for integrative medicine in China: A systematic review.

    PubMed

    Yao, Sha; Wei, Dang; Chen, Yao-Long; Wang, Qi; Wang, Xiao-Qin; Zeng, Zhao; Li, Hui

    2017-05-01

    To assess the quality of integrative medicine clinical practice guidelines (CPGs) published before 2014. A systematic search of the scientific literature published before 2014 was conducted to select integrative medicine CPGs. Four major Chinese integrated databases and one guideline database were searched: the Chinese Biomedical Literature Database (CBM), the China National Knowledge Infrastructure (CNKI), China Science and Technology Journal Database (VIP), Wanfang Data, and the China Guideline Clearinghouse (CGC). Four reviewers independently assessed the quality of the included guidelines using the Appraisal of Guidelines for Research and Evaluation (AGREE) II Instrument. Overall consensus among the reviewers was assessed using the intra-class correlation coefficient (ICC). A total of 41 guidelines published from 2003 to 2014 were included. The overall consensus among the reviewers was good [ICC: 0.928; 95% confifi dence interval (CI): 0.920 to 0.935]. The scores on the 6 AGREE domains were: 17% for scope and purpose (range: 6% to 32%), 11% for stakeholder involvement (range: 0 to 24%), 10% for rigor of development (range: 3% to 22%), 39% for clarity and presentation (range: 25% to 64%), 11% for applicability (range: 4% to 24%), and 1% for editorial independence (range: 0 to 15%). The quality of integrative medicine CPGs was low, the development of integrative medicine CPGs should be guided by systematic methodology. More emphasis should be placed on multi-disciplinary guideline development groups, quality of evidence, management of funding and conflfl icts of interest, and guideline updates in the process of developing integrative medicine CPGs in China.

Top