NASA Astrophysics Data System (ADS)
Steigies, Christian
2012-07-01
The Neutron Monitor Database project, www.nmdb.eu, has been funded in 2008 and 2009 by the European Commission's 7th framework program (FP7). Neutron monitors (NMs) have been in use worldwide since the International Geophysical Year (IGY) in 1957 and cosmic ray data from the IGY and the improved NM64 NMs has been distributed since this time, but a common data format existed only for data with one hour resolution. This data was first distributed in printed books, later via the World Data Center ftp server. In the 1990's the first NM stations started to record data at higher resolutions (typically 1 minute) and publish in on their webpages. However, every NM station chose their own format, making it cumbersome to work with this distributed data. In NMDB all European and some neighboring NM stations came together to agree on a common format for high-resolution data and made this available via a centralized database. The goal of NMDB is to make all data from all NM stations available in real-time. The original NMDB network has recently been joined by the Bartol Research Institute (Newark DE, USA), the National Autonomous University of Mexico and the North-West University (Potchefstroom, South Africa). The data is accessible to everyone via an easy to use webinterface, but expert users can also directly access the database to build applications like real-time space weather alerts. Even though SQL databases are used today by most webservices (blogs, wikis, social media, e-commerce), the power of an SQL database has not yet been fully realized by the scientific community. In training courses, we are teaching how to make use of NMDB, how to join NMDB, and how to ensure the data quality. The present status of the extended NMDB will be presented. The consortium welcomes further data providers to help increase the scientific contributions of the worldwide neutron monitor network to heliospheric physics and space weather.
NASA Astrophysics Data System (ADS)
Steigies, C. T.
2015-12-01
Since the International Geophysical Year (IGY) in 1957-58 cosmic rays areroutinely measured by many ground-based Neutron Monitors (NM) around theworld. The World Data Center for Cosmic Rays (WDCCR) was established as apart of this activity and is providing a database of cosmic-ray neutronobservations in unified formats. However, that standard data comprises onlyof one hour averages, whereas most NM stations have been enhanced at the endof the 20th century to provide data in one minute resolution or even better.This data was only available on the web-sites of the institutes operatingthe station, and every station invented their own data format for thehigh-resolution measurements. There were some efforts to collect data fromseveral stations, to make this data available on FTP servers, however noneof these efforts could provide real-time data for all stations.The EU FP7 project NMDB (real-time database for high-resolution NeutronMonitor measurements, http://nmdb.eu) was funded by the European Commission,and a new database was set up by several Neutron Monitor stations in Europeand Asia to store high-resolution data and to provide access to the data inreal-time (i.e. less than five minute delay). By storing the measurements ina database, a standard format for the high-resolution measurements isenforced. This database is complementary to the WDCCR, as it does not (yet)provide all historical data, but the creation of this effort has spurred anew collaboration between Neutron Monitor scientists worldwide, (new)stations have gone online (again), new projects are building on the resultsof NMDB, new users outside of the Cosmic Ray community are starting to useNM data for new applications like soil moisture measurements using cosmicrays. These applications are facilitated by the easy access to the data withthe http://nest.nmdb.eu interface that offers access to all NMDB data forall users.
WWW.NMDB.EU: The real-time Neutron Monitor database
NASA Astrophysics Data System (ADS)
Klein, Karl-Ludwig; Steigies, Christian T.; NMDB Consortium
2010-05-01
The Real time database for high-resolution neutron monitor measurements (NMDB), which was supported by the 7th framework program of the European Commission, hosts data on cosmic rays in the GeV range from European and some non-European neutron monitor stations. It offers a variety of applications ranging from the representation and retrieval of cosmic ray data over solar energetic particle alerts to the calculation of ionisation doses in the atmosphere and radiation dose rates at aircraft altitudes. Furthermore the web site comprises public outreach pages in several languages and offers training material on cosmic rays for university students and researchers and engineers who want to get familiar with cosmic rays and neutron monitor measurements. This contribution presents an overview of the provided services and indications on how to access the database. Operators of other neutron monitor stations are welcome to submit their data to NMDB.
Near-realtime Cosmic Ray measurements for space weather applications
NASA Astrophysics Data System (ADS)
Steigies, C. T.
2013-12-01
In its FP7 program the European Commission has funded the creation of scientific databases. One successful project is the Neutron Monitor database NMDB which provides near-realtime access to ground-based Neutron Monitor measurements. In its beginning NMDB hosted only data from European and Asian participants, but it has recently grown to also include data from North American stations. We are currently working on providing also data from Australian stations. With the increased coverage of stations the accuracy of the NMDB applications to issue an alert of a ground level enhancement (GLE) or to predict the arrival of a coronal mass ejection (CME) is constantly improving. Besides the Cosmic Ray community and Airlines, that want to calculate radiation doses on flight routes, NMDB has also attracted users from outside the core field, for example hydrologists who compare local Neutron measurements with data from NMDB to determine soil humidity. By providing access to data from 50 stations, NMDB includes already data from the majority of the currently operating stations. However, in the future we want to include data from the few remaining stations, as well as historical data from stations that have been shut down.
WWW.NMDB.EU: The real-time Neutron Monitor databas
NASA Astrophysics Data System (ADS)
Klein, Karl-Ludwig; Steigies, Christian; Steigies, Christian T.; Wimmer-Schweingruber, Robert F.; Kudela, Karel; Strharsky, Igor; Langer, Ronald; Usoskin, Ilya; Ibragimov, Askar; Flückiger, Erwin O.; Bütikofer, Rolf; Eroshenko, Eugenia; Belov, Anatoly; Yanke, Victor; Klein, Karl-Ludwig; Fuller, Nicolas; Mavromichalaki, Helen; Papaioannou, Athana-Sios; Sarlanis, Christos; Souvatzoglou, George; Plainaki, Christina; Geron-Tidou, Maria; Papailiou, Maria-Christina; Mariatos, George; Chilingaryan, Ashot; Hovsepyan, G.; Reymers, Artur; Parisi, Mario; Kryakunova, Olga; Tsepakina, Irina; Nikolayevskiy, Nikolay; Dor-Man, Lev; Pustil'Nik, Lev; García-Población, Oscar
The Real time database for high-resolution neutron monitor measurements(NMDB), which was supported by the 7th Framework Programme of the European Commission, hosts data on cosmic rays in the GeV range from European and some non-European neutron monitor stations. Besides real-time data and historical data over several decades in a unified format, it offers data products such as galactic cosmic ray spectra and applications including solar energetic particle alerts and the calculation of ionisation rates in the atmosphere and effective radiation dose rates at aircraft altitudes. Furthermore the web site comprises public outreach pages in several languages and offers training material on cosmic rays for university students and researchers and engineers who want to become familiar with cosmic rays and neutron monitor measurements. This contribution presents an overview of the provided services and indications on how to access the database. Operators of other neutron monitor stations are welcome to submit their data to NMDB.
Implementation of the ground level enhancement alert software at NMDB database
NASA Astrophysics Data System (ADS)
Mavromichalaki, Helen; Souvatzoglou, George; Sarlanis, Christos; Mariatos, George; Papaioannou, Athanasios; Belov, Anatoly; Eroshenko, Eugenia; Yanke, Victor; NMDB Team
2010-11-01
The European Commission is supporting the real-time database for high-resolution neutron monitor measurements (NMDB) as an e-Infrastructures project in the Seventh Framework Programme in the Capacities section. The realization of the NMDB will provide the opportunity for several applications most of which will be implemented in real-time. An important application will be the establishment of an Alert signal when dangerous solar particle events are heading to the Earth, resulting into a ground level enhancement (GLE) registered by neutron monitors (NMs). The cosmic ray community has been occupied with the question of establishing such an Alert for many years and recently several groups succeeded in creating a proper algorithm capable of detecting space weather threats in an off-line mode. A lot of original work has been done to this direction and every group working in this field performed routine runs for all GLE cases, resulting into statistical analyses of GLE events. The next step was to make this algorithm as accurate as possible and most importantly, working in real-time. This was achieved when, during the last GLE observed so far, a real-time GLE Alert signal was produced. In this work, the steps of this procedure as well as the functionality of this algorithm for both the scientific community and users are being discussed. Nevertheless, the transition of the Alert algorithm to the NMDB is also being discussed.
NASA Astrophysics Data System (ADS)
Steigies, C. T.
2016-12-01
Cosmic rays are routinely measured by standardized ground-based Neutron Monitors (NM) around the world. Stations provide measurements as 1-hour averages to the World-Data Center for Cosmic Rays, but most stations can also provide high-resolution measurements at 1-minute cadence. Measurements of one station provide information about the cosmic ray intensity over time at this location. By correcting the measurement for changes in atmospheric pressure, the intensity of the incoming radiation at the top of the atmosphere can be determined. Studying this time series gives information about long-term changes in the heliospheric environment (11 and 22 year solar cycles), as well as information on shorter (Forbush decrease, Fd) and impulsive (Ground Level Enhancement, GLE) events. Since the measurement of a NM is a cumulative measurement a single station can provide only limited information on the spectrum of the incoming radiation. The whole network of Neutron Monitors, however, can act as a large spectrometer. By combining the measurements of many NM stations, the direction and the spectrum of the incoming radiation can be modeled. With this method, high energy solar particle events (that lead to GLEs) and the precursors of Coronal Mass Ejections (CME, manifesting as a Fd) can be detected by the ground-based instruments before the lower energy particles can harm satellites or astronauts. These ALERT systems require the availability of NM data in real-time, which wass one of the goals of the NMDB project. The easy to use NEST interface (nest.nmdb.eu) to NMDB data allows everyone to plot and download data for all participating stations. Since the project started, not only space agencies and ALERT systems make use of the data, but NMDB has attracted several users outside the cosmic ray community. This data is now also used for example as reference value for soil humidity measurements with cosmic rays, or by the DHS for radiation monitors at border crossings, as well as for computer companies testing the susceptibility of their ICs to cosmic rays. These new uses have only become possible since the individual stations have agreed to share their data freely. We encourage all NM stations that are not yet part of NMDB to join the network, and the space and funding agencies to continue to support these important measurements.
NASA Astrophysics Data System (ADS)
Souvatzoglou, G.; Papaioannou, A.; Mavromichalaki, H.; Dimitroulakos, J.; Sarlanis, C.
2014-11-01
Whenever a significant intensity increase is being recorded by at least three neutron monitor stations in real-time mode, a ground level enhancement (GLE) event is marked and an automated alert is issued. Although, the physical concept of the algorithm is solid and has efficiently worked in a number of cases, the availability of real-time data is still an open issue and makes timely GLE alerts quite challenging. In this work we present the optimization of the GLE alert that has been set into operation since 2006 at the Athens Neutron Monitor Station. This upgrade has led to GLE Alert Plus, which is currently based upon the Neutron Monitor Database (NMDB). We have determined the critical values per station allowing us to issue reliable GLE alerts close to the initiation of the event while at the same time we keep the false alert rate at low levels. Furthermore, we have managed to treat the problem of data availability, introducing the Go-Back-N algorithm. A total of 13 GLE events have been marked from January 2000 to December 2012. GLE Alert Plus issued an alert for 12 events. These alert times are compared to the alert times of GOES Space Weather Prediction Center and Solar Energetic Particle forecaster of the University of Málaga (UMASEP). In all cases GLE Alert Plus precedes the GOES alert by ≈8-52 min. The comparison with UMASEP demonstrated a remarkably good agreement. Real-time GLE alerts by GLE Alert Plus may be retrieved by http://cosray.phys.uoa.gr/gle_alert_plus.html, http://www.nmdb.eu, and http://swe.ssa.esa.int/web/guest/space-radiation. An automated GLE alert email notification system is also available to interested users.
NASA Astrophysics Data System (ADS)
Grigoryev, V. G.; Starodubtsev, S. A.; Potapova, V. D.
2013-02-01
In our previous works we have created the method of determination of parameters of cosmic ray daily anisotropy in the interplanetary environment based on the data provided by only single station - cosmic ray spectrograph named after A.I.Kuzmin. This method allows to predict the ingress of the Earth into large-scale solar wind disturbances with a probability of more than 70% and in advance time of about from several hours up to 2 days. Now it became possible to use the data of the neutron monitor networks, which can be seen in the neutron monitor database (NMDB) in real time. In this case the well-known method of global survey is applied for determination of cosmic ray anisotropy. Usage of the data of the cosmic ray station network allows to determine parameters of daily cosmic ray anisotropy with a greater accuracy.
Implementing the European Neutron Monitor Service for the ESA SSA Program
NASA Astrophysics Data System (ADS)
Mavromichalaki, H.; Papaioannou, A.; Souvatzoglou, G.; Dimitroulakos, J.; Paschalis, P.; Gerontidou, M.; Sarlanis, Ch.
2013-09-01
Ground level enhancements (GLEs) are observed as significant intensity increases at neutron monitor measurements, followed by an intense solar flare and/or a very energetic coronal mass ejection. Due to their space weather impact it is crucial to establish a real-time operational system that would be in place to issue reliable and timely GLE Alerts. Such a Neutron Monitor Service that will be made available via the Space Weather Portal operated by the European Space Agency (ESA), under the Space Situational Awareness (SSA) Program, is currently under development. The ESA Neutron Monitor Service will provide two products: a web interface providing data from multiple Neutron Monitor stations as well as an upgraded GLE Alert. Both services are now under testing and validation and will probably enter to an operational phase next year. The core of this Neutron Monitor Service is the GLE Alert software, and therefore, the main goal of this research effort is to upgrade the existing GLE Alert software and to minimize the probability of false alarms. The ESA Neutron Monitor Service is building upon the infrastructure made available with the implementation of the High-Resolution Neutron Monitor Database (NMDB). In this work the structure of the ESA Neutron Monitor Service, the core of the novel GLE Alert Service and its validation results will be presented and further discussed.
The First Ground-Level Enhancement of Solar Cycle 24 on 17 May 2012 and Its Real-Time Detection
NASA Astrophysics Data System (ADS)
Papaioannou, A.; Souvatzoglou, G.; Paschalis, P.; Gerontidou, M.; Mavromichalaki, H.
2014-01-01
Ground-level enhancements (GLEs) are defined as sudden increases in the recorded intensity of cosmic-ray particles, usually by neutron monitors (NMs). In this work we present a time-shifting analysis (TSA) for the first arriving particles that were detected at Earth by NMs. We also present an automated real-time GLE alert that has been developed and is operating via the Neutron Monitor Database (NMDB), which successfully identified the 17 May 2012 event, designated as GLE71. We discuss the time evolution of the real-time GLE alert that was issued for GLE71 and present the event onset-time for NMs that contributed to this GLE alert based on their archived data. A comparison with their real-time time-stamp was made to illustrate the necessity for high-resolution data ( e.g. 1-min time resolution) made available at every minute. The first results on the propagation of relativistic protons that have been recorded by NMs, as inferred by the TSA, imply that they are most probably accelerated by the coronal-mass-ejection-driven shock. Furthermore, the successful usage of NM data and the corresponding achievement of issuing a timely GLE alert are discussed.
NASA Astrophysics Data System (ADS)
Papaioannou, Athanasios; Mavromichalaki, Helen; Souvatzoglou, George; Paschalis, Pavlos; Sarlanis, Christos; Dimitroulakos, John; Gerontidou, Maria
2013-04-01
High-energy particles released at the Sun during a solar flare or a very energetic coronal mass ejection, result to a significant intensity increase at neutron monitor measurements known as Ground Level Enhancements (GLEs). Due to their space weather impact (i.e. risks and failures at communication and navigation systems, spacecraft electronics and operations, space power systems, manned space missions, and commercial aircraft operations) it is crucial to establish a real-time operational system that would be in place to issue reliable and timely GLE Alerts. Currently, the Cosmic Ray group of the National and Kapodistrian University of Athens is working towards the establishment of a Neutron Monitor Service that will be made available via the Space Weather Portal operated by the European Space Agency (ESA), under the Space Situational Awareness (SSA) Program. To this end, a web interface providing data from multiple Neutron Monitor stations as well as an upgraded GLE Alert will be provided. Both services are now under testing and validation and they will probably enter to an operational phase next year. The core of this Neutron Monitor Service is the GLE Alert software, and therefore, the main goal of this research effort is to upgrade the existing GLE Alert software, to minimize the probability of a false alarm and to enhance the usability of the corresponding results. The ESA Neutron Monitor Service is building upon the infrastructure made available with the implementation of the High-Resolution Neutron Monitor Database (NMDB). In this work the structure of the Neutron Monitor Service for ESA SSA Program and the impact of the novel GLE Alert Service that will be made available to future users via ESA SSA web portal will be presented and further discussed.
NASA Astrophysics Data System (ADS)
Malandraki, Olga; Klein, Karl-Ludwig; Vainio, Rami; Agueda, Neus; Nunez, Marlon; Heber, Bernd; Buetikofer, Rolf; Sarlanis, Christos; Crosby, Norma
2017-04-01
High-energy solar energetic particles (SEPs) emitted from the Sun are a major space weather hazard motivating the development of predictive capabilities. In this work, the current state of knowledge on the origin and forecasting of SEP events will be reviewed. Subsequently, we will present the EU HORIZON2020 HESPERIA (High Energy Solar Particle Events foRecastIng and Analysis) project, its structure, its main scientific objectives and forecasting operational tools, as well as the added value to SEP research both from the observational as well as the SEP modelling perspective. The project addresses through multi-frequency observations and simulations the chain of processes from particle acceleration in the corona, particle transport in the magnetically complex corona and interplanetary space to the detection near 1 AU. Furthermore, publicly available software to invert neutron monitor observations of relativistic SEPs to physical parameters that can be compared with space-borne measurements at lower energies is provided for the first time by HESPERIA. In order to achieve these goals, HESPERIA is exploiting already available large datasets stored in databases such as the neutron monitor database (NMDB) and SEPServer that were developed under EU FP7 projects from 2008 to 2013. Forecasting results of the two novel SEP operational forecasting tools published via the consortium server of 'HESPERIA' will be presented, as well as some scientific key results on the acceleration, transport and impact on Earth of high-energy particles. Acknowledgement: This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 637324.
Mini Neutron Monitors at Concordia Research Station, Central Antarctica
NASA Astrophysics Data System (ADS)
Poluianov, Stepan; Usoskin, Ilya; Mishev, Alexander; Moraal, Harm; Kruger, Helena; Casasanta, Giampietro; Traversi, Rita; Udisti, Roberto
2015-12-01
Two mini neutron monitors are installed at Concordia research station (Dome C, Central Antarctica, 75° 06' S, 123° 23' E, 3,233 m.a.s.l.). The site has unique properties ideal for cosmic ray measurements, especially for the detection of solar energetic particles: very low cutoff rigidity < 0.01 GV, high elevation and poleward asymptotic acceptance cones pointing to geographical latitudes > 75° S. The instruments consist of a standard neutron monitor and a "bare" (lead-free) neutron monitor. The instrument operation started in mid-January 2015. The barometric correction coefficients were computed for the period from 1 February to 31 July 2015. Several interesting events, including two notable Forbush decreases on 17 March 2015 and 22 June 2015, and a solar particle event of 29 October 2015 were registered. The data sets are available at cosmicrays.oulu.fi and nmdb.eu.
Galactic cosmic ray spectral index: the case of Forbush decreases of March 2012
NASA Astrophysics Data System (ADS)
Livada, M.; Mavromichalaki, H.; Plainaki, C.
2018-01-01
During the burst of solar activity in March 2012, close to the maximum of solar cycle 24, a number of X-class and M-class flares and halo CMEs with velocity up to 2684 km/s were recorded. During a relatively short period (7-21 March 2012) two Forbush decreases were registered in the ground-level neutron monitor data. In this work, after a short description of the solar and geomagnetic background of these Forbush decreases, we deduce the cosmic ray density and anisotropy variations based on the daily cosmic ray data of the neutron monitor network (http://www.nmdb.eu; http://cosray.phys.uoa.gr). Applying to our data two different coupling functions methods, the spectral index of these Forbush decreases was calculated following the technique of Wawrzynczak and Alania (Adv. Space Res. 45:622-631, 2010). We pointed out that the estimated values of the spectral index γ of these events are almost similar for both cases following the fluctuation of the Forbush decrease. The study and the calculation of the cosmic ray spectrum during such cosmic ray events are very important for Space Weather applications.
Mutation Update for GNE Gene Variants Associated with GNE Myopathy
Celeste, Frank V.; Vilboux, Thierry; Ciccone, Carla; de Dios, John Karl; Malicdan, May Christine V.; Leoyklang, Petcharat; McKew, John C.; Gahl, William A.; Carrillo-Carrasco, Nuria; Huizing, Marjan
2014-01-01
The GNE gene encodes the rate-limiting, bifunctional enzyme of sialic acid biosynthesis, UDP-N-acetylglucosamine 2-epimerase/N-acetylmannosamine kinase (GNE). Biallelic GNE mutations underlie GNE myopathy, an adult-onset progressive myopathy. GNE myopathy-associated GNE mutations are predominantly missense, resulting in reduced, but not absent, GNE enzyme activities. The exact pathomechanism of GNE myopathy remains unknown, but likely involves aberrant (muscle) sialylation. Here we summarize 154 reported and novel GNE variants associated with GNE myopathy, including 122 missense, 11 nonsense, 14 insertion/deletions and 7 intronic variants. All variants were deposited in the online GNE variation database (http://www.dmd.nl/nmdb2/home.php?select_db=GNE). We report the predicted effects on protein function of all variants as well as the predicted effects on epimerase and/or kinase enzymatic activities of selected variants. By analyzing exome sequence databases, we identified three frequently occurring, unreported GNE missense variants/polymorphisms, important for future sequence interpretations. Based on allele frequencies, we estimate the world-wide prevalence of GNE myopathy to be ~ 4–21/1,000,000. This previously unrecognized high prevalence confirms suspicions that many patients may escape diagnosis. Awareness among physicians for GNE myopathy is essential for the identification of new patients, which is required for better understanding of the disorder’s pathomechanism and for the success of ongoing treatment trials. PMID:24796702
DIMA.Tools: An R package for working with the database for inventory, monitoring, and assessment
USDA-ARS?s Scientific Manuscript database
The Database for Inventory, Monitoring, and Assessment (DIMA) is a Microsoft Access database used to collect, store and summarize monitoring data. This database is used by both local and national monitoring efforts within the National Park Service, the Forest Service, the Bureau of Land Management, ...
Report: EPA Needs to Strengthen Financial Database Security Oversight and Monitor Compliance
Report #2007-P-00017, March 29, 2007. Weaknesses in how EPA offices monitor databases for known security vulnerabilities, communicate the status of critical system patches, and monitor the access to database administrator accounts and privileges.
A database application for wilderness character monitoring
Ashley Adams; Peter Landres; Simon Kingston
2012-01-01
The National Park Service (NPS) Wilderness Stewardship Division, in collaboration with the Aldo Leopold Wilderness Research Institute and the NPS Inventory and Monitoring Program, developed a database application to facilitate tracking and trend reporting in wilderness character. The Wilderness Character Monitoring Database allows consistent, scientifically based...
DIMA quick start, database for inventory, monitoring and assessment
USDA-ARS?s Scientific Manuscript database
The Database for Inventory, Monitoring and Assessment (DIMA) is a highly-customized Microsoft Access database for collecting data electronically in the field and for organizing, storing and reporting those data for monitoring and assessment. While DIMA can be used for any number of different monito...
49 CFR 384.229 - Skills test examiner auditing and monitoring.
Code of Federal Regulations, 2014 CFR
2014-10-01
... overt monitoring must be performed at least once every year; (c) Establish and maintain a database to...; (d) Establish and maintain a database of all third party testers and examiners, which at a minimum... examiner; (e) Establish and maintain a database of all State CDL skills examiners, which at a minimum...
49 CFR 384.229 - Skills test examiner auditing and monitoring.
Code of Federal Regulations, 2013 CFR
2013-10-01
... overt monitoring must be performed at least once every year; (c) Establish and maintain a database to...; (d) Establish and maintain a database of all third party testers and examiners, which at a minimum... examiner; (e) Establish and maintain a database of all State CDL skills examiners, which at a minimum...
Customizable tool for ecological data entry, assessment, monitoring, and interpretation
USDA-ARS?s Scientific Manuscript database
The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...
The Application of Lidar to Synthetic Vision System Integrity
NASA Technical Reports Server (NTRS)
Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve
2003-01-01
One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.
Monitoring SLAC High Performance UNIX Computing Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lettsome, Annette K.; /Bethune-Cookman Coll. /SLAC
2005-12-15
Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia.more » Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface.« less
NASA Technical Reports Server (NTRS)
Saeed, M.; Lieu, C.; Raber, G.; Mark, R. G.
2002-01-01
Development and evaluation of Intensive Care Unit (ICU) decision-support systems would be greatly facilitated by the availability of a large-scale ICU patient database. Following our previous efforts with the MIMIC (Multi-parameter Intelligent Monitoring for Intensive Care) Database, we have leveraged advances in networking and storage technologies to develop a far more massive temporal database, MIMIC II. MIMIC II is an ongoing effort: data is continuously and prospectively archived from all ICU patients in our hospital. MIMIC II now consists of over 800 ICU patient records including over 120 gigabytes of data and is growing. A customized archiving system was used to store continuously up to four waveforms and 30 different parameters from ICU patient monitors. An integrated user-friendly relational database was developed for browsing of patients' clinical information (lab results, fluid balance, medications, nurses' progress notes). Based upon its unprecedented size and scope, MIMIC II will prove to be an important resource for intelligent patient monitoring research, and will support efforts in medical data mining and knowledge-discovery.
The Monitoring Erosion of Agricultural Land and spatial database of erosion events
NASA Astrophysics Data System (ADS)
Kapicka, Jiri; Zizala, Daniel
2013-04-01
In 2011 originated in The Czech Republic The Monitoring Erosion of Agricultural Land as joint project of State Land Office (SLO) and Research Institute for Soil and Water Conservation (RISWC). The aim of the project is collecting and record keeping information about erosion events on agricultural land and their evaluation. The main idea is a creation of a spatial database that will be source of data and information for evaluation and modeling erosion process, for proposal of preventive measures and measures to reduce negative impacts of erosion events. A subject of monitoring is the manifestations of water erosion, wind erosion and slope deformation in which cause damaged agriculture land. A website, available on http://me.vumop.cz, is used as a tool for keeping and browsing information about monitored events. SLO employees carry out record keeping. RISWC is specialist institute in the Monitoring Erosion of Agricultural Land that performs keeping the spatial database, running the website, managing the record keeping of events, analysis the cause of origins events and statistical evaluations of keeping events and proposed measures. Records are inserted into the database using the user interface of the website which has map server as a component. Website is based on database technology PostgreSQL with superstructure PostGIS and MapServer UMN. Each record is in the database spatial localized by a drawing and it contains description information about character of event (data, situation description etc.) then there are recorded information about land cover and about grown crops. A part of database is photodocumentation which is taken in field reconnaissance which is performed within two days after notify of event. Another part of database are information about precipitations from accessible precipitation gauges. Website allows to do simple spatial analysis as are area calculation, slope calculation, percentage representation of GAEC etc.. Database structure was designed on the base of needs analysis inputs to mathematical models. Mathematical models are used for detailed analysis of chosen erosion events which include soil analysis. Till the end 2012 has had the database 135 events. The content of database still accrues and gives rise to the extensive source of data that is usable for testing mathematical models.
Implementation of medical monitor system based on networks
NASA Astrophysics Data System (ADS)
Yu, Hui; Cao, Yuzhen; Zhang, Lixin; Ding, Mingshi
2006-11-01
In this paper, the development trend of medical monitor system is analyzed and portable trend and network function become more and more popular among all kinds of medical monitor devices. The architecture of medical network monitor system solution is provided and design and implementation details of medical monitor terminal, monitor center software, distributed medical database and two kind of medical information terminal are especially discussed. Rabbit3000 system is used in medical monitor terminal to implement security administration of data transfer on network, human-machine interface, power management and DSP interface while DSP chip TMS5402 is used in signal analysis and data compression. Distributed medical database is designed for hospital center according to DICOM information model and HL7 standard. Pocket medical information terminal based on ARM9 embedded platform is also developed to interactive with center database on networks. Two kernels based on WINCE are customized and corresponding terminal software are developed for nurse's routine care and doctor's auxiliary diagnosis. Now invention patent of the monitor terminal is approved and manufacture and clinic test plans are scheduled. Applications for invention patent are also arranged for two medical information terminals.
A Case Study in Software Adaptation
2002-01-01
1 A Case Study in Software Adaptation Giuseppe Valetto Telecom Italia Lab Via Reiss Romoli 274 10148, Turin, Italy +39 011 2288788...configuration of the service; monitoring of database connectivity from within the service; monitoring of crashes and shutdowns of IM servers; monitoring of...of the IM server all share a relational database and a common runtime state repository, which make up the backend tier, and allow replicas to
Lee, Jong Woo; LaRoche, Suzette; Choi, Hyunmi; Rodriguez Ruiz, Andres A; Fertig, Evan; Politsky, Jeffrey M; Herman, Susan T; Loddenkemper, Tobias; Sansevere, Arnold J; Korb, Pearce J; Abend, Nicholas S; Goldstein, Joshua L; Sinha, Saurabh R; Dombrowski, Keith E; Ritzl, Eva K; Westover, Michael B; Gavvala, Jay R; Gerard, Elizabeth E; Schmitt, Sarah E; Szaflarski, Jerzy P; Ding, Kan; Haas, Kevin F; Buchsbaum, Richard; Hirsch, Lawrence J; Wusthoff, Courtney J; Hopp, Jennifer L; Hahn, Cecil D
2016-04-01
The rapid expansion of the use of continuous critical care electroencephalogram (cEEG) monitoring and resulting multicenter research studies through the Critical Care EEG Monitoring Research Consortium has created the need for a collaborative data sharing mechanism and repository. The authors describe the development of a research database incorporating the American Clinical Neurophysiology Society standardized terminology for critical care EEG monitoring. The database includes flexible report generation tools that allow for daily clinical use. Key clinical and research variables were incorporated into a Microsoft Access database. To assess its utility for multicenter research data collection, the authors performed a 21-center feasibility study in which each center entered data from 12 consecutive intensive care unit monitoring patients. To assess its utility as a clinical report generating tool, three large volume centers used it to generate daily clinical critical care EEG reports. A total of 280 subjects were enrolled in the multicenter feasibility study. The duration of recording (median, 25.5 hours) varied significantly between the centers. The incidence of seizure (17.6%), periodic/rhythmic discharges (35.7%), and interictal epileptiform discharges (11.8%) was similar to previous studies. The database was used as a clinical reporting tool by 3 centers that entered a total of 3,144 unique patients covering 6,665 recording days. The Critical Care EEG Monitoring Research Consortium database has been successfully developed and implemented with a dual role as a collaborative research platform and a clinical reporting tool. It is now available for public download to be used as a clinical data repository and report generating tool.
Monitoring of services with non-relational databases and map-reduce framework
NASA Astrophysics Data System (ADS)
Babik, M.; Souto, F.
2012-12-01
Service Availability Monitoring (SAM) is a well-established monitoring framework that performs regular measurements of the core site services and reports the corresponding availability and reliability of the Worldwide LHC Computing Grid (WLCG) infrastructure. One of the existing extensions of SAM is Site Wide Area Testing (SWAT), which gathers monitoring information from the worker nodes via instrumented jobs. This generates quite a lot of monitoring data to process, as there are several data points for every job and several million jobs are executed every day. The recent uptake of non-relational databases opens a new paradigm in the large-scale storage and distributed processing of systems with heavy read-write workloads. For SAM this brings new possibilities to improve its model, from performing aggregation of measurements to storing raw data and subsequent re-processing. Both SAM and SWAT are currently tuned to run at top performance, reaching some of the limits in storage and processing power of their existing Oracle relational database. We investigated the usability and performance of non-relational storage together with its distributed data processing capabilities. For this, several popular systems have been compared. In this contribution we describe our investigation of the existing non-relational databases suited for monitoring systems covering Cassandra, HBase and MongoDB. Further, we present our experiences in data modeling and prototyping map-reduce algorithms focusing on the extension of the already existing availability and reliability computations. Finally, possible future directions in this area are discussed, analyzing the current deficiencies of the existing Grid monitoring systems and proposing solutions to leverage the benefits of the non-relational databases to get more scalable and flexible frameworks.
A Recommender System in the Cyber Defense Domain
2014-03-27
monitoring software is a java based program sending updates to the database on the sensor machine. The host monitoring program gathers information about...3.2.2 Database. A MySQL database located on the sensor machine acts as the storage for the sensors on the network. Snort, Nmap, vulnerability scores, and...machine with the IDS and the recommender is labeled “sensor”. The recommender system code is written in java and compiled using java version 1.6.024
Environment/Health/Safety (EHS): Databases
Hazard Documents Database Biosafety Authorization System CATS (Corrective Action Tracking System) (for findings 12/2005 to present) Chemical Management System Electrical Safety Ergonomics Database (for new Learned / Best Practices REMS - Radiation Exposure Monitoring System SJHA Database - Subcontractor Job
Database usage and performance for the Fermilab Run II experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonham, D.; Box, D.; Gallas, E.
2004-12-01
The Run II experiments at Fermilab, CDF and D0, have extensive database needs covering many areas of their online and offline operations. Delivering data to users and processing farms worldwide has represented major challenges to both experiments. The range of applications employing databases includes, calibration (conditions), trigger information, run configuration, run quality, luminosity, data management, and others. Oracle is the primary database product being used for these applications at Fermilab and some of its advanced features have been employed, such as table partitioning and replication. There is also experience with open source database products such as MySQL for secondary databasesmore » used, for example, in monitoring. Tools employed for monitoring the operation and diagnosing problems are also described.« less
Evaluation of NoSQL databases for DIRAC monitoring and beyond
NASA Astrophysics Data System (ADS)
Mathe, Z.; Casajus Ramo, A.; Stagni, F.; Tomassetti, L.
2015-12-01
Nowadays, many database systems are available but they may not be optimized for storing time series data. Monitoring DIRAC jobs would be better done using a database optimised for storing time series data. So far it was done using a MySQL database, which is not well suited for such an application. Therefore alternatives have been investigated. Choosing an appropriate database for storing huge amounts of time series data is not trivial as one must take into account different aspects such as manageability, scalability and extensibility. We compared the performance of Elasticsearch, OpenTSDB (based on HBase) and InfluxDB NoSQL databases, using the same set of machines and the same data. We also evaluated the effort required for maintaining them. Using the LHCb Workload Management System (WMS), based on DIRAC as a use case we set up a new monitoring system, in parallel with the current MySQL system, and we stored the same data into the databases under test. We evaluated Grafana (for OpenTSDB) and Kibana (for ElasticSearch) metrics and graph editors for creating dashboards, in order to have a clear picture on the usability of each candidate. In this paper we present the results of this study and the performance of the selected technology. We also give an outlook of other potential applications of NoSQL databases within the DIRAC project.
Data Auditor: Analyzing Data Quality Using Pattern Tableaux
NASA Astrophysics Data System (ADS)
Srivastava, Divesh
Monitoring databases maintain configuration and measurement tables about computer systems, such as networks and computing clusters, and serve important business functions, such as troubleshooting customer problems, analyzing equipment failures, planning system upgrades, etc. These databases are prone to many data quality issues: configuration tables may be incorrect due to data entry errors, while measurement tables may be affected by incorrect, missing, duplicate and delayed polls. We describe Data Auditor, a tool for analyzing data quality and exploring data semantics of monitoring databases. Given a user-supplied constraint, such as a boolean predicate expected to be satisfied by every tuple, a functional dependency, or an inclusion dependency, Data Auditor computes "pattern tableaux", which are concise summaries of subsets of the data that satisfy or fail the constraint. We discuss the architecture of Data Auditor, including the supported types of constraints and the tableau generation mechanism. We also show the utility of our approach on an operational network monitoring database.
Ahmed, Zeeshan; Zeeshan, Saman; Fleischmann, Pauline; Rössler, Wolfgang; Dandekar, Thomas
2014-01-01
Field studies on arthropod ecology and behaviour require simple and robust monitoring tools, preferably with direct access to an integrated database. We have developed and here present a database tool allowing smart-phone based monitoring of arthropods. This smart phone application provides an easy solution to collect, manage and process the data in the field which has been a very difficult task for field biologists using traditional methods. To monitor our example species, the desert ant Cataglyphis fortis, we considered behavior, nest search runs, feeding habits and path segmentations including detailed information on solar position and azimuth calculation, ant orientation and time of day. For this we established a user friendly database system integrating the Ant-App-DB with a smart phone and tablet application, combining experimental data manipulation with data management and providing solar position and timing estimations without any GPS or GIS system. Moreover, the new desktop application Dataplus allows efficient data extraction and conversion from smart phone application to personal computers, for further ecological data analysis and sharing. All features, software code and database as well as Dataplus application are made available completely free of charge and sufficiently generic to be easily adapted to other field monitoring studies on arthropods or other migratory organisms. The software applications Ant-App-DB and Dataplus described here are developed using the Android SDK, Java, XML, C# and SQLite Database.
Ahmed, Zeeshan; Zeeshan, Saman; Fleischmann, Pauline; Rössler, Wolfgang; Dandekar, Thomas
2015-01-01
Field studies on arthropod ecology and behaviour require simple and robust monitoring tools, preferably with direct access to an integrated database. We have developed and here present a database tool allowing smart-phone based monitoring of arthropods. This smart phone application provides an easy solution to collect, manage and process the data in the field which has been a very difficult task for field biologists using traditional methods. To monitor our example species, the desert ant Cataglyphis fortis, we considered behavior, nest search runs, feeding habits and path segmentations including detailed information on solar position and azimuth calculation, ant orientation and time of day. For this we established a user friendly database system integrating the Ant-App-DB with a smart phone and tablet application, combining experimental data manipulation with data management and providing solar position and timing estimations without any GPS or GIS system. Moreover, the new desktop application Dataplus allows efficient data extraction and conversion from smart phone application to personal computers, for further ecological data analysis and sharing. All features, software code and database as well as Dataplus application are made available completely free of charge and sufficiently generic to be easily adapted to other field monitoring studies on arthropods or other migratory organisms. The software applications Ant-App-DB and Dataplus described here are developed using the Android SDK, Java, XML, C# and SQLite Database. PMID:25977753
Shantakumar, Sumitra; Nordstrom, Beth L; Hall, Susan A; Djousse, Luc; van Herk-Sukel, Myrthe P P; Fraeman, Kathy H; Gagnon, David R; Chagin, Karen; Nelson, Jeanenne J
2017-04-20
Pazopanib received US Food and Drug Administration approval in 2009 for advanced renal cell carcinoma. During clinical development, liver chemistry abnormalities and adverse hepatic events were observed, leading to a boxed warning for hepatotoxicity and detailed label prescriber guidelines for liver monitoring. As part of postapproval regulatory commitments, a cohort study was conducted to assess prescriber compliance with liver monitoring guidelines. Over a 4-year period, a distributed network approach was used across 3 databases: US Veterans Affairs Healthcare System, a US outpatient oncology community practice database, and the Dutch PHARMO Database Network. Measures of prescriber compliance were designed using the original pazopanib label guidelines for liver monitoring. Results from the VA (n = 288) and oncology databases (n = 283) indicate that prescriber liver chemistry monitoring was less than 100%: 73% to 74% compliance with baseline testing and 37% to 39% compliance with testing every 4 weeks. Compliance was highest near drug initiation and decreased over time. Among patients who should have had weekly testing, the compliance was 56% in both databases. The more serious elevations examined, including combinations of liver enzyme elevations meeting the laboratory definition of Hy's law were infrequent but always led to appropriate discontinuation of pazopanib. Only 4 patients were identified for analysis in the Dutch database; none had recorded baseline testing. In this population-based study, prescriber compliance was reasonable near pazopanib initiation but low during subsequent weeks of treatment. This study provides information from real-world community practice settings and offers feedback to regulators on the effectiveness of label monitoring guidelines.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Building a structured monitoring and evaluating system of postmarketing drug use in Shanghai.
Du, Wenmin; Levine, Mitchell; Wang, Longxing; Zhang, Yaohua; Yi, Chengdong; Wang, Hongmin; Wang, Xiaoyu; Xie, Hongjuan; Xu, Jianglong; Jin, Huilin; Wang, Tongchun; Huang, Gan; Wu, Ye
2007-01-01
In order to understand a drug's full profile in the post-marketing environment, information is needed regarding utilization patterns, beneficial effects, ADRs and economic value. China, the most populated country in the world, has the largest number of people who are taking medications. To begin to appreciate the impact of these medications, a multifunctional evaluation and surveillance system was developed, the Shanghai Drug Monitoring and Evaluative System (SDMES). Set up by the Shanghai Center for Adverse Drug Reaction Monitoring in 2001, the SDMES contains three databases: a population health data base of middle aged and elderly persons; hospital patient medical records; and a spontaneous ADR reporting database. Each person has a unique identification and Medicare number, which permits record-linkage within and between these three databases. After more than three years in development, the population health database has comprehensive data for more than 320,000 residents. The hospital database has two years of inpatient medical records from five major hospitals, and will be increasing to 10 hospitals in 2007. The spontaneous reporting ADR database has collected 20,205 cases since 2001 from approximately 295 sources, including hospitals, pharmaceutical companies, drug wholesalers and pharmacies. The SDMES has the potential to become an important national and international pharmacoepidemiology resource for drug evaluation.
Hydroacoustic propagation grids for the CTBT knowledge databaes BBN technical memorandum W1303
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Angell
1998-05-01
The Hydroacoustic Coverage Assessment Model (HydroCAM) has been used to develop components of the hydroacoustic knowledge database required by operational monitoring systems, particularly the US National Data Center (NDC). The database, which consists of travel time, amplitude correction and travel time standard deviation grids, is planned to support source location, discrimination and estimation functions of the monitoring network. The grids will also be used under the current BBN subcontract to support an analysis of the performance of the International Monitoring System (IMS) and national sensor systems. This report describes the format and contents of the hydroacoustic knowledgebase grids, and themore » procedures and model parameters used to generate these grids. Comparisons between the knowledge grids, measured data and other modeled results are presented to illustrate the strengths and weaknesses of the current approach. A recommended approach for augmenting the knowledge database with a database of expected spectral/waveform characteristics is provided in the final section of the report.« less
Database Performance Monitoring for the Photovoltaic Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Katherine A.
The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less
[Implementation of Oncomelania hupensis monitoring system based on Baidu Map].
Zhi-Hua, Chen; Yi-Sheng, Zhu; Zhi-Qiang, Xue; Xue-Bing, Li; Yi-Min, Ding; Li-Jun, Bi; Kai-Min, Gao; You, Zhang
2017-10-25
To construct the Oncomelania hupensis snail monitoring system based on the Baidu Map. The environmental basic information about historical snail environment and existing snail environment, etc. was collected with the monitoring data about different kinds of O. hupensis snails, and then the O. hupensis snail monitoring system was built. Geographic Information System (GIS) and the electronic fence technology and Application Program Interface (API) were applied to set up the electronic fence of the snail surveillance environments, and the electronic fence was connected to the database of the snail surveillance. The O. hupensis snail monitoring system based on the Baidu Map were built up, including three modules of O. hupensis Snail Monitoring Environmental Database, Dynamic Monitoring Platform and Electronic Map. The information about monitoring O. hupensis snails could be obtained through the computer and smartphone simultaneously. The O. hupensis snail monitoring system, which is based on Baidu Map, is a visible platform to follow the process of snailsearching and molluscaciding.
Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vivek Agarwal; Nancy J. Lybeck; Randall Bickford
Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation ofmore » the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.« less
Bianca N. I. Eskelson; Hailemariam Temesgen; Valerie Lemay; Tara M. Barrett; Nicholas L. Crookston; Andrew T. Hudak
2009-01-01
Almost universally, forest inventory and monitoring databases are incomplete, ranging from missing data for only a few records and a few variables, common for small land areas, to missing data for many observations and many variables, common for large land areas. For a wide variety of applications, nearest neighbor (NN) imputation methods have been developed to fill in...
Zampieri, Fernando Godinho; Soares, Márcio; Borges, Lunna Perdigão; Salluh, Jorge Ibrain Figueira; Ranzani, Otávio Tavares
2017-01-01
To describe the Epimed Monitor Database®, a Brazilian intensive care unit quality improvement database. We described the Epimed Monitor® Database, including its structure and core data. We presented aggregated informative data from intensive care unit admissions from 2010 to 2016 using descriptive statistics. We also described the expansion and growth of the database along with the geographical distribution of participating units in Brazil. The core data from the database includes demographic, administrative and physiological parameters, as well as specific report forms used to gather detailed data regarding the use of intensive care unit resources, infectious episodes, adverse events and checklists for adherence to best clinical practices. As of the end of 2016, 598 adult intensive care units in 318 hospitals totaling 8,160 intensive care unit beds were participating in the database. Most units were located at private hospitals in the southeastern region of the country. The number of yearly admissions rose during this period and included a predominance of medical admissions. The proportion of admissions due to cardiovascular disease declined, while admissions due to sepsis or infections became more common. Illness severity (Simplified Acute Physiology Score - SAPS 3 - 62 points), patient age (mean = 62 years) and hospital mortality (approximately 17%) remained reasonably stable during this time period. A large private database of critically ill patients is feasible and may provide relevant nationwide epidemiological data for quality improvement and benchmarking purposes among the participating intensive care units. This database is useful not only for administrative reasons but also for the improvement of daily care by facilitating the adoption of best practices and use for clinical research.
Monitoring product safety in the postmarketing environment.
Sharrar, Robert G; Dieck, Gretchen S
2013-10-01
The safety profile of a medicinal product may change in the postmarketing environment. Safety issues not identified in clinical development may be seen and need to be evaluated. Methods of evaluating spontaneous adverse experience reports and identifying new safety risks include a review of individual reports, a review of a frequency distribution of a list of the adverse experiences, the development and analysis of a case series, and various ways of examining the database for signals of disproportionality, which may suggest a possible association. Regulatory agencies monitor product safety through a variety of mechanisms including signal detection of the adverse experience safety reports in databases and by requiring and monitoring risk management plans, periodic safety update reports and postauthorization safety studies. The United States Food and Drug Administration is working with public, academic and private entities to develop methods for using large electronic databases to actively monitor product safety. Important identified risks will have to be evaluated through observational studies and registries.
Participation in a national nursing outcomes database: monitoring outcomes over time.
Loan, Lori A; Patrician, Patricia A; McCarthy, Mary
2011-01-01
The current and future climates in health care require increased accountability of health care organizations for the quality of the care they provide. Never before in the history of health care in America has this focus on quality been so critical. The imperative to measure nursing's impact without fully developed and tested monitoring systems is a critical issue for nurse executives and managers alike. This article describes a project to measure nursing structure, process, and outcomes in the military health system, the Military Nursing Outcomes Database project. Here we review the effectiveness of this project in monitoring changes over time, in satisfying expectations of nurse leaders in participating hospitals, and evaluate the potential budgetary impacts of such a system. We conclude that the Military Nursing Outcomes Database did meet the needs of a monitoring system that is sensitive to changes over time in outcomes, provides interpretable data for nurse leaders, and could result in cost benefits and patient care improvements in organizations.
Charlton, R A; Bettoli, V; Bos, H J; Engeland, A; Garne, E; Gini, R; Hansen, A V; de Jong-van den Berg, L T W; Jordan, S; Klungsøyr, K; Neville, A J; Pierini, A; Puccini, A; Sinclair, M; Thayer, D; Dolk, H
2018-04-01
Pregnancy prevention programmes (PPPs) exist for some medicines known to be highly teratogenic. It is increasingly recognised that the impact of these risk minimisation measures requires periodic evaluation. This study aimed to assess the extent to which some of the data needed to monitor the effectiveness of PPPs may be present in European healthcare databases. An inventory was completed for databases contributing to EUROmediCAT capturing pregnancy and prescription data in Denmark, Norway, the Netherlands, Italy (Tuscany/Emilia Romagna), Wales and the rest of the UK, to determine the extent of data collected that could be used to evaluate the impact of PPPs. Data availability varied between databases. All databases could be used to identify the frequency and duration of prescriptions to women of childbearing age from primary care, but there were specific issues with availability of data from secondary care and private care. To estimate the frequency of exposed pregnancies, all databases could be linked to pregnancy data, but the accuracy of timing of the start of pregnancy was variable, and data on pregnancies ending in induced abortions were often not available. Data availability on contraception to estimate compliance with contraception requirements was variable and no data were available on pregnancy tests. Current electronic healthcare databases do not contain all the data necessary to fully monitor the effectiveness of PPP implementation, and thus, special data collection measures need to be instituted.
Remote monitoring of patients with implanted devices: data exchange and integration.
Van der Velde, Enno T; Atsma, Douwe E; Foeken, Hylke; Witteman, Tom A; Hoekstra, Wybo H G J
2013-06-01
Remote follow-up of implanted implantable cardioverter defibrillators (ICDs) may offer a solution to the problem of overcrowded outpatient clinics, and may also be effective in detecting clinical events early. Data obtained from remote follow up systems, as developed by all major device companies, are stored in a central database system, operated and owned by the device company. A problem now arises that the patient's clinical information is partly stored in the local electronic health record (EHR) system in the hospital, and partly in the remote monitoring database, which may potentially result in patient safety issues. To address the requirement of integrating remote monitoring data in the local EHR, the Integrating the Healthcare Enterprise (IHE) Implantable Device Cardiac Observation (IDCO) profile has been developed. This IHE IDCO profile has been adapted by all major device companies. In our hospital, we have implemented the IHE IDCO profile to import data from the remote databases from two device vendors into the departmental Cardiology Information System (EPD-Vision). Data is exchanged via a HL7/XML communication protocol, as defined in the IHE IDCO profile. By implementing the IHE IDCO profile, we have been able to integrate the data from the remote monitoring databases in our local EHRs. It can be expected that remote monitoring systems will develop into dedicated monitoring and therapy platforms. Data retrieved from these systems should form an integral part of the electronic patient record as more and more out-patient clinic care will shift to personalized care provided at a distance, in other words at the patient's home.
BBN technical memorandum W1291 infrasound model feasibility study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, T., BBN Systems and Technologies
1998-05-01
The purpose of this study is to determine the need and level of effort required to add existing atmospheric databases and infrasound propagation models to the DOE`s Hydroacoustic Coverage Assessment Model (HydroCAM) [1,2]. The rationale for the study is that the performance of the infrasound monitoring network will be an important factor for both the International Monitoring System (IMS) and US national monitoring capability. Many of the technical issues affecting the design and performance of the infrasound network are directly related to the variability of the atmosphere and the corresponding uncertainties in infrasound propagation. It is clear that the studymore » of these issues will be enhanced by the availability of software tools for easy manipulation and interfacing of various atmospheric databases and infrasound propagation models. In addition, since there are many similarities between propagation in the oceans and in the atmosphere, it is anticipated that much of the software infrastructure developed for hydroacoustic database manipulation and propagation modeling in HydroCAM will be directly extendible to an infrasound capability. The study approach was to talk to the acknowledged domain experts in the infrasound monitoring area to determine: 1. The major technical issues affecting infrasound monitoring network performance. 2. The need for an atmospheric database/infrasound propagation modeling capability similar to HydroCAM. 3. The state of existing infrasound propagation codes and atmospheric databases. 4. A recommended approach for developing the required capabilities. A list of the people who contributed information to this study is provided in Table 1. We also relied on our knowledge of oceanographic and meteorological data sources to determine the availability of atmospheric databases and the feasibility of incorporating this information into the existing HydroCAM geographic database software. This report presents a summary of the need for an integrated infrasound modeling capability in Section 2.0. Section 3.0 provides a recommended approach for developing this capability in two stages; a basic capability and an extended capability. This section includes a discussion of the available static and dynamic databases, and the various modeling tools which are available or could be developed under such a task. The conclusions and recommendations of the study are provided in Section 4.0.« less
Nørgaard, M; Johnsen, S P
2016-02-01
In Denmark, the need for monitoring of clinical quality and patient safety with feedback to the clinical, administrative and political systems has resulted in the establishment of a network of more than 60 publicly financed nationwide clinical quality databases. Although primarily devoted to monitoring and improving quality of care, the potential of these databases as data sources in clinical research is increasingly being recognized. In this review, we describe these databases focusing on their use as data sources for clinical research, including their strengths and weaknesses as well as future concerns and opportunities. The research potential of the clinical quality databases is substantial but has so far only been explored to a limited extent. Efforts related to technical, legal and financial challenges are needed in order to take full advantage of this potential. © 2016 The Association for the Publication of the Journal of Internal Medicine.
A plan for the North American Bat Monitoring Program (NABat)
Loeb, Susan C.; Rodhouse, Thomas J.; Ellison, Laura E.; Lausen, Cori L.; Reichard, Jonathan D.; Irvine, Kathryn M.; Ingersoll, Thomas E.; Coleman, Jeremy; Thogmartin, Wayne E.; Sauer, John R.; Francis, Charles M.; Bayless, Mylea L.; Stanley, Thomas R.; Johnson, Douglas H.
2015-01-01
The purpose of the North American Bat Monitoring Program (NABat) is to create a continent-wide program to monitor bats at local to rangewide scales that will provide reliable data to promote effective conservation decisionmaking and the long-term viability of bat populations across the continent. This is an international, multiagency program. Four approaches will be used to gather monitoring data to assess changes in bat distributions and abundances: winter hibernaculum counts, maternity colony counts, mobile acoustic surveys along road transects, and acoustic surveys at stationary points. These monitoring approaches are described along with methods for identifying species recorded by acoustic detectors. Other chapters describe the sampling design, the database management system (Bat Population Database), and statistical approaches that can be used to analyze data collected through this program.
Chen, Huan-Sheng; Cheng, Chun-Ting; Hou, Chun-Cheng; Liou, Hung-Hsiang; Chang, Cheng-Tsung; Lin, Chun-Ju; Wu, Tsai-Kun; Chen, Chang-Hsu; Lim, Paik-Seong
2017-07-01
Rapid screening and monitoring of nutritional status is mandatory in hemodialysis population because of the increasingly encountered nutritional problems. Considering the limitations of previous composite nutrition scores applied in this population, we tried to develop a standardized composite nutrition score (SCNS) using low lean tissue index as a marker of protein wasting to facilitate clinical screening and monitoring and to predict outcome. This retrospective cohort used 2 databases of dialysis populations from Taiwan between 2011 and 2014. First database consisting of data from 629 maintenance hemodialysis patients was used to develop the SCNS and the second database containing data from 297 maintenance hemodialysis patients was used to validate this developed score. SCNS containing albumin, creatinine, potassium, and body mass index was developed from the first database using low lean tissue index as a marker of protein wasting. When applying this score in the original database, significantly higher risk of developing protein wasting was found for patients with lower SCNS (odds ratio 1.38 [middle tertile vs highest tertile, P < .0001] and 2.40 [lowest tertile vs middle tertile, P < .0001]). The risk of death was also shown to be higher for patients with lower SCNS (hazard ratio 4.45 [below median level vs above median level, P < .0001]). These results were validated in the second database. We developed an SCNS consisting of 4 easily available biochemical parameters. This kind of scoring system can be easily applied in different dialysis facilities for screening and monitoring of protein wasting. The wide application of body composition monitor in dialysis population will also facilitate the development of specific nutrition scoring model for individual facility. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
Monitoring of IaaS and scientific applications on the Cloud using the Elasticsearch ecosystem
NASA Astrophysics Data System (ADS)
Bagnasco, S.; Berzano, D.; Guarise, A.; Lusso, S.; Masera, M.; Vallero, S.
2015-05-01
The private Cloud at the Torino INFN computing centre offers IaaS services to different scientific computing applications. The infrastructure is managed with the OpenNebula cloud controller. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BES-III collaboration, plus an increasing number of other small tenants. Besides keeping track of the usage, the automation of dynamic allocation of resources to tenants requires detailed monitoring and accounting of the resource usage. As a first investigation towards this, we set up a monitoring system to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the Elasticsearch, Logstash and Kibana stack. In the current implementation, the heterogeneous accounting information is fed to different MySQL databases and sent to Elasticsearch via a custom Logstash plugin. For the IaaS metering, we developed sensors for the OpenNebula API. The IaaS level information gathered through the API is sent to the MySQL database through an ad-hoc developed RESTful web service, which is also used for other accounting purposes. Concerning the application level, we used the Root plugin TProofMonSenderSQL to collect accounting data from the interactive analysis facility. The BES-III virtual instances used to be monitored with Zabbix, as a proof of concept we also retrieve the information contained in the Zabbix database. Each of these three cases is indexed separately in Elasticsearch. We are now starting to consider dismissing the intermediate level provided by the SQL database and evaluating a NoSQL option as a unique central database for all the monitoring information. We setup a set of Kibana dashboards with pre-defined queries in order to monitor the relevant information in each case. In this way we have achieved a uniform monitoring interface for both the IaaS and the scientific applications, mostly leveraging off-the-shelf tools.
Use of electronic medical record data for quality improvement in schizophrenia treatment.
Owen, Richard R; Thrush, Carol R; Cannon, Dale; Sloan, Kevin L; Curran, Geoff; Hudson, Teresa; Austen, Mark; Ritchie, Mona
2004-01-01
An understanding of the strengths and limitations of automated data is valuable when using administrative or clinical databases to monitor and improve the quality of health care. This study discusses the feasibility and validity of using data electronically extracted from the Veterans Health Administration (VHA) computer database (VistA) to monitor guideline performance for inpatient and outpatient treatment of schizophrenia. The authors also discuss preliminary results and their experience in applying these methods to monitor antipsychotic prescribing using the South Central VA Healthcare Network (SCVAHCN) Data Warehouse as a tool for quality improvement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevins, N; Vanderhoek, M; Lang, S
2014-06-15
Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary andmore » secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system.« less
NASA Technical Reports Server (NTRS)
Iverson, David L. (Inventor)
2008-01-01
The present invention relates to an Inductive Monitoring System (IMS), its software implementations, hardware embodiments and applications. Training data is received, typically nominal system data acquired from sensors in normally operating systems or from detailed system simulations. The training data is formed into vectors that are used to generate a knowledge database having clusters of nominal operating regions therein. IMS monitors a system's performance or health by comparing cluster parameters in the knowledge database with incoming sensor data from a monitored-system formed into vectors. Nominal performance is concluded when a monitored-system vector is determined to lie within a nominal operating region cluster or lies sufficiently close to a such a cluster as determined by a threshold value and a distance metric. Some embodiments of IMS include cluster indexing and retrieval methods that increase the execution speed of IMS.
Using Statistics for Database Management in an Academic Library.
ERIC Educational Resources Information Center
Hyland, Peter; Wright, Lynne
1996-01-01
Collecting statistical data about database usage by library patrons aids in the management of CD-ROM and database offerings, collection development, and evaluation of training programs. Two approaches to data collection are presented which should be used together: an automated or nonintrusive method which monitors search sessions while the…
Long-term ecosystem monitoring and change detection: the Sonoran initiative
Robert Lozar; Charles Ehlschlaeger
2005-01-01
Ecoregional Systems Heritage and Encroachment Monitoring (ESHEM) examines issues of land management at an ecosystem level using remote sensing. Engineer Research and Development Center (ERDC), in partnership with Western Illinois University, has developed an ecoregional database and monitoring capability covering the Sonoran region. The monitoring time horizon will...
ERIC Educational Resources Information Center
Chavez-Gibson, Sarah
2013-01-01
The purpose of this study is to exam in-depth, the Comprehensive, Powerful, Academic Database (CPAD), a data decision-making tool that determines and identifies students at-risk of dropping out of school, and how the CPAD assists administrators and teachers at an elementary campus to monitor progress, curriculum, and performance to improve student…
Dietary Exposure Potential Model
Existing food consumption and contaminant residue databases, typically products of nutrition and regulatory monitoring, contain useful information to characterize dietary intake of environmental chemicals. A PC-based model with resident database system, termed the Die...
Rooijakkers, Michiel; Rabotti, Chiara; Bennebroek, Martijn; van Meerbergen, Jef; Mischi, Massimo
2011-01-01
Non-invasive fetal health monitoring during pregnancy has become increasingly important. Recent advances in signal processing technology have enabled fetal monitoring during pregnancy, using abdominal ECG recordings. Ubiquitous ambulatory monitoring for continuous fetal health measurement is however still unfeasible due to the computational complexity of noise robust solutions. In this paper an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, while increasing the R-peak detection quality compared to existing R-peak detection schemes. Validation of the algorithm is performed on two manually annotated datasets, the MIT/BIH Arrhythmia database and an in-house abdominal database. Both R-peak detection quality and computational complexity are compared to state-of-the-art algorithms as described in the literature. With a detection error rate of 0.22% and 0.12% on the MIT/BIH Arrhythmia and in-house databases, respectively, the quality of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.
Processing and Quality Monitoring for the ATLAS Tile Hadronic Calorimeter Data
NASA Astrophysics Data System (ADS)
Burghgrave, Blake; ATLAS Collaboration
2017-10-01
An overview is presented of Data Processing and Data Quality (DQ) Monitoring for the ATLAS Tile Hadronic Calorimeter. Calibration runs are monitored from a data quality perspective and used as a cross-check for physics runs. Data quality in physics runs is monitored extensively and continuously. Any problems are reported and immediately investigated. The DQ efficiency achieved was 99.6% in 2012 and 100% in 2015, after the detector maintenance in 2013-2014. Changes to detector status or calibrations are entered into the conditions database (DB) during a brief calibration loop between the end of a run and the beginning of bulk processing of data collected in it. Bulk processed data are reviewed and certified for the ATLAS Good Run List if no problem is detected. Experts maintain the tools used by DQ shifters and the calibration teams during normal operation, and prepare new conditions for data reprocessing and Monte Carlo (MC) production campaigns. Conditions data are stored in 3 databases: Online DB, Offline DB for data and a special DB for Monte Carlo. Database updates can be performed through a custom-made web interface.
Christopher W. Woodall; Barbara L. Conkling; Michael C. Amacher; John W. Coulston; Sarah Jovan; Charles H. Perry; Beth Schulz; Gretchen C. Smith; Susan Will Wolf
2010-01-01
Describes the structure of the Forest Inventory and Analysis Database (FIADB) 4.0 for phase 3 indicators. The FIADB structure provides a consistent framework for storing forest health monitoring data across all ownerships for the entire United States. These data are available to the public.
EPA’s RadNet data are available for viewing in a searchable database or as PDF reports. Historical and current RadNet monitoring data are used to estimate long-term trends in environmental radiation levels.
NASA Astrophysics Data System (ADS)
Cavallari, Francesca; de Gruttola, Michele; Di Guida, Salvatore; Govi, Giacomo; Innocente, Vincenzo; Pfeiffer, Andreas; Pierro, Antonio
2011-12-01
Automatic, synchronous and reliable population of the condition databases is critical for the correct operation of the online selection as well as of the offline reconstruction and analysis of data. In this complex infrastructure, monitoring and fast detection of errors is a very challenging task. In this paper, we describe the CMS experiment system to process and populate the Condition Databases and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are automatically collected using centralized jobs or are "dropped" by the users in dedicated services (offline and online drop-box), which synchronize them and take care of writing them into the online database. Then they are automatically streamed to the offline database, and thus are immediately accessible offline worldwide. The condition data are managed by different users using a wide range of applications.In normal operation the database monitor is used to provide simple timing information and the history of all transactions for all database accounts, and in the case of faults it is used to return simple error messages and more complete debugging information.
Multi-parameter vital sign database to assist in alarm optimization for general care units.
Welch, James; Kanter, Benjamin; Skora, Brooke; McCombie, Scott; Henry, Isaac; McCombie, Devin; Kennedy, Rosemary; Soller, Babs
2016-12-01
Continual vital sign assessment on the general care, medical-surgical floor is expected to provide early indication of patient deterioration and increase the effectiveness of rapid response teams. However, there is concern that continual, multi-parameter vital sign monitoring will produce alarm fatigue. The objective of this study was the development of a methodology to help care teams optimize alarm settings. An on-body wireless monitoring system was used to continually assess heart rate, respiratory rate, SpO 2 and noninvasive blood pressure in the general ward of ten hospitals between April 1, 2014 and January 19, 2015. These data, 94,575 h for 3430 patients are contained in a large database, accessible with cloud computing tools. Simulation scenarios assessed the total alarm rate as a function of threshold and annunciation delay (s). The total alarm rate of ten alarms/patient/day predicted from the cloud-hosted database was the same as the total alarm rate for a 10 day evaluation (1550 h for 36 patients) in an independent hospital. Plots of vital sign distributions in the cloud-hosted database were similar to other large databases published by different authors. The cloud-hosted database can be used to run simulations for various alarm thresholds and annunciation delays to predict the total alarm burden experienced by nursing staff. This methodology might, in the future, be used to help reduce alarm fatigue without sacrificing the ability to continually monitor all vital signs.
Integrated cluster management at Manchester
NASA Astrophysics Data System (ADS)
McNab, Andrew; Forti, Alessandra
2012-12-01
We describe an integrated management system using third-party, open source components used in operating a large Tier-2 site for particle physics. This system tracks individual assets and records their attributes such as MAC and IP addresses; derives DNS and DHCP configurations from this database; creates each host's installation and re-configuration scripts; monitors the services on each host according to the records of what should be running; and cross references tickets with asset records and per-asset monitoring pages. In addition, scripts which detect problems and automatically remove hosts record these new states in the database which are available to operators immediately through the same interface as tickets and monitoring.
OVERSEER: An Expert System Monitor for the Psychiatric Hospital
Bronzino, Joseph D.; Morelli, Ralph A.; Goethe, John W.
1988-01-01
In order to improve patient care, comply with regulatory guidelines and decrease potential liability, psychiatric hospitals and clinics have been searching for computer systems to monitor the management and treatment of patients. This paper describes OVERSEER: a knowledge based system that monitors the treatment of psychiatric patients in real time. Based on procedures and protocols developed in the psychiatric setting, OVERSEER monitors the clinical database and issues alerts when standard clinical practices are not followed or when laboratory results or other clinical indicators are abnormal. Written in PROLOG, OVERSEER is designed to interface directly with the hospital's database, and, thereby utilizes all available pharmacy and laboratory data. Moreover, unlike the interactive expert systems developed for the psychiatric clinic, OVERSEER does not require extensive data entry by the clinician. Consequently, the chief benefit of OVERSEER's monitoring approach is the unobtrusive manner in which it evaluates treatment and patient responses and provides information regarding patient management.
49 CFR 384.229 - Skills test examiner auditing and monitoring.
Code of Federal Regulations, 2011 CFR
2011-10-01
... must be performed at least once every year; (c) Establish and maintain a database to track pass/fail... maintain a database of all third party testers and examiners, which at a minimum tracks the dates and... and maintain a database of all State CDL skills examiners, which at a minimum tracks the dates and...
49 CFR 384.229 - Skills test examiner auditing and monitoring.
Code of Federal Regulations, 2012 CFR
2012-10-01
... must be performed at least once every year; (c) Establish and maintain a database to track pass/fail... maintain a database of all third party testers and examiners, which at a minimum tracks the dates and... and maintain a database of all State CDL skills examiners, which at a minimum tracks the dates and...
USDA-ARS?s Scientific Manuscript database
The sodium concentration (mg/100g) for 23 of 125 Sentinel Foods were identified in the 2009 CDC Packaged Food Database (PFD) and compared with data in the USDA’s 2013 Standard Reference 26 (SR 26) database. Sentinel Foods are foods and beverages identified by USDA to be monitored as primary indicat...
Use of a Relational Database to Support Clinical Research: Application in a Diabetes Program
Lomatch, Diane; Truax, Terry; Savage, Peter
1981-01-01
A database has been established to support conduct of clinical research and monitor delivery of medical care for 1200 diabetic patients as part of the Michigan Diabetes Research and Training Center (MDRTC). Use of an intelligent microcomputer to enter and retrieve the data and use of a relational database management system (DBMS) to store and manage data have provided a flexible, efficient method of achieving both support of small projects and monitoring overall activity of the Diabetes Center Unit (DCU). Simplicity of access to data, efficiency in providing data for unanticipated requests, ease of manipulations of relations, security and “logical data independence” were important factors in choosing a relational DBMS. The ability to interface with an interactive statistical program and a graphics program is a major advantage of this system. Out database currently provides support for the operation and analysis of several ongoing research projects.
Regular monitoring of breast-feeding rates: feasible and sustainable. The Emilia-Romagna experience.
Di Mario, Simona; Borsari, Silvana; Verdini, Eleonora; Battaglia, Sergio; Cisbani, Luca; Sforza, Stefano; Cuoghi, Chiara; Basevi, Vittorio
2017-08-01
An efficient breast-feeding monitoring system should be in place in every country to assist policy makers and health professionals plan activities to reach optimal breast-feeding rates. Design/Setting/Subjects From March to June 2015, breast-feeding rates at 3 and 5 months of age were monitored in Emilia-Romagna, an Italian region, using four questions added to a newly developed paediatric immunization database with single records for each individual. Data were collected at primary-care centres. Breast-feeding definitions and 24 h recall as recommended by the WHO were used. Direct age standardization was applied to breast-feeding rates. Record linkage with the medical birth database was attempted to identify maternal, pregnancy and delivery factors associated with full breast-feeding rates at 3 and 5 months of age. Data on breast-feeding were collected for 14044 infants. The mean regional full breast-feeding rate at 3 months was 52 %; differences between local health authorities ranged from 42 to 62 %. At 5 months of age, the mean regional full breast-feeding rate dropped to 33 % (range between local health authorities: 26 to 46 %). Record linkage with the birth certificate database was successful for 93 % of records. Total observations more than doubled with respect to the previous regional survey. The new monitoring system implemented in 2015 in Emilia-Romagna region, totally integrated with the immunization database, has proved to be feasible, sustainable and more efficient than the previous one. This system can be a model for other regions and countries where the vast majority of mothers obtain vaccinations from public health facilities and that already have an immunization database in place.
Information support of monitoring of technical condition of buildings in construction risk area
NASA Astrophysics Data System (ADS)
Skachkova, M. E.; Lepihina, O. Y.; Ignatova, V. V.
2018-05-01
The paper presents the results of the research devoted to the development of a model of information support of monitoring buildings technical condition; these buildings are located in the construction risk area. As a result of the visual and instrumental survey, as well as the analysis of existing approaches and techniques, attributive and cartographic databases have been created. These databases allow monitoring defects and damages of buildings located in a 30-meter risk area from the object under construction. The classification of structures and defects of these buildings under survey is presented. The functional capabilities of the developed model and the field of it practical applications are determined.
NASA Astrophysics Data System (ADS)
Bagnasco, S.; Berzano, D.; Guarise, A.; Lusso, S.; Masera, M.; Vallero, S.
2015-12-01
The INFN computing centre in Torino hosts a private Cloud, which is managed with the OpenNebula cloud controller. The infrastructure offers Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) services to different scientific computing applications. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BESIII collaboration, plus an increasing number of other small tenants. The dynamic allocation of resources to tenants is partially automated. This feature requires detailed monitoring and accounting of the resource usage. We set up a monitoring framework to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the ElasticSearch, Logstash and Kibana (ELK) stack. The infrastructure relies on a MySQL database back-end for data preservation and to ensure flexibility to choose a different monitoring solution if needed. The heterogeneous accounting information is transferred from the database to the ElasticSearch engine via a custom Logstash plugin. Each use-case is indexed separately in ElasticSearch and we setup a set of Kibana dashboards with pre-defined queries in order to monitor the relevant information in each case. For the IaaS metering, we developed sensors for the OpenNebula API. The IaaS level information gathered through the API is sent to the MySQL database through an ad-hoc developed RESTful web service. Moreover, we have developed a billing system for our private Cloud, which relies on the RabbitMQ message queue for asynchronous communication to the database and on the ELK stack for its graphical interface. The Italian Grid accounting framework is also migrating to a similar set-up. Concerning the application level, we used the Root plugin TProofMonSenderSQL to collect accounting data from the interactive analysis facility. The BESIII virtual instances used to be monitored with Zabbix, as a proof of concept we also retrieve the information contained in the Zabbix database. In this way we have achieved a uniform monitoring interface for both the IaaS and the scientific applications, mostly leveraging off-the-shelf tools. At present, we are working to define a model for monitoring-as-a-service, based on the tools described above, which the Cloud tenants can easily configure to suit their specific needs.
Monitoring tools of COMPASS experiment at CERN
NASA Astrophysics Data System (ADS)
Bodlak, M.; Frolov, V.; Huber, S.; Jary, V.; Konorov, I.; Levit, D.; Novy, J.; Salac, R.; Tomsa, J.; Virius, M.
2015-12-01
This paper briefly introduces the data acquisition system of the COMPASS experiment and is mainly focused on the part that is responsible for the monitoring of the nodes in the whole newly developed data acquisition system of this experiment. The COMPASS is a high energy particle experiment with a fixed target located at the SPS of the CERN laboratory in Geneva, Switzerland. The hardware of the data acquisition system has been upgraded to use FPGA cards that are responsible for data multiplexing and event building. The software counterpart of the system includes several processes deployed in heterogenous network environment. There are two processes, namely Message Logger and Message Browser, taking care of monitoring. These tools handle messages generated by nodes in the system. While Message Logger collects and saves messages to the database, the Message Browser serves as a graphical interface over the database containing these messages. For better performance, certain database optimizations have been used. Lastly, results of performance tests are presented.
Remote online monitoring and measuring system for civil engineering structures
NASA Astrophysics Data System (ADS)
Kujawińska, Malgorzata; Sitnik, Robert; Dymny, Grzegorz; Karaszewski, Maciej; Michoński, Kuba; Krzesłowski, Jakub; Mularczyk, Krzysztof; Bolewicki, Paweł
2009-06-01
In this paper a distributed intelligent system for civil engineering structures on-line measurement, remote monitoring, and data archiving is presented. The system consists of a set of optical, full-field displacement sensors connected to a controlling server. The server conducts measurements according to a list of scheduled tasks and stores the primary data or initial results in a remote centralized database. Simultaneously the server performs checks, ordered by the operator, which may in turn result with an alert or a specific action. The structure of whole system is analyzed along with the discussion on possible fields of application and the ways to provide a relevant security during data transport. Finally, a working implementation consisting of a fringe projection, geometrical moiré, digital image correlation and grating interferometry sensors and Oracle XE database is presented. The results from database utilized for on-line monitoring of a threshold value of strain for an exemplary area of interest at the engineering structure are presented and discussed.
Smartphone home monitoring of ECG
NASA Astrophysics Data System (ADS)
Szu, Harold; Hsu, Charles; Moon, Gyu; Landa, Joseph; Nakajima, Hiroshi; Hata, Yutaka
2012-06-01
A system of ambulatory, halter, electrocardiography (ECG) monitoring system has already been commercially available for recording and transmitting heartbeats data by the Internet. However, it enjoys the confidence with a reservation and thus a limited market penetration, our system was targeting at aging global villagers having an increasingly biomedical wellness (BMW) homecare needs, not hospital related BMI (biomedical illness). It was designed within SWaP-C (Size, Weight, and Power, Cost) using 3 innovative modules: (i) Smart Electrode (lowpower mixed signal embedded with modern compressive sensing and nanotechnology to improve the electrodes' contact impedance); (ii) Learnable Database (in terms of adaptive wavelets transform QRST feature extraction, Sequential Query Relational database allowing home care monitoring retrievable Aided Target Recognition); (iii) Smartphone (touch screen interface, powerful computation capability, caretaker reporting with GPI, ID, and patient panic button for programmable emergence procedure). It can provide a supplementary home screening system for the post or the pre-diagnosis care at home with a build-in database searchable with the time, the place, and the degree of urgency happened, using in-situ screening.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-20
... information about the request is entered into the appropriate tracking databases. Use of the information in the Agency's tracking databases enables the Agency to monitor progress on the activities attendant to...
NASA Astrophysics Data System (ADS)
Anugrah, Wirdah; Suryono; Suseno, Jatmiko Endro
2018-02-01
Management of water resources based on Geographic Information System can provide substantial benefits to water availability settings. Monitoring the potential water level is needed in the development sector, agriculture, energy and others. In this research is developed water resource information system using real-time Geographic Information System concept for monitoring the potential water level of web based area by applying rule based system method. GIS consists of hardware, software, and database. Based on the web-based GIS architecture, this study uses a set of computer that are connected to the network, run on the Apache web server and PHP programming language using MySQL database. The Ultrasound Wireless Sensor System is used as a water level data input. It also includes time and geographic location information. This GIS maps the five sensor locations. GIS is processed through a rule based system to determine the level of potential water level of the area. Water level monitoring information result can be displayed on thematic maps by overlaying more than one layer, and also generating information in the form of tables from the database, as well as graphs are based on the timing of events and the water level values.
Operational Monitoring of GOME-2 and IASI Level 1 Product Processing at EUMETSAT
NASA Astrophysics Data System (ADS)
Livschitz, Yakov; Munro, Rosemary; Lang, Rüdiger; Fiedler, Lars; Dyer, Richard; Eisinger, Michael
2010-05-01
The growing complexity of operational level 1 radiance products from Low Earth Orbiting (LEO) platforms like EUMETSATs Metop series makes near-real-time monitoring of product quality a challenging task. The main challenge is to provide a monitoring system which is flexible and robust enough to identify and to react to anomalies which may be previously unknown to the system, as well as to provide all means and parameters necessary in order to support efficient ad-hoc analysis of the incident. The operational monitoring system developed at EUMETSAT for monitoring of GOME-2 and IASI level 1 data allows to perform near-real-time monitoring of operational products and instrument's health in a robust and flexible fashion. For effective information management, the system is based on a relational database (Oracle). An Extract, Transform, Load (ETL) process transforms products in EUMETSAT Polar System (EPS) format into relational data structures. The identification of commonalities between products and instruments allows for a database structure design in such a way that different data can be analyzed using the same business intelligence functionality. An interactive analysis software implementing modern data mining techniques is also provided for a detailed look into the data. The system is effectively used for day-to-day monitoring, long-term reporting, instrument's degradation analysis as well as for ad-hoc queries in case of an unexpected instrument or processing behaviour. Having data from different sources on a single instrument and even from different instruments, platforms or numerical weather prediction within the same database allows effective cross-comparison and looking for correlated parameters. Automatic alarms raised by checking for deviation of certain parameters, for data losses and other events significantly reduce time, necessary to monitor the processing on a day-to-day basis.
Operational Monitoring of GOME-2 and IASI Level 1 Product Processing at EUMETSAT
NASA Astrophysics Data System (ADS)
Livschitz, Y.; Munro, R.; Lang, R.; Fiedler, L.; Dyer, R.; Eisinger, M.
2009-12-01
The growing complexity of operational level 1 radiance products from Low Earth Orbiting (LEO) platforms like EUMETSATs Metop series makes near-real-time monitoring of product quality a challenging task. The main challenge is to provide a monitoring system which is flexible and robust enough to identify and to react to anomalies which may be previously unknown to the system, as well as to provide all means and parameters necessary in order to support efficient ad-hoc analysis of the incident. The operational monitoring system developed at EUMETSAT for monitoring of GOME-2 and IASI level 1 data allows to perform near-real-time monitoring of operational products and instrument’s health in a robust and flexible fashion. For effective information management, the system is based on a relational database (Oracle). An Extract, Transform, Load (ETL) process transforms products in EUMETSAT Polar System (EPS) format into relational data structures. The identification of commonalities between products and instruments allows for a database structure design in such a way that different data can be analyzed using the same business intelligence functionality. An interactive analysis software implementing modern data mining techniques is also provided for a detailed look into the data. The system is effectively used for day-to-day monitoring, long-term reporting, instrument’s degradation analysis as well as for ad-hoc queries in case of an unexpected instrument or processing behaviour. Having data from different sources on a single instrument and even from different instruments, platforms or numerical weather prediction within the same database allows effective cross-comparison and looking for correlated parameters. Automatic alarms raised by checking for deviation of certain parameters, for data losses and other events significantly reduce time, necessary to monitor the processing on a day-to-day basis.
How Intrusion Detection Can Improve Software Decoy Applications
2003-03-01
THIS PAGE INTENTIONALLY LEFT BLANK 41 V. DISCUSSION Military history suggests it is best to employ a layered, defense-in...database: alert, postgresql , user=snort dbname=snort # output database: log, unixodbc, user=snort dbname=snort # output database: log, mssql, dbname...Threat Monitoring and Surveillance, James P. Anderson Co., Fort Washington. PA, April 1980. URL http://csrc.nist.gov/publications/ history /ande80
50 CFR 660.17 - Catch monitors and catch monitor providers.
Code of Federal Regulations, 2011 CFR
2011-10-01
... work competently with standard database software and computer hardware. (v) Have a current and valid... candidate's academic transcripts and resume; (4) A statement signed by the candidate under penalty of...
50 CFR 660.17 - Catch monitors and catch monitor service providers.
Code of Federal Regulations, 2013 CFR
2013-10-01
... work competently with standard database software and computer hardware. (v) Have a current and valid... candidate's academic transcripts and resume; (4) A statement signed by the candidate under penalty of...
50 CFR 660.17 - Catch monitors and catch monitor service providers.
Code of Federal Regulations, 2012 CFR
2012-10-01
... work competently with standard database software and computer hardware. (v) Have a current and valid... candidate's academic transcripts and resume; (4) A statement signed by the candidate under penalty of...
50 CFR 660.17 - Catch monitors and catch monitor service providers.
Code of Federal Regulations, 2014 CFR
2014-10-01
... work competently with standard database software and computer hardware. (v) Have a current and valid... candidate's academic transcripts and resume; (4) A statement signed by the candidate under penalty of...
Distributed On-line Monitoring System Based on Modem and Public Phone Net
NASA Astrophysics Data System (ADS)
Chen, Dandan; Zhang, Qiushi; Li, Guiru
In order to solve the monitoring problem of urban sewage disposal, a distributed on-line monitoring system is proposed. By introducing dial-up communication technology based on Modem, the serial communication program can rationally solve the information transmission problem between master station and slave station. The realization of serial communication program is based on the MSComm control of C++ Builder 6.0.The software includes real-time data operation part and history data handling part, which using Microsoft SQL Server 2000 for database, and C++ Builder6.0 for user interface. The monitoring center displays a user interface with alarm information of over-standard data and real-time curve. Practical application shows that the system has successfully accomplished the real-time data acquisition from data gather station, and stored them in the terminal database.
Eric T. Linder; David A. Buehler
2005-01-01
In 1996, Region 8 of the U. S. Forest Service implemented a program to monitor landbirds on southeastern U.S. national forests. The goal was to develop a monitoring system that could document population trends and bird-habitat relationships. Using power analysis, we examined the ability of the monitoring program to detect population trends (3 percent annual change) at...
NASA Astrophysics Data System (ADS)
Marzocchi, W.
2011-12-01
Eruption forecasting is the probability of eruption in a specific time-space-magnitude window. The use of probabilities to track the evolution of a phase of unrest is unavoidable for two main reasons: first, eruptions are intrinsically unpredictable in a deterministic sense, and, second, probabilities represent a quantitative tool that can be rationally used by decision-makers (this is usually done in many other fields). The primary information for the probability assessment during a phase of unrest come from monitoring data of different quantities, such as the seismic activity, ground deformation, geochemical signatures, and so on. Nevertheless, the probabilistic forecast based on monitoring data presents two main difficulties. First, many high-risk volcanoes do not have monitoring pre-eruptive and unrest databases, making impossible a probabilistic assessment based on the frequency of past observations. The ongoing project WOVOdat (led by Christopher Newhall) is trying to tackle this limitation creating a sort of worldwide epidemiological database that may cope with the lack of monitoring pre-eruptive and unrest databases for a specific volcano using observations of 'analogs' volcanoes. Second, the quantity and quality of monitoring data are rapidly increasing in many volcanoes, creating strongly inhomogeneous dataset. In these cases, classical statistical analysis can be performed on high quality monitoring observations only for (usually too) short periods of time, or alternatively using only few specific monitoring data that are available for longer times (such as the number of earthquakes), therefore neglecting a lot of information carried out by the most recent kind of monitoring. Here, we explore a possible strategy to cope with these limitations. In particular, we present a Bayesian strategy that merges different kinds of information. In this approach, all relevant monitoring observations are embedded into a probabilistic scheme through expert opinion, conceptual models, and, possibly, real past data. After discussing all scientific and philosophical aspects of such approach, we present some applications for Campi Flegrei and Vesuvius.
The Danish Inguinal Hernia database.
Friis-Andersen, Hans; Bisgaard, Thue
2016-01-01
To monitor and improve nation-wide surgical outcome after groin hernia repair based on scientific evidence-based surgical strategies for the national and international surgical community. Patients ≥18 years operated for groin hernia. Type and size of hernia, primary or recurrent, type of surgical repair procedure, mesh and mesh fixation methods. According to the Danish National Health Act, surgeons are obliged to register all hernia repairs immediately after surgery (3 minute registration time). All institutions have continuous access to their own data stratified on individual surgeons. Registrations are based on a closed, protected Internet system requiring personal codes also identifying the operating institution. A national steering committee consisting of 13 voluntary and dedicated surgeons, 11 of whom are unpaid, handles the medical management of the database. The Danish Inguinal Hernia Database comprises intraoperative data from >130,000 repairs (May 2015). A total of 49 peer-reviewed national and international publications have been published from the database (June 2015). The Danish Inguinal Hernia Database is fully active monitoring surgical quality and contributes to the national and international surgical society to improve outcome after groin hernia repair.
Experience with ATLAS MySQL PanDA database service
NASA Astrophysics Data System (ADS)
Smirnov, Y.; Wlodek, T.; De, K.; Hover, J.; Ozturk, N.; Smith, J.; Wenaus, T.; Yu, D.
2010-04-01
The PanDA distributed production and analysis system has been in production use for ATLAS data processing and analysis since late 2005 in the US, and globally throughout ATLAS since early 2008. Its core architecture is based on a set of stateless web services served by Apache and backed by a suite of MySQL databases that are the repository for all PanDA information: active and archival job queues, dataset and file catalogs, site configuration information, monitoring information, system control parameters, and so on. This database system is one of the most critical components of PanDA, and has successfully delivered the functional and scaling performance required by PanDA, currently operating at a scale of half a million jobs per week, with much growth still to come. In this paper we describe the design and implementation of the PanDA database system, its architecture of MySQL servers deployed at BNL and CERN, backup strategy and monitoring tools. The system has been developed, thoroughly tested, and brought to production to provide highly reliable, scalable, flexible and available database services for ATLAS Monte Carlo production, reconstruction and physics analysis.
Survey of Machine Learning Methods for Database Security
NASA Astrophysics Data System (ADS)
Kamra, Ashish; Ber, Elisa
Application of machine learning techniques to database security is an emerging area of research. In this chapter, we present a survey of various approaches that use machine learning/data mining techniques to enhance the traditional security mechanisms of databases. There are two key database security areas in which these techniques have found applications, namely, detection of SQL Injection attacks and anomaly detection for defending against insider threats. Apart from the research prototypes and tools, various third-party commercial products are also available that provide database activity monitoring solutions by profiling database users and applications. We present a survey of such products. We end the chapter with a primer on mechanisms for responding to database anomalies.
The Eruption Forecasting Information System: Volcanic Eruption Forecasting Using Databases
NASA Astrophysics Data System (ADS)
Ogburn, S. E.; Harpel, C. J.; Pesicek, J. D.; Wellik, J.
2016-12-01
Forecasting eruptions, including the onset size, duration, location, and impacts, is vital for hazard assessment and risk mitigation. The Eruption Forecasting Information System (EFIS) project is a new initiative of the US Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) and will advance VDAP's ability to forecast the outcome of volcanic unrest. The project supports probability estimation for eruption forecasting by creating databases useful for pattern recognition, identifying monitoring data thresholds beyond which eruptive probabilities increase, and for answering common forecasting questions. A major component of the project is a global relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest. This module allows us to query eruption chronologies, monitoring data, descriptive information, operational data, and eruptive phases alongside other global databases, such as WOVOdat and the Global Volcanism Program. The EFIS database is in the early stages of development and population; thus, this contribution also is a request for feedback from the community. Preliminary data are already benefitting several research areas. For example, VDAP provided a forecast of the likely remaining eruption duration for Sinabung volcano, Indonesia, using global data taken from similar volcanoes in the DomeHaz database module, in combination with local monitoring time-series data. In addition, EFIS seismologists used a beta-statistic test and empirically-derived thresholds to identify distal volcano-tectonic earthquake anomalies preceding Alaska volcanic eruptions during 1990-2015 to retrospectively evaluate Alaska Volcano Observatory eruption precursors. This has identified important considerations for selecting analog volcanoes for global data analysis, such as differences between closed and open system volcanoes.
Village Green Project: Web-accessible Database
The purpose of this web-accessible database is for the public to be able to view instantaneous readings from a solar-powered air monitoring station located in a public location (prototype pilot test is outside of a library in Durham County, NC). The data are wirelessly transmitte...
Deng, Chen-Hui; Zhang, Guan-Min; Bi, Shan-Shan; Zhou, Tian-Yan; Lu, Wei
2011-07-01
This study is to develop a therapeutic drug monitoring (TDM) network server of tacrolimus for Chinese renal transplant patients, which can facilitate doctor to manage patients' information and provide three levels of predictions. Database management system MySQL was employed to build and manage the database of patients and doctors' information, and hypertext mark-up language (HTML) and Java server pages (JSP) technology were employed to construct network server for database management. Based on the population pharmacokinetic model of tacrolimus for Chinese renal transplant patients, above program languages were used to construct the population prediction and subpopulation prediction modules. Based on Bayesian principle and maximization of the posterior probability function, an objective function was established, and minimized by an optimization algorithm to estimate patient's individual pharmacokinetic parameters. It is proved that the network server has the basic functions for database management and three levels of prediction to aid doctor to optimize the regimen of tacrolimus for Chinese renal transplant patients.
Pesticides in Drinking Water – The Brazilian Monitoring Program
Barbosa, Auria M. C.; Solano, Marize de L. M.; Umbuzeiro, Gisela de A.
2015-01-01
Brazil is the world largest pesticide consumer; therefore, it is important to monitor the levels of these chemicals in the water used by population. The Ministry of Health coordinates the National Drinking Water Quality Surveillance Program (Vigiagua) with the objective to monitor water quality. Water quality data are introduced in the program by state and municipal health secretariats using a database called Sisagua (Information System of Water Quality Monitoring). Brazilian drinking water norm (Ordinance 2914/2011 from Ministry of Health) includes 27 pesticide active ingredients that need to be monitored every 6 months. This number represents <10% of current active ingredients approved for use in the country. In this work, we analyzed data compiled in Sisagua database in a qualitative and quantitative way. From 2007 to 2010, approximately 169,000 pesticide analytical results were prepared and evaluated, although approximately 980,000 would be expected if all municipalities registered their analyses. This shows that only 9–17% of municipalities registered their data in Sisagua. In this dataset, we observed non-compliance with the minimum sampling number required by the norm, lack of information about detection and quantification limits, insufficient standardization in expression of results, and several inconsistencies, leading to low credibility of pesticide data provided by the system. Therefore, it is not possible to evaluate exposure of total Brazilian population to pesticides via drinking water using the current national database system Sisagua. Lessons learned from this study could provide insights into the monitoring and reporting of pesticide residues in drinking water worldwide. PMID:26581345
Distributed cyberinfrastructure tools for automated data processing of structural monitoring data
NASA Astrophysics Data System (ADS)
Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul
2012-04-01
The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.
Navigation integrity monitoring and obstacle detection for enhanced-vision systems
NASA Astrophysics Data System (ADS)
Korn, Bernd; Doehler, Hans-Ullrich; Hecker, Peter
2001-08-01
Typically, Enhanced Vision (EV) systems consist of two main parts, sensor vision and synthetic vision. Synthetic vision usually generates a virtual out-the-window view using databases and accurate navigation data, e. g. provided by differential GPS (DGPS). The reliability of the synthetic vision highly depends on both, the accuracy of the used database and the integrity of the navigation data. But especially in GPS based systems, the integrity of the navigation can't be guaranteed. Furthermore, only objects that are stored in the database can be displayed to the pilot. Consequently, unexpected obstacles are invisible and this might cause severe problems. Therefore, additional information has to be extracted from sensor data to overcome these problems. In particular, the sensor data analysis has to identify obstacles and has to monitor the integrity of databases and navigation. Furthermore, if a lack of integrity arises, navigation data, e.g. the relative position of runway and aircraft, has to be extracted directly from the sensor data. The main contribution of this paper is about the realization of these three sensor data analysis tasks within our EV system, which uses the HiVision 35 GHz MMW radar of EADS, Ulm as the primary EV sensor. For the integrity monitoring, objects extracted from radar images are registered with both database objects and objects (e. g. other aircrafts) transmitted via data link. This results in a classification into known and unknown radar image objects and consequently, in a validation of the integrity of database and navigation. Furthermore, special runway structures are searched for in the radar image where they should appear. The outcome of this runway check contributes to the integrity analysis, too. Concurrent to this investigation a radar image based navigation is performed without using neither precision navigation nor detailed database information to determine the aircraft's position relative to the runway. The performance of our approach is demonstrated with real data acquired during extensive flight tests to several airports in Northern Germany.
Security Controls in the Stockpoint Logistics Integrated Communications Environment (SPLICE).
1985-03-01
call programs as authorized after checks by the Terminal Management Subsystem on SAS databases . SAS overlays the TANDEM GUARDIAN operating system to...Security Access Profile database (SAP) and a query capability generating various security reports. SAS operates with the System Monitor (SMON) subsystem...system to DDN and other components. The first SAS component to be reviewed is the SAP database . SAP is organized into two types of files. Relational
Using Landsat imagery to detect, monitor, and project net landscape change
Reker, Ryan R.; Sohl, Terry L.; Gallant, Alisa L.
2015-01-01
Detailed landscape information is a necessary component to bird habitat conservation planning. The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center has been providing information on the Earth’s surface for over 40 years via the continuous series of Landsat satellites. In addition to operating, processing, and disseminating satellite images, EROS is the home to nationwide and global landscape mapping, monitoring, and projection products, including:National Land Cover Database (NLCD) – the definitive land cover dataset for the U.S., with updates occurring at five-year intervals;Global Land Cover Monitoring – producing 30m resolution global land cover;LANDFIRE – Landscape Fire and Resource Management Planning Tools–EROS is a partner in this joint program between U.S. Department of Agriculture and Department of Interior that produces consistent, comprehensive, geospatial data and databases that describe vegetation, wildland fuel, and fire regimes across the U.S.;Land Cover Trends – a landscape monitoring and assessment effort to understand the rates, trends, causes, and consequences of contemporary U.S. land use and land cover change; andLand Use and Land Cover (LULC) Modeling – a project extending contemporary databases of landscape change forward and backward in time through moderate-resolution land cover projections.
An "EAR" on environmental surveillance and monitoring: A ...
Current environmental monitoring approaches focus primarily on chemical occurrence. However, based on chemical concentration alone, it can be difficult to identify which compounds may be of toxicological concern for prioritization for further monitoring or management. This can be problematic because toxicological characterization is lacking for many emerging contaminants. New sources of high throughput screening data like the ToxCast™ database, which contains data for over 9,000 compounds screened through up to 1,100 assays, are now available. Integrated analysis of chemical occurrence data with HTS data offers new opportunities to prioritize chemicals, sites, or biological effects for further investigation based on concentrations detected in the environment linked to relative potencies in pathway-based bioassays. As a case study, chemical occurrence data from a 2012 study in the Great Lakes Basin along with the ToxCast™ effects database were used to calculate exposure-activity ratios (EARs) as a prioritization tool. Technical considerations of data processing and use of the ToxCast™ database are presented and discussed. EAR prioritization identified multiple sites, biological pathways, and chemicals that warrant further investigation. Biological pathways were then linked to adverse outcome pathways to identify potential adverse outcomes and biomarkers for use in subsequent monitoring efforts. Anthropogenic contaminants are frequently reported in environm
Web application for detailed real-time database transaction monitoring for CMS condition data
NASA Astrophysics Data System (ADS)
de Gruttola, Michele; Di Guida, Salvatore; Innocente, Vincenzo; Pierro, Antonio
2012-12-01
In the upcoming LHC era, database have become an essential part for the experiments collecting data from LHC, in order to safely store, and consistently retrieve, a wide amount of data, which are produced by different sources. In the CMS experiment at CERN, all this information is stored in ORACLE databases, allocated in several servers, both inside and outside the CERN network. In this scenario, the task of monitoring different databases is a crucial database administration issue, since different information may be required depending on different users' tasks such as data transfer, inspection, planning and security issues. We present here a web application based on Python web framework and Python modules for data mining purposes. To customize the GUI we record traces of user interactions that are used to build use case models. In addition the application detects errors in database transactions (for example identify any mistake made by user, application failure, unexpected network shutdown or Structured Query Language (SQL) statement error) and provides warning messages from the different users' perspectives. Finally, in order to fullfill the requirements of the CMS experiment community, and to meet the new development in many Web client tools, our application was further developed, and new features were deployed.
NASA Technical Reports Server (NTRS)
Young, Steven D.; Harrah, Steven D.; deHaag, Maarten Uijt
2002-01-01
Terrain Awareness and Warning Systems (TAWS) and Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data (e.g. terrain, obstacles, and/or features). As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. This lack of a quantifiable integrity level is one of the constraints that has limited certification and operational approval of TAWS/SVS to "advisory-only" systems for civil aviation. Previous work demonstrated the feasibility of using a real-time monitor to bound database integrity by using downward-looking remote sensing technology (i.e. radar altimeters). This paper describes an extension of the integrity monitor concept to include a forward-looking sensor to cover additional classes of terrain database faults and to reduce the exposure time associated with integrity threats. An operational concept is presented that combines established feature extraction techniques with a statistical assessment of similarity measures between the sensed and stored features using principles from classical detection theory. Finally, an implementation is presented that uses existing commercial-off-the-shelf weather radar sensor technology.
NASA Astrophysics Data System (ADS)
Wiacek, Daniel; Kudla, Ignacy M.; Pozniak, Krzysztof T.; Bunkowski, Karol
2005-02-01
The main task of the RPC (Resistive Plate Chamber) Muon Trigger monitoring system design for the CMS (Compact Muon Solenoid) experiment (at LHC in CERN Geneva) is the visualization of data that includes the structure of electronic trigger system (e.g. geometry and imagery), the way of its processes and to generate automatically files with VHDL source code used for programming of the FPGA matrix. In the near future, the system will enable the analysis of condition, operation and efficiency of individual Muon Trigger elements, registration of information about some Muon Trigger devices and present previously obtained results in interactive presentation layer. A broad variety of different database and programming concepts for design of Muon Trigger monitoring system was presented in this article. The structure and architecture of the system and its principle of operation were described. One of ideas for building this system is use object-oriented programming and design techniques to describe real electronics systems through abstract object models stored in database and implement these models in Java language.
Plan for Developing a Materials Performance Database for the Texas Department of Transportation
DOT National Transportation Integrated Search
1999-09-01
The materials used within the Texas Department of Transportation (TxDOT) are undergoing a period of change. The purpose of this report is to develop the information necessary to develop (for TxDOT) a method or a database for monitoring the performanc...
Online Sources of Competitive Intelligence.
ERIC Educational Resources Information Center
Wagers, Robert
1986-01-01
Presents an approach to using online sources of information for competitor intelligence (i.e., monitoring industry and tracking activities of competitors); identifies principal sources; and suggests some ways of making use of online databases. Types and sources of information and sources and database charts are appended. Eight references are…
Online Islamic Organizations and Measuring Web Effectiveness
2004-12-01
Internet Research 13 (2003) : 17-26. Retrived from ProQuest online database on 15 May 2004. Lee, Jae-Kwan. “A model for monitoring public sector...Web site strategy.” Internet Research : Electronic Networking Applications and Policy 13 (2003) : 259-266. Retrieved from Emerad online database on
Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun
2018-01-01
To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.
Experience with Multi-Tier Grid MySQL Database Service Resiliency at BNL
NASA Astrophysics Data System (ADS)
Wlodek, Tomasz; Ernst, Michael; Hover, John; Katramatos, Dimitrios; Packard, Jay; Smirnov, Yuri; Yu, Dantong
2011-12-01
We describe the use of F5's BIG-IP smart switch technology (3600 Series and Local Traffic Manager v9.0) to provide load balancing and automatic fail-over to multiple Grid services (GUMS, VOMS) and their associated back-end MySQL databases. This resiliency is introduced in front of the external application servers and also for the back-end database systems, which is what makes it "multi-tier". The combination of solutions chosen to ensure high availability of the services, in particular the database replication and fail-over mechanism, are discussed in detail. The paper explains the design and configuration of the overall system, including virtual servers, machine pools, and health monitors (which govern routing), as well as the master-slave database scheme and fail-over policies and procedures. Pre-deployment planning and stress testing will be outlined. Integration of the systems with our Nagios-based facility monitoring and alerting is also described. And application characteristics of GUMS and VOMS which enable effective clustering will be explained. We then summarize our practical experiences and real-world scenarios resulting from operating a major US Grid center, and assess the applicability of our approach to other Grid services in the future.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
... is entered into the appropriate tracking databases. Use of the information in the Agency's tracking databases enables the Agency to monitor progress on the activities attendant to scheduling and holding a... collection of information on respondents, including through the use of automated collection techniques, when...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-18
... request is entered into the appropriate tracking databases. Use of the information in the Agency's tracking databases enables the appropriate Agency official to monitor progress on the evaluation of the... collection of information on respondents, including through the use of automated collection techniques, when...
Data, Data Everywhere but Not a Byte to Read: Managing Monitoring Information.
ERIC Educational Resources Information Center
Stafford, Susan G.
1993-01-01
Describes the Forest Science Data Bank that contains 2,400 data sets from over 350 existing ecological studies. Database features described include involvement of the scientific community; database documentation; data quality assurance; security; data access and retrieval; and data import/export flexibility. Appendices present the Quantitative…
CLOUDCLOUD : general-purpose instrument monitoring and data managing software
NASA Astrophysics Data System (ADS)
Dias, António; Amorim, António; Tomé, António
2016-04-01
An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.
DOT National Transportation Integrated Search
2006-05-01
This research has provided NCDOT with (1) scientific observations to validate the pollutant removal : performance of selected structural BMPs, (2) a database management option for BMP monitoring and : non-monitoring sites, (3) pollution prevention pl...
Overview of national bird population monitoring programs and databases
Gregory S. Butcher; Bruce Peterjohn; C. John Ralph
1993-01-01
A number of programs have been set up to monitor populations of nongame migratory birds. We review these programs and their purposes and provide information on obtaining data or results from these programs. In addition, we review recommendations for improving these programs.
An intelligent remote monitoring system for artificial heart.
Choi, Jaesoon; Park, Jun W; Chung, Jinhan; Min, Byoung G
2005-12-01
A web-based database system for intelligent remote monitoring of an artificial heart has been developed. It is important for patients with an artificial heart implant to be discharged from the hospital after an appropriate stabilization period for better recovery and quality of life. Reliable continuous remote monitoring systems for these patients with life support devices are gaining practical meaning. The authors have developed a remote monitoring system for this purpose that consists of a portable/desktop monitoring terminal, a database for continuous recording of patient and device status, a web-based data access system with which clinicians can access real-time patient and device status data and past history data, and an intelligent diagnosis algorithm module that noninvasively estimates blood pump output and makes automatic classification of the device status. The system has been tested with data generation emulators installed on remote sites for simulation study, and in two cases of animal experiments conducted at remote facilities. The system showed acceptable functionality and reliability. The intelligence algorithm also showed acceptable practicality in an application to animal experiment data.
The inland water macro-invertebrate occurrences in Flanders, Belgium.
Vannevel, Rudy; Brosens, Dimitri; Cooman, Ward De; Gabriels, Wim; Frank Lavens; Mertens, Joost; Vervaeke, Bart
2018-01-01
The Flanders Environment Agency (VMM) has been performing biological water quality assessments on inland waters in Flanders (Belgium) since 1989 and sediment quality assessments since 2000. The water quality monitoring network is a combined physico-chemical and biological network, the biological component focusing on macro-invertebrates. The sediment monitoring programme produces biological data to assess the sediment quality. Both monitoring programmes aim to provide index values, applying a similar conceptual methodology based on the presence of macro-invertebrates. The biological data obtained from both monitoring networks are consolidated in the VMM macro-invertebrates database and include identifications at family and genus level of the freshwater phyla Coelenterata, Platyhelminthes, Annelida, Mollusca, and Arthropoda. This paper discusses the content of this database, and the dataset published thereof: 282,309 records of 210 observed taxa from 4,140 monitoring sites located on 657 different water bodies, collected during 22,663 events. This paper provides some background information on the methodology, temporal and spatial coverage, and taxonomy, and describes the content of the dataset. The data are distributed as open data under the Creative Commons CC-BY license.
Hürlimann, Eveline; Schur, Nadine; Boutsika, Konstantina; Stensgaard, Anna-Sofie; Laserna de Himpsl, Maiti; Ziegelbauer, Kathrin; Laizer, Nassor; Camenzind, Lukas; Di Pasquale, Aurelio; Ekpo, Uwem F; Simoonga, Christopher; Mushinge, Gabriel; Saarnak, Christopher F L; Utzinger, Jürg; Kristensen, Thomas K; Vounatsou, Penelope
2011-12-01
After many years of general neglect, interest has grown and efforts came under way for the mapping, control, surveillance, and eventual elimination of neglected tropical diseases (NTDs). Disease risk estimates are a key feature to target control interventions, and serve as a benchmark for monitoring and evaluation. What is currently missing is a georeferenced global database for NTDs providing open-access to the available survey data that is constantly updated and can be utilized by researchers and disease control managers to support other relevant stakeholders. We describe the steps taken toward the development of such a database that can be employed for spatial disease risk modeling and control of NTDs. With an emphasis on schistosomiasis in Africa, we systematically searched the literature (peer-reviewed journals and 'grey literature'), contacted Ministries of Health and research institutions in schistosomiasis-endemic countries for location-specific prevalence data and survey details (e.g., study population, year of survey and diagnostic techniques). The data were extracted, georeferenced, and stored in a MySQL database with a web interface allowing free database access and data management. At the beginning of 2011, our database contained more than 12,000 georeferenced schistosomiasis survey locations from 35 African countries available under http://www.gntd.org. Currently, the database is expanded to a global repository, including a host of other NTDs, e.g. soil-transmitted helminthiasis and leishmaniasis. An open-access, spatially explicit NTD database offers unique opportunities for disease risk modeling, targeting control interventions, disease monitoring, and surveillance. Moreover, it allows for detailed geostatistical analyses of disease distribution in space and time. With an initial focus on schistosomiasis in Africa, we demonstrate the proof-of-concept that the establishment and running of a global NTD database is feasible and should be expanded without delay.
[Benefits of large healthcare databases for drug risk research].
Garbe, Edeltraut; Pigeot, Iris
2015-08-01
Large electronic healthcare databases have become an important worldwide data resource for drug safety research after approval. Signal generation methods and drug safety studies based on these data facilitate the prospective monitoring of drug safety after approval, as has been recently required by EU law and the German Medicines Act. Despite its large size, a single healthcare database may include insufficient patients for the study of a very small number of drug-exposed patients or the investigation of very rare drug risks. For that reason, in the United States, efforts have been made to work on models that provide the linkage of data from different electronic healthcare databases for monitoring the safety of medicines after authorization in (i) the Sentinel Initiative and (ii) the Observational Medical Outcomes Partnership (OMOP). In July 2014, the pilot project Mini-Sentinel included a total of 178 million people from 18 different US databases. The merging of the data is based on a distributed data network with a common data model. In the European Network of Centres for Pharmacoepidemiology and Pharmacovigilance (ENCEPP) there has been no comparable merging of data from different databases; however, first experiences have been gained in various EU drug safety projects. In Germany, the data of the statutory health insurance providers constitute the most important resource for establishing a large healthcare database. Their use for this purpose has so far been severely restricted by the Code of Social Law (Section 75, Book 10). Therefore, a reform of this section is absolutely necessary.
Digital Image Support in the ROADNet Real-time Monitoring Platform
NASA Astrophysics Data System (ADS)
Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.
2004-12-01
The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.
Volcano-Monitoring Instrumentation in the United States, 2008
Guffanti, Marianne; Diefenbach, Angela K.; Ewert, John W.; Ramsey, David W.; Cervelli, Peter F.; Schilling, Steven P.
2010-01-01
The United States is one of the most volcanically active countries in the world. According to the global volcanism database of the Smithsonian Institution, the United States (including its Commonwealth of the Northern Mariana Islands) is home to about 170 volcanoes that are in an eruptive phase, have erupted in historical time, or have not erupted recently but are young enough (eruptions within the past 10,000 years) to be capable of reawakening. From 1980 through 2008, 30 of these volcanoes erupted, several repeatedly. Volcano monitoring in the United States is carried out by the U.S. Geological Survey (USGS) Volcano Hazards Program, which operates a system of five volcano observatories-Alaska Volcano Observatory (AVO), Cascades Volcano Observatory (CVO), Hawaiian Volcano Observatory (HVO), Long Valley Observatory (LVO), and Yellowstone Volcano Observatory (YVO). The observatories issue public alerts about conditions and hazards at U.S. volcanoes in support of the USGS mandate under P.L. 93-288 (Stafford Act) to provide timely warnings of potential volcanic disasters to the affected populace and civil authorities. To make efficient use of the Nation's scientific resources, the volcano observatories operate in partnership with universities and other governmental agencies through various formal agreements. The Consortium of U.S. Volcano Observatories (CUSVO) was established in 2001 to promote scientific cooperation among the Federal, academic, and State agencies involved in observatory operations. Other groups also contribute to volcano monitoring by sponsoring long-term installation of geophysical instruments at some volcanoes for specific research projects. This report describes a database of information about permanently installed ground-based instruments used by the U.S. volcano observatories to monitor volcanic activity (unrest and eruptions). The purposes of this Volcano-Monitoring Instrumentation Database (VMID) are to (1) document the Nation's existing, ground-based, volcano-monitoring capabilities, (2) answer queries within a geospatial framework about the nature of the instrumentation, and (3) provide a benchmark for planning future monitoring improvements. The VMID is not an archive of the data collected by monitoring instruments, nor is it intended to keep track of whether a station is temporarily unavailable due to telemetry or equipment problems. Instead, it is a compilation of basic information about each instrument such as location, type, and sponsoring agency. Typically, instruments installed expressly for volcano monitoring are emplaced within about 20 kilometers (km) of a volcanic center; however, some more distant instruments (as far away as 100 km) can be used under certain circumstances and therefore are included in the database. Not included is information about satellite-based and airborne sensors and temporarily deployed instrument arrays, which also are used for volcano monitoring but do not lend themselves to inclusion in a geospatially organized compilation of sensor networks. This Open-File Report is provided in two parts: (1) an Excel spreadsheet (http://pubs.usgs.gov/of/2009/1165/) containing the version of the Volcano-Monitoring Instrumentation Database current through 31 December 2008 and (2) this text (in Adobe PDF format), which serves as metadata for the VMID. The disclaimer for the VMID is in appendix 1 of the text. Updated versions of the VMID will be posted on the Web sites of the Consortium of U.S. Volcano Observatories (http://www.cusvo.org/) and the USGS Volcano Hazards Program http://volcanoes.usgs.gov/activity/data/index.php.
Ukrainian Database and Atlas of Light Curves of Artificial Space Objects
NASA Astrophysics Data System (ADS)
Koshkin, N.; Savanevich, V.; Pohorelov, A.; Shakun, L.; Zhukov, V.; Korobeynikova, E.; Strakhova, S.; Moskalenko, S.; Kashuba, V.; Krasnoshchokov, A.
This paper describes the Ukrainian database of long-term photometric observations of resident space objects (RSO). For the purpose of using this database for the outer space monitoring and space situational awareness (SSA) the open internet resource has been developed. The paper shows examples of using the Atlas of light curves of RSO's for analyzing the state of rotation around the center of mass of several active and non-functioning satellites in orbit.
2017-01-01
CII-B 2800 Powder Mill Road Adelphi, MD 20783-1138 8. PERFORMING ORGANIZATION REPORT NUMBER ARL-TR-7921 9. SPONSORING/MONITORING AGENCY NAME(S...server database, structured query language, information objects, instructions, maintenance , cursor on target events, unattended ground sensors...unlimited. iii Contents List of Figures iv 1. Introduction 1 2. Computer and Software Development Tools Requirements 1 3. Database Maintenance 2 3.1
Colangelo, Christopher M.; Shifman, Mark; Cheung, Kei-Hoi; Stone, Kathryn L.; Carriero, Nicholas J.; Gulcicek, Erol E.; Lam, TuKiet T.; Wu, Terence; Bjornson, Robert D.; Bruce, Can; Nairn, Angus C.; Rinehart, Jesse; Miller, Perry L.; Williams, Kenneth R.
2015-01-01
We report a significantly-enhanced bioinformatics suite and database for proteomics research called Yale Protein Expression Database (YPED) that is used by investigators at more than 300 institutions worldwide. YPED meets the data management, archival, and analysis needs of a high-throughput mass spectrometry-based proteomics research ranging from a single laboratory, group of laboratories within and beyond an institution, to the entire proteomics community. The current version is a significant improvement over the first version in that it contains new modules for liquid chromatography–tandem mass spectrometry (LC–MS/MS) database search results, label and label-free quantitative proteomic analysis, and several scoring outputs for phosphopeptide site localization. In addition, we have added both peptide and protein comparative analysis tools to enable pairwise analysis of distinct peptides/proteins in each sample and of overlapping peptides/proteins between all samples in multiple datasets. We have also implemented a targeted proteomics module for automated multiple reaction monitoring (MRM)/selective reaction monitoring (SRM) assay development. We have linked YPED’s database search results and both label-based and label-free fold-change analysis to the Skyline Panorama repository for online spectra visualization. In addition, we have built enhanced functionality to curate peptide identifications into an MS/MS peptide spectral library for all of our protein database search identification results. PMID:25712262
Colangelo, Christopher M; Shifman, Mark; Cheung, Kei-Hoi; Stone, Kathryn L; Carriero, Nicholas J; Gulcicek, Erol E; Lam, TuKiet T; Wu, Terence; Bjornson, Robert D; Bruce, Can; Nairn, Angus C; Rinehart, Jesse; Miller, Perry L; Williams, Kenneth R
2015-02-01
We report a significantly-enhanced bioinformatics suite and database for proteomics research called Yale Protein Expression Database (YPED) that is used by investigators at more than 300 institutions worldwide. YPED meets the data management, archival, and analysis needs of a high-throughput mass spectrometry-based proteomics research ranging from a single laboratory, group of laboratories within and beyond an institution, to the entire proteomics community. The current version is a significant improvement over the first version in that it contains new modules for liquid chromatography-tandem mass spectrometry (LC-MS/MS) database search results, label and label-free quantitative proteomic analysis, and several scoring outputs for phosphopeptide site localization. In addition, we have added both peptide and protein comparative analysis tools to enable pairwise analysis of distinct peptides/proteins in each sample and of overlapping peptides/proteins between all samples in multiple datasets. We have also implemented a targeted proteomics module for automated multiple reaction monitoring (MRM)/selective reaction monitoring (SRM) assay development. We have linked YPED's database search results and both label-based and label-free fold-change analysis to the Skyline Panorama repository for online spectra visualization. In addition, we have built enhanced functionality to curate peptide identifications into an MS/MS peptide spectral library for all of our protein database search identification results. Copyright © 2015 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.
Saokaew, Surasak; Suwankesawong, Wimon; Permsuwan, Unchalee; Chaiyakunapruk, Nathorn
2011-04-01
The use of herbal products continues to expand rapidly across the world and concerns regarding the safety of these products have been raised. In Thailand, Thai Vigibase, developed by the Health Product Vigilance Center (HPVC) under the Thai Food and Drug Administration, is the national database that collates reports from health product surveillance systems and programmes. Thai Vigibase can be used to identify signals of adverse events in patients receiving herbal products. The purpose of the study was to describe the characteristics of reported adverse events in patients receiving herbal products in Thailand. Thai Vigibase data from February 2000 to December 2008 involving adverse events reported in association with herbal products were used. This database includes case reports submitted through the spontaneous reporting system and intensive monitoring programmes. Under the spontaneous reporting system, adverse event reports are collected nationwide via a national network of 22 regional centres covering more than 800 public and private hospitals, and health service centres. An intensive monitoring programme was also conducted to monitor the five single herbal products listed in the Thai National List of Essential Medicines (NLEM), while another intensive monitoring programme was developed to monitor the four single herbal products that were under consideration for inclusion in the NLEM. The database contained patient demographics, adverse events associated with herbal products, and details on seriousness, causality and quality of reports. Descriptive statistics were used for data analyses. A total of 593 reports with 1868 adverse events involving 24 different products were made during the study period. The age range of individuals was 1-86 years (mean 47 years). Most case reports were obtained from the intensive monitoring programme. Of the reports, 72% involved females. The herbal products for which adverse events were frequently reported were products containing turmeric (44%), followed by andrographis (10%), veld grape (10%), pennywort (7%), plai (6%), jewel vine (6%), bitter melon (5%) and snake plant (5%). Gastrointestinal problems were the most common adverse effect reported. Serious adverse events included Stevens-Johnson syndrome, anaphylactic shock and exfoliative dermatitis. Adverse event reports on herbals products were diverse, with most of them being reported through intensive monitoring programmes. Thai Vigibase is a potentially effective data source for signal detection of adverse events associated with herbal products.
Problem Solving with Guided Repeated Oral Reading Instruction
ERIC Educational Resources Information Center
Conderman, Greg; Strobel, Debra
2006-01-01
Many students with disabilities require specialized instructional interventions and frequent progress monitoring in reading. The guided repeated oral reading technique promotes oral reading fluency while providing a reliable data-based monitoring system. This article emphasizes the importance of problem-solving when using this reading approach.
Wang, Hui; Zhang, Xiao-Bo; Huang, Lu-Qi; Guo, Lan-Ping; Wang, Ling; Zhao, Yu-Ping; Yang, Guang
2017-11-01
The supply of Chinese patent medicine is influenced by the price of raw materials (Chinese herbal medicines) and the stock of resources. On the one hand, raw material prices show cyclical volatility or even irreversible soaring, making the price of Chinese patent medicine is not stable or even the highest cost of hanging upside down. On the other hand, due to lack of resources or disable some of the proprietary Chinese medicine was forced to stop production. Based on the micro-service architecture and Redis cluster deployment Based on the micro-service architecture and Redis cluster deployment, the supply security monitoring and analysis system for Chinese patent medicines in national essential medicines has realized the dynamic monitoring and intelligence warning of herbs and Chinese patent medicine by connecting and integrating the database of Chinese medicine resources, the dynamic monitoring system of traditional Chinese medicine resources and the basic medicine database of Chinese patent medicine. Copyright© by the Chinese Pharmaceutical Association.
Blackwell, Brett R.; Ankley, Gerald T.; Corsi, Steven; DeCicco, Laura; Houck, Kieth A.; Judson, Richard S.; Li, Shibin; Martin, Matthew T.; Murphy, Elizabeth; Schroeder, Anthony L.; Smith, Edwin R.; Swintek, Joe; Villeneuve, Daniel L.
2017-01-01
Current environmental monitoring approaches focus primarily on chemical occurrence. However, based on concentration alone, it can be difficult to identify which compounds may be of toxicological concern and should be prioritized for further monitoring, in-depth testing, or management. This can be problematic because toxicological characterization is lacking for many emerging contaminants. New sources of high-throughput screening (HTS) data, such as the ToxCast database, which contains information for over 9000 compounds screened through up to 1100 bioassays, are now available. Integrated analysis of chemical occurrence data with HTS data offers new opportunities to prioritize chemicals, sites, or biological effects for further investigation based on concentrations detected in the environment linked to relative potencies in pathway-based bioassays. As a case study, chemical occurrence data from a 2012 study in the Great Lakes Basin along with the ToxCast effects database were used to calculate exposure–activity ratios (EARs) as a prioritization tool. Technical considerations of data processing and use of the ToxCast database are presented and discussed. EAR prioritization identified multiple sites, biological pathways, and chemicals that warrant further investigation. Prioritized bioactivities from the EAR analysis were linked to discrete adverse outcome pathways to identify potential adverse outcomes and biomarkers for use in subsequent monitoring efforts.
Space Object Radiometric Modeling for Hardbody Optical Signature Database Generation
2009-09-01
Introduction This presentation summarizes recent activity in monitoring spacecraft health status using passive remote optical nonimaging ...Approved for public release; distribution is unlimited. Space Object Radiometric Modeling for Hardbody Optical Signature Database Generation...It is beneficial to the observer/analyst to understand the fundamental optical signature variability associated with these detection and
A Web-Based Tool to Support Data-Based Early Intervention Decision Making
ERIC Educational Resources Information Center
Buzhardt, Jay; Greenwood, Charles; Walker, Dale; Carta, Judith; Terry, Barbara; Garrett, Matthew
2010-01-01
Progress monitoring and data-based intervention decision making have become key components of providing evidence-based early childhood special education services. Unfortunately, there is a lack of tools to support early childhood service providers' decision-making efforts. The authors describe a Web-based system that guides service providers…
Database for landscape-scale carbon monitoring sites
Jason A. Cole; Kristopher D. Johnson; Richard A. Birdsey; Yude Pan; Craig A. Wayson; Kevin McCullough; Coeli M. Hoover; David Y. Hollinger; John B. Bradford; Michael G. Ryan; Randall K. Kolka; Peter Wieshampel; Kenneth L. Clark; Nicholas S. Skowronski; John Hom; Scott V. Ollinger; Steven G. McNulty; Michael J. Gavazzi
2013-01-01
This report describes the database used to compile, store, and manage intensive ground-based biometric data collected at research sites in Colorado, Minnesota, New Hampshire, New Jersey, North Carolina, and Wyoming, supporting research activities of the U.S. North American Carbon Program (NACP). This report also provides details of each site, the sampling design and...
CALINVASIVES: a revolutionary tool to monitor invasive threats
M. Garbelotto; S. Drill; C. Powell; J. Malpas
2017-01-01
CALinvasives is a web-based relational database and content management system (CMS) cataloging the statewide distribution of invasive pathogens and pests and the plant hosts they impact. The database has been developed as a collaboration between the Forest Pathology and Mycology Laboratory at UC Berkeley and Calflora. CALinvasives will combine information on the...
Creating Smarter Classrooms: Data-Based Decision Making for Effective Classroom Management
ERIC Educational Resources Information Center
Gage, Nicholas A.; McDaniel, Sara
2012-01-01
The term "data-based decision making" (DBDM) has become pervasive in education and typically refers to the use of data to make decisions in schools, from assessment of an individual student's academic progress to whole-school reform efforts. Research suggests that special education teachers who use progress monitoring data (a DBDM…
Service Management Database for DSN Equipment
NASA Technical Reports Server (NTRS)
Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed
2009-01-01
This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.
Electrochemical Impedance Sensors for Monitoring Trace Amounts of NO3 in Selected Growing Media.
Ghaffari, Seyed Alireza; Caron, William-O; Loubier, Mathilde; Normandeau, Charles-O; Viens, Jeff; Lamhamedi, Mohammed S; Gosselin, Benoit; Messaddeq, Younes
2015-07-21
With the advent of smart cities and big data, precision agriculture allows the feeding of sensor data into online databases for continuous crop monitoring, production optimization, and data storage. This paper describes a low-cost, compact, and scalable nitrate sensor based on electrochemical impedance spectroscopy for monitoring trace amounts of NO3- in selected growing media. The nitrate sensor can be integrated to conventional microelectronics to perform online nitrate sensing continuously over a wide concentration range from 0.1 ppm to 100 ppm, with a response time of about 1 min, and feed data into a database for storage and analysis. The paper describes the structural design, the Nyquist impedance response, the measurement sensitivity and accuracy, and the field testing of the nitrate sensor performed within tree nursery settings under ISO/IEC 17025 certifications.
Electrochemical Impedance Sensors for Monitoring Trace Amounts of NO3 in Selected Growing Media
Ghaffari, Seyed Alireza; Caron, William-O.; Loubier, Mathilde; Normandeau, Charles-O.; Viens, Jeff; Lamhamedi, Mohammed S.; Gosselin, Benoit; Messaddeq, Younes
2015-01-01
With the advent of smart cities and big data, precision agriculture allows the feeding of sensor data into online databases for continuous crop monitoring, production optimization, and data storage. This paper describes a low-cost, compact, and scalable nitrate sensor based on electrochemical impedance spectroscopy for monitoring trace amounts of NO3− in selected growing media. The nitrate sensor can be integrated to conventional microelectronics to perform online nitrate sensing continuously over a wide concentration range from 0.1 ppm to 100 ppm, with a response time of about 1 min, and feed data into a database for storage and analysis. The paper describes the structural design, the Nyquist impedance response, the measurement sensitivity and accuracy, and the field testing of the nitrate sensor performed within tree nursery settings under ISO/IEC 17025 certifications. PMID:26197322
Monitoring the Sodium Content of Restaurant Foods: Public Health Challenges and Opportunities
Cogswell, Mary E.; Gunn, Janelle P.; Curtis, Christine J.; Rhodes, Donna; Hoy, Kathy; Pehrsson, Pamela; Nickle, Melissa; Merritt, Robert
2013-01-01
We reviewed methods of studies assessing restaurant foods’ sodium content and nutrition databases. We systematically searched the 1964–2012 literature and manually examined references in selected articles and studies. Twenty-six (5.2%) of the 499 articles we found met the inclusion criteria and were abstracted. Five were conducted nationally. Sodium content determination methods included laboratory analysis (n = 15), point-of-purchase nutrition information or restaurants’ Web sites (n = 8), and menu analysis with a nutrient database (n = 3). There is no comprehensive data system that provides all information needed to monitor changes in sodium or other nutrients among restaurant foods. Combining information from different sources and methods may help inform a comprehensive system to monitor sodium content reduction efforts in the US food supply and to develop future strategies. PMID:23865701
The Genomes On Line Database (GOLD) v.2: a monitor of genome projects worldwide
Liolios, Konstantinos; Tavernarakis, Nektarios; Hugenholtz, Philip; Kyrpides, Nikos C.
2006-01-01
The Genomes On Line Database (GOLD) is a web resource for comprehensive access to information regarding complete and ongoing genome sequencing projects worldwide. The database currently incorporates information on over 1500 sequencing projects, of which 294 have been completed and the data deposited in the public databases. GOLD v.2 has been expanded to provide information related to organism properties such as phenotype, ecotype and disease. Furthermore, project relevance and availability information is now included. GOLD is available at . It is also mirrored at the Institute of Molecular Biology and Biotechnology, Crete, Greece at PMID:16381880
Implementation of a WAP-based telemedicine system for patient monitoring.
Hung, Kevin; Zhang, Yuan-Ting
2003-06-01
Many parties have already demonstrated telemedicine applications that use cellular phones and the Internet. A current trend in telecommunication is the convergence of wireless communication and computer network technologies, and the emergence of wireless application protocol (WAP) devices is an example. Since WAP will also be a common feature found in future mobile communication devices, it is worthwhile to investigate its use in telemedicine. This paper describes the implementation and experiences with a WAP-based telemedicine system for patient-monitoring that has been developed in our laboratory. It utilizes WAP devices as mobile access terminals for general inquiry and patient-monitoring services. Authorized users can browse the patients' general data, monitored blood pressure (BP), and electrocardiogram (ECG) on WAP devices in store-and-forward mode. The applications, written in wireless markup language (WML), WMLScript, and Perl, resided in a content server. A MySQL relational database system was set up to store the BP readings, ECG data, patient records, clinic and hospital information, and doctors' appointments with patients. A wireless ECG subsystem was built for recording ambulatory ECG in an indoor environment and for storing ECG data into the database. For testing, a WAP phone compliant with WAP 1.1 was used at GSM 1800 MHz by circuit-switched data (CSD) to connect to the content server through a WAP gateway, which was provided by a mobile phone service provider in Hong Kong. Data were successfully retrieved from the database and displayed on the WAP phone. The system shows how WAP can be feasible in remote patient-monitoring and patient data retrieval.
This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
R2 Water Quality Portal Monitoring Stations
The Water Quality Data Portal (WQP) provides an easy way to access data stored in various large water quality databases. The WQP provides various input parameters on the form including location, site, sampling, and date parameters to filter and customize the returned results. The The Water Quality Portal (WQP) is a cooperative service sponsored by the United States Geological Survey (USGS), the Environmental Protection Agency (EPA) and the National Water Quality Monitoring Council (NWQMC) that integrates publicly available water quality data from the USGS National Water Information System (NWIS) the EPA STOrage and RETrieval (STORET) Data Warehouse, and the USDA ARS Sustaining The Earth??s Watersheds - Agricultural Research Database System (STEWARDS).
YUCSA: A CLIPS expert database system to monitor academic performance
NASA Technical Reports Server (NTRS)
Toptsis, Anestis A.; Ho, Frankie; Leindekar, Milton; Foon, Debra Low; Carbonaro, Mike
1991-01-01
The York University CLIPS Student Administrator (YUCSA), an expert database system implemented in C Language Integrated Processing System (CLIPS), for monitoring the academic performance of undergraduate students at York University, is discussed. The expert system component in the system has already been implemented for two major departments, and it is under testing and enhancement for more departments. Also, more elaborate user interfaces are under development. We describe the design and implementation of the system, problems encountered, and immediate future plans. The system has excellent maintainability and it is very efficient, taking less than one minute to complete an assessment of one student.
Philippe, Anne S; Plumejeaud-Perreau, Christine; Jourde, Jérôme; Pineau, Philippe; Lachaussée, Nicolas; Joyeux, Emmanuel; Corre, Frédéric; Delaporte, Philippe; Bocher, Pierrick
2017-01-01
Long-term benthic monitoring is rewarding in terms of science, but labour-intensive, whether in the field, the laboratory, or behind the computer. Building and managing databases require multiple skills, including consistency over time as well as organisation via a systematic approach. Here, we introduce and share our spatially explicit benthic database, comprising 11 years of benthic data. It is the result of intensive benthic sampling that has been conducted on a regular grid (259 stations) covering the intertidal mudflats of the Pertuis-Charentais (Marennes-Oléron Bay and Aiguillon Bay). Samples were taken by foot or by boats during winter depending on tidal height, from December 2003 to February 2014. The present dataset includes abundances and biomass densities of all mollusc species of the study regions and principal polychaetes as well as their length, accessibility to shorebirds, energy content and shell mass when appropriate and available. This database has supported many studies dealing with the spatial distribution of benthic invertebrates and temporal variations in food resources for shorebird species as well as latitudinal comparisons with other databases. In this paper, we introduce our benthos monitoring, share our data, and present a "guide of good practices" for building, cleaning and using it efficiently, providing examples of results with associated R code. The dataset has been formatted into a geo-referenced relational database, using PostgreSQL open-source DBMS. We provide density information, measurements, energy content and accessibility of thirteen bivalve, nine gastropod and two polychaete taxa (a total of 66,620 individuals) for 11 consecutive winters. Figures and maps are provided to describe how the dataset was built, cleaned, and how it can be used. This dataset can again support studies concerning spatial and temporal variations in species abundance, interspecific interactions as well as evaluations of the availability of food resources for small- and medium size shorebirds and, potentially, conservation and impact assessment studies.
Development of a database for Louisiana highway bridge scour data : a program and manual.
DOT National Transportation Integrated Search
1999-10-01
A tremendous amount of scour data already exists for the highway bridges monitored by the Louisiana Department of Transportation and Development (DOTD). More than one hundred and twenty bridges are being monitored at a frequency of one to several tim...
Live load monitoring for the I-10 twin span bridge : research project capsule.
DOT National Transportation Integrated Search
2014-10-01
To establish a site-specific database for bridge evaluation and future bridge design, : DOTD established a long-term health monitoring system at the I-10 Twin Span Bridge. : The bridge is instrumented from deck to piles to capture bridge response (bo...
Monitoring aquatic resources for regional assessments requires an accurate and comprehensive inventory of the resource and useful classification of exosystem similarities. Our research effort to create an electronic database and work with various ways to classify coastal wetlands...
Evaluation of low wind modeling approaches for two tall-stack databases.
Paine, Robert; Samani, Olga; Kaplan, Mary; Knipping, Eladio; Kumar, Naresh
2015-11-01
The performance of the AERMOD air dispersion model under low wind speed conditions, especially for applications with only one level of meteorological data and no direct turbulence measurements or vertical temperature gradient observations, is the focus of this study. The analysis documented in this paper addresses evaluations for low wind conditions involving tall stack releases for which multiple years of concurrent emissions, meteorological data, and monitoring data are available. AERMOD was tested on two field-study databases involving several SO2 monitors and hourly emissions data that had sub-hourly meteorological data (e.g., 10-min averages) available using several technical options: default mode, with various low wind speed beta options, and using the available sub-hourly meteorological data. These field study databases included (1) Mercer County, a North Dakota database featuring five SO2 monitors within 10 km of the Dakota Gasification Company's plant and the Antelope Valley Station power plant in an area of both flat and elevated terrain, and (2) a flat-terrain setting database with four SO2 monitors within 6 km of the Gibson Generating Station in southwest Indiana. Both sites featured regionally representative 10-m meteorological databases, with no significant terrain obstacles between the meteorological site and the emission sources. The low wind beta options show improvement in model performance helping to reduce some of the over-prediction biases currently present in AERMOD when run with regulatory default options. The overall findings with the low wind speed testing on these tall stack field-study databases indicate that AERMOD low wind speed options have a minor effect for flat terrain locations, but can have a significant effect for elevated terrain locations. The performance of AERMOD using low wind speed options leads to improved consistency of meteorological conditions associated with the highest observed and predicted concentration events. The available sub-hourly modeling results using the Sub-Hourly AERMOD Run Procedure (SHARP) are relatively unbiased and show that this alternative approach should be seriously considered to address situations dominated by low-wind meander conditions. AERMOD was evaluated with two tall stack databases (in North Dakota and Indiana) in areas of both flat and elevated terrain. AERMOD cases included the regulatory default mode, low wind speed beta options, and use of the Sub-Hourly AERMOD Run Procedure (SHARP). The low wind beta options show improvement in model performance (especially in higher terrain areas), helping to reduce some of the over-prediction biases currently present in regulatory default AERMOD. The SHARP results are relatively unbiased and show that this approach should be seriously considered to address situations dominated by low-wind meander conditions.
United states national land cover data base development? 1992-2001 and beyond
Yang, L.
2008-01-01
An accurate, up-to-date and spatially-explicate national land cover database is required for monitoring the status and trends of the nation's terrestrial ecosystem, and for managing and conserving land resources at the national scale. With all the challenges and resources required to develop such a database, an innovative and scientifically sound planning must be in place and a partnership be formed among users from government agencies, research institutes and private sectors. In this paper, we summarize major scientific and technical issues regarding the development of the NLCD 1992 and 2001. Experiences and lessons learned from the project are documented with regard to project design, technical approaches, accuracy assessment strategy, and projecti imiplementation.Future improvements in developing next generation NLCD beyond 2001 are suggested, including: 1) enhanced satellite data preprocessing in correction of atmospheric and adjacency effect and the topographic normalization; 2) improved classification accuracy through comprehensive and consistent training data and new algorithm development; 3) multi-resolution and multi-temporal database targeting major land cover changes and land cover database updates; 4) enriched database contents by including additional biophysical parameters and/or more detailed land cover classes through synergizing multi-sensor, multi-temporal, and multi-spectral satellite data and ancillary data, and 5) transform the NLCD project into a national land cover monitoring program. ?? 2008 IEEE.
Avram, Robert; Marquis-Gravel, Guillaume; Simard, François; Pacheco, Christine; Couture, Étienne; Tremblay-Gravel, Maxime; Desplantie, Olivier; Malhamé, Isabelle; Bibas, Lior; Mansour, Samer; Parent, Marie-Claude; Farand, Paul; Harvey, Luc; Lessard, Marie-Gabrielle; Ly, Hung; Liu, Geoffrey; Hay, Annette E; Marc Jolicoeur, E
2018-07-01
Use of health administrative databases is proposed for screening and monitoring of participants in randomized registry trials. However, access to these databases raises privacy concerns. We assessed patient's preferences regarding use of personal information to link their research records with national health databases, as part of a hypothetical randomized registry trial. Cardiology patients were invited to complete an anonymous self-reported survey that ascertained preferences related to the concept of accessing government health databases for research, the type of personal identifiers to be shared and the type of follow-up preferred as participants in a hypothetical trial. A total of 590 responders completed the survey (90% response rate), the majority of which were Caucasians (90.4%), male (70.0%) with a median age of 65years (interquartile range, 8). The majority responders (80.3%) would grant researchers access to health administrative databases for screening and follow-up. To this end, responders endorsed the recording of their personal identifiers by researchers for future record linkage, including their name (90%), and health insurance number (83.9%), but fewer responders agreed with the recording of their social security number (61.4%, p<0.05 with date of birth as reference). Prior participation in a trial predicted agreement for granting researchers access to the administrative databases (OR: 1.69, 95% confidence interval: 1.03-2.90; p=0.04). The majority of Cardiology patients surveyed were supportive of use of their personal identifiers to access administrative health databases and conduct long-term monitoring in the context of a randomized registry trial. Copyright © 2018 Elsevier Ireland Ltd. All rights reserved.
Development of Asset Fault Signatures for Prognostic and Health Management in the Nuclear Industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vivek Agarwal; Nancy J. Lybeck; Randall Bickford
2014-06-01
Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: Diagnostic Advisor, Asset Fault Signature (AFS) Database, Remaining Useful Life Advisor, and Remaining Useful Life Database. This paper focuses on development of asset fault signatures to assess the health status of generator step-up generators and emergency diesel generators in nuclear power plants. Asset fault signatures describe themore » distinctive features based on technical examinations that can be used to detect a specific fault type. At the most basic level, fault signatures are comprised of an asset type, a fault type, and a set of one or more fault features (symptoms) that are indicative of the specified fault. The AFS Database is populated with asset fault signatures via a content development exercise that is based on the results of intensive technical research and on the knowledge and experience of technical experts. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.« less
Do "Digital Certificates" Hold the Key to Colleges' On-Line Activities?
ERIC Educational Resources Information Center
Olsen, Florence
1999-01-01
Examines the increasing use of "digital certificates" to validate computer user identity in various applications on college and university campuses, including letting students register for courses, monitoring access to Internet2, and monitoring access to databases and electronic journals. The methodology has been developed by the…
42 CFR 488.68 - State Agency responsibilities for OASIS collection and data base requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... operating the OASIS system: (a) Establish and maintain an OASIS database. The State agency or other entity designated by CMS must— (1) Use a standard system developed or approved by CMS to collect, store, and analyze..., system back-up, and monitoring the status of the database; and (3) Obtain CMS approval before modifying...
Matched Comparison of PRAMS and the First Steps Database.
ERIC Educational Resources Information Center
Schubert, Stacey; Cawthon, Laurie
This study compared some of the survey data collected by the Pregnancy Risk Assessment Monitoring System (PRAMS) to information from vital records and administrative records in the First Steps Database (FSDB) for a group of women who gave birth in 1993. PRAMS is an ongoing survey of Washington women who have given birth. The FSDB contains birth…
Monitoring Colonias along the United States-Mexico border
Norman, Laura M.; Parcher, Jean W.; Lam, Alven H.
2004-01-01
The Colonias Monitoring Program provides a publicly accessible, binational, GIS database to enable civic leaders and c itizens to inventory, analyze, and monitor growth, housing, and infrastructure in border communities. High-technology tools are provided to support planning efforts and development along the border, using a sustainable and comprehensive approach. The collective information can be used by nongovernmental organizations in preparing grant and loan applications for community-improvement projects.
A sensor monitoring system for telemedicine, safety and security applications
NASA Astrophysics Data System (ADS)
Vlissidis, Nikolaos; Leonidas, Filippos; Giovanis, Christos; Marinos, Dimitrios; Aidinis, Konstantinos; Vassilopoulos, Christos; Pagiatakis, Gerasimos; Schmitt, Nikolaus; Pistner, Thomas; Klaue, Jirka
2017-02-01
A sensor system capable of medical, safety and security monitoring in avionic and other environments (e.g. homes) is examined. For application inside an aircraft cabin, the system relies on an optical cellular network that connects each seat to a server and uses a set of database applications to process data related to passengers' health, safety and security status. Health monitoring typically encompasses electrocardiogram, pulse oximetry and blood pressure, body temperature and respiration rate while safety and security monitoring is related to the standard flight attendance duties, such as cabin preparation for take-off, landing, flight in regions of turbulence, etc. In contrast to previous related works, this article focuses on the system's modules (medical and safety sensors and associated hardware), the database applications used for the overall control of the monitoring function and the potential use of the system for security applications. Further tests involving medical, safety and security sensing performed in an real A340 mock-up set-up are also described and reference is made to the possible use of the sensing system in alternative environments and applications, such as health monitoring within other means of transport (e.g. trains or small passenger sea vessels) as well as for remotely located home users, over a wired Ethernet network or the Internet.
Prescription drug monitoring programs in the United States of America
Félix, Sausan El Burai; Mack, Karin
2015-01-01
SYNOPSIS Since the late 1990s, the number of opioid analgesic overdose deaths has quadrupled in the United States of America (from 4 030 deaths in 1999 to 16 651 in 2010). The objectives of this article are to provide an overview of the problem of prescription drug overdose in the United States and to discuss actions that could help reduce the problem, with particular attention to the characteristics of prescription drug monitoring programs (PDMPs). These programs consist of state-level databases that monitor controlled substances. The information compiled in the databases is at the disposal of authorized persons (e.g., physicians, pharmacists, and other health-care providers) and may be used only for professional purposes. Suppliers can use such information to prevent interaction with other drugs or therapeutic duplication, or to identify drug-search behavior. Law enforcement agencies can use these programs to identify improper drug prescription or dispensing patterns, or drug diversion. PMID:25563153
Development of a replicated database of DHCP data for evaluation of drug use.
Graber, S E; Seneker, J A; Stahl, A A; Franklin, K O; Neel, T E; Miller, R A
1996-01-01
This case report describes development and testing of a method to extract clinical information stored in the Veterans Affairs (VA) Decentralized Hospital Computer System (DHCP) for the purpose of analyzing data about groups of patients. The authors used a microcomputer-based, structured query language (SQL)-compatible, relational database system to replicate a subset of the Nashville VA Hospital's DHCP patient database. This replicated database contained the complete current Nashville DHCP prescription, provider, patient, and drug data sets, and a subset of the laboratory data. A pilot project employed this replicated database to answer questions that might arise in drug-use evaluation, such as identification of cases of polypharmacy, suboptimal drug regimens, and inadequate laboratory monitoring of drug therapy. These database queries included as candidates for review all prescriptions for all outpatients. The queries demonstrated that specific drug-use events could be identified for any time interval represented in the replicated database. PMID:8653451
Development of a replicated database of DHCP data for evaluation of drug use.
Graber, S E; Seneker, J A; Stahl, A A; Franklin, K O; Neel, T E; Miller, R A
1996-01-01
This case report describes development and testing of a method to extract clinical information stored in the Veterans Affairs (VA) Decentralized Hospital Computer System (DHCP) for the purpose of analyzing data about groups of patients. The authors used a microcomputer-based, structured query language (SQL)-compatible, relational database system to replicate a subset of the Nashville VA Hospital's DHCP patient database. This replicated database contained the complete current Nashville DHCP prescription, provider, patient, and drug data sets, and a subset of the laboratory data. A pilot project employed this replicated database to answer questions that might arise in drug-use evaluation, such as identification of cases of polypharmacy, suboptimal drug regimens, and inadequate laboratory monitoring of drug therapy. These database queries included as candidates for review all prescriptions for all outpatients. The queries demonstrated that specific drug-use events could be identified for any time interval represented in the replicated database.
WOVOdat as a worldwide resource to improve eruption forecasts
NASA Astrophysics Data System (ADS)
Widiwijayanti, Christina; Costa, Fidel; Zar Win Nang, Thin; Tan, Karine; Newhall, Chris; Ratdomopurbo, Antonius
2015-04-01
During periods of volcanic unrest, volcanologists need to interpret signs of unrest to be able to forecast whether an eruption is likely to occur. Some volcanic eruptions display signs of impending eruption such as seismic activity, surface deformation, or gas emissions; but not all will give signs and not all signs are necessarily followed by an eruption. Volcanoes behave differently. Precursory signs of an eruption are sometimes very short, less than an hour, but can be also weeks, months, or even years. Some volcanoes are regularly active and closely monitored, while other aren't. Often, the record of precursors to historical eruptions of a volcano isn't enough to allow a forecast of its future activity. Therefore, volcanologists must refer to monitoring data of unrest and eruptions at similar volcanoes. WOVOdat is the World Organization of Volcano Observatories' Database of volcanic unrest - an international effort to develop common standards for compiling and storing data on volcanic unrests in a centralized database and freely web-accessible for reference during volcanic crises, comparative studies, and basic research on pre-eruption processes. WOVOdat will be to volcanology as an epidemiological database is to medicine. We have up to now incorporated about 15% of worldwide unrest data into WOVOdat, covering more than 100 eruption episodes, which includes: volcanic background data, eruptive histories, monitoring data (seismic, deformation, gas, hydrology, thermal, fields, and meteorology), monitoring metadata, and supporting data such as reports, images, maps and videos. Nearly all data in WOVOdat are time-stamped and geo-referenced. Along with creating a database on volcanic unrest, WOVOdat also developing web-tools to help users to query, visualize, and compare data, which further can be used for probabilistic eruption forecasting. Reference to WOVOdat will be especially helpful at volcanoes that have not erupted in historical or 'instrumental' time and thus for which no previous data exist. The more data in WOVOdat, the more useful it will be. We actively solicit relevant data contributions from volcano observatories, other institutions, and individual researchers. Detailed information and documentation about the database and how to use it can be found at www.wovodat.org.
A data management infrastructure for bridge monitoring
NASA Astrophysics Data System (ADS)
Jeong, Seongwoon; Byun, Jaewook; Kim, Daeyoung; Sohn, Hoon; Bae, In Hwan; Law, Kincho H.
2015-04-01
This paper discusses a data management infrastructure framework for bridge monitoring applications. As sensor technologies mature and become economically affordable, their deployment for bridge monitoring will continue to grow. Data management becomes a critical issue not only for storing the sensor data but also for integrating with the bridge model to support other functions, such as management, maintenance and inspection. The focus of this study is on the effective data management of bridge information and sensor data, which is crucial to structural health monitoring and life cycle management of bridge structures. We review the state-of-the-art of bridge information modeling and sensor data management, and propose a data management framework for bridge monitoring based on NoSQL database technologies that have been shown useful in handling high volume, time-series data and to flexibly deal with unstructured data schema. Specifically, Apache Cassandra and Mongo DB are deployed for the prototype implementation of the framework. This paper describes the database design for an XML-based Bridge Information Modeling (BrIM) schema, and the representation of sensor data using Sensor Model Language (SensorML). The proposed prototype data management framework is validated using data collected from the Yeongjong Bridge in Incheon, Korea.
EasyKSORD: A Platform of Keyword Search Over Relational Databases
NASA Astrophysics Data System (ADS)
Peng, Zhaohui; Li, Jing; Wang, Shan
Keyword Search Over Relational Databases (KSORD) enables casual users to use keyword queries (a set of keywords) to search relational databases just like searching the Web, without any knowledge of the database schema or any need of writing SQL queries. Based on our previous work, we design and implement a novel KSORD platform named EasyKSORD for users and system administrators to use and manage different KSORD systems in a novel and simple manner. EasyKSORD supports advanced queries, efficient data-graph-based search engines, multiform result presentations, and system logging and analysis. Through EasyKSORD, users can search relational databases easily and read search results conveniently, and system administrators can easily monitor and analyze the operations of KSORD and manage KSORD systems much better.
Development of a land-cover characteristics database for the conterminous U.S.
Loveland, Thomas R.; Merchant, J.W.; Ohlen, D.O.; Brown, Jesslyn F.
1991-01-01
Information regarding the characteristics and spatial distribution of the Earth's land cover is critical to global environmental research. A prototype land-cover database for the conterminous United States designed for use in a variety of global modelling, monitoring, mapping, and analytical endeavors has been created. The resultant database contains multiple layers, including the source AVHRR data, the ancillary data layers, the land-cover regions defined by the research, and translation tables linking the regions to other land classification schema (for example, UNESCO, USGS Anderson System). The land-cover characteristics database can be analyzed, transformed, or aggregated by users to meet a broad spectrum of requirements. -from Authors
Johansson, Saga; Wallander, Mari-Ann; de Abajo, Francisco J; García Rodríguez, Luis Alberto
2010-03-01
Post-launch drug safety monitoring is essential for the detection of adverse drug signals that may be missed during preclinical trials. Traditional methods of postmarketing surveillance such as spontaneous reporting have intrinsic limitations, many of which can be overcome by the additional application of structured pharmacoepidemiological approaches. However, further improvement in drug safety monitoring requires a shift towards more proactive pharmacoepidemiological methods that can detect adverse drug signals as they occur in the population. To assess the feasibility of using proactive monitoring of an electronic medical record system, in combination with an independent endpoint adjudication committee, to detect adverse events among users of selected drugs. UK General Practice Research Database (GPRD) information was used to detect acute liver disorder associated with the use of amoxicillin/clavulanic acid (hepatotoxic) or low-dose aspirin (acetylsalicylic acid [non-hepatotoxic]). Individuals newly prescribed these drugs between 1 October 2005 and 31 March 2006 were identified. Acute liver disorder cases were assessed using GPRD computer records in combination with case validation by an independent endpoint adjudication committee. Signal generation thresholds were based on the background rate of acute liver disorder in the general population. Over a 6-month period, 8148 patients newly prescribed amoxicillin/clavulanic acid and 5577 patients newly prescribed low-dose aspirin were identified. Within this cohort, searches identified 11 potential liver disorder cases from computerized records: six for amoxicillin/clavulanic acid and five for low-dose aspirin. The independent endpoint adjudication committee refined this to four potential acute liver disorder cases for whom paper-based information was requested for final case assessment. Final case assessments confirmed no cases of acute liver disorder. The time taken for this study was 18 months (6 months for recruitment and 12 months for data management and case validation). To reach the estimated target exposure necessary to raise or rule out a signal of concern to public health, we determined that a recruitment period 2-3 times longer than that used in this study would be required. Based on the real market uptake of six commonly used medicinal products launched between 2001 and 2006 in the UK (budesonide/eformoterol [fixed-dose combination], duloxetine, ezetimibe, metformin/rosiglitazone [fixed-dose combination], tiotropium bromide and tadalafil) the target exposure would not have been reached until the fifth year of marketing using a single database. It is feasible to set up a system that actively monitors drug safety using a healthcare database and an independent endpoint adjudication committee. However, future successful implementation will require multiple databases to be queried so that larger study populations are included. This requires further development and harmonization of international healthcare databases.
Wide-area continuous offender monitoring
NASA Astrophysics Data System (ADS)
Hoshen, Joseph; Drake, George; Spencer, Debra D.
1997-02-01
The corrections system in the U.S. is supervising over five million offenders. This number is rising fast and so are the direct and indirect costs to society. To improve supervision and reduce the cost of parole and probation, first generation home arrest systems were introduced in 1987. While these systems proved to be helpful to the corrections system, their scope is rather limited because they only cover an offender at a single location and provide only a partial time coverage. To correct the limitations of first- generation systems, second-generation wide area continuous electronic offender monitoring systems, designed to monitor the offender at all times and locations, are now on the drawing board. These systems use radio frequency location technology to track the position of offenders. The challenge for this technology is the development of reliable personal locator devices that are small, lightweight, with long operational battery life, and indoors/outdoors accuracy of 100 meters or less. At the center of a second-generation system is a database that specifies the offender's home, workplace, commute, and time the offender should be found in each. The database could also define areas from which the offender is excluded. To test compliance, the system would compare the observed coordinates of the offender with the stored location for a given time interval. Database logfiles will also enable law enforcement to determine if a monitored offender was present at a crime scene and thus include or exclude the offender as a potential suspect.
Wide area continuous offender monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoshen, J.; Drake, G.; Spencer, D.
The corrections system in the U.S. is supervising over five million offenders. This number is rising fast and so are the direct and indirect costs to society. To improve supervision and reduce the cost of parole and probation, first generation home arrest systems were introduced in 1987. While these systems proved to be helpful to the corrections system, their scope is rather limited because they only cover an offender at a single location and provide only a partial time coverage. To correct the limitations of first-generation systems, second-generation wide area continuous electronic offender monitoring systems, designed to monitor the offendermore » at all times and locations, are now on the drawing board. These systems use radio frequency location technology to track the position of offenders. The challenge for this technology is the development of reliable personal locator devices that are small, lightweight, with long operational battery life, and indoors/outdoors accuracy of 100 meters or less. At the center of a second-generation system is a database that specifies the offender`s home, workplace, commute, and time the offender should be found in each. The database could also define areas from which the offender is excluded. To test compliance, the system would compare the observed coordinates of the offender with the stored location for a given time interval. Database logfiles will also enable law enforcement to determine if a monitored offender was present at a crime scene and thus include or exclude the offender as a potential suspect.« less
Poon, Art F Y; Gustafson, Réka; Daly, Patricia; Zerr, Laura; Demlow, S Ellen; Wong, Jason; Woods, Conan K; Hogg, Robert S; Krajden, Mel; Moore, David; Kendall, Perry; Montaner, Julio S G; Harrigan, P Richard
2016-05-01
HIV evolves rapidly and therefore infections with similar genetic sequences are likely linked by recent transmission events. Clusters of related infections can represent subpopulations with high rates of transmission. We describe the implementation of an automated near real-time system to monitor and characterise HIV transmission hotspots in British Columbia, Canada. In this implementation case study, we applied a monitoring system to the British Columbia drug treatment database, which holds more than 32 000 anonymised HIV genotypes for nearly 9000 residents of British Columbia living with HIV. On average, five to six new HIV genotypes are deposited in the database every day, which triggers an automated reanalysis of the entire database. We extracted clusters of five or more individuals with short phylogenetic distances between their respective HIV sequences. The system generated monthly reports of the growth and characteristics of clusters that were distributed to public health officers. In June, 2014, the monitoring system detected the expansion of a cluster by 11 new cases during 3 months, including eight cases with transmitted drug resistance. This cluster generally comprised young men who have sex with men. The subsequent report precipitated an enhanced public health follow-up to ensure linkage to care and treatment initiation in the affected subpopulation. Of the nine cases associated with this follow-up, all had already been linked to care and five cases had started treatment. Subsequent to the follow-up, three additional cases started treatment and most cases achieved suppressed viral loads. During the next 12 months, we detected 12 new cases in this cluster with reduction in the onward transmission of drug resistance. Our findings show the first application of an automated phylogenetic system monitoring a clinical database to detect a recent HIV outbreak and support the ensuing public health response. By making secondary use of routinely collected HIV genotypes, this approach is cost-effective, attains near real-time monitoring of new cases, and can be implemented in all settings in which HIV genotyping is the standard of care. BC Centre for Excellence in HIV/AIDS, the Canadian Institutes for Health Research, the Genome Canada-CIHR Partnership in Genomics and Personalized Health, and the US National Institute on Drug Abuse. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hürlimann, Eveline; Schur, Nadine; Boutsika, Konstantina; Stensgaard, Anna-Sofie; Laserna de Himpsl, Maiti; Ziegelbauer, Kathrin; Laizer, Nassor; Camenzind, Lukas; Di Pasquale, Aurelio; Ekpo, Uwem F.; Simoonga, Christopher; Mushinge, Gabriel; Saarnak, Christopher F. L.; Utzinger, Jürg; Kristensen, Thomas K.; Vounatsou, Penelope
2011-01-01
Background After many years of general neglect, interest has grown and efforts came under way for the mapping, control, surveillance, and eventual elimination of neglected tropical diseases (NTDs). Disease risk estimates are a key feature to target control interventions, and serve as a benchmark for monitoring and evaluation. What is currently missing is a georeferenced global database for NTDs providing open-access to the available survey data that is constantly updated and can be utilized by researchers and disease control managers to support other relevant stakeholders. We describe the steps taken toward the development of such a database that can be employed for spatial disease risk modeling and control of NTDs. Methodology With an emphasis on schistosomiasis in Africa, we systematically searched the literature (peer-reviewed journals and ‘grey literature’), contacted Ministries of Health and research institutions in schistosomiasis-endemic countries for location-specific prevalence data and survey details (e.g., study population, year of survey and diagnostic techniques). The data were extracted, georeferenced, and stored in a MySQL database with a web interface allowing free database access and data management. Principal Findings At the beginning of 2011, our database contained more than 12,000 georeferenced schistosomiasis survey locations from 35 African countries available under http://www.gntd.org. Currently, the database is expanded to a global repository, including a host of other NTDs, e.g. soil-transmitted helminthiasis and leishmaniasis. Conclusions An open-access, spatially explicit NTD database offers unique opportunities for disease risk modeling, targeting control interventions, disease monitoring, and surveillance. Moreover, it allows for detailed geostatistical analyses of disease distribution in space and time. With an initial focus on schistosomiasis in Africa, we demonstrate the proof-of-concept that the establishment and running of a global NTD database is feasible and should be expanded without delay. PMID:22180793
The Deep Impact Network Experiment Operations Center Monitor and Control System
NASA Technical Reports Server (NTRS)
Wang, Shin-Ywan (Cindy); Torgerson, J. Leigh; Schoolcraft, Joshua; Brenman, Yan
2009-01-01
The Interplanetary Overlay Network (ION) software at JPL is an implementation of Delay/Disruption Tolerant Networking (DTN) which has been proposed as an interplanetary protocol to support space communication. The JPL Deep Impact Network (DINET) is a technology development experiment intended to increase the technical readiness of the JPL implemented ION suite. The DINET Experiment Operations Center (EOC) developed by JPL's Protocol Technology Lab (PTL) was critical in accomplishing the experiment. EOC, containing all end nodes of simulated spaces and one administrative node, exercised publish and subscribe functions for payload data among all end nodes to verify the effectiveness of data exchange over ION protocol stacks. A Monitor and Control System was created and installed on the administrative node as a multi-tiered internet-based Web application to support the Deep Impact Network Experiment by allowing monitoring and analysis of the data delivery and statistics from ION. This Monitor and Control System includes the capability of receiving protocol status messages, classifying and storing status messages into a database from the ION simulation network, and providing web interfaces for viewing the live results in addition to interactive database queries.
Chesapeake Bay Program Water Quality Database
The Chesapeake Information Management System (CIMS), designed in 1996, is an integrated, accessible information management system for the Chesapeake Bay Region. CIMS is an organized, distributed library of information and software tools designed to increase basin-wide public access to Chesapeake Bay information. The information delivered by CIMS includes technical and public information, educational material, environmental indicators, policy documents, and scientific data. Through the use of relational databases, web-based programming, and web-based GIS a large number of Internet resources have been established. These resources include multiple distributed on-line databases, on-demand graphing and mapping of environmental data, and geographic searching tools for environmental information. Baseline monitoring data, summarized data and environmental indicators that document ecosystem status and trends, confirm linkages between water quality, habitat quality and abundance, and the distribution and integrity of biological populations are also available. One of the major features of the CIMS network is the Chesapeake Bay Program's Data Hub, providing users access to a suite of long- term water quality and living resources databases. Chesapeake Bay mainstem and tidal tributary water quality, benthic macroinvertebrates, toxics, plankton, and fluorescence data can be obtained for a network of over 800 monitoring stations.
NASA Astrophysics Data System (ADS)
Bulan, Orhan; Bernal, Edgar A.; Loce, Robert P.; Wu, Wencheng
2013-03-01
Video cameras are widely deployed along city streets, interstate highways, traffic lights, stop signs and toll booths by entities that perform traffic monitoring and law enforcement. The videos captured by these cameras are typically compressed and stored in large databases. Performing a rapid search for a specific vehicle within a large database of compressed videos is often required and can be a time-critical life or death situation. In this paper, we propose video compression and decompression algorithms that enable fast and efficient vehicle or, more generally, event searches in large video databases. The proposed algorithm selects reference frames (i.e., I-frames) based on a vehicle having been detected at a specified position within the scene being monitored while compressing a video sequence. A search for a specific vehicle in the compressed video stream is performed across the reference frames only, which does not require decompression of the full video sequence as in traditional search algorithms. Our experimental results on videos captured in a local road show that the proposed algorithm significantly reduces the search space (thus reducing time and computational resources) in vehicle search tasks within compressed video streams, particularly those captured in light traffic volume conditions.
Monitoring by Use of Clusters of Sensor-Data Vectors
NASA Technical Reports Server (NTRS)
Iverson, David L.
2007-01-01
The inductive monitoring system (IMS) is a system of computer hardware and software for automated monitoring of the performance, operational condition, physical integrity, and other aspects of the health of a complex engineering system (e.g., an industrial process line or a spacecraft). The input to the IMS consists of streams of digitized readings from sensors in the monitored system. The IMS determines the type and amount of any deviation of the monitored system from a nominal or normal ( healthy ) condition on the basis of a comparison between (1) vectors constructed from the incoming sensor data and (2) corresponding vectors in a database of nominal or normal behavior. The term inductive reflects the use of a process reminiscent of traditional mathematical induction to learn about normal operation and build the nominal-condition database. The IMS offers two major advantages over prior computational monitoring systems: The computational burden of the IMS is significantly smaller, and there is no need for abnormal-condition sensor data for training the IMS to recognize abnormal conditions. The figure schematically depicts the relationships among the computational processes effected by the IMS. Training sensor data are gathered during normal operation of the monitored system, detailed computational simulation of operation of the monitored system, or both. The training data are formed into vectors that are used to generate the database. The vectors in the database are clustered into regions that represent normal or nominal operation. Once the database has been generated, the IMS compares the vectors of incoming sensor data with vectors representative of the clusters. The monitored system is deemed to be operating normally or abnormally, depending on whether the vector of incoming sensor data is or is not, respectively, sufficiently close to one of the clusters. For this purpose, a distance between two vectors is calculated by a suitable metric (e.g., Euclidean distance) and "sufficiently close" signifies lying at a distance less than a specified threshold value. It must be emphasized that although the IMS is intended to detect off-nominal or abnormal performance or health, it is not necessarily capable of performing a thorough or detailed diagnosis. Limited diagnostic information may be available under some circumstances. For example, the distance of a vector of incoming sensor data from the nearest cluster could serve as an indication of the severity of a malfunction. The identity of the nearest cluster may be a clue as to the identity of the malfunctioning component or subsystem. It is possible to decrease the IMS computation time by use of a combination of cluster-indexing and -retrieval methods. For example, in one method, the distances between each cluster and two or more reference vectors can be used for the purpose of indexing and retrieval. The clusters are sorted into a list according to these distance values, typically in ascending order of distance. When a set of input data arrives and is to be tested, the data are first arranged as an ordered set (that is, a vector). The distances from the input vector to the reference points are computed. The search of clusters from the list can then be limited to those clusters lying within a certain distance range from the input vector; the computation time is reduced by not searching the clusters at a greater distance.
Information access in a dual-task context: testing a model of optimal strategy selection.
Wickens, C D; Seidler, K S
1997-09-01
Pilots were required to access information from a hierarchical aviation database by navigating under single-task conditions (Experiment 1) and when this task was time-shared with an altitude-monitoring task of varying bandwidth and priority (Experiment 2). In dual-task conditions, pilots had 2 viewports available, 1 always used for the information task and the other to be allocated to either task. Dual-task strategy, inferred from the decision of which task to allocate to the 2nd viewport, revealed that allocation was generally biased in favor of the monitoring task and was only partly sensitive to the difficulty of the 2 tasks and their relative priorities. Some dominant sources of navigational difficulties failed to adaptively influence selection strategy. The implications of the results are to provide tools for jumping to the top of the database, to provide 2 viewports into the common database, and to provide training as to the optimum viewport management strategy in a multitask environment.
Preliminary Results on Design and Implementation of a Solar Radiation Monitoring System
Balan, Mugur C.; Damian, Mihai; Jäntschi, Lorentz
2008-01-01
The paper presents a solar radiation monitoring system, using two scientific pyranometers and an on-line computer home-made data acquisition system. The first pyranometer measures the global solar radiation and the other one, which is shaded, measure the diffuse radiation. The values of total and diffuse solar radiation are continuously stored into a database on a server. Original software was created for data acquisition and interrogation of the created system. The server application acquires the data from pyranometers and stores it into a database with a baud rate of one record at 50 seconds. The client-server application queries the database and provides descriptive statistics. A web interface allow to any user to define the including criteria and to obtain the results. In terms of results, the system is able to provide direct, diffuse and total radiation intensities as time series. Our client-server application computes also derivate heats. The ability of the system to evaluate the local solar energy potential is highlighted. PMID:27879746
Information access in a dual-task context: testing a model of optimal strategy selection
NASA Technical Reports Server (NTRS)
Wickens, C. D.; Seidler, K. S.
1997-01-01
Pilots were required to access information from a hierarchical aviation database by navigating under single-task conditions (Experiment 1) and when this task was time-shared with an altitude-monitoring task of varying bandwidth and priority (Experiment 2). In dual-task conditions, pilots had 2 viewports available, 1 always used for the information task and the other to be allocated to either task. Dual-task strategy, inferred from the decision of which task to allocate to the 2nd viewport, revealed that allocation was generally biased in favor of the monitoring task and was only partly sensitive to the difficulty of the 2 tasks and their relative priorities. Some dominant sources of navigational difficulties failed to adaptively influence selection strategy. The implications of the results are to provide tools for jumping to the top of the database, to provide 2 viewports into the common database, and to provide training as to the optimum viewport management strategy in a multitask environment.
Seismic databases of The Caucasus
NASA Astrophysics Data System (ADS)
Gunia, I.; Sokhadze, G.; Mikava, D.; Tvaradze, N.; Godoladze, T.
2012-12-01
The Caucasus is one of the active segments of the Alpine-Himalayan collision belt. The region needs continues seismic monitoring systems for better understanding of tectonic processes going in the region. Seismic Monitoring Center of Georgia (Ilia State University) is operating the digital seismic network of the country and is also collecting and exchanging data with neighboring countries. The main focus of our study was to create seismic database which is well organized, easily reachable and is convenient for scientists to use. The seismological database includes the information about more than 100 000 earthquakes from the whole Caucasus. We have to mention that it includes data from analog and digital seismic networks. The first analog seismic station in Georgia was installed in 1899 in the Caucasus in Tbilisi city. The number of analog seismic stations was increasing during next decades and in 1980s about 100 analog stations were operated all over the region. From 1992 due to political and economical situation the number of stations has been decreased and in 2002 just two analog equipments was operated. New digital seismic network was developed in Georgia since 2003. The number of digital seismic stations was increasing and in current days there are more than 25 digital stations operating in the country. The database includes the detailed information about all equipments installed on seismic stations. Database is available online. That will make convenient interface for seismic data exchange data between Caucasus neighboring countries. It also makes easier both the seismic data processing and transferring them to the database and decreases the operator's mistakes during the routine work. The database was created using the followings: php, MySql, Javascript, Ajax, GMT, Gmap, Hypoinverse.
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Fontaine, Alain
2016-04-01
IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data. The IAGOS Database Portal (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). The new IAGOS Database Portal has been released in December 2015. The main improvement is the interoperability implementation with international portals or other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the CAMS data center in Jülich (http://join.iek.fz-juelich.de). The CAMS (Copernicus Atmospheric Monitoring Service) project is a prominent user of the IGAS data network. The new portal provides improved and new services such as the download in NetCDF or NASA Ames formats, plotting tools (maps, time series, vertical profiles, etc.) and user management. Added value products are available on the portal: back trajectories, origin of air masses, co-location with satellite data, etc. The link with the CAMS data center, through JOIN (Jülich OWS Interface), allows to combine model outputs with IAGOS data for inter-comparison. Finally IAGOS metadata has been standardized (ISO 19115) and now provides complete information about data traceability and quality.
Prieto, Claudia I; Palau, María J; Martina, Pablo; Achiary, Carlos; Achiary, Andrés; Bettiol, Marisa; Montanaro, Patricia; Cazzola, María L; Leguizamón, Mariana; Massillo, Cintia; Figoli, Cecilia; Valeiras, Brenda; Perez, Silvia; Rentería, Fernando; Diez, Graciela; Yantorno, Osvaldo M; Bosch, Alejandra
2016-01-01
The epidemiological and clinical management of cystic fibrosis (CF) patients suffering from acute pulmonary exacerbations or chronic lung infections demands continuous updating of medical and microbiological processes associated with the constant evolution of pathogens during host colonization. In order to monitor the dynamics of these processes, it is essential to have expert systems capable of storing and subsequently extracting the information generated from different studies of the patients and microorganisms isolated from them. In this work we have designed and developed an on-line database based on an information system that allows to store, manage and visualize data from clinical studies and microbiological analysis of bacteria obtained from the respiratory tract of patients suffering from cystic fibrosis. The information system, named Cystic Fibrosis Cloud database is available on the http://servoy.infocomsa.com/cfc_database site and is composed of a main database and a web-based interface, which uses Servoy's product architecture based on Java technology. Although the CFC database system can be implemented as a local program for private use in CF centers, it can also be used, updated and shared by different users who can access the stored information in a systematic, practical and safe manner. The implementation of the CFC database could have a significant impact on the monitoring of respiratory infections, the prevention of exacerbations, the detection of emerging organisms, and the adequacy of control strategies for lung infections in CF patients. Copyright © 2015 Asociación Argentina de Microbiología. Publicado por Elsevier España, S.L.U. All rights reserved.
The use of a computerized database to monitor vaccine safety in Viet Nam.
Ali, Mohammad; Canh, Gia Do; Clemens, John D.; Park, Jin-Kyung; von Seidlein, Lorenz; Minh, Tan Truong; Thiem, Dinh Vu; Tho, Huu Le; Trach, Duc Dang
2005-01-01
Health information systems to monitor vaccine safety are used in industrialized countries to detect adverse medical events related to vaccinations or to prove the safety of vaccines. There are no such information systems in the developing world, but they are urgently needed. A large linked database for the monitoring of vaccine-related adverse events has been established in Khanh Hoa province, Viet Nam. Data collected during the first 2 years of surveillance, a period which included a mass measles vaccination campaign, were used to evaluate the system. For this purpose the discharge diagnoses of individuals admitted to polyclinics and hospitals were coded according to the International Classification of Diseases (ICD)-10 guidelines and linked in a dynamic population database with vaccination histories. A case-series analysis was applied to the cohort of children vaccinated during the mass measles vaccination campaign. The study recorded 107,022 immunizations in a catchment area with a population of 357,458 and confirmed vaccine coverage of 87% or higher for completed routine childhood vaccinations. The measles vaccination campaign immunized at least 86% of the targeted children aged 9 months to 10 years. No medical event was detected significantly more frequently during the 14 days after measles vaccination than before it. The experience in Viet Nam confirmed the safety of a measles vaccination campaign and shows that it is feasible to establish health information systems such as a large linked database which can provide reliable data in a developing country for a modest increase in use of resources. PMID:16193545
Why are we prolonging QT interval monitoring?
Barrett, Trina
2015-01-01
At present, monitoring of the QT interval (QTI) is not a standard practice in the medical intensive care unit setting, where many drugs that prolong the QTI are administered. This literature review looked at the current research for evidence-based standards to support QTI monitoring of patients with risk factors for QTI prolongation, which can result in life-threatening arrhythmias such as torsade de pointes. The objective of this article is to establish the existence of evidence-based standards for monitoring of the QTI and to raise awareness in the nursing profession of the need for such monitoring among patients who are at high risk for prolonged QTI. To determine whether published standards for QTI monitoring exist, a search was conducted of the bibliographic databases CINAHL, EBSCOhost, Medline, PubMed, Google Scholar, and the Cochrane Library for the years 2013 and 2014. Also, a survey was conducted to determine whether practice standards for QTI monitoring are being implemented at 4 major hospitals in the Memphis area, including a level 1 trauma center. The database search established the existence of published guidelines that support the need for QTI monitoring. Results of the hospital survey indicated that direct care nurses were not aware of the need to identify high-risk patients, drugs with the potential to prolong QTI that were being administered to their patients, or evidence-based standards for QTI monitoring. Review of the research literature underscored the need for QTI monitoring among high-risk patients, that is, those with genetic conditions that predispose them to QTI prolongation, those with existing cardiac conditions being treated with antiarrhythmic medications, or those who are prescribed any new medication classified as high risk on the basis of clinical research. This need is especially crucial in intensive care unit settings, where many antiarrhythmic medications are administered.
NASA Technical Reports Server (NTRS)
Stutte, G. W.; Mackowiak, C. L.; Markwell, G. A.; Wheeler, R. M.; Sager, J. C.
1993-01-01
This KSC database is being made available to the scientific research community to facilitate the development of crop development models, to test monitoring and control strategies, and to identify environmental limitations in crop production systems. The KSC validated dataset consists of 17 parameters necessary to maintain bioregenerative life support functions: water purification, CO2 removal, O2 production, and biomass production. The data are available on disk as either a DATABASE SUBSET (one week of 5-minute data) or DATABASE SUMMARY (daily averages of parameters). Online access to the VALIDATED DATABASE will be made available to institutions with specific programmatic requirements. Availability and access to the KSC validated database are subject to approval and limitations implicit in KSC computer security policies.
Network control processor for a TDMA system
NASA Astrophysics Data System (ADS)
Suryadevara, Omkarmurthy; Debettencourt, Thomas J.; Shulman, R. B.
Two unique aspects of designing a network control processor (NCP) to monitor and control a demand-assigned, time-division multiple-access (TDMA) network are described. The first involves the implementation of redundancy by synchronizing the databases of two geographically remote NCPs. The two sets of databases are kept in synchronization by collecting data on both systems, transferring databases, sending incremental updates, and the parallel updating of databases. A periodic audit compares the checksums of the databases to ensure synchronization. The second aspect involves the use of a tracking algorithm to dynamically reallocate TDMA frame space. This algorithm detects and tracks current and long-term load changes in the network. When some portions of the network are overloaded while others have excess capacity, the algorithm automatically calculates and implements a new burst time plan.
Internationally coordinated glacier monitoring: strategy and datasets
NASA Astrophysics Data System (ADS)
Hoelzle, Martin; Armstrong, Richard; Fetterer, Florence; Gärtner-Roer, Isabelle; Haeberli, Wilfried; Kääb, Andreas; Kargel, Jeff; Nussbaumer, Samuel; Paul, Frank; Raup, Bruce; Zemp, Michael
2014-05-01
Internationally coordinated monitoring of long-term glacier changes provide key indicator data about global climate change and began in the year 1894 as an internationally coordinated effort to establish standardized observations. Today, world-wide monitoring of glaciers and ice caps is embedded within the Global Climate Observing System (GCOS) in support of the United Nations Framework Convention on Climate Change (UNFCCC) as an important Essential Climate Variable (ECV). The Global Terrestrial Network for Glaciers (GTN-G) was established in 1999 with the task of coordinating measurements and to ensure the continuous development and adaptation of the international strategies to the long-term needs of users in science and policy. The basic monitoring principles must be relevant, feasible, comprehensive and understandable to a wider scientific community as well as to policy makers and the general public. Data access has to be free and unrestricted, the quality of the standardized and calibrated data must be high and a combination of detailed process studies at selected field sites with global coverage by satellite remote sensing is envisaged. Recently a GTN-G Steering Committee was established to guide and advise the operational bodies responsible for the international glacier monitoring, which are the World Glacier Monitoring Service (WGMS), the US National Snow and Ice Data Center (NSIDC), and the Global Land Ice Measurements from Space (GLIMS) initiative. Several online databases containing a wealth of diverse data types having different levels of detail and global coverage provide fast access to continuously updated information on glacier fluctuation and inventory data. For world-wide inventories, data are now available through (a) the World Glacier Inventory containing tabular information of about 130,000 glaciers covering an area of around 240,000 km2, (b) the GLIMS-database containing digital outlines of around 118,000 glaciers with different time stamps and (c) the Randolph Glacier Inventory (RGI), a new and globally complete digital dataset of outlines from about 180,000 glaciers with some meta-information, which has been used for many applications relating to the IPCC AR5 report. Concerning glacier changes, a database (Fluctuations of Glaciers) exists containing information about mass balance, front variations including past reconstructed time series, geodetic changes and special events. Annual mass balance reporting contains information for about 125 glaciers with a subset of 37 glaciers with continuous observational series since 1980 or earlier. Front variation observations of around 1800 glaciers are available from most of the mountain ranges world-wide. This database was recently updated with 26 glaciers having an unprecedented dataset of length changes from from reconstructions of well-dated historical evidence going back as far as the 16th century. Geodetic observations of about 430 glaciers are available. The database is completed by a dataset containing information on special events including glacier surges, glacier lake outbursts, ice avalanches, eruptions of ice-clad volcanoes, etc. related to about 200 glaciers. A special database of glacier photographs contains 13,000 pictures from around 500 glaciers, some of them dating back to the 19th century. A key challenge is to combine and extend the traditional observations with fast evolving datasets from new technologies.
The Long Valley Caldera GIS database
Battaglia, Maurizio; Williams, M.J.; Venezky, D.Y.; Hill, D.P.; Langbein, J.O.; Farrar, C.D.; Howle, J.F.; Sneed, M.; Segall, P.
2003-01-01
This database provides an overview of the studies being conducted by the Long Valley Observatory in eastern California from 1975 to 2001. The database includes geologic, monitoring, and topographic datasets related to Long Valley caldera. The CD-ROM contains a scan of the original geologic map of the Long Valley region by R. Bailey. Real-time data of the current activity of the caldera (including earthquakes, ground deformation and the release of volcanic gas), information about volcanic hazards and the USGS response plan are available online at the Long Valley observatory web page (http://lvo.wr.usgs.gov). If you have any comments or questions about this database, please contact the Scientist in Charge of the Long Valley observatory.
Parra, Lorena; García, Laura
2018-01-01
The monitoring of farming processes can optimize the use of resources and improve its sustainability and profitability. In fish farms, the water quality, tank environment, and fish behavior must be monitored. Wireless sensor networks (WSNs) are a promising option to perform this monitoring. Nevertheless, its high cost is slowing the expansion of its use. In this paper, we propose a set of sensors for monitoring the water quality and fish behavior in aquaculture tanks during the feeding process. The WSN is based on physical sensors, composed of simple electronic components. The system proposed can monitor water quality parameters, tank status, the feed falling and fish swimming depth and velocity. In addition, the system includes a smart algorithm to reduce the energy waste when sending the information from the node to the database. The system is composed of three nodes in each tank that send the information though the local area network to a database on the Internet and a smart algorithm that detects abnormal values and sends alarms when they happen. All the sensors are designed, calibrated, and deployed to ensure its suitability. The greatest efforts have been accomplished with the fish presence sensor. The total cost of the sensors and nodes for the proposed system is less than 90 €. PMID:29494560
Parra, Lorena; Sendra, Sandra; García, Laura; Lloret, Jaime
2018-03-01
The monitoring of farming processes can optimize the use of resources and improve its sustainability and profitability. In fish farms, the water quality, tank environment, and fish behavior must be monitored. Wireless sensor networks (WSNs) are a promising option to perform this monitoring. Nevertheless, its high cost is slowing the expansion of its use. In this paper, we propose a set of sensors for monitoring the water quality and fish behavior in aquaculture tanks during the feeding process. The WSN is based on physical sensors, composed of simple electronic components. The system proposed can monitor water quality parameters, tank status, the feed falling and fish swimming depth and velocity. In addition, the system includes a smart algorithm to reduce the energy waste when sending the information from the node to the database. The system is composed of three nodes in each tank that send the information though the local area network to a database on the Internet and a smart algorithm that detects abnormal values and sends alarms when they happen. All the sensors are designed, calibrated, and deployed to ensure its suitability. The greatest efforts have been accomplished with the fish presence sensor. The total cost of the sensors and nodes for the proposed system is less than 90 €.
ERIC Educational Resources Information Center
Hughes, Norm
The Distance Education Center (DEC) of the University of Southern Queensland (Australia) has developed a unique materials database system which is used to monitor pre-production, design and development, production and post-production planning, scheduling, and distribution of all types of materials including courses offered only on the Internet. In…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-24
... recorded in EPA's Air Quality System (AQS) database. To account for missing data, the procedures found in... three-year period and then adjusts for missing data. In short, if the three-year average expected... ambient air quality monitoring data for the 2001-2003 monitoring period showing that the area had an...
A remote condition monitoring system for wind-turbine based DG systems
NASA Astrophysics Data System (ADS)
Ma, X.; Wang, G.; Cross, P.; Zhang, X.
2012-05-01
In this paper, a remote condition monitoring system is proposed, which fundamentally consists of real-time monitoring modules on the plant side, a remote support centre and the communications between them. The paper addresses some of the key issues related on the monitoring system, including i) the implementation and configuration of a VPN connection, ii) an effective database system to be able to handle huge amount of monitoring data, and iii) efficient data mining techniques to convert raw data into useful information for plant assessment. The preliminary results have demonstrated that the proposed system is practically feasible and can be deployed to monitor the emerging new energy generation systems.
Database for the degradation risk assessment of groundwater resources (Southern Italy)
NASA Astrophysics Data System (ADS)
Polemio, M.; Dragone, V.; Mitolo, D.
2003-04-01
The risk characterisation of quality degradation and availability lowering of groundwater resources has been pursued for a wide coastal plain (Basilicata region, Southern Italy), an area covering 40 km along the Ionian Sea and 10 km inland. The quality degradation is due two phenomena: pollution due to discharge of waste water (coming from urban areas) and due to salt pollution, related to seawater intrusion but not only. The availability lowering is due to overexploitation but also due to drought effects. To this purpose the historical data of 1,130 wells have been collected. Wells, homogenously distributed in the area, were the source of geological, stratigraphical, hydrogeological, geochemical data. In order to manage space-related information via a GIS, a database system has been devised to encompass all the surveyed wells and the body of information available per well. Geo-databases were designed to comprise the four types of data collected: a database including geometrical, geological and hydrogeological data on wells (WDB), a database devoted to chemical and physical data on groundwater (CDB), a database including the geotechnical parameters (GDB), a database concering piezometric and hydrological (rainfall, air temperature, river discharge) data (HDB). The record pertaining to each well is identified in these databases by the progressive number of the well itself. Every database is designed as follows: a) the HDB contains 1,158 records, 28 of and 31 fields, mainly describing the geometry of the well and of the stratigraphy; b) the CDB encompasses data about 157 wells, based on which the chemical and physical analyses of groundwater have been carried out. More than one record has been associated with these 157 wells, due to periodic monitoring and analysis; c) the GDB covers 61 wells to which the geotechnical parameters obtained by soil samples taken at various depths; the HDB is designed to permit the analysis of long time series (from 1918) of piezometric data, monitored by more than 60 wells, temperature, rainfall and river discharge data. Based on geo-databases, the geostatistical processing of data has permitted to characterise the degradation risk of groundwater resources of a wide coastal aquifer.
Online Monitoring of Induction Motors
DOE Office of Scientific and Technical Information (OSTI.GOV)
McJunkin, Timothy R.; Agarwal, Vivek; Lybeck, Nancy Jean
2016-01-01
The online monitoring of active components project, under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program, researched diagnostic and prognostic models for alternating current induction motors (IM). Idaho National Laboratory (INL) worked with the Electric Power Research Institute (EPRI) to augment and revise the fault signatures previously implemented in the Asset Fault Signature Database of EPRI’s Fleet Wide Prognostic and Health Management (FW PHM) Suite software. Induction Motor diagnostic models were researched using the experimental data collected by Idaho State University. Prognostic models were explored in the set of literature and through amore » limited experiment with 40HP to seek the Remaining Useful Life Database of the FW PHM Suite.« less
Scale out databases for CERN use cases
NASA Astrophysics Data System (ADS)
Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Lanza Garcia, Daniel; Surdy, Kacper
2015-12-01
Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database.
Heetderks-Cox, M J; Alford, B B; Bednar, C M; Heiss, C J; Tauai, L A; Edgren, K K
2001-09-01
This study observed the effect of using a computerized vs manual method of self-monitoring among Air Force personnel receiving nutrition counseling for weight loss. Subjects who enrolled during the first 2 weeks of the 4-week recruitment period completed food records for 6 weeks using a CD-ROM nutrient database (intervention group) whereas those who enrolled during the last 2 weeks used a food record booklet (comparison group). Of the 42 subjects (n = 23 intervention group and n = 19 comparison group), only 113 intervention and 11 comparison group subjects (57% of study enrollees) submitted at least 1 food record during the study and were included in the analysis, which included review of pre- and poststudy questionnaires, food records, and focus group data. There were no significant differences between the number of days per week documented or average number of items recorded daily. All 9 intervention as compared to 2 comparison group subjects who completed a poststudy questionnaire searched for lower-energy and lower-fat items and reported changing their dietary intake as a result. All intervention group subjects who participated in a focus group (n=6) had favorable comments about using the CD-ROM for monitoring and changing eating habits, indicating that it is a beneficial self-monitoring tool. Participants enjoyed the immediate dietary feedback, and computerized food records may be easier to interpret by nutrition counselors. A number of computerized nutrient databases are available to assist patients and consumers in managing nutritional concerns.
Zilaout, Hicham; Vlaanderen, Jelle; Houba, Remko; Kromhout, Hans
2017-07-01
In 2000, a prospective Dust Monitoring Program (DMP) was started in which measurements of worker's exposure to respirable dust and quartz are collected in member companies from the European Industrial Minerals Association (IMA-Europe). After 15 years, the resulting IMA-DMP database allows a detailed overview of exposure levels of respirable dust and quartz over time within this industrial sector. Our aim is to describe the IMA-DMP and the current state of the corresponding database which due to continuation of the IMA-DMP is still growing. The future use of the database will also be highlighted including its utility for the industrial minerals producing sector. Exposure data are being obtained following a common protocol including a standardized sampling strategy, standardized sampling and analytical methods and a data management system. Following strict quality control procedures, exposure data are consequently added to a central database. The data comprises personal exposure measurements including auxiliary information on work and other conditions during sampling. Currently, the IMA-DMP database consists of almost 28,000 personal measurements which have been performed from 2000 until 2015 representing 29 half-yearly sampling campaigns. The exposure data have been collected from 160 different worksites owned by 35 industrial mineral companies and comes from 23 European countries and approximately 5000 workers. The IMA-DMP database provides the European minerals sector with reliable data regarding worker personal exposures to respirable dust and quartz. The database can be used as a powerful tool to address outstanding scientific issues on long-term exposure trends and exposure variability, and importantly, as a surveillance tool to evaluate exposure control measures. The database will be valuable for future epidemiological studies on respiratory health effects and will allow for estimation of quantitative exposure response relationships. Copyright © 2017 The Authors. Published by Elsevier GmbH.. All rights reserved.
Sustainable Seas Student Intertidal Monitoring Project at Duxbury Reef in Bolinas, CA
NASA Astrophysics Data System (ADS)
Soave, K.; Dean, A.; Weigel, S.; Redman, K.; Darakananda, D.; Fuller, C.; Gusman, V.; Hirschfeld, Z.; Kornfeld, H.; Picchi, K.
2006-12-01
The Sustainable Seas Student Monitoring Project at the Branson School in Ross, CA has monitored Duxbury Reef in Bolinas, CA since 1999, in cooperation with the Farallones Marine Sanctuary Association and the Gulf of Farallones National Marine Sanctuary. Goals of the project include: 1) To monitor the rocky intertidal habitat and develop a baseline database of invertebrates and algal density and abundance; 2) To contribute to the conservation of the rocky intertidal habitat through education of students and visitors about intertidal species and requirements for maintaining a healthy, diverse intertidal ecosystem; 3) To increase stewardship in the Gulf of the Farallones National Marine Sanctuary; and 4) To contribute abundance and population data on key algae and invertebrate species to the national database, LiMPETS (Long Term Monitoring Program & Experiential Training for Students). Student volunteers complete an intensive training course on the natural history of intertidal invertebrates and algae, identification of key species, rocky intertidal ecology, interpretation and monitoring techniques, and history of the sanctuary. Students conduct two baseline-monitoring surveys three times per year (fall, winter, and late spring) to identify and count key invertebrate and algae species. During six seasons of monitoring (2000-2006), the density of black turban snails, Tegula funebralis, showed seasonal abundance variation with respect to tidal zonation. Most algae species had consistently lower densities in the more accessible northern (A) transects than the southern (B) transects. To test the reliability of the student counts, replicate counts of all species are always performed. Replicate counts for invertebrate and algae species within the same quadrat along the permanent transects revealed a very small amount of variability, giving us confidence that our monitoring program is providing reliable data.
Estuarine monitoring programs in the Albemarle Sound study area, North Carolina
Moorman, Michelle; Kolb, Katharine R.; Supak, Stacy
2014-01-01
The purpose of this report is to identify major natural resource management issues for the region, provide information on current monitoring activities occurring within the Albemarle Sound study area, determine how the current monitoring network fits into the design of the NMN, and determine what additional monitoring data are needed to address these issues. In order to address these questions, a shapefile and data table were created to document monitoring and research programs in the Albemarle Sound study area with an emphasis on current monitoring programs within the region. This database was queried to determine monitoring gaps that existed in the Albemarle Sound by comparing current monitoring programs with the design indicated by the NMN. The report uses this information to provide recommendations on how monitoring could be improved in the Albemarle Sound study area.
[Design and application of user managing system of cardiac remote monitoring network].
Chen, Shouqiang; Zhang, Jianmin; Yuan, Feng; Gao, Haiqing
2007-12-01
According to inpatient records, data managing demand of cardiac remote monitoring network and computer, this software was designed with relative database ACCESS. Its interface, operational button and menu were designed in VBA language assistantly. Its design included collective design, amity, practicability and compatibility. Its function consisted of registering, inquiring, statisticing and printing, et al. It could be used to manage users effectively and could be helpful to exerting important action of cardiac remote monitoring network in preventing cardiac-vascular emergency ulteriorly.
Let your fingers do the walking: The projects most invaluable tool
NASA Technical Reports Server (NTRS)
Zirk, Deborah A.
1993-01-01
The barrage of information pertaining to the software being developed for a project can be overwhelming. Current status information, as well as the statistics and history of software releases, should be 'at the fingertips' of project management and key technical personnel. This paper discusses the development, configuration, capabilities, and operation of a relational database, the System Engineering Database (SEDB) which was designed to assist management in monitoring of the tasks performed by the Network Control Center (NCC) Project. This database has proven to be an invaluable project tool and is utilized daily to support all project personnel.
Utilizing Non-Contact Stress Measurement System (NSMS) as a Health Monitor
NASA Technical Reports Server (NTRS)
Hayes, Terry; Hayes, Bryan; Bynum, Ken
2011-01-01
Continuously monitor all 156 blades throughout the entire operating envelope without adversely affecting tunnel conditions or compromise compressor shell integrity, Calculate dynamic response and identify the frequency/mode to determine individual blade deflection amplitudes, natural frequencies, phase, and damping (Q), Log static deflection to build a database of deflection values at certain compressor conditions to use as basis for real-time online Blade Stack monitor, Monitor for stall, surge, flutter, and blade damage, Operate with limited user input, low maintenance cost, safe illumination of probes, easy probe replacement, and require little or no access to compressor.
Database Development for Ocean Impacts: Imaging, Outreach, and Rapid Response
2012-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Database Development for Ocean Impacts: Imaging, Outreach...Development for Ocean Impacts: Imaging, Outreach, and Rapid Response 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...hoses ( Applied Ocean Physics & Engineering department, WHOI, to evaluate wear and locate in mooring optical cables used in the Right Whale monitoring
The Battle Command Sustainment Support System: Initial Analysis Report
2016-09-01
diagnostic monitoring, asynchronous commits, and others. The other components of the NEDP include a main forwarding gateway /web server and one or more...NATIONAL ENTERPRISE DATA PORTAL ANALYSIS The NEDP is comprised of an Oracle Database 10g referred to as the National Data Server and several other...data forwarding gateways (DFG). Together, with the Oracle Database 10g, these components provide a heterogeneous data source that aligns various data
Monitoring bird migration in the Caribbean basin: multi-national cooperation can close the loop
Paul B. Hamel; Cecilia M. Riley; W. C. Hunter; Mark S. Woodrey
2005-01-01
The Gulf Coast Bird Observatory (GCBO) and the Southeastern Working Group of Partners in Flight have developed a protocol to monitor landbirds with volunteer observers performing avian censuses in the field. Field observations are compiled within a powerful internet database, and recording and summary capability is maintained by the GCBO. More than 100 observers have...
Herasevich, Vitaly; Pickering, Brian W; Dong, Yue; Peters, Steve G; Gajic, Ognjen
2010-03-01
To develop and validate an informatics infrastructure for syndrome surveillance, decision support, reporting, and modeling of critical illness. Using open-schema data feeds imported from electronic medical records (EMRs), we developed a near-real-time relational database (Multidisciplinary Epidemiology and Translational Research in Intensive Care Data Mart). Imported data domains included physiologic monitoring, medication orders, laboratory and radiologic investigations, and physician and nursing notes. Open database connectivity supported the use of Boolean combinations of data that allowed authorized users to develop syndrome surveillance, decision support, and reporting (data "sniffers") routines. Random samples of database entries in each category were validated against corresponding independent manual reviews. The Multidisciplinary Epidemiology and Translational Research in Intensive Care Data Mart accommodates, on average, 15,000 admissions to the intensive care unit (ICU) per year and 200,000 vital records per day. Agreement between database entries and manual EMR audits was high for sex, mortality, and use of mechanical ventilation (kappa, 1.0 for all) and for age and laboratory and monitored data (Bland-Altman mean difference +/- SD, 1(0) for all). Agreement was lower for interpreted or calculated variables, such as specific syndrome diagnoses (kappa, 0.5 for acute lung injury), duration of ICU stay (mean difference +/- SD, 0.43+/-0.2), or duration of mechanical ventilation (mean difference +/- SD, 0.2+/-0.9). Extraction of essential ICU data from a hospital EMR into an open, integrative database facilitates process control, reporting, syndrome surveillance, decision support, and outcome research in the ICU.
A web-based relational database for monitoring and analyzing mosquito population dynamics.
Sucaet, Yves; Van Hemert, John; Tucker, Brad; Bartholomay, Lyric
2008-07-01
Mosquito population dynamics have been monitored on an annual basis in the state of Iowa since 1969. The primary goal of this project was to integrate light trap data from these efforts into a centralized back-end database and interactive website that is available through the internet at http://iowa-mosquito.ent.iastate.edu. For comparative purposes, all data were categorized according to the week of the year and normalized according to the number of traps running. Users can readily view current, weekly mosquito abundance compared with data from previous years. Additional interactive capabilities facilitate analyses of the data based on mosquito species, distribution, or a time frame of interest. All data can be viewed in graphical and tabular format and can be downloaded to a comma separated value (CSV) file for import into a spreadsheet or more specialized statistical software package. Having this long-term dataset in a centralized database/website is useful for informing mosquito and mosquito-borne disease control and for exploring the ecology of the species represented therein. In addition to mosquito population dynamics, this database is available as a standardized platform that could be modified and applied to a multitude of projects that involve repeated collection of observational data. The development and implementation of this tool provides capacity for the user to mine data from standard spreadsheets into a relational database and then view and query the data in an interactive website.
Ubiquitous-health (U-Health) monitoring systems for elders and caregivers
NASA Astrophysics Data System (ADS)
Moon, Gyu; Lim, Kyung-won; Yoo, Young-min; An, Hye-min; Lee, Ki Seop; Szu, Harold
2011-06-01
This paper presents two aordable low-tack system for household biomedical wellness monitoring. The rst system, JIKIMI (pronounced caregiver in Korean), is a remote monitoring system that analyzes the behavior patterns of elders that live alone. JIKIMI is composed of an in-house sensing system, a set of wireless sensor nodes containing a pyroelectric infrared sensor to detect the motion of elders, an emergency button and a magnetic sensor that detects the opening and closing of doors. The system is also equipped with a server system, which is comprised of a database and web server. The server provides the mechanism for web-based monitoring to caregivers. The second system, Reader of Bottle Information (ROBI), is an assistant system which advises the contents of bottles for elders. ROBI is composed of bottles that have connected RFID tags and an advice system, which is composed of a wireless RFID reader, a gateway and a remote database server. The RFID tags are connected to the caps of the bottles are used in conjunction with the advice system These systems have been in use for three years and have proven to be useful for caregivers to provide more ecient and eective care services.
NASA Technical Reports Server (NTRS)
Young, Steve; UijtdeHaag, Maarten; Sayre, Jonathon
2003-01-01
Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data representing terrain, obstacles, and cultural features. As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. Further, updates to the databases may not be provided as changes occur. These issues limit the certification level and constrain the operational context of SVS for civil aviation. Previous work demonstrated the feasibility of using a realtime monitor to bound the integrity of Digital Elevation Models (DEMs) by using radar altimeter measurements during flight. This paper describes an extension of this concept to include X-band Weather Radar (WxR) measurements. This enables the monitor to detect additional classes of DEM errors and to reduce the exposure time associated with integrity threats. Feature extraction techniques are used along with a statistical assessment of similarity measures between the sensed and stored features that are detected. Recent flight-testing in the area around the Juneau, Alaska Airport (JNU) has resulted in a comprehensive set of sensor data that is being used to assess the feasibility of the proposed monitor technology. Initial results of this assessment are presented.
Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization
NASA Astrophysics Data System (ADS)
Daglis, I. A.; Bourdarie, S.; Khotyaintsev, Y.; Santolik, O.; Horne, R.; Mann, I.; Turner, D.; Anastasiadis, A.; Angelopoulos, V.; Balasis, G.; Chatzichristou, E.; Cully, C.; Georgiou, M.; Glauert, S.; Grison, B.; Kolmasova, I.; Lazaro, D.; Macusova, E.; Maget, V.; Papadimitriou, C.; Ropokis, G.; Sandberg, I.; Usanova, M.
2012-09-01
We present the concept, objectives and expected impact of the MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization) project, which is being implemented by a consortium of seven institutions (five European, one Canadian and one US) with support from the European Community's Seventh Framework Programme. The MAARBLE project employs multi-spacecraft monitoring of the geospace environment, complemented by ground-based monitoring, in order to analyze and assess the physical mechanisms leading to radiation belt particle energization and loss. Particular attention is paid to the role of ULF/VLF waves. A database containing properties of the waves is being created and will be made available to the scientific community. Based on the wave database, a statistical model of the wave activity dependent on the level of geomagnetic activity, solar wind forcing, and magnetospheric region will be developed. Furthermore, we will incorporate multi-spacecraft particle measurements into data assimilation tools, aiming at a new understanding of the causal relationships between ULF/VLF waves and radiation belt dynamics. Data assimilation techniques have been proven to be a valuable tool in the field of radiation belts, able to guide 'the best' estimate of the state of a complex system.
Improving Remote Health Monitoring: A Low-Complexity ECG Compression Approach
Al-Ali, Abdulla; Mohamed, Amr; Ward, Rabab
2018-01-01
Recent advances in mobile technology have created a shift towards using battery-driven devices in remote monitoring settings and smart homes. Clinicians are carrying out diagnostic and screening procedures based on the electrocardiogram (ECG) signals collected remotely for outpatients who need continuous monitoring. High-speed transmission and analysis of large recorded ECG signals are essential, especially with the increased use of battery-powered devices. Exploring low-power alternative compression methodologies that have high efficiency and that enable ECG signal collection, transmission, and analysis in a smart home or remote location is required. Compression algorithms based on adaptive linear predictors and decimation by a factor B/K are evaluated based on compression ratio (CR), percentage root-mean-square difference (PRD), and heartbeat detection accuracy of the reconstructed ECG signal. With two databases (153 subjects), the new algorithm demonstrates the highest compression performance (CR=6 and PRD=1.88) and overall detection accuracy (99.90% sensitivity, 99.56% positive predictivity) over both databases. The proposed algorithm presents an advantage for the real-time transmission of ECG signals using a faster and more efficient method, which meets the growing demand for more efficient remote health monitoring. PMID:29337892
Improving Remote Health Monitoring: A Low-Complexity ECG Compression Approach.
Elgendi, Mohamed; Al-Ali, Abdulla; Mohamed, Amr; Ward, Rabab
2018-01-16
Recent advances in mobile technology have created a shift towards using battery-driven devices in remote monitoring settings and smart homes. Clinicians are carrying out diagnostic and screening procedures based on the electrocardiogram (ECG) signals collected remotely for outpatients who need continuous monitoring. High-speed transmission and analysis of large recorded ECG signals are essential, especially with the increased use of battery-powered devices. Exploring low-power alternative compression methodologies that have high efficiency and that enable ECG signal collection, transmission, and analysis in a smart home or remote location is required. Compression algorithms based on adaptive linear predictors and decimation by a factor B / K are evaluated based on compression ratio (CR), percentage root-mean-square difference (PRD), and heartbeat detection accuracy of the reconstructed ECG signal. With two databases (153 subjects), the new algorithm demonstrates the highest compression performance ( CR = 6 and PRD = 1.88 ) and overall detection accuracy (99.90% sensitivity, 99.56% positive predictivity) over both databases. The proposed algorithm presents an advantage for the real-time transmission of ECG signals using a faster and more efficient method, which meets the growing demand for more efficient remote health monitoring.
A new system for measuring three-dimensional back shape in scoliosis
Pynsent, Paul; Fairbank, Jeremy; Disney, Simon
2008-01-01
The aim of this work was to develop a low-cost automated system to measure the three-dimensional shape of the back in patients with scoliosis. The resulting system uses structured light to illuminate a patient’s back from an angle while a digital photograph is taken. The height of the surface is calculated using Fourier transform profilometry with an accuracy of ±1 mm. The surface is related to body axes using bony landmarks on the back that have been palpated and marked with small coloured stickers prior to photographing. Clinical parameters are calculated automatically and presented to the user on a monitor and as a printed report. All data are stored in a database. The database can be interrogated and successive measurements plotted for monitoring the deformity changes. The system developed uses inexpensive hardware and open source software. Accurate surface topography can help the clinician to measure spinal deformity at baseline and monitor changes over time. It can help the patients and their families to assess deformity. Above all it reduces the dependence on serial radiography and reduces radiation exposure when monitoring spinal deformity. PMID:18247064
Monitoring of small laboratory animal experiments by a designated web-based database.
Frenzel, T; Grohmann, C; Schumacher, U; Krüll, A
2015-10-01
Multiple-parametric small animal experiments require, by their very nature, a sufficient number of animals which may need to be large to obtain statistically significant results.(1) For this reason database-related systems are required to collect the experimental data as well as to support the later (re-) analysis of the information gained during the experiments. In particular, the monitoring of animal welfare is simplified by the inclusion of warning signals (for instance, loss in body weight >20%). Digital patient charts have been developed for human patients but are usually not able to fulfill the specific needs of animal experimentation. To address this problem a unique web-based monitoring system using standard MySQL, PHP, and nginx has been created. PHP was used to create the HTML-based user interface and outputs in a variety of proprietary file formats, namely portable document format (PDF) or spreadsheet files. This article demonstrates its fundamental features and the easy and secure access it offers to the data from any place using a web browser. This information will help other researchers create their own individual databases in a similar way. The use of QR-codes plays an important role for stress-free use of the database. We demonstrate a way to easily identify all animals and samples and data collected during the experiments. Specific ways to record animal irradiations and chemotherapy applications are shown. This new analysis tool allows the effective and detailed analysis of huge amounts of data collected through small animal experiments. It supports proper statistical evaluation of the data and provides excellent retrievable data storage. © The Author(s) 2015.
The National Nonindigenous Aquatic Species Database
Neilson, Matthew E.; Fuller, Pamela L.
2012-01-01
The U.S. Geological Survey (USGS) Nonindigenous Aquatic Species (NAS) Program maintains a database that monitors, records, and analyzes sightings of nonindigenous aquatic plant and animal species throughout the United States. The program is based at the USGS Wetland and Aquatic Research Center in Gainesville, Florida.The initiative to maintain scientific information on nationwide occurrences of nonindigenous aquatic species began with the Aquatic Nuisance Species Task Force, created by Congress in 1990 to provide timely information to natural resource managers. Since then, the NAS database has been a clearinghouse of information for confirmed sightings of nonindigenous, also known as nonnative, aquatic species throughout the Nation. The database is used to produce email alerts, maps, summary graphs, publications, and other information products to support natural resource managers.
Itri, Jason N; Jones, Lisa P; Kim, Woojin; Boonn, William W; Kolansky, Ana S; Hilton, Susan; Zafar, Hanna M
2014-04-01
Monitoring complications and diagnostic yield for image-guided procedures is an important component of maintaining high quality patient care promoted by professional societies in radiology and accreditation organizations such as the American College of Radiology (ACR) and Joint Commission. These outcome metrics can be used as part of a comprehensive quality assurance/quality improvement program to reduce variation in clinical practice, provide opportunities to engage in practice quality improvement, and contribute to developing national benchmarks and standards. The purpose of this article is to describe the development and successful implementation of an automated web-based software application to monitor procedural outcomes for US- and CT-guided procedures in an academic radiology department. The open source tools PHP: Hypertext Preprocessor (PHP) and MySQL were used to extract relevant procedural information from the Radiology Information System (RIS), auto-populate the procedure log database, and develop a user interface that generates real-time reports of complication rates and diagnostic yield by site and by operator. Utilizing structured radiology report templates resulted in significantly improved accuracy of information auto-populated from radiology reports, as well as greater compliance with manual data entry. An automated web-based procedure log database is an effective tool to reliably track complication rates and diagnostic yield for US- and CT-guided procedures performed in a radiology department.
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Gautron, Benoit; Schultz, Martin; Brötz, Björn; Rauthe-Schöch, Armin; Thouret, Valérie
2015-04-01
IAGOS (In-service Aircraft for a Global Observing System) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. The IAGOS database (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data centre Ether (CNES and CNRS). In the framework of the IGAS project (IAGOS for Copernicus Atmospheric Service) interoperability with international portals or other databases is implemented in order to improve IAGOS data discovery. The IGAS data network is composed of three data centres: the IAGOS database in Toulouse including IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data since January 2015; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the MACC data centre in Jülich (http://join.iek.fz-juelich.de). The MACC (Monitoring Atmospheric Composition and Climate) project is a prominent user of the IGAS data network. In June 2015 a new version of the IAGOS database will be released providing improved services such as download in NetCDF or NASA Ames formats; graphical tools (maps, scatter plots, etc.); standardized metadata (ISO 19115) and a better users management. The link with the MACC data centre, through JOIN (Jülich OWS Interface), will allow to combine model outputs with IAGOS data for intercomparison. The interoperability within the IGAS data network, implemented thanks to many web services, will improve the functionalities of the web interfaces of each data centre.
Scofield, Patricia A.; Smith, Linda Lenell; Johnson, David N.
2017-07-01
The U.S. Environmental Protection Agency promulgated national emission standards for emissions of radionuclides other than radon from US Department of Energy facilities in Chapter 40 of the Code of Federal Regulations (CFR) 61, Subpart H. This regulatory standard limits the annual effective dose that any member of the public can receive from Department of Energy facilities to 0.1 mSv. As defined in the preamble of the final rule, all of the facilities on the Oak Ridge Reservation, i.e., the Y–12 National Security Complex, Oak Ridge National Laboratory, East Tennessee Technology Park, and any other U.S. Department of Energy operations onmore » Oak Ridge Reservation, combined, must meet the annual dose limit of 0.1 mSv. At Oak Ridge National Laboratory, there are monitored sources and numerous unmonitored sources. To maintain radiological source and inventory information for these unmonitored sources, e.g., laboratory hoods, equipment exhausts, and room exhausts not currently venting to monitored stacks on the Oak Ridge National Laboratory campus, the Environmental Protection Rad NESHAPs Inventory Web Database was developed. This database is updated annually and is used to compile emissions data for the annual Radionuclide National Emission Standards for Hazardous Air Pollutants (Rad NESHAPs) report required by 40 CFR 61.94. It also provides supporting documentation for facility compliance audits. In addition, a Rad NESHAPs source and dose database was developed to import the source and dose summary data from Clean Air Act Assessment Package—1988 computer model files. As a result, this database provides Oak Ridge Reservation and facility-specific source inventory; doses associated with each source and facility; and total doses for the Oak Ridge Reservation dose.« less
[Relational database for urinary stone ambulatory consultation. Assessment of initial outcomes].
Sáenz Medina, J; Páez Borda, A; Crespo Martinez, L; Gómez Dos Santos, V; Barrado, C; Durán Poveda, M
2010-05-01
To create a relational database for monitoring lithiasic patients. We describe the architectural details and the initial results of the statistical analysis. Microsoft Access 2002 was used as template. Four different tables were constructed to gather demographic data (table 1), clinical and laboratory findings (table 2), stone features (table 3) and therapeutic approach (table 4). For a reliability analysis of the database the number of correctly stored data was gathered. To evaluate the performance of the database, a prospective analysis was conducted, from May 2004 to August 2009, on 171 stone free patients after treatment (EWSL, surgery or medical) from a total of 511 patients stored in the database. Lithiasic status (stone free or stone relapse) was used as primary end point, while demographic factors (age, gender), lithiasic history, upper urinary tract alterations and characteristics of the stone (side, location, composition and size) were considered as predictive factors. An univariate analysis was conducted initially by chi square test and supplemented by Kaplan Meier estimates for time to stone recurrence. A multiple Cox proportional hazards regression model was generated to jointly assess the prognostic value of the demographic factors and the predictive value of stones characteristics. For the reliability analysis 22,084 data were available corresponding to 702 consultations on 511 patients. Analysis of data showed a recurrence rate of 85.4% (146/171, median time to recurrence 608 days, range 70-1758). In the univariate and multivariate analysis, none of the factors under consideration had a significant effect on recurrence rate (p=ns). The relational database is useful for monitoring patients with urolithiasis. It allows easy control and update, as well as data storage for later use. The analysis conducted for its evaluation showed no influence of demographic factors and stone features on stone recurrence.
WOVOdat, A Worldwide Volcano Unrest Database, to Improve Eruption Forecasts
NASA Astrophysics Data System (ADS)
Widiwijayanti, C.; Costa, F.; Win, N. T. Z.; Tan, K.; Newhall, C. G.; Ratdomopurbo, A.
2015-12-01
WOVOdat is the World Organization of Volcano Observatories' Database of Volcanic Unrest. An international effort to develop common standards for compiling and storing data on volcanic unrests in a centralized database and freely web-accessible for reference during volcanic crises, comparative studies, and basic research on pre-eruption processes. WOVOdat will be to volcanology as an epidemiological database is to medicine. Despite the large spectrum of monitoring techniques, the interpretation of monitoring data throughout the evolution of the unrest and making timely forecasts remain the most challenging tasks for volcanologists. The field of eruption forecasting is becoming more quantitative, based on the understanding of the pre-eruptive magmatic processes and dynamic interaction between variables that are at play in a volcanic system. Such forecasts must also acknowledge and express the uncertainties, therefore most of current research in this field focused on the application of event tree analysis to reflect multiple possible scenarios and the probability of each scenario. Such forecasts are critically dependent on comprehensive and authoritative global volcano unrest data sets - the very information currently collected in WOVOdat. As the database becomes more complete, Boolean searches, side-by-side digital and thus scalable comparisons of unrest, pattern recognition, will generate reliable results. Statistical distribution obtained from WOVOdat can be then used to estimate the probabilities of each scenario after specific patterns of unrest. We established main web interface for data submission and visualizations, and have now incorporated ~20% of worldwide unrest data into the database, covering more than 100 eruptive episodes. In the upcoming years we will concentrate in acquiring data from volcano observatories develop a robust data query interface, optimizing data mining, and creating tools by which WOVOdat can be used for probabilistic eruption forecasting. The more data in WOVOdat, the more useful it will be.
Scofield, Patricia A; Smith, Linda L; Johnson, David N
2017-07-01
The U.S. Environmental Protection Agency promulgated national emission standards for emissions of radionuclides other than radon from US Department of Energy facilities in Chapter 40 of the Code of Federal Regulations (CFR) 61, Subpart H. This regulatory standard limits the annual effective dose that any member of the public can receive from Department of Energy facilities to 0.1 mSv. As defined in the preamble of the final rule, all of the facilities on the Oak Ridge Reservation, i.e., the Y-12 National Security Complex, Oak Ridge National Laboratory, East Tennessee Technology Park, and any other U.S. Department of Energy operations on Oak Ridge Reservation, combined, must meet the annual dose limit of 0.1 mSv. At Oak Ridge National Laboratory, there are monitored sources and numerous unmonitored sources. To maintain radiological source and inventory information for these unmonitored sources, e.g., laboratory hoods, equipment exhausts, and room exhausts not currently venting to monitored stacks on the Oak Ridge National Laboratory campus, the Environmental Protection Rad NESHAPs Inventory Web Database was developed. This database is updated annually and is used to compile emissions data for the annual Radionuclide National Emission Standards for Hazardous Air Pollutants (Rad NESHAPs) report required by 40 CFR 61.94. It also provides supporting documentation for facility compliance audits. In addition, a Rad NESHAPs source and dose database was developed to import the source and dose summary data from Clean Air Act Assessment Package-1988 computer model files. This database provides Oak Ridge Reservation and facility-specific source inventory; doses associated with each source and facility; and total doses for the Oak Ridge Reservation dose.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scofield, Patricia A.; Smith, Linda Lenell; Johnson, David N.
The U.S. Environmental Protection Agency promulgated national emission standards for emissions of radionuclides other than radon from US Department of Energy facilities in Chapter 40 of the Code of Federal Regulations (CFR) 61, Subpart H. This regulatory standard limits the annual effective dose that any member of the public can receive from Department of Energy facilities to 0.1 mSv. As defined in the preamble of the final rule, all of the facilities on the Oak Ridge Reservation, i.e., the Y–12 National Security Complex, Oak Ridge National Laboratory, East Tennessee Technology Park, and any other U.S. Department of Energy operations onmore » Oak Ridge Reservation, combined, must meet the annual dose limit of 0.1 mSv. At Oak Ridge National Laboratory, there are monitored sources and numerous unmonitored sources. To maintain radiological source and inventory information for these unmonitored sources, e.g., laboratory hoods, equipment exhausts, and room exhausts not currently venting to monitored stacks on the Oak Ridge National Laboratory campus, the Environmental Protection Rad NESHAPs Inventory Web Database was developed. This database is updated annually and is used to compile emissions data for the annual Radionuclide National Emission Standards for Hazardous Air Pollutants (Rad NESHAPs) report required by 40 CFR 61.94. It also provides supporting documentation for facility compliance audits. In addition, a Rad NESHAPs source and dose database was developed to import the source and dose summary data from Clean Air Act Assessment Package—1988 computer model files. As a result, this database provides Oak Ridge Reservation and facility-specific source inventory; doses associated with each source and facility; and total doses for the Oak Ridge Reservation dose.« less
Working Group 1: Software System Design and Implementation for Environmental Modeling
ISCMEM Working Group One Presentation, presentation with the purpose of fostering the exchange of information about environmental modeling tools, modeling frameworks, and environmental monitoring databases.
SIMAC: development and implementation of a coral reef monitoring network in Colombia.
Garzón-Ferreira, Jaime; Rodríguez-Ramírez, Alberto
2010-05-01
Significant coral reef decline has been observed in Colombia during the last three decades. However, due to the lack of monitoring activities, most of the information about health and changes was fragmentary or inadequate. To develop an expanded nation-wide reef-monitoring program, in 1998 INVEMAR (Instituto de Investigaciones Marinas y Costeras: "Colombian Institute of Marine and Coastal Research") designed and implemented SIMAC (Sistema Nacional de Monitorco de Arrecifes Coralinos en Colombia: "National Monitoring System of Coral Reefs in Colombia") with the participation of other institutions. By the end of 2003 the SIMAC network reached more than twice its initial size, covering ten reef areas (seven in the Caribbean and three in the Pacific), 63 reef sites and 263 permanent transects. SIMAC monitoring continued without interruption until 2008 and should persist in the long-term. The SIMAC has a large database and consists basically of water quality measurements (temperature, salinity, turbidity) and a yearly estimation of benthic reef cover, coral disease prevalence, gorgonian density, abundance of important mobile invertebrates, fish diversity and abundance of important fish species. A methods manual is available in the Internet. Data and results of SIMAC have been widely circulated through a summary report published annually since 2000 for the Colombian environmental agencies and the general public, as well as numerous national and international scientific papers and presentations at meetings. SIMAC information has contributed to support regional and global reef monitoring networks and databases (i.e. CARICOMP, GCRMN, ReefBase).
Exploring the Potential of TanDEM-X Data in Rice Monitoring
NASA Astrophysics Data System (ADS)
Erten, E.
2015-12-01
In this work, phenological parameters such as growth stage, calendar estimation, crop density and yield estimation for rice fields are estimated employing TanDEM-X data. Currently, crop monitoring is country-dependent. Most countries have databases based on cadastral information and annual farmer inputs. Inaccuracies are coming from wrong or missing farmer declarations and/or coarsely updated cadastral boundary definitions. This leads to inefficient regulation of the market, frauds as well as to ecological risks. An accurate crop calendar is also missing, since farmers provide estimations in advance and there is no efficient way to know the growth status over large plantations. SAR data is of particular interest for these purposes. The proposed method includes two step approach including field detection and phenological state estimation. In the context of precise farming it is substantial to define field borders which are usually changing every cultivation period. Linking the SAR inherit properties to transplanting practice such as irrigation, the spatial database of rice-planted agricultural crops can be updated. Boundaries of agricultural fields will be defined in the database, and assignments of crops and sowing dates will be continuously updated by our monitoring system considering that sowing practice variously changes depending on the field owner decision. To define and segment rice crops, the system will make use of the fact that rice fields are characterized as flooded parcels separated by path networks composed by soil or rare grass. This natural segmentation is well detectable by inspecting low amplitude and coherence values of bistatic acquisitions. Once the field borders are defined, the phenology estimation of crops monitored at any time is the key point of monitoring. In this aspect the wavelength and the polarization option of TanDEM-X are enough to characterize the small phenological changes. The combination of bistatic interferometry and Radiative Transfer Theory (RTT) with different polarization provides a realistic description of plants including their full morphology (stalks, tillers, leaves and panicles).
Evaluating Land-Atmosphere Interactions with the North American Soil Moisture Database
NASA Astrophysics Data System (ADS)
Giles, S. M.; Quiring, S. M.; Ford, T.; Chavez, N.; Galvan, J.
2015-12-01
The North American Soil Moisture Database (NASMD) is a high-quality observational soil moisture database that was developed to study land-atmosphere interactions. It includes over 1,800 monitoring stations the United States, Canada and Mexico. Soil moisture data are collected from multiple sources, quality controlled and integrated into an online database (soilmoisture.tamu.edu). The period of record varies substantially and only a few of these stations have an observation record extending back into the 1990s. Daily soil moisture observations have been quality controlled using the North American Soil Moisture Database QAQC algorithm. The database is designed to facilitate observationally-driven investigations of land-atmosphere interactions, validation of the accuracy of soil moisture simulations in global land surface models, satellite calibration/validation for SMOS and SMAP, and an improved understanding of how soil moisture influences climate on seasonal to interannual timescales. This paper provides some examples of how the NASMD has been utilized to enhance understanding of land-atmosphere interactions in the U.S. Great Plains.
NASA Astrophysics Data System (ADS)
Feller, Jens; Feller, Sebastian; Mauersberg, Bernhard; Mergenthaler, Wolfgang
2009-09-01
Many applications in plant management require close monitoring of equipment performance, in particular with the objective to prevent certain critical events. At each point in time, the information available to classify the criticality of the process, is represented through the historic signal database as well as the actual measurement. This paper presents an approach to detect and predict critical events, based on pattern recognition and discriminance analysis.
1986 Year End Report for Road Following at Carnegie-Mellon
1987-05-01
how to make them work efficiently. We designed a hierarchical structure and a monitor module which manages all parts of the hierarchy (see figure 1...database, called the Local Map, is managed by a program known as the Local Map Builder (LMB). Each module stores and retrieves information in the...knowledge-intensive modules, and a database manager that synchronizes the modules-is characteristic of a traditional blackboard system. Such a system is
Monitoring and tracing of critical software systems: State of the work and project definition
2008-12-01
analysis, troubleshooting and debugging. Some of these subsystems already come with ad hoc tracers for events like wireless connections or SCSI disk... SQLite ). Additional synthetic events (e.g. states) are added to the database. The database thus consists in contexts (process, CPU, state), event...capability on a [operating] system-by-system basis. Additionally, the mechanics of querying the data in an ad - hoc manner outside the boundaries of the
Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian
2014-06-01
The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years before these drugs were withdrawn from the market. Copyright © 2014 John Wiley & Sons, Ltd.
Dietary assessment and self-monitoring with nutrition applications for mobile devices.
Lieffers, Jessica R L; Hanning, Rhona M
2012-01-01
Nutrition applications for mobile devices (e.g., personal digital assistants, smartphones) are becoming increasingly accessible and can assist with the difficult task of intake recording for dietary assessment and self-monitoring. This review is a compilation and discussion of research on this tool for dietary intake documentation in healthy populations and those trying to lose weight. The purpose is to compare this tool with conventional methods (e.g., 24-hour recall interviews, paper-based food records). Research databases were searched from January 2000 to April 2011, with the following criteria: healthy or weight loss populations, use of a mobile device nutrition application, and inclusion of at least one of three measures, which were the ability to capture dietary intake in comparison with conventional methods, dietary self-monitoring adherence, and changes in anthropometrics and/or dietary intake. Eighteen studies are discussed. Two application categories were identified: those with which users select food and portion size from databases and those with which users photograph their food. Overall, positive feedback was reported with applications. Both application types had moderate to good correlations for assessing energy and nutrient intakes in comparison with conventional methods. For self-monitoring, applications versus conventional techniques (often paper records) frequently resulted in better self-monitoring adherence, and changes in dietary intake and/or anthropometrics. Nutrition applications for mobile devices have an exciting potential for use in dietetic practice.
Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization
NASA Astrophysics Data System (ADS)
Daglis, I.; Balasis, G.; Bourdarie, S.; Horne, R.; Khotyaintsev, Y.; Mann, I.; Santolik, O.; Turner, D.; Anastasiadis, A.; Georgiou, M.; Giannakis, O.; Papadimitriou, C.; Ropokis, G.; Sandberg, I.; Angelopoulos, V.; Glauert, S.; Grison, B., Kersten T.; Kolmasova, I.; Lazaro, D.; Mella, M.; Ozeke, L.; Usanova, M.
2013-09-01
We present the concept, objectives and expected impact of the MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization) project, which is being implemented by a consortium of seven institutions (five European, one Canadian and one US) with support from the European Community's Seventh Framework Programme. The MAARBLE project employs multi-spacecraft monitoring of the geospace environment, complemented by ground-based monitoring, in order to analyze and assess the physical mechanisms leading to radiation belt particle energization and loss. Particular attention is paid to the role of ULF/VLF waves. A database containing properties of the waves is being created and will be made available to the scientific community. Based on the wave database, a statistical model of the wave activity dependent on the level of geomagnetic activity, solar wind forcing, and magnetospheric region will be developed. Multi-spacecraft particle measurements will be incorporated into data assimilation tools, leading to new understanding of the causal relationships between ULF/VLF waves and radiation belt dynamics. Data assimilation techniques have been proven as a valuable tool in the field of radiation belts, able to guide 'the best' estimate of the state of a complex system. The MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Energization and Loss) collaborative research project has received funding from the European Union’s Seventh Framework Programme (FP7-SPACE-2011-1) under grant agreement no. 284520.
Features and application of wearable biosensors in medical care
Ajami, Sima; Teimouri, Fotooheh
2015-01-01
One of the new technologies in the field of health is wearable biosensor, which provides vital signs monitoring of patients, athletes, premature infants, children, psychiatric patients, people who need long-term care, elderly, and people in impassable regions far from health and medical services. The aim of this study was to explain features and applications of wearable biosensors in medical services. This was a narrative review study that done in 2015. Search conducted with the help of libraries, books, conference proceedings, through databases of Science Direct, PubMed, Proquest, Springer, and SID (Scientific Information Database). In our searches, we employed the following keywords and their combinations; vital sign monitoring, medical smart shirt, smart clothing, wearable biosensors, physiological monitoring system, remote detection systems, remote control health, and bio-monitoring system. The preliminary search resulted in 54 articles, which published between 2002 and 2015. After a careful analysis of the content of each paper, 41 sources selected based on their relevancy. Although the use of wearable in healthcare is still in an infant stage, it could have a magic effect on healthcare. Smart wearable in the technology industry for 2015 is one that is looking to be a big and profitable market. Wearable biosensors capable of continuous vital signs monitoring and feedback to the user will be significantly effective in timely prevention, diagnosis, treatment, and control of diseases. PMID:26958058
Peter S. Murdoch; John L. Hom; Yude Pan; Jeffrey M. Fischer
2008-01-01
To complete the collaborative monitoring study of forested landscapes within the DRB, regional perspective on the cumulative effect of different disturbances on overall ecosystem health. This section describes two modeling activities used as integrating tools for the CEMRI database and a validation system that used nested river monitoring stations.
Indigenous species barcode database improves the identification of zooplankton
Yang, Jianghua; Zhang, Wanwan; Sun, Jingying; Xie, Yuwei; Zhang, Yimin; Burton, G. Allen; Yu, Hongxia
2017-01-01
Incompleteness and inaccuracy of DNA barcode databases is considered an important hindrance to the use of metabarcoding in biodiversity analysis of zooplankton at the species-level. Species barcoding by Sanger sequencing is inefficient for organisms with small body sizes, such as zooplankton. Here mitochondrial cytochrome c oxidase I (COI) fragment barcodes from 910 freshwater zooplankton specimens (87 morphospecies) were recovered by a high-throughput sequencing platform, Ion Torrent PGM. Intraspecific divergence of most zooplanktons was < 5%, except Branchionus leydign (Rotifer, 14.3%), Trichocerca elongate (Rotifer, 11.5%), Lecane bulla (Rotifer, 15.9%), Synchaeta oblonga (Rotifer, 5.95%) and Schmackeria forbesi (Copepod, 6.5%). Metabarcoding data of 28 environmental samples from Lake Tai were annotated by both an indigenous database and NCBI Genbank database. The indigenous database improved the taxonomic assignment of metabarcoding of zooplankton. Most zooplankton (81%) with barcode sequences in the indigenous database were identified by metabarcoding monitoring. Furthermore, the frequency and distribution of zooplankton were also consistent between metabarcoding and morphology identification. Overall, the indigenous database improved the taxonomic assignment of zooplankton. PMID:28977035
Sustainable Seas Intertidal Monitoring Project at Duxbury Reef
NASA Astrophysics Data System (ADS)
Soave, K. S.; Dean, A.; Gusman, V.; McCracken, K.; Solli, S.; Storm, E.; Placeholder, P.
2007-12-01
The Sustainable Seas Student Monitoring Project at the Branson School in Ross, CA has monitored Duxbury Reef in Bolinas, CA since 1999, in cooperation with the Farallones Marine Sanctuary Association and the Gulf of Farallones National Marine Sanctuary. Goals of the project include: 1) To monitor the rocky intertidal habitat and develop a baseline database of invertebrates and algal density and abundance; 2) To contribute to the conservation of the rocky intertidal habitat through education of students and visitors about intertidal species and requirements for maintaining a healthy, diverse intertidal ecosystem; 3) To increase stewardship in the Gulf of the Farallones National Marine Sanctuary; and 4) To contribute abundance and population data on key algae and invertebrate species to the national database, LiMPETS (Long Term Monitoring Program & Experiential Training for Students). Student volunteers complete an intensive training course on the natural history of intertidal invertebrates and algae, identification of key species, rocky intertidal ecology, interpretation and monitoring techniques, and history of the sanctuary. Students conduct two baseline-monitoring surveys three times per year (fall, winter, and late spring) to identify and count key invertebrate and algae species. Seasonal abundance of the algae species Mastocarpus and Fucus revealed lower populations in the spring monitoring events. Turban snails, Tegula funebralis, also showed dramatic population variation with respect to tidal zone. One of our project goals is to monitor this area long enough to obtain trends and to begin to connect these patterns to contributing factors (specific weather events, anthropogenic impacts, etc). Replicate counts of all species are regularly performed. Replicate counts for invertebrate and algae species within the same quadrat along the permanent transects revealed a very small amount of variability, giving us confidence that our monitoring program is providing reliable data. The Branson School 39 Fernhill Rd. Ross, CA 94957 (415) 454-3612 x 323 Farallones Marine Sanctuary Association, PO Box 29386 San Francisco, CA 94129, 415-561-6625 x 303
Wang, Hai-Nan; Chen, Wen; Fu, Zheng; Du, Wen-min; He, Jia
2008-03-01
Traditional Chinese medicine (TCM) injection has become one of the hotspots in the new TCM research and development. The serious adverse drug reactions happened in clinical have arosed attention widely in the whole society. It's very urgent to monitor the post-marketing safety of TCM injections. This paper elucidated the pharmacovigilance's necessity in the post-marketing safety monitoring of TCM injections, basing on the reason of safety problem of TCM injections and the future developing trend of adverse drug reaction monitoring. Also, this paper introduced the rapid signal detection method of spontaneous reporting system database by data mining technology.
Job monitoring on DIRAC for Belle II distributed computing
NASA Astrophysics Data System (ADS)
Kato, Yuji; Hayasaka, Kiyoshi; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo
2015-12-01
We developed a monitoring system for Belle II distributed computing, which consists of active and passive methods. In this paper we describe the passive monitoring system, where information stored in the DIRAC database is processed and visualized. We divide the DIRAC workload management flow into steps and store characteristic variables which indicate issues. These variables are chosen carefully based on our experiences, then visualized. As a result, we are able to effectively detect issues. Finally, we discuss the future development for automating log analysis, notification of issues, and disabling problematic sites.
An Automated Web Diary System for TeleHomeCare Patient Monitoring
Ganzinger, Matthias; Demiris, George; Finkelstein, Stanley M.; Speedie, Stuart; Lundgren, Jan Marie
2001-01-01
The TeleHomeCare project monitors home care patients via the Internet. Each patient has a personalized homepage with an electronic diary for collecting the monitoring data with HTML forms. The web pages are generated dynamically using PHP. All data are stored in a MySQL database. Data are checked immediately by the system; if a value exceeds a predefined limit an alarm message is generated and sent automatically to the patient's case manager. Weekly graphical reports (PDF format) are also generated and sent by email to the same destination.
Optical Network Virtualisation Using Multitechnology Monitoring and SDN-Enabled Optical Transceiver
NASA Astrophysics Data System (ADS)
Ou, Yanni; Davis, Matthew; Aguado, Alejandro; Meng, Fanchao; Nejabati, Reza; Simeonidou, Dimitra
2018-05-01
We introduce the real-time multi-technology transport layer monitoring to facilitate the coordinated virtualisation of optical and Ethernet networks supported by optical virtualise-able transceivers (V-BVT). A monitoring and network resource configuration scheme is proposed to include the hardware monitoring in both Ethernet and Optical layers. The scheme depicts the data and control interactions among multiple network layers under the software defined network (SDN) background, as well as the application that analyses the monitored data obtained from the database. We also present a re-configuration algorithm to adaptively modify the composition of virtual optical networks based on two criteria. The proposed monitoring scheme is experimentally demonstrated with OpenFlow (OF) extensions for a holistic (re-)configuration across both layers in Ethernet switches and V-BVTs.
Cronin, Edmond M; Varma, Niraj
2012-07-01
Traditional follow-up of cardiac implantable electronic devices involves the intermittent download of largely nonactionable data. Remote monitoring represents a paradigm shift from episodic office-based follow-up to continuous monitoring of device performance and patient and disease state. This lessens device clinical burden and may also lead to cost savings, although data on economic impact are only beginning to emerge. Remote monitoring technology has the potential to improve the outcomes through earlier detection of arrhythmias and compromised device integrity, and possibly predict heart failure hospitalizations through integration of heart failure diagnostics and hemodynamic monitors. Remote monitoring platforms are also huge databases of patients and devices, offering unprecedented opportunities to investigate real-world outcomes. Here, the current status of the field is described and future directions are predicted.
The European general thoracic surgery database project.
Falcoz, Pierre Emmanuel; Brunelli, Alessandro
2014-05-01
The European Society of Thoracic Surgeons (ESTS) Database is a free registry created by ESTS in 2001. The current online version was launched in 2007. It runs currently on a Dendrite platform with extensive data security and frequent backups. The main features are a specialty-specific, procedure-specific, prospectively maintained, periodically audited and web-based electronic database, designed for quality control and performance monitoring, which allows for the collection of all general thoracic procedures. Data collection is the "backbone" of the ESTS database. It includes many risk factors, processes of care and outcomes, which are specially designed for quality control and performance audit. The user can download and export their own data and use them for internal analyses and quality control audits. The ESTS database represents the gold standard of clinical data collection for European General Thoracic Surgery. Over the past years, the ESTS database has achieved many accomplishments. In particular, the database hit two major milestones: it now includes more than 235 participating centers and 70,000 surgical procedures. The ESTS database is a snapshot of surgical practice that aims at improving patient care. In other words, data capture should become integral to routine patient care, with the final objective of improving quality of care within Europe.
NASA Astrophysics Data System (ADS)
Qi, Weiran; Miao, Hongxia; Miao, Xuejiao; Xiao, Xuanxuan; Yan, Kuo
2016-10-01
In order to ensure the safe and stable operation of the prefabricated substations, temperature sensing subsystem, temperature remote monitoring and management subsystem, forecast subsystem are designed in the paper. Wireless temperature sensing subsystem which consists of temperature sensor and MCU sends the electrical equipment temperature to the remote monitoring center by wireless sensor network. Remote monitoring center can realize the remote monitoring and prediction by monitoring and management subsystem and forecast subsystem. Real-time monitoring of power equipment temperature, history inquiry database, user management, password settings, etc., were achieved by monitoring and management subsystem. In temperature forecast subsystem, firstly, the chaos of the temperature data was verified and phase space is reconstructed. Then Support Vector Machine - Particle Swarm Optimization (SVM-PSO) was used to predict the temperature of the power equipment in prefabricated substations. The simulation results found that compared with the traditional methods SVM-PSO has higher prediction accuracy.
Global Health Observatory (GHO)
... monitoring partnerships, including the Countdown to 2030 and academic institutions. – Access the portal Global Observatory on Health ... global situation and trends highlights, using core indicators, database views, major publications and links to relevant web ...
Pedersen, Sidsel Arnspang; Schmidt, Sigrun Alba Johannesdottir; Klausen, Siri; Pottegård, Anton; Friis, Søren; Hölmich, Lisbet Rosenkrantz; Gaist, David
2018-05-01
The nationwide Danish Cancer Registry and the Danish Melanoma Database both record data on melanoma for purposes of monitoring, quality assurance, and research. However, the data quality of the Cancer Registry and the Melanoma Database has not been formally evaluated. We estimated the positive predictive value (PPV) of melanoma diagnosis for random samples of 200 patients from the Cancer Registry (n = 200) and the Melanoma Database (n = 200) during 2004-2014, using the Danish Pathology Registry as "gold standard" reference. We further validated tumor characteristics in the Cancer Registry and the Melanoma Database. Additionally, we estimated the PPV of in situ melanoma diagnoses in the Melanoma Database, and the sensitivity of melanoma diagnoses in 2004-2014. The PPVs of melanoma in the Cancer Registry and the Melanoma Database were 97% (95% CI = 94, 99) and 100%. The sensitivity was 90% in the Cancer Registry and 77% in the Melanoma Database. The PPV of in situ melanomas in the Melanoma Database was 97% and the sensitivity was 56%. In the Melanoma Database, we observed PPVs of ulceration of 75% and Breslow thickness of 96%. The PPV of histologic subtypes varied between 87% and 100% in the Cancer Registry and 93% and 100% in the Melanoma Database. The PPVs for anatomical localization were 83%-95% in the Cancer Registry and 93%-100% in the Melanoma Database. The data quality in both the Cancer Registry and the Melanoma Database is high, supporting their use in epidemiologic studies.
Ahuja, Jaspreet K C; Moshfegh, Alanna J; Holden, Joanne M; Harris, Ellen
2013-02-01
The USDA food and nutrient databases provide the basic infrastructure for food and nutrition research, nutrition monitoring, policy, and dietary practice. They have had a long history that goes back to 1892 and are unique, as they are the only databases available in the public domain that perform these functions. There are 4 major food and nutrient databases released by the Beltsville Human Nutrition Research Center (BHNRC), part of the USDA's Agricultural Research Service. These include the USDA National Nutrient Database for Standard Reference, the Dietary Supplement Ingredient Database, the Food and Nutrient Database for Dietary Studies, and the USDA Food Patterns Equivalents Database. The users of the databases are diverse and include federal agencies, the food industry, health professionals, restaurants, software application developers, academia and research organizations, international organizations, and foreign governments, among others. Many of these users have partnered with BHNRC to leverage funds and/or scientific expertise to work toward common goals. The use of the databases has increased tremendously in the past few years, especially the breadth of uses. These new uses of the data are bound to increase with the increased availability of technology and public health emphasis on diet-related measures such as sodium and energy reduction. Hence, continued improvement of the databases is important, so that they can better address these challenges and provide reliable and accurate data.
Schopohl, D; Bidlingmaier, C; Herzig, D; Klamroth, R; Kurnik, K; Rublee, D; Schramm, W; Schwarzkopf, L; Berger, K
2018-02-28
Open questions in haemophilia, such as effectiveness of innovative therapies, clinical and patient-reported outcomes (PROs), epidemiology and cost, await answers. The aim was to identify data attributes required and investigate the availability, appropriateness and accessibility of real-world data (RWD) from German registries and secondary databases to answer the aforementioned questions. Systematic searches were conducted in BIOSIS, EMBASE and MEDLINE to identify non-commercial secondary healthcare databases and registries of patients with haemophilia (PWH). Inclusion of German patients, type of patients, data elements-stratified by use in epidemiology, safety, outcomes and health economics research-and accessibility were investigated by desk research. Screening of 676 hits, identification of four registries [national PWH (DHR), national/international paediatric (GEPARD, PEDNET), international safety monitoring (EUHASS)] and seven national secondary databases. Access was limited to participants in three registries and to employees in one secondary database. One registry asks for PROs. Limitations of secondary databases originate from the ICD-coding system (missing: severity of haemophilia, presence of inhibitory antibodies), data protection laws and need to monitor reliability. Rigorous observational analysis of German haemophilia RWD shows that there is potential to supplement current knowledge and begin to address selected policy goals. To improve the value of existing RWD, the following efforts are proposed: ethical, legal and methodological discussions on data linkage across different sources, formulation of transparent governance rules for data access, redefinition of the ICD-coding, standardized collection of outcome data and implementation of incentives for treatment centres to improve data collection. © 2018 John Wiley & Sons Ltd.
A new comprehensive database of global volcanic gas analyses
NASA Astrophysics Data System (ADS)
Clor, L. E.; Fischer, T. P.; Lehnert, K. A.; McCormick, B.; Hauri, E. H.
2013-12-01
Volcanic volatiles are the driving force behind eruptions, powerful indicators of magma provenance, present localized hazards, and have implications for climate. Studies of volcanic emissions are necessary for understanding volatile cycling from the mantle to the atmosphere. Gas compositions vary with volcanic activity, making it important to track their chemical variability over time. As studies become increasingly interdisciplinary, it is critical to have a mechanism to integrate decades of gas studies across disciplines. Despite the value of this research to a variety of fields, there is currently no integrated network to house all volcanic and hydrothermal gas data, making spatial, temporal, and interdisciplinary comparison studies time-consuming. To remedy this, we are working to establish a comprehensive database of volcanic gas emissions and compositions worldwide, as part of the Deep Carbon Observatory's DECADE (Deep Carbon Degassing) initiative. Volcanic gas data have been divided into two broad categories: 1) chemical analyses from samples collected directly at the volcanic source, and 2) measurements of gas concentrations and fluxes, such as remotely by mini-DOAS or satellite, or in-plume such as by multiGAS. The gas flux database effort is realized by the Global Volcanism Program of the Smithsonian Institution (abstract by Brendan McCormick, this meeting). The direct-sampling data is the subject of this presentation. Data from direct techniques include samples of gases collected at the volcanic source from fumaroles and springs, tephras analyzed for gas contents, filter pack samples of gases collected in a plume, and any other data types that involve collection of a sample. Data are incorporated into the existing framework of the Petrological Database, PetDB. Association with PetDB is advantageous as it will allow volcanic gas data to be linked to chemical data from lava or tephra samples, forming more complete ties between the eruptive products and the source magma. Eventually our goal is to have a seamless gas database that allows the user to easily access all gas data ever collected at volcanoes. This database will be useful in a variety of science applications: 1) correlating volcanic gas composition to volcanic activity; 2) establishing a characteristic gas composition or total volatile budget for a volcano or region in studies of global chemical cycles; 3) better quantifying the flux and source of volcanic carbon to the atmosphere. The World Organization of Volcano Observatories is populating a volcano monitoring database, WOVOdat, which centers on data collected during times of volcanic unrest for monitoring and hazard purposes. The focus of our database is to gain insight into volcanic degassing specifically, during both eruptive and quiescent times. Coordination of the new database with WOVOdat will allow comparison studies of gas compositions with seismic and other monitoring data during times of unrest, as well as promote comprehensive and cross-disciplinary questions about volcanic degassing.
NASA Astrophysics Data System (ADS)
Montalto, F. A.; Yu, Z.; Soldner, K.; Israel, A.; Fritch, M.; Kim, Y.; White, S.
2017-12-01
Urban stormwater utilities are increasingly using decentralized "green" infrastructure (GI) systems to capture stormwater and achieve compliance with regulations. Because environmental conditions, and design varies by GSI facility, monitoring of GSI systems under a range of conditions is essential. Conventional monitoring efforts can be costly because in-field data logging requires intense data transmission rates. The Internet of Things (IoT) can be used to more cost-effectively collect, store, and publish GSI monitoring data. Using 3G mobile networks, a cloud-based database was built on an Amazon Web Services (AWS) EC2 virtual machine to store and publish data collected with environmental sensors deployed in the field. This database can store multi-dimensional time series data, as well as photos and other observations logged by citizen scientists through a public engagement mobile app through a new Application Programming Interface (API). Also on the AWS EC2 virtual machine, a real-time QAQC flagging algorithm was developed to validate the sensor data streams.
Integrated technologies for solid waste bin monitoring system.
Arebey, Maher; Hannan, M A; Basri, Hassan; Begum, R A; Abdullah, Huda
2011-06-01
The integration of communication technologies such as radio frequency identification (RFID), global positioning system (GPS), general packet radio system (GPRS), and geographic information system (GIS) with a camera are constructed for solid waste monitoring system. The aim is to improve the way of responding to customer's inquiry and emergency cases and estimate the solid waste amount without any involvement of the truck driver. The proposed system consists of RFID tag mounted on the bin, RFID reader as in truck, GPRS/GSM as web server, and GIS as map server, database server, and control server. The tracking devices mounted in the trucks collect location information in real time via the GPS. This information is transferred continuously through GPRS to a central database. The users are able to view the current location of each truck in the collection stage via a web-based application and thereby manage the fleet. The trucks positions and trash bin information are displayed on a digital map, which is made available by a map server. Thus, the solid waste of the bin and the truck are being monitored using the developed system.
Application research for 4D technology in flood forecasting and evaluation
NASA Astrophysics Data System (ADS)
Li, Ziwei; Liu, Yutong; Cao, Hongjie
1998-08-01
In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.
Active surveillance of postmarket medical product safety in the Federal Partners' Collaboration.
Robb, Melissa A; Racoosin, Judith A; Worrall, Chris; Chapman, Summer; Coster, Trinka; Cunningham, Francesca E
2012-11-01
After half a century of monitoring voluntary reports of medical product adverse events, the Food and Drug Administration (FDA) has launched a long-term project to build an adverse events monitoring system, the Sentinel System, which can access and evaluate electronic health care data to help monitor the safety of regulated medical products once they are marketed. On the basis of experience gathered through a number of collaborative efforts, the Federal Partners' Collaboration pilot project, involving FDA, the Centers for Medicare & Medicaid Services, the Department of Veteran Affairs, and the Department of Defense, is already enabling FDA to leverage the power of large public health care databases to assess, in near real time, the utility of analytical tools and methodologies that are being developed for use in the Sentinel System. Active medical product safety surveillance is enhanced by use of these large public health databases because specific populations of exposed patients can be identified and analyzed, and can be further stratified by key variables such as age, sex, race, socioeconomic status, and basis for eligibility to examine important subgroups.
The HST/WFC3 Quicklook Project: A User Interface to Hubble Space Telescope Wide Field Camera 3 Data
NASA Astrophysics Data System (ADS)
Bourque, Matthew; Bajaj, Varun; Bowers, Ariel; Dulude, Michael; Durbin, Meredith; Gosmeyer, Catherine; Gunning, Heather; Khandrika, Harish; Martlin, Catherine; Sunnquist, Ben; Viana, Alex
2017-06-01
The Hubble Space Telescope's Wide Field Camera 3 (WFC3) instrument, comprised of two detectors, UVIS (Ultraviolet-Visible) and IR (Infrared), has been acquiring ~ 50-100 images daily since its installation in 2009. The WFC3 Quicklook project provides a means for instrument analysts to store, calibrate, monitor, and interact with these data through the various Quicklook systems: (1) a ~ 175 TB filesystem, which stores the entire WFC3 archive on disk, (2) a MySQL database, which stores image header data, (3) a Python-based automation platform, which currently executes 22 unique calibration/monitoring scripts, (4) a Python-based code library, which provides system functionality such as logging, downloading tools, database connection objects, and filesystem management, and (5) a Python/Flask-based web interface to the Quicklook system. The Quicklook project has enabled large-scale WFC3 analyses and calibrations, such as the monitoring of the health and stability of the WFC3 instrument, the measurement of ~ 20 million WFC3/UVIS Point Spread Functions (PSFs), the creation of WFC3/IR persistence calibration products, and many others.
Post-marketing studies: the work of the Drug Safety Research Unit.
Mackay, F J
1998-11-01
The Drug Safety Research Unit (DSRU) is the centre for prescription-event monitoring (PEM) in England. PEM studies are noninterventional observational cohort studies which monitor the safety of newly marketed drugs. The need for post-marketing surveillance is well recognised in the UK and general practice is an ideal source of data. PEM studies are general practitioner (community)-based and exposure is based on dispensed prescription data in England. To date, 65 PEM studies have been completed with a mean cohort size of 10 979 patients and the DSRU database has clinical information on over 700000 patients prescribed new drugs. Unlike spontaneous reporting schemes, PEM produces incidence rates for events reported during treatment. Comparative studies can be conducted for drugs in the same class. The DSRU aggregates outcome data for pregnancies exposed to new drugs. Data for children and the elderly can also be specifically examined. PEM data have a number of advantages over data from computerised general practice databases in the UK. PEM is the only technique within the UK capable of monitoring newly marketed drugs in such a comprehensive and systematic way.
49 CFR 237.155 - Documents and records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the information required by this part; (3) The track owner monitors its electronic records database...; (4) The track owner shall train its employees who use the system on the proper use of the electronic...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-19
..., tribal entities, environmental groups, academic institutions, industrial groups) use the ambient air... System (AQS) database. Quality assurance/quality control records and monitoring network documentation are...
Safe Drinking Water Information System Federal Version (SDWIS/FED)
SDWIS/FED is EPA’s national database that manages and collects public water system information from states, including reports of drinking water standard violations, reporting and monitoring violations, and other basic information.
Astaras, Alexander; Arvanitidou, Marina; Chouvarda, Ioanna; Kilintzis, Vassilis; Koutkias, Vassilis; Sanchez, Eduardo Monton; Stalidis, George; Triantafyllidis, Andreas; Maglaveras, Nicos
2008-01-01
A flexible, scaleable and cost-effective medical telemetry system is described for monitoring sleep-related disorders in the home environment. The system was designed and built for real-time data acquisition and processing, allowing for additional use in intensive care unit scenarios where rapid medical response is required in case of emergency. It comprises a wearable body area network of Zigbee-compatible wireless sensors worn by the subject, a central database repository residing in the medical centre and thin client workstations located at the subject's home and in the clinician's office. The system supports heterogeneous setup configurations, involving a variety of data acquisition sensors to suit several medical applications. All telemetry data is securely transferred and stored in the central database under the clinicians' ownership and control.
NASA Technical Reports Server (NTRS)
Burns, Lee; Decker, Ryan
2005-01-01
Lightning strike location and peak current are monitored operationally in the Kennedy Space Center (KSC) Cape Canaveral Air Force Station (CCAFS) area by the Cloud to Ground Lightning Surveillance System (CGLSS). The present study compiles ten years worth of CGLSS data into a database of near strikes. Using shuffle launch platform LP39A as a convenient central point, all strikes recorded within a 20-mile radius for the period of record O R ) from January 1, 1993 to December 31,2002 were included in the subset database. Histograms and cumulative probability curves are produced for both strike intensity (peak current, in kA) and the corresponding magnetic inductance fields (in A/m). Results for the full POR have application to launch operations lightning monitoring and post-strike test procedures.
Tritium environmental transport studies at TFTR
NASA Astrophysics Data System (ADS)
Ritter, P. D.; Dolan, T. J.; Longhurst, G. R.
1993-06-01
Environmental tritium concentrations will be measured near the Tokamak Fusion Test Reactor (TFTR) to help validate dynamic models of tritium transport in the environment. For model validation the database must contain sequential measurements of tritium concentrations in key environmental compartments. Since complete containment of tritium is an operational goal, the supplementary monitoring program should be able to glean useful data from an unscheduled acute release. Portable air samplers will be used to take samples automatically every 4 hours for a week after an acute release, thus obtaining the time resolution needed for code validation. Samples of soil, vegetation, and foodstuffs will be gathered daily at the same locations as the active air monitors. The database may help validate the plant/soil/air part of tritium transport models and enhance environmental tritium transport understanding for the International Thermonuclear Experimental Reactor (ITER).
Guidelines for the collection of continuous stream water-temperature data in Alaska
Toohey, Ryan C.; Neal, Edward G.; Solin, Gary L.
2014-01-01
Objectives of stream monitoring programs differ considerably among many of the academic, Federal, state, tribal, and non-profit organizations in the state of Alaska. Broad inclusion of stream-temperature monitoring can provide an opportunity for collaboration in the development of a statewide stream-temperature database. Statewide and regional coordination could reduce overall monitoring cost, while providing better analyses at multiple spatial and temporal scales to improve resource decision-making. Increased adoption of standardized protocols and data-quality standards may allow for validation of historical modeling efforts with better projection calibration. For records of stream water temperature to be generally consistent, unbiased, and reproducible, data must be collected and analyzed according to documented protocols. Collection of water-temperature data requires definition of data-quality objectives, good site selection, proper selection of instrumentation, proper installation of sensors, periodic site visits to maintain sensors and download data, pre- and post-deployment verification against an NIST-certified thermometer, potential data corrections, and proper documentation, review, and approval. A study created to develop a quality-assurance project plan, data-quality objectives, and a database management plan that includes procedures for data archiving and dissemination could provide a means to standardize a statewide stream-temperature database in Alaska. Protocols can be modified depending on desired accuracy or specific needs of data collected. This document is intended to guide users in collecting time series water-temperature data in Alaskan streams and draws extensively on the broader protocols already published by the U.S. Geological Survey.
A global organism detection and monitoring system for non-native species
Graham, J.; Newman, G.; Jarnevich, C.; Shory, R.; Stohlgren, T.J.
2007-01-01
Harmful invasive non-native species are a significant threat to native species and ecosystems, and the costs associated with non-native species in the United States is estimated at over $120 Billion/year. While some local or regional databases exist for some taxonomic groups, there are no effective geographic databases designed to detect and monitor all species of non-native plants, animals, and pathogens. We developed a web-based solution called the Global Organism Detection and Monitoring (GODM) system to provide real-time data from a broad spectrum of users on the distribution and abundance of non-native species, including attributes of their habitats for predictive spatial modeling of current and potential distributions. The four major subsystems of GODM provide dynamic links between the organism data, web pages, spatial data, and modeling capabilities. The core survey database tables for recording invasive species survey data are organized into three categories: "Where, Who & When, and What." Organisms are identified with Taxonomic Serial Numbers from the Integrated Taxonomic Information System. To allow users to immediately see a map of their data combined with other user's data, a custom geographic information system (GIS) Internet solution was required. The GIS solution provides an unprecedented level of flexibility in database access, allowing users to display maps of invasive species distributions or abundances based on various criteria including taxonomic classification (i.e., phylum or division, order, class, family, genus, species, subspecies, and variety), a specific project, a range of dates, and a range of attributes (percent cover, age, height, sex, weight). This is a significant paradigm shift from "map servers" to true Internet-based GIS solutions. The remainder of the system was created with a mix of commercial products, open source software, and custom software. Custom GIS libraries were created where required for processing large datasets, accessing the operating system, and to use existing libraries in C++, R, and other languages to develop the tools to track harmful species in space and time. The GODM database and system are crucial for early detection and rapid containment of invasive species. ?? 2007 Elsevier B.V. All rights reserved.
Design of wideband solar ultraviolet radiation intensity monitoring and control system
NASA Astrophysics Data System (ADS)
Ye, Linmao; Wu, Zhigang; Li, Yusheng; Yu, Guohe; Jin, Qi
2009-08-01
According to the principle of SCM (Single Chip Microcomputer) and computer communication technique, the system is composed of chips such as ATML89C51, ADL0809, integrated circuit and sensors for UV radiation, which is designed for monitoring and controlling the UV index. This system can automatically collect the UV index data, analyze and check the history database, research the law of UV radiation in the region.
ERIC Educational Resources Information Center
Hickman, Matthew; Griffin, Maria; Mott, Joy; Corkery, John; Madden, Peter; Sondhi, Arun; Stimson, Gerry
2004-01-01
Aims: We discuss the Addicts Index (AI) and examine whether the epidemiological trends of the AI can be continued by the regional drug misuse databases (DMDs, now known as National Drug Treatment Monitoring System (NDTMS). Methods: (i) Matching individuals recorded as addicted to opiates and/or cocaine in the AI with those reported to the North…
System and methods to determine and monitor changes in microstructural properties
Turner, Joseph Alan [Lincoln, NE
2011-05-17
A system and methods with which changes in microstructure properties such as grain size, grain elongation, texture, and porosity of materials can be determined and monitored over time to assess conditions such as stress and defects. The present invention includes a database of data, wherein a first set of data is used for comparison with a second set of data to determine the conditions of the material microstructure.
Family Expense Manager Application in Android
NASA Astrophysics Data System (ADS)
Rajaprabha, M. N.
2017-11-01
FAMILY EXPENSES MANAGER is an android application. This monitors your own costs, family costs and incidental costs. This resembles a present day costs day book in your versatile. This application helps you to monitor your every day costs, settlement points of interest, general rundown, report in detail and periodic costs subtle elements. Every one of the information is put away in database and can be recovered by the client and their relatives.
Informatics in radiology: Efficiency metrics for imaging device productivity.
Hu, Mengqi; Pavlicek, William; Liu, Patrick T; Zhang, Muhong; Langer, Steve G; Wang, Shanshan; Place, Vicki; Miranda, Rafael; Wu, Teresa Tong
2011-01-01
Acute awareness of the costs associated with medical imaging equipment is an ever-present aspect of the current healthcare debate. However, the monitoring of productivity associated with expensive imaging devices is likely to be labor intensive, relies on summary statistics, and lacks accepted and standardized benchmarks of efficiency. In the context of the general Six Sigma DMAIC (design, measure, analyze, improve, and control) process, a World Wide Web-based productivity tool called the Imaging Exam Time Monitor was developed to accurately and remotely monitor imaging efficiency with use of Digital Imaging and Communications in Medicine (DICOM) combined with a picture archiving and communication system. Five device efficiency metrics-examination duration, table utilization, interpatient time, appointment interval time, and interseries time-were derived from DICOM values. These metrics allow the standardized measurement of productivity, to facilitate the comparative evaluation of imaging equipment use and ongoing efforts to improve efficiency. A relational database was constructed to store patient imaging data, along with device- and examination-related data. The database provides full access to ad hoc queries and can automatically generate detailed reports for administrative and business use, thereby allowing staff to monitor data for trends and to better identify possible changes that could lead to improved productivity and reduced costs in association with imaging services. © RSNA, 2011.
Monitoring in the nearshore: A process for making reasoned decisions
Bodkin, James L.; Dean, T.A.
2003-01-01
Over the past several years, a conceptual framework for the GEM nearshore monitoring program has been developed through a series of workshops. However, details of the proposed monitoring program, e.g. what to sample, where to sample, when to sample and at how many sites, have yet to be determined. In FY 03 we were funded under Project 03687 to outline a process whereby specific alternatives to monitoring are developed and presented to the EVOS Trustee Council for consideration. As part of this process, two key elements are required before reasoned decisions can be made. These are: 1) a comprehensive historical perspective of locations and types of past studies conducted in the nearshore marine communities within Gulf of Alaska, and 2) estimates of costs for each element of a proposed monitoring program. We have developed a GIS database that details available information from past studies of selected nearshore habitats and species in the Gulf of Alaska and provide a visual means of selecting sites based (in part) on the locations for which historical data of interest are available. We also provide cost estimates for specific monitoring plan alternatives and outline several alternative plans that can be accomplished within reasonable budgetary constraints. The products that we will provide are: 1) A GIS database and maps showing the location and types of information available from the nearshore in the Gulf of Alaska; 2) A list of several specific monitoring alternatives that can be conducted within reasonable budgetary constraints; and 3) Cost estimates for proposed tasks to be conducted as part of the nearshore program. Because data compilation and management will not be completed until late in FY03 we are requesting support for close-out of this project in FY 04.
Dosing and Monitoring of Methadone in Pregnancy: Literature Review
Shiu, Jennifer R; Ensom, Mary H H
2012-01-01
Background: The pharmacokinetics of methadone is altered during pregnancy, but the most appropriate dosing and monitoring regimen has yet to be identified. Objective: To review dosing and monitoring of methadone therapy in pregnancy. Methods: A literature search was performed in several databases (PubMed, MEDLINE, Embase, International Pharmaceutical Abstracts, and the Cochrane Database of Systematic Reviews) from inception to May 2012. The search terms were “methadone”, “pregnancy”, “pharmacokinetic”, “clearance”, “metabolism”, “therapeutic drug monitoring”, and “methadone dosing”. Additional papers were identified by searching the bibliographies of primary and review articles. All English-language primary articles related to methadone pharmacokinetics in pregnancy were included. Articles not related to maternal outcomes were excluded. Results: The literature search yielded 1 case report and 10 studies discussing use of methadone by pregnant women. Methadone pharmacokinetics in pregnancy has been studied in 3 pharmacokinetic trials, and split dosing of methadone in pregnant women has been described in 1 case report and 3 dosing trials. Only 4 trials evaluated monitoring of methadone concentration in pregnancy. The studies included in this review confirm that methadone pharmacokinetics is altered in pregnancy and is potentially correlated with increases in maternal withdrawal symptoms. Insufficient evidence is available to warrant routine monitoring of serum methadone concentrations in pregnant women with opioid dependence. Conclusions: Few studies of methadone pharmacokinetics and therapeutic drug monitoring are available for pregnant women with opioid dependence. Although it is known that methadone pharmacokinetics is altered in pregnancy, there is insufficient evidence to guide dosage adjustments and serum concentration monitoring. Until further studies are available, regular follow-up of maternal withdrawal symptoms and empiric dosage adjustments throughout pregnancy are still recommended. PMID:23129867
Sustainable Seas Student Monitoring Project at the Branson School in Ross, CA
NASA Astrophysics Data System (ADS)
Rainsford, A.; Soave, K.; Costolo, R.; Kudler, J.; Emunah, M.; Hatfield, J.; Kiyasu, J.
2015-12-01
Alina Rainsford, Kathy Soave, Julia Kudler, Jane Hatfield, Melea Emunah, Rose Costelo, Jenna Kiyasu, Amy Dean and Sustainable Seas Monitoring Project, Branson School, Ross, CA, United States, Farallones Marine Sanctuary Association, San Francisco, CA, United StatesAbstract:The Sustainable Seas Student Monitoring Project at the Branson School in Ross, CA has monitored Duxbury Reef in Bolinas, CA since 1999, in cooperation with the Farallones Marine Sanctuary Association and the Gulf of Farallones National Marine Sanctuary. Goals of this student-run project include: 1) To monitor the rocky intertidal habitat and develop a baseline database of invertebrates and algal density and abundance; 2) To contribute to the conservation of the rocky intertidal habitat through education of students and visitors about intertidal species; 3) To increase stewardship in the Gulf of the Farallones National Marine Sanctuary; and 4) To contribute abundance and population data on key algae and invertebrate species to the national database, LiMPETS (Long Term Monitoring Program & Experiential Training for Students). Each fall student volunteers complete an intensive training course on the natural history of intertidal invertebrates and algae, identification of key species, rocky intertidal monitoring techniques, and history of the sanctuary. Students identify and count key invertebrate and algae species along two permanent transects and, using randomly determined points, within two permanent 200 m2 areas, in fall, winter, and late spring. Using data from the previous years, we will compare population densities, seasonal abundance and long-term population trends of key algal and invertebrate species, including Tegula funebralis, Anthopluera elegantissima, Cladophora sp. and Fucus sp.. Future analyses and investigations will include intertidal abiotic factors (including water temperature, pH and human foot-traffic) to enhance insights into the Duxbury Reef ecosystem, in particular, the high and mid-intertidal zones experiencing the greatest amount of human impacts.
Sustainable Seas Student Intertidal Monitoring Project at Duxbury Reef in Bolinas, CA
NASA Astrophysics Data System (ADS)
Soave, K.; Dean, A.; Yang, G.; Solli, E.; Dattels, C.; Wallace, K.; Boesel, A.; Steiger, C.; Buie, A.
2010-12-01
The Sustainable Seas Student Monitoring Project at the Branson School in Ross, CA has monitored Duxbury Reef in Bolinas, CA since 1999, in cooperation with the Farallones Marine Sanctuary Association and the Gulf of Farallones National Marine Sanctuary. Goals of the project include: 1) To monitor the rocky intertidal habitat and develop a baseline database of invertebrates and algal density and abundance; 2) To contribute to the conservation of the rocky intertidal habitat through education of students and visitors about intertidal species and the requirements for maintaining a healthy, diverse intertidal ecosystem; 3) To increase stewardship in the Gulf of the Farallones National Marine Sanctuary; and 4) To contribute abundance and population data on key algae and invertebrate species to the national database, LiMPETS (Long Term Monitoring Program & Experiential Training for Students). Student volunteers complete an intensive training course on the natural history of intertidal invertebrates and algae, identification of key species, rocky intertidal monitoring techniques, and history of the sanctuary. Students identify and count key invertebrate and algae species along two permanent transects (A and B) and using randomly determined points within a permanent 100 m2 area, three times per year (fall, winter, and late spring). Using the data collected since 2004, we will compare population densities, seasonal abundance and long-term population trends of key algal and invertebrate species, including Tegula funebralis. Future analyses and investigations will include intertidal abiotic factors (including water temperature and human foot-traffic) to enhance insights into the workings of the Duxbury Reef ecosystem, in particular, the high intertidal zone which experiences the greatest amount of human impacts.
Sustainable Seas Student Monitoring Project
NASA Astrophysics Data System (ADS)
Soave, K.; Emunah, M.; Hatfield, J.; Kiyasu, J.; Packard, E.; Ching, L.; Zhao, K.; Sanderson, L.; Turmon, M.
2016-12-01
The Sustainable Seas Student Monitoring Project at the Branson School in Ross, CA has monitored Duxbury Reef in Bolinas, CA since 2000, in cooperation with the Farallones Marine Sanctuary Association and the Gulf of Farallones National Marine Sanctuary. Goals of this student-run project include: 1) To monitor the rocky intertidal habitat and develop a baseline database of invertebrates and algal density and abundance; 2) To contribute to the conservation of the rocky intertidal habitat through education of students and visitors about intertidal species; 3) To increase stewardship in the Gulf of the Farallones National Marine Sanctuary; and 4) To contribute abundance and population data on key algae and invertebrate species to the national database, LiMPETS (Long Term Monitoring Program & Experiential Training for Students). Each fall student volunteers complete an intensive training course on the natural history of intertidal invertebrates and algae, identification of key species, rocky intertidal monitoring techniques, and history of the sanctuary. Students identify and count key invertebrate and algae species along two permanent transects and, using randomly determined points, within two permanent 200 m2 areas, in fall, winter, and late spring. Using data from the previous years, we will compare population densities, seasonal abundance and long-term population trends of key algal and invertebrate species, including Tegula funebralis, three separate Anthopluera sea anemone species, and two rockweed species. Future analyses and investigations will include intertidal abiotic factors (including water temperature, pH and human foot-traffic) to enhance insights into the Duxbury Reef ecosystem, in particular, the high and mid-intertidal zones experiencing the greatest amount of human impacts.
Sustainable Seas Student Intertidal Monitoring Project at Duxbury Reef in Bolinas, CA
NASA Astrophysics Data System (ADS)
Rainsford, A.; Soave, K.; Gerraty, F.; Jung, G.; Quirke-Shattuck, M.; Kudler, J.; Hatfield, J.; Emunah, M.; Dean, A. F.
2014-12-01
The Sustainable Seas Student Monitoring Project at the Branson School in Ross, CA has monitored Duxbury Reef in Bolinas, CA since 1999, in cooperation with the Farallones Marine Sanctuary Association and the Gulf of Farallones National Marine Sanctuary. Goals of this student-run project include: 1) To monitor the rocky intertidal habitat and develop a baseline database of invertebrates and algal density and abundance; 2) To contribute to the conservation of the rocky intertidal habitat through education of students and visitors about intertidal species; 3) To increase stewardship in the Gulf of the Farallones National Marine Sanctuary; and 4) To contribute abundance and population data on key algae and invertebrate species to the national database, LiMPETS (Long Term Monitoring Program & Experiential Training for Students). Each fall student volunteers complete an intensive training course on the natural history of intertidal invertebrates and algae, identification of key species, rocky intertidal monitoring techniques, and history of the sanctuary. Students identify and count key invertebrate and algae species along two permanent transects and, using randomly determined points, within two permanent 200 m2 areas, in fall, winter, and late spring. Using data from the previous years, we will compare population densities, seasonal abundance and long-term population trends of key algal and invertebrate species, including Tegula funebralis, Anthopluera elegantissima, Cladophora sp. and Fucus sp.. Future analyses and investigations will include intertidal abiotic factors (including water temperature, pH and human foot-traffic) to enhance insights into the Duxbury Reef ecosystem, in particular, the high and mid-intertidal zones experiencing the greatest amount of human impacts.
NASA Astrophysics Data System (ADS)
Weltzin, J. F.; Browning, D. M.
2014-12-01
The USA National Phenology Network (USA-NPN; www.usanpn.org) is a national-scale science and monitoring initiative focused on phenology - the study of seasonal life-cycle events such as leafing, flowering, reproduction, and migration - as a tool to understand the response of biodiversity to environmental variation and change. USA-NPN provides a hierarchical, national monitoring framework that enables other organizations to leverage the capacity of the Network for their own applications - minimizing investment and duplication of effort - while promoting interoperability. Network participants can leverage: (1) Standardized monitoring protocols that have been broadly vetted, tested and published; (2) A centralized National Phenology Database (NPDb) for maintaining, archiving and replicating data, with standard metadata, terms-of-use, web-services, and documentation of QA/QC, plus tools for discovery, visualization and download of raw data and derived data products; and/or (3) A national in-situ, multi-taxa phenological monitoring system, Nature's Notebook, which enables participants to observe and record phenology of plants and animals - based on the protocols and information management system (IMS) described above - via either web or mobile applications. The protocols, NPDb and IMS, and Nature's Notebook represent a hierarchy of opportunities for involvement by a broad range of interested stakeholders, from individuals to agencies. For example, some organizations have adopted (e.g., the National Ecological Observatory Network or NEON) -- or are considering adopting (e.g., the Long-Term Agroecosystems Network or LTAR) -- the USA-NPN standardized protocols, but will develop their own database and IMS with web services to promote sharing of data with the NPDb. Other organizations (e.g., the Inventory and Monitoring Programs of the National Wildlife Refuge System and the National Park Service) have elected to use Nature's Notebook to support their phenological monitoring programs. We highlight the challenges and benefits of integrating phenology monitoring within existing and emerging national monitoring networks, and showcase opportunities that exist when standardized protocols are adopted and implemented to promote data interoperability and sharing.
Poon, Art F. Y.; Gustafson, Réka; Daly, Patricia; Zerr, Laura; Demlow, S. Ellen; Wong, Jason; Woods, Conan K; Hogg, Robert S.; Krajden, Mel; Moore, David; Kendall, Perry; Montaner, Julio S. G.; Harrigan, P. Richard
2016-01-01
Background Due to the rapid evolution of HIV, infections with similar genetic sequences are likely to be related by recent transmission events. Clusters of related infections can represent subpopulations with high rates of HIV transmission. Here we describe the implementation of an automated “near real-time” system using clustering analysis of routinely collected HIV resistance genotypes to monitor and characterize HIV transmission hotspots in British Columbia (BC). Methods A monitoring system was implemented on the BC Drug Treatment Database, which currently holds over 32000 anonymized HIV genotypes for nearly 9000 residents of BC living with HIV. On average, five to six new HIV genotypes are deposited in the database every day, which triggers an automated re-analysis of the entire database. Clusters of five or more individuals were extracted on the basis of short phylogenetic distances between their respective HIV sequences. Monthly reports on the growth and characteristics of clusters were generated by the system and distributed to public health officers. Findings In June 2014, the monitoring system detected the expansion of a cluster by 11 new cases over three months, including eight cases with transmitted drug resistance. This cluster generally comprised young men who have sex with men. The subsequent report precipitated an enhanced public health follow-up to ensure linkage to care and treatment initiation in the affected subpopulation. Of the nine cases associated with this follow-up, all had already been linked to care and five cases had started treatment. Subsequent to the follow-up, three additional cases started treatment and the majority of cases achieved suppressed viral loads. Over the following 12 months, 12 new cases were detected in this cluster with a marked reduction in the onward transmission of drug resistance. Interpretation Our findings demonstrate the first application of an automated phylogenetic system monitoring a clinical database to detect a recent HIV outbreak and support the ensuing public health response. By making secondary use of routinely collected HIV genotypes, this approach is cost-effective, attains near realtime monitoring of new cases, and can be implemented in all settings where HIV genotyping is the standard of care. Funding This work was supported by the BC Centre for Excellence in HIV/AIDS and by grants from the Canadian Institutes for Health Research (CIHR HOP-111406, HOP-107544), the Genome BC, Genome Canada and CIHR Partnership in Genomics and Personalized Health (Large-Scale Applied Research Project HIV142 contract to PRH, JSGM, and AFYP), and by the US National Institute on Drug Abuse (1-R01-DA036307-01, 5-R01-031055-02, R01-DA021525-06, and R01-DA011591). PMID:27126490
Databases as policy instruments. About extending networks as evidence-based policy.
de Bont, Antoinette; Stoevelaar, Herman; Bal, Roland
2007-12-07
This article seeks to identify the role of databases in health policy. Access to information and communication technologies has changed traditional relationships between the state and professionals, creating new systems of surveillance and control. As a result, databases may have a profound effect on controlling clinical practice. We conducted three case studies to reconstruct the development and use of databases as policy instruments. Each database was intended to be employed to control the use of one particular pharmaceutical in the Netherlands (growth hormone, antiretroviral drugs for HIV and Taxol, respectively). We studied the archives of the Dutch Health Insurance Board, conducted in-depth interviews with key informants and organized two focus groups, all focused on the use of databases both in policy circles and in clinical practice. Our results demonstrate that policy makers hardly used the databases, neither for cost control nor for quality assurance. Further analysis revealed that these databases facilitated self-regulation and quality assurance by (national) bodies of professionals, resulting in restrictive prescription behavior amongst physicians. The databases fulfill control functions that were formerly located within the policy realm. The databases facilitate collaboration between policy makers and physicians, since they enable quality assurance by professionals. Delegating regulatory authority downwards into a network of physicians who control the use of pharmaceuticals seems to be a good alternative for centralized control on the basis of monitoring data.
NASA Astrophysics Data System (ADS)
Lanckman, Jean-Pierre; Elger, Kirsten; Karlsson, Ævar Karl; Johannsson, Halldór; Lantuit, Hugues
2013-04-01
Permafrost is a direct indicator of climate change and has been identified as Essential Climate Variable (ECV) by the global observing community. The monitoring of permafrost temperatures, active-layer thicknesses and other parameters has been performed for several decades already, but it was brought together within the Global Terrestrial Network for Permafrost (GTN-P) in the 1990's only, including the development of measurement protocols to provide standardized data. GTN-P is the primary international observing network for permafrost sponsored by the Global Climate Observing System (GCOS) and the Global Terrestrial Observing System (GTOS), and managed by the International Permafrost Association (IPA). All GTN-P data was outfitted with an "open data policy" with free data access via the World Wide Web. The existing data, however, is far from being homogeneous: it is not yet optimized for databases, there is no framework for data reporting or archival and data documentation is incomplete. As a result, and despite the utmost relevance of permafrost in the Earth's climate system, the data has not been used by as many researchers as intended by the initiators of the programs. While the monitoring of many other ECVs has been tackled by organized international networks (e.g. FLUXNET), there is still no central database for all permafrost-related parameters. The European Union project PAGE21 created opportunities to develop this central database for permafrost monitoring parameters of GTN-P during the duration of the project and beyond. The database aims to be the one location where the researcher can find data, metadata, and information of all relevant parameters for a specific site. Each component of the Data Management System (DMS), including parameters, data levels and metadata formats were developed in cooperation with the GTN-P and the IPA. The general framework of the GTN-P DMS is based on an object oriented model (OOM), open for as many parameters as possible, and implemented into a spatial database. To ensure interoperability and enable potential inter-database search, field names are following international metadata standards and are based on a control vocabulary registry. Tools are developed to provide data processing, analysis capability, and quality control. Our system aims to be a reference model, improvable and reusable. It allows a maximum top-down and bottom-up data flow, giving scientists one global searchable data and metadata repository, the public a full access to scientific data, and the policy maker a powerful cartographic and statistical tool. To engage the international community in GTN-P, it was essential to develop an online interface for data upload. Aim for this was that it is easy-to-use and allows data input with a minimum of technical and personal effort. In addition to this, large efforts will have to be produced in order to be able to query, visualize and retrieve information over many platforms and type of measurements. Ultimately, it is not the layer in itself that matter, but more the relationship that these information layers maintain with each other.
Ambient Monitoring Guidelines for Prevention of Significant Deterioration
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
ERIC Educational Resources Information Center
Bock, H. Darrell
The hardware and software system used to create the National Opinion Research Center/Center for Research on Evaluation, Standards, and Student Testing (NORC/CRESST) item databases and test booklets for the 12th-grade science assessment are described. A general description of the capabilities of the system is given, with some specific information…
Effect of Changing Stack Heights on PSD Modeling and Monitoring
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Definition of Postapproval Monitoring
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
A Multimodal Database for a Home Remote Medical Care Application
NASA Astrophysics Data System (ADS)
Medjahed, Hamid; Istrate, Dan; Boudy, Jerome; Steenkeste, François; Baldinger, Jean-Louis; Dorizzi, Bernadette
The home remote monitoring systems aim to make a protective contribution to the well being of individuals (patients, elderly persons) requiring moderate amounts of support for independent living spaces, and improving their everyday life. Existing researches of these systems suffer from lack of experimental data and a standard medical database intended for their validation and improvement. This paper presents a multi-sensors environment for acquiring and recording a multimodal medical database, which includes physiological data (cardiac frequency, activity or agitation, posture, fall), environment sounds and localization data. It provides graphical interface functions to manage, process and index these data. The paper focuses on the system implementation, its usage and it points out possibilities for future work.
Consulting report on the NASA technology utilization network system
NASA Technical Reports Server (NTRS)
Hlava, Marjorie M. K.
1992-01-01
The purposes of this consulting effort are: (1) to evaluate the existing management and production procedures and workflow as they each relate to the successful development, utilization, and implementation of the NASA Technology Utilization Network System (TUNS) database; (2) to identify, as requested by the NASA Project Monitor, the strengths, weaknesses, areas of bottlenecking, and previously unaddressed problem areas affecting TUNS; (3) to recommend changes or modifications of existing procedures as necessary in order to effect corrections for the overall benefit of NASA TUNS database production, implementation, and utilization; and (4) to recommend the addition of alternative procedures, routines, and activities that will consolidate and facilitate the production, implementation, and utilization of the NASA TUNS database.
Monitoring service for the Gran Telescopio Canarias control system
NASA Astrophysics Data System (ADS)
Huertas, Manuel; Molgo, Jordi; Macías, Rosa; Ramos, Francisco
2016-07-01
The Monitoring Service collects, persists and propagates the Telescope and Instrument telemetry, for the Gran Telescopio CANARIAS (GTC), an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). A new version of the Monitoring Service has been developed in order to improve performance, provide high availability, guarantee fault tolerance and scalability to cope with high volume of data. The architecture is based on a distributed in-memory data store with a Product/Consumer pattern design. The producer generates the data samples. The consumers either persists the samples to a database for further analysis or propagates them to the consoles in the control room to monitorize the state of the whole system.
Scholl, Joep H G; van Puijenbroek, Eugene P
2012-08-01
The Netherlands Pharmacovigilance Centre Lareb received reports of six cases of hearing impairment in association with oral terbinafine use. This study describes these cases and provides support for this association from the Lareb database for spontaneous adverse drug reaction (ADR) reporting and from Vigibase™, the ADR database of the WHO Collaborating Centre for International Drug Monitoring, the Uppsala Monitoring Centre. The objective of the current study was to identify whether the observed association between oral terbinafine use and hearing impairment, based on cases received by Lareb, constitutes a safety signal. Cases of hearing impairment in oral terbinafine users are described. In a case/non-case analysis, the strength of the association in Vigibase™ and the Lareb database was determined (date of analysis August 2011) by calculating the reporting odds ratios (RORs), adjusted for possible confounding by age, sex and ototoxic concomitant medication. For the purpose of this study, RORs were calculated for deafness, hypoacusis and the combination of both, defined as hearing impairment. In the Lareb database, six reports concerning individuals aged 31-82 years, who developed hearing impairment after starting oral terbinafine, were present. The use of oral terbinafine was disproportionally associated with hypoacusis in both the Lareb database (adjusted ROR 3.9; 95% CI 1.7, 9.0) and in Vigibase™ (adjusted ROR 1.7; 95% CI 1.0, 2.8). Deafness was not disproportionally present in either of the databases. Based on the described cases and the statistical analyses from both databases, a causal relationship between the use of oral terbinafine and hearing impairment is possible. The mechanism by which terbinafine could cause hearing impairment has not been elucidated yet. The pharmacological action of terbinafine is based on the inhibition of squalene epoxidase, an enzyme present in both fungal and human cells. This inhibition might result in a decrease in cholesterol levels in human cells, among which are the outer hair cells of the cochlea. It may be possible that the reduction in cochlear cholesterol levels leads to impaired cochlear function and possibly hearing impairment. In this study we describe hearing impairment as a possible ADR of oral terbinafine, based on six case reports and statistical support from Vigibase™ and the Lareb database. To our knowledge this association has not been described before.
Evolution of grid-wide access to database resident information in ATLAS using Frontier
NASA Astrophysics Data System (ADS)
Barberis, D.; Bujor, F.; de Stefano, J.; Dewhurst, A. L.; Dykstra, D.; Front, D.; Gallas, E.; Gamboa, C. F.; Luehring, F.; Walker, R.
2012-12-01
The ATLAS experiment deployed Frontier technology worldwide during the initial year of LHC collision data taking to enable user analysis jobs running on the Worldwide LHC Computing Grid to access database resident data. Since that time, the deployment model has evolved to optimize resources, improve performance, and streamline maintenance of Frontier and related infrastructure. In this presentation we focus on the specific changes in the deployment and improvements undertaken, such as the optimization of cache and launchpad location, the use of RPMs for more uniform deployment of underlying Frontier related components, improvements in monitoring, optimization of fail-over, and an increasing use of a centrally managed database containing site specific information (for configuration of services and monitoring). In addition, analysis of Frontier logs has allowed us a deeper understanding of problematic queries and understanding of use cases. Use of the system has grown beyond user analysis and subsystem specific tasks such as calibration and alignment, extending into production processing areas, such as initial reconstruction and trigger reprocessing. With a more robust and tuned system, we are better equipped to satisfy the still growing number of diverse clients and the demands of increasingly sophisticated processing and analysis.
Romeis, Jörg; Meissle, Michael; Alvarez-Alfageme, Fernando; Bigler, Franz; Bohan, David A; Devos, Yann; Malone, Louise A; Pons, Xavier; Rauschen, Stefan
2014-12-01
Worldwide, plants obtained through genetic modification are subject to a risk analysis and regulatory approval before they can enter the market. An area of concern addressed in environmental risk assessments is the potential of genetically modified (GM) plants to adversely affect non-target arthropods and the valued ecosystem services they provide. Environmental risk assessments are conducted case-by-case for each GM plant taking into account the plant species, its trait(s), the receiving environments into which the GM plant is to be released and its intended uses, and the combination of these characteristics. To facilitate the non-target risk assessment of GM plants, information on arthropods found in relevant agro-ecosystems in Europe has been compiled in a publicly available database of bio-ecological information during a project commissioned by the European Food Safety Authority (EFSA). Using different hypothetical GM maize case studies, we demonstrate how the information contained in the database can assist in identifying valued species that may be at risk and in selecting suitable species for laboratory testing, higher-tier studies, as well as post-market environmental monitoring.
A web-based quantitative signal detection system on adverse drug reaction in China.
Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan
2009-07-01
To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.
Pan European Phenological database (PEP725): a single point of access for European data.
Templ, Barbara; Koch, Elisabeth; Bolmgren, Kjell; Ungersböck, Markus; Paul, Anita; Scheifinger, Helfried; Rutishauser, This; Busto, Montserrat; Chmielewski, Frank-M; Hájková, Lenka; Hodzić, Sabina; Kaspar, Frank; Pietragalla, Barbara; Romero-Fresneda, Ramiro; Tolvanen, Anne; Vučetič, Višnja; Zimmermann, Kirsten; Zust, Ana
2018-06-01
The Pan European Phenology (PEP) project is a European infrastructure to promote and facilitate phenological research, education, and environmental monitoring. The main objective is to maintain and develop a Pan European Phenological database (PEP725) with an open, unrestricted data access for science and education. PEP725 is the successor of the database developed through the COST action 725 "Establishing a European phenological data platform for climatological applications" working as a single access point for European-wide plant phenological data. So far, 32 European meteorological services and project partners from across Europe have joined and supplied data collected by volunteers from 1868 to the present for the PEP725 database. Most of the partners actively provide data on a regular basis. The database presently holds almost 12 million records, about 46 growing stages and 265 plant species (including cultivars), and can be accessed via http://www.pep725.eu/ . Users of the PEP725 database have studied a diversity of topics ranging from climate change impact, plant physiological question, phenological modeling, and remote sensing of vegetation to ecosystem productivity.
Pan European Phenological database (PEP725): a single point of access for European data
NASA Astrophysics Data System (ADS)
Templ, Barbara; Koch, Elisabeth; Bolmgren, Kjell; Ungersböck, Markus; Paul, Anita; Scheifinger, Helfried; Rutishauser, This; Busto, Montserrat; Chmielewski, Frank-M.; Hájková, Lenka; Hodzić, Sabina; Kaspar, Frank; Pietragalla, Barbara; Romero-Fresneda, Ramiro; Tolvanen, Anne; Vučetič, Višnja; Zimmermann, Kirsten; Zust, Ana
2018-02-01
The Pan European Phenology (PEP) project is a European infrastructure to promote and facilitate phenological research, education, and environmental monitoring. The main objective is to maintain and develop a Pan European Phenological database (PEP725) with an open, unrestricted data access for science and education. PEP725 is the successor of the database developed through the COST action 725 "Establishing a European phenological data platform for climatological applications" working as a single access point for European-wide plant phenological data. So far, 32 European meteorological services and project partners from across Europe have joined and supplied data collected by volunteers from 1868 to the present for the PEP725 database. Most of the partners actively provide data on a regular basis. The database presently holds almost 12 million records, about 46 growing stages and 265 plant species (including cultivars), and can be accessed via http://www.pep725.eu/. Users of the PEP725 database have studied a diversity of topics ranging from climate change impact, plant physiological question, phenological modeling, and remote sensing of vegetation to ecosystem productivity.
Devices for Self-Monitoring Sedentary Time or Physical Activity: A Scoping Review.
Sanders, James P; Loveday, Adam; Pearson, Natalie; Edwardson, Charlotte; Yates, Thomas; Biddle, Stuart J H; Esliger, Dale W
2016-05-04
It is well documented that meeting the guideline levels (150 minutes per week) of moderate-to-vigorous physical activity (PA) is protective against chronic disease. Conversely, emerging evidence indicates the deleterious effects of prolonged sitting. Therefore, there is a need to change both behaviors. Self-monitoring of behavior is one of the most robust behavior-change techniques available. The growing number of technologies in the consumer electronics sector provides a unique opportunity for individuals to self-monitor their behavior. The aim of this study is to review the characteristics and measurement properties of currently available self-monitoring devices for sedentary time and/or PA. To identify technologies, four scientific databases were systematically searched using key terms related to behavior, measurement, and population. Articles published through October 2015 were identified. To identify technologies from the consumer electronic sector, systematic searches of three Internet search engines were also performed through to October 1, 2015. The initial database searches identified 46 devices and the Internet search engines identified 100 devices yielding a total of 146 technologies. Of these, 64 were further removed because they were currently unavailable for purchase or there was no evidence that they were designed for, had been used in, or could readily be modified for self-monitoring purposes. The remaining 82 technologies were included in this review (73 devices self-monitored PA, 9 devices self-monitored sedentary time). Of the 82 devices included, this review identified no published articles in which these devices were used for the purpose of self-monitoring PA and/or sedentary behavior; however, a number of technologies were found via Internet searches that matched the criteria for self-monitoring and provided immediate feedback on PA (ActiGraph Link, Microsoft Band, and Garmin Vivofit) and sedentary time (activPAL VT, the Lumo Back, and Darma). There are a large number of devices that self-monitor PA; however, there is a greater need for the development of tools to self-monitor sedentary time. The novelty of these devices means they have yet to be used in behavior change interventions, although the growing field of wearable technology may facilitate this to change.
Devices for Self-Monitoring Sedentary Time or Physical Activity: A Scoping Review
Loveday, Adam; Pearson, Natalie; Edwardson, Charlotte; Yates, Thomas; Biddle, Stuart JH; Esliger, Dale W
2016-01-01
Background It is well documented that meeting the guideline levels (150 minutes per week) of moderate-to-vigorous physical activity (PA) is protective against chronic disease. Conversely, emerging evidence indicates the deleterious effects of prolonged sitting. Therefore, there is a need to change both behaviors. Self-monitoring of behavior is one of the most robust behavior-change techniques available. The growing number of technologies in the consumer electronics sector provides a unique opportunity for individuals to self-monitor their behavior. Objective The aim of this study is to review the characteristics and measurement properties of currently available self-monitoring devices for sedentary time and/or PA. Methods To identify technologies, four scientific databases were systematically searched using key terms related to behavior, measurement, and population. Articles published through October 2015 were identified. To identify technologies from the consumer electronic sector, systematic searches of three Internet search engines were also performed through to October 1, 2015. Results The initial database searches identified 46 devices and the Internet search engines identified 100 devices yielding a total of 146 technologies. Of these, 64 were further removed because they were currently unavailable for purchase or there was no evidence that they were designed for, had been used in, or could readily be modified for self-monitoring purposes. The remaining 82 technologies were included in this review (73 devices self-monitored PA, 9 devices self-monitored sedentary time). Of the 82 devices included, this review identified no published articles in which these devices were used for the purpose of self-monitoring PA and/or sedentary behavior; however, a number of technologies were found via Internet searches that matched the criteria for self-monitoring and provided immediate feedback on PA (ActiGraph Link, Microsoft Band, and Garmin Vivofit) and sedentary time (activPAL VT, the Lumo Back, and Darma). Conclusions There are a large number of devices that self-monitor PA; however, there is a greater need for the development of tools to self-monitor sedentary time. The novelty of these devices means they have yet to be used in behavior change interventions, although the growing field of wearable technology may facilitate this to change. PMID:27145905
LiMPETS: Scientists Contributions to Coastal Protection Program for Youth
NASA Astrophysics Data System (ADS)
Saltzman, J.; Osborn, D. A.
2004-12-01
In the West Coast National Marine Sanctuaries' LiMPETS (Long-term Monitoring Experiential Training for Students), scientists have partnered with local sanctuaries to develop an educational and scientifically-based monitoring program. With different levels of commitment and interest, scientists have contributed to developing protocols that youth can successfully use to monitor coastal habitats. LiMPETS was developed to address the gap in marine science education for high school students. The team of sanctuary educators together with local scientists collaborate and compromise to develop scientifically accurate and meaningful monitoring projects. By crossing the border between scientists and educators, LiMPETS has become a rich program which provides to teachers professional development, monitoring equipment, an online database, and field support. In the Sandy Beach Monitoring Project, we called on an expert on the sand crab Emerita analoga to help us modify the protocols that she uses to monitor crabs regularly. This scientist brings inspiration to teachers at teacher workshops by explaining how the student monitoring compliments her research. The Rocky Intertidal Monitoring Project was developed by scientists at University of California at Santa Cruz with the intention of passing on this project to an informal learning center. After receiving California Sea Grant funding, the protocols used for over 30 years with undergraduates were modified for middle and high school students. With the help of teachers, classroom activities were developed to train students for fieldwork. The online database was envisioned by the scientists to house the historical data from undergraduate students while growing with new data collected middle and high school students. The support of scientists in this program has been crucial to develop a meaningful program for both youth and resource managers. The hours that a scientist contributes to this program may be minimal, a weeklong workshop or even a part-time job. The framework of resource protection agencies partnering with scientists can be replicated to monitor other natural habitats. Through LiMPETS, scientists are helping to develop scientifically literate youth who are engaged in environmental monitoring.
2015-10-01
already been well characterized. Finally, we have also been recruiting migraine patients since they commonly report light sensitivity between headaches...and have been recruiting migraine subjects in the immediate 25-mile radius using email announcements and also the UIHC database by diagnostic...expression of pain from feigned pain as might occur when monitoring photosensitivity for migraine treatment. 3D also proves its value for expressions of
2002-09-01
ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Egov 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING / MONITORING...initiatives. The federal government has 55 databases that deal with security threats, but inter- agency access depends on establishing agreements through...which that information can be shared. True cooperation also will require government -wide commitment to enterprise architecture, integrated
An object-oriented, knowledge-based system for cardiovascular rehabilitation--phase II.
Ryder, R. M.; Inamdar, B.
1995-01-01
The Heart Monitor is an object-oriented, knowledge-based system designed to support the clinical activities of cardiovascular (CV) rehabilitation. The original concept was developed as part of graduate research completed in 1992. This paper describes the second generation system which is being implemented in collaboration with a local heart rehabilitation program. The PC UNIX-based system supports an extensive patient database organized by clinical areas. In addition, a knowledge base is employed to monitor patient status. Rule-based automated reasoning is employed to assess risk factors contraindicative to exercise therapy and to monitor administrative and statutory requirements. PMID:8563285
Deploying Server-side File System Monitoring at NERSC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uselton, Andrew
2009-05-01
The Franklin Cray XT4 at the NERSC center was equipped with the server-side I/O monitoring infrastructure Cerebro/LMT, which is described here in detail. Insights gained from the data produced include a better understanding of instantaneous data rates during file system testing, file system behavior during regular production time, and long-term average behaviors. Information and insights gleaned from this monitoring support efforts to proactively manage the I/O infrastructure on Franklin. A simple model for I/O transactions is introduced and compared with the 250 million observations sent to the LMT database from August 2008 to February 2009.
Doering, Che; Bollhöfer, Andreas
2016-10-01
This paper presents a database of radionuclide activity and metal concentrations for the Alligator Rivers Region (ARR) uranium province in the Australian wet-dry tropics. The database contains 5060 sample records and 57,473 concentration values. The data are for animal, plant, soil, sediment and water samples collected by the Environmental Research Institute of the Supervising Scientist (ERISS) as part of its statutory role to undertake research and monitoring into the impacts of uranium mining on the environment of the ARR. Concentration values are provided in the database for 11 radionuclides ( 227 Ac, 40 K, 210 Pb, 210 Po, 226 Ra, 228 Ra, 228 Th, 230 Th, 232 Th, 234 U, 238 U) and 26 metals (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Fe, Hg, K, Mg, Mn, Na, Ni, P, Pb, Rb, S, Sb, Se, Sr, Th, U, V, Zn). Potential uses of the database are discussed. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
Schurr, K.M.; Cox, S.E.
1994-01-01
The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.
Reddy, T.B.K.; Thomas, Alex D.; Stamatis, Dimitri; Bertsch, Jon; Isbandi, Michelle; Jansson, Jakob; Mallajosyula, Jyothi; Pagani, Ioanna; Lobos, Elizabeth A.; Kyrpides, Nikos C.
2015-01-01
The Genomes OnLine Database (GOLD; http://www.genomesonline.org) is a comprehensive online resource to catalog and monitor genetic studies worldwide. GOLD provides up-to-date status on complete and ongoing sequencing projects along with a broad array of curated metadata. Here we report version 5 (v.5) of the database. The newly designed database schema and web user interface supports several new features including the implementation of a four level (meta)genome project classification system and a simplified intuitive web interface to access reports and launch search tools. The database currently hosts information for about 19 200 studies, 56 000 Biosamples, 56 000 sequencing projects and 39 400 analysis projects. More than just a catalog of worldwide genome projects, GOLD is a manually curated, quality-controlled metadata warehouse. The problems encountered in integrating disparate and varying quality data into GOLD are briefly highlighted. GOLD fully supports and follows the Genomic Standards Consortium (GSC) Minimum Information standards. PMID:25348402
Guhn, Martin; Janus, Magdalena; Enns, Jennifer; Brownell, Marni; Forer, Barry; Duku, Eric; Muhajarine, Nazeem; Raos, Rob
2016-01-01
Introduction Early childhood is a key period to establish policies and practices that optimise children's health and development, but Canada lacks nationally representative data on social indicators of children's well-being. To address this gap, the Early Development Instrument (EDI), a teacher-administered questionnaire completed for kindergarten-age children, has been implemented across most Canadian provinces over the past 10 years. The purpose of this protocol is to describe the Canadian Neighbourhoods and Early Child Development (CanNECD) Study, the aims of which are to create a pan-Canadian EDI database to monitor trends over time in children's developmental health and to advance research examining the social determinants of health. Methods and analysis Canada-wide EDI records from 2004 to 2014 (representing over 700 000 children) will be linked to Canada Census and Income Taxfiler data. Variables of socioeconomic status derived from these databases will be used to predict neighbourhood-level EDI vulnerability rates by conducting a series of regression analyses and latent variable models at provincial/territorial and national levels. Where data are available, we will measure the neighbourhood-level change in developmental vulnerability rates over time and model the socioeconomic factors associated with those trends. Ethics and dissemination Ethics approval for this study was granted by the Behavioural Research Ethics Board at the University of British Columbia. Study findings will be disseminated to key partners, including provincial and federal ministries, schools and school districts, collaborative community groups and the early childhood development research community. The database created as part of this longitudinal population-level monitoring system will allow researchers to associate practices, programmes and policies at school and community levels with trends in developmental health outcomes. The CanNECD Study will guide future early childhood development action and policies, using the database as a tool for formative programme and policy evaluation. PMID:27130168
Research on public participant urban infrastructure safety monitoring system using smartphone
NASA Astrophysics Data System (ADS)
Zhao, Xuefeng; Wang, Niannian; Ou, Jinping; Yu, Yan; Li, Mingchu
2017-04-01
Currently more and more people concerned about the safety of major public security. Public participant urban infrastructure safety monitoring and investigation has become a trend in the era of big data. In this paper, public participant urban infrastructure safety protection system based on smart phones is proposed. The system makes it possible to public participant disaster data collection, monitoring and emergency evaluation in the field of disaster prevention and mitigation. Function of the system is to monitor the structural acceleration, angle and other vibration information, and extract structural deformation and implement disaster emergency communications based on smartphone without network. The monitoring data is uploaded to the website to create urban safety information database. Then the system supports big data analysis processing, the structure safety assessment and city safety early warning.
Long-Term Pavement Performance Program
DOT National Transportation Integrated Search
2015-12-01
The LTPP program will yield additional benefits as data are added to the database and as data analysis effortssome currently planned and some yet to be identifiedare completed. Continued monitoring of the test sections that remain in service is...
49 CFR 237.155 - Documents and records.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... inspection and reproduction by the Federal Railroad Administration. (a) Electronic recordkeeping; general... the information required by this part; (3) The track owner monitors its electronic records database...
49 CFR 237.155 - Documents and records.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... inspection and reproduction by the Federal Railroad Administration. (a) Electronic recordkeeping; general... the information required by this part; (3) The track owner monitors its electronic records database...
49 CFR 237.155 - Documents and records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... inspection and reproduction by the Federal Railroad Administration. (a) Electronic recordkeeping; general... the information required by this part; (3) The track owner monitors its electronic records database...
The ASAS-SN Catalog of Variable Stars I: The Serendipitous Survey
NASA Astrophysics Data System (ADS)
Jayasinghe, T.; Kochanek, C. S.; Stanek, K. Z.; Shappee, B. J.; Holoien, T. W.-S.; Thompson, Todd A.; Prieto, J. L.; Dong, Subo; Pawlak, M.; Shields, J. V.; Pojmanski, G.; Otero, S.; Britt, C. A.; Will, D.
2018-04-01
The All-Sky Automated Survey for Supernovae (ASAS-SN) is the first optical survey to routinely monitor the whole sky with a cadence of ˜2 - 3 days down to V≲ 17 mag. ASAS-SN has monitored the whole sky since 2014, collecting ˜100 - 500 epochs of observations per field. The V-band light curves for candidate variables identified during the search for supernovae are classified using a random forest classifier and visually verified. We present a catalog of 66,533 bright, new variable stars discovered during our search for supernovae, including 27,753 periodic variables and 38,780 irregular variables. V-band light curves for the ASAS-SN variables are available through the ASAS-SN variable stars database (https://asas-sn.osu.edu/variables). The database will begin to include the light curves of known variable stars in the near future along with the results for a systematic, all-sky variability survey.
[Computerised monitoring of integrated cervical screening. Indicators of diagnostic performance].
Bucchi, L; Pierri, C; Amadori, A; Folicaldi, S; Ghidoni, D; Nannini, R; Bondi, A
2003-12-01
In a previous issue of this journal, we presented the background, rationale, general methods, and indicators of participation of a computerised system for the monitoring of integrated cervical screening, i.e. the integration of spontaneous Pap smear practice into organised screening. We also reported the results of the application of those indicators in the general database of the Pathology Department of Imola Health District in northern Italy. In the current paper, we present the rationale and definitions of indicators of diagnostic performance (total Pap smears and rate of unsatisfactory Pap smears, distribution by cytology class reported, rate of patients without timely follow-up, detection rate, positive predictive value, distribution of cytology classes reported by histology diagnosis, and distribution of cases of CIN and carcinoma registered by detection modality) as well as the results of their application in the same database as above.
XRootD popularity on hadoop clusters
NASA Astrophysics Data System (ADS)
Meoni, Marco; Boccali, Tommaso; Magini, Nicolò; Menichetti, Luca; Giordano, Domenico;
2017-10-01
Performance data and metadata of the computing operations at the CMS experiment are collected through a distributed monitoring infrastructure, currently relying on a traditional Oracle database system. This paper shows how to harness Big Data architectures in order to improve the throughput and the efficiency of such monitoring. A large set of operational data - user activities, job submissions, resources, file transfers, site efficiencies, software releases, network traffic, machine logs - is being injected into a readily available Hadoop cluster, via several data streamers. The collected metadata is further organized running fast arbitrary queries; this offers the ability to test several Map&Reduce-based frameworks and measure the system speed-up when compared to the original database infrastructure. By leveraging a quality Hadoop data store and enabling an analytics framework on top, it is possible to design a mining platform to predict dataset popularity and discover patterns and correlations.
Dialynas, Emmanuel; Topalis, Pantelis; Vontas, John; Louis, Christos
2009-01-01
Background Monitoring of insect vector populations with respect to their susceptibility to one or more insecticides is a crucial element of the strategies used for the control of arthropod-borne diseases. This management task can nowadays be achieved more efficiently when assisted by IT (Information Technology) tools, ranging from modern integrated databases to GIS (Geographic Information System). Here we describe an application ontology that we developed de novo, and a specially designed database that, based on this ontology, can be used for the purpose of controlling mosquitoes and, thus, the diseases that they transmit. Methodology/Principal Findings The ontology, named MIRO for Mosquito Insecticide Resistance Ontology, developed using the OBO-Edit software, describes all pertinent aspects of insecticide resistance, including specific methodology and mode of action. MIRO, then, forms the basis for the design and development of a dedicated database, IRbase, constructed using open source software, which can be used to retrieve data on mosquito populations in a temporally and spatially separate way, as well as to map the output using a Google Earth interface. The dependency of the database on the MIRO allows for a rational and efficient hierarchical search possibility. Conclusions/Significance The fact that the MIRO complies with the rules set forward by the OBO (Open Biomedical Ontologies) Foundry introduces cross-referencing with other biomedical ontologies and, thus, both MIRO and IRbase are suitable as parts of future comprehensive surveillance tools and decision support systems that will be used for the control of vector-borne diseases. MIRO is downloadable from and IRbase is accessible at VectorBase, the NIAID-sponsored open access database for arthropod vectors of disease. PMID:19547750
NASA Astrophysics Data System (ADS)
Hoh, Siew Sin; Rapie, Nurul Nadiah; Lim, Edwin Suh Wen; Tan, Chun Yuan; Yavar, Alireza; Sarmani, Sukiman; Majid, Amran Ab.; Khoo, Kok Siong
2013-05-01
Instrumental Neutron Activation Analysis (INAA) is often used to determine and calculate the elemental concentrations of a sample at The National University of Malaysia (UKM) typically in Nuclear Science Programme, Faculty of Science and Technology. The objective of this study was to develop a database code-system based on Microsoft Access 2010 which could help the INAA users to choose either comparator method, k0-method or absolute method for calculating the elemental concentrations of a sample. This study also integrated k0data, Com-INAA, k0Concent, k0-Westcott and Abs-INAA to execute and complete the ECC-UKM database code-system. After the integration, a study was conducted to test the effectiveness of the ECC-UKM database code-system by comparing the concentrations between the experiments and the code-systems. 'Triple Bare Monitor' Zr-Au and Cr-Mo-Au were used in k0Concent, k0-Westcott and Abs-INAA code-systems as monitors to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration were net peak area (Np), measurement time (tm), irradiation time (tirr), k-factor (k), thermal to epithermal neutron flux ratio (f), parameters of the neutron flux distribution epithermal (α) and detection efficiency (ɛp). For Com-INAA code-system, certified reference material IAEA-375 Soil was used to calculate the concentrations of elements in a sample. Other CRM and SRM were also used in this database codesystem. Later, a verification process to examine the effectiveness of the Abs-INAA code-system was carried out by comparing the sample concentrations between the code-system and the experiment. The results of the experimental concentration values of ECC-UKM database code-system were performed with good accuracy.
Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A.
2013-01-01
Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED are maintained at http://vislab.github.com/MobbedMatlab/ PMID:24124417
Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A
2013-01-01
Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED are maintained at http://vislab.github.com/MobbedMatlab/
NASA Astrophysics Data System (ADS)
Qiu, Xin; Cheng, Irene; Yang, Fuquan; Horb, Erin; Zhang, Leiming; Harner, Tom
2018-03-01
Two speciated and spatially resolved emissions databases for polycyclic aromatic compounds (PACs) in the Athabasca oil sands region (AOSR) were developed. The first database was derived from volatile organic compound (VOC) emissions data provided by the Cumulative Environmental Management Association (CEMA) and the second database was derived from additional data collected within the Joint Canada-Alberta Oil Sands Monitoring (JOSM) program. CALPUFF modelling results for atmospheric polycyclic aromatic hydrocarbons (PAHs), alkylated PAHs, and dibenzothiophenes (DBTs), obtained using each of the emissions databases, are presented and compared with measurements from a passive air monitoring network. The JOSM-derived emissions resulted in better model-measurement agreement in the total PAH concentrations and for most PAH species concentrations compared to results using CEMA-derived emissions. At local sites near oil sands mines, the percent error of the model compared to observations decreased from 30 % using the CEMA-derived emissions to 17 % using the JOSM-derived emissions. The improvement at local sites was likely attributed to the inclusion of updated tailings pond emissions estimated from JOSM activities. In either the CEMA-derived or JOSM-derived emissions scenario, the model underestimated PAH concentrations by a factor of 3 at remote locations. Potential reasons for the disagreement include forest fire emissions, re-emissions of previously deposited PAHs, and long-range transport not considered in the model. Alkylated PAH and DBT concentrations were also significantly underestimated. The CALPUFF model is expected to predict higher concentrations because of the limited chemistry and deposition modelling. Thus the model underestimation of PACs is likely due to gaps in the emissions database for these compounds and uncertainties in the methodology for estimating the emissions. Future work is required that focuses on improving the PAC emissions estimation and speciation methodologies and reducing the uncertainties in VOC emissions which are subsequently used in PAC emissions estimation.
Control of cost in prospective memory: evidence for spontaneous retrieval processes.
Scullin, Michael K; McDaniel, Mark A; Einstein, Gilles O
2010-01-01
To examine the processes that support prospective remembering, previous research has often examined whether the presence of a prospective memory task slows overall responding on an ongoing task. Although slowed task performance suggests that monitoring is present, this method does not clearly establish whether monitoring is functionally related to prospective memory performance. According to the multiprocess theory (McDaniel & Einstein, 2000), monitoring should be necessary to prospective memory performance with nonfocal cues but not with focal cues. To test this hypothesis, we varied monitoring by presenting items that were related (or unrelated) to the prospective memory task proximal to target events. Notably, whereas monitoring proximal to target events led to a large increase in nonfocal prospective memory performance, focal prospective remembering was high in the absence of monitoring, and monitoring in this condition provided no additional benefits. These results suggest that when monitoring is absent, spontaneous retrieval processes can support focal prospective remembering. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
The evolution of monitoring system: the INFN-CNAF case study
NASA Astrophysics Data System (ADS)
Bovina, Stefano; Michelotto, Diego
2017-10-01
Over the past two years, the operations at CNAF, the ICT center of the Italian Institute for Nuclear Physics, have undergone significant changes. The adoption of configuration management tools, such as Puppet, and the constant increase of dynamic and cloud infrastructures have led us to investigate a new monitoring approach. The present work deals with the centralization of the monitoring service at CNAF through a scalable and highly configurable monitoring infrastructure. The selection of tools has been made taking into account the following requirements given by users: (I) adaptability to dynamic infrastructures, (II) ease of configuration and maintenance, capability to provide more flexibility, (III) compatibility with existing monitoring system, (IV) re-usability and ease of access to information and data. In the paper, the CNAF monitoring infrastructure and its related components are hereafter described: Sensu as monitoring router, InfluxDB as time series database to store data gathered from sensors, Uchiwa as monitoring dashboard and Grafana as a tool to create dashboards and to visualize time series metrics.
Real-time indoor monitoring system based on wireless sensor networks
NASA Astrophysics Data System (ADS)
Wu, Zhengzhong; Liu, Zilin; Huang, Xiaowei; Liu, Jun
2008-10-01
Wireless sensor networks (WSN) greatly extend our ability to monitor and control the physical world. It can collaborate and aggregate a huge amount of sensed data to provide continuous and spatially dense observation of environment. The control and monitoring of indoor atmosphere conditions represents an important task with the aim of ensuring suitable working and living spaces to people. However, the comprehensive air quality, which includes monitoring of humidity, temperature, gas concentrations, etc., is not so easy to be monitored and controlled. In this paper an indoor WSN monitoring system was developed. In the system several sensors such as temperature sensor, humidity sensor, gases sensor, were built in a RF transceiver board for monitoring indoor environment conditions. The indoor environmental monitoring parameters can be transmitted by wireless to database server and then viewed throw PC or PDA accessed to the local area networks by administrators. The system, which was also field-tested and showed a reliable and robust characteristic, is significant and valuable to people.
NASA Astrophysics Data System (ADS)
Hortos, William S.
2010-04-01
Broadband wireless access standards, together with advances in the development of commercial sensing and actuator devices, enable the feasibility of a consumer service for a multi-sensor system that monitors the conditions within a residence or office: the environment/infrastructure, patient-occupant health, and physical security. The proposed service is a broadband reimplementation and combination of existing services to allow on-demand reports on and management of the conditions by remote subscribers. The flow of on-demand reports to subscribers and to specialists contracted to mitigate out-of-tolerance conditions is the foreground process. Service subscribers for an over-the-horizon connected home/office (OCHO) monitoring system are the occupant of the premises and agencies, contracted by the service provider, to mitigate or resolve any observed out-of-tolerance condition(s) at the premises. Collectively, these parties are the foreground users of the OCHO system; the implemented wireless standards allow the foreground users to be mobile as they request situation reports on demand from the subsystems on remote conditions that comprise OCHO via wireless devices. An OCHO subscriber, i.e., a foreground user, may select the level of detail found in on-demand reports, i.e., the amount of information displayed in the report of monitored conditions at the premises. This is one context of system operations. While foreground reports are sent only periodically to subscribers, the information generated by the monitored conditions at the premises is continuous and is transferred to a background configuration of servers on which databases reside. These databases are each used, generally, in non-real time, for the assessment and management of situations defined by attributes like those being monitored in the foreground by OCHO. This is the second context of system operations. Context awareness and management of conditions at the premises by a second group of analysts and decision makers who extract information from the OCHO data in the databases form the foundation of the situation management problem.
A Database Approach for Predicting and Monitoring Baked Anode Properties
NASA Astrophysics Data System (ADS)
Lauzon-Gauthier, Julien; Duchesne, Carl; Tessier, Jayson
2012-11-01
The baked anode quality control strategy currently used by most carbon plants based on testing anode core samples in the laboratory is inadequate for facing increased raw material variability. The low core sampling rate limited by lab capacity and the common practice of reporting averaged properties based on some anode population mask a significant amount of individual anode variability. In addition, lab results are typically available a few weeks after production and the anodes are often already set in the reduction cells preventing early remedial actions when necessary. A database approach is proposed in this work to develop a soft-sensor for predicting individual baked anode properties at the end of baking cycle. A large historical database including raw material properties, process operating parameters and anode core data was collected from a modern Alcoa plant. A multivariate latent variable PLS regression method was used for analyzing the large database and building the soft-sensor model. It is shown that the general low frequency trends in most anode physical and mechanical properties driven by raw material changes are very well captured by the model. Improvements in the data infrastructure (instrumentation, sampling frequency and location) will be necessary for predicting higher frequency variations in individual baked anode properties. This paper also demonstrates how multivariate latent variable models can be interpreted against process knowledge and used for real-time process monitoring of carbon plants, and detection of faults and abnormal operation.
Airport take-off noise assessment aimed at identify responsible aircraft classes.
Sanchez-Perez, Luis A; Sanchez-Fernandez, Luis P; Shaout, Adnan; Suarez-Guerra, Sergio
2016-01-15
Assessment of aircraft noise is an important task of nowadays airports in order to fight environmental noise pollution given the recent discoveries on the exposure negative effects on human health. Noise monitoring and estimation around airports mostly use aircraft noise signals only for computing statistical indicators and depends on additional data sources so as to determine required inputs such as the aircraft class responsible for noise pollution. In this sense, the noise monitoring and estimation systems have been tried to improve by creating methods for obtaining more information from aircraft noise signals, especially real-time aircraft class recognition. Consequently, this paper proposes a multilayer neural-fuzzy model for aircraft class recognition based on take-off noise signal segmentation. It uses a fuzzy inference system to build a final response for each class p based on the aggregation of K parallel neural networks outputs Op(k) with respect to Linear Predictive Coding (LPC) features extracted from K adjacent signal segments. Based on extensive experiments over two databases with real-time take-off noise measurements, the proposed model performs better than other methods in literature, particularly when aircraft classes are strongly correlated to each other. A new strictly cross-checked database is introduced including more complex classes and real-time take-off noise measurements from modern aircrafts. The new model is at least 5% more accurate with respect to previous database and successfully classifies 87% of measurements in the new database. Copyright © 2015 Elsevier B.V. All rights reserved.
Earthquake forecasting studies using radon time series data in Taiwan
NASA Astrophysics Data System (ADS)
Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong
2017-04-01
For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.
Pharmacology Portal: An Open Database for Clinical Pharmacologic Laboratory Services.
Karlsen Bjånes, Tormod; Mjåset Hjertø, Espen; Lønne, Lars; Aronsen, Lena; Andsnes Berg, Jon; Bergan, Stein; Otto Berg-Hansen, Grim; Bernard, Jean-Paul; Larsen Burns, Margrete; Toralf Fosen, Jan; Frost, Joachim; Hilberg, Thor; Krabseth, Hege-Merete; Kvan, Elena; Narum, Sigrid; Austgulen Westin, Andreas
2016-01-01
More than 50 Norwegian public and private laboratories provide one or more analyses for therapeutic drug monitoring or testing for drugs of abuse. Practices differ among laboratories, and analytical repertoires can change rapidly as new substances become available for analysis. The Pharmacology Portal was developed to provide an overview of these activities and to standardize the practices and terminology among laboratories. The Pharmacology Portal is a modern dynamic web database comprising all available analyses within therapeutic drug monitoring and testing for drugs of abuse in Norway. Content can be retrieved by using the search engine or by scrolling through substance lists. The core content is a substance registry updated by a national editorial board of experts within the field of clinical pharmacology. This ensures quality and consistency regarding substance terminologies and classification. All laboratories publish their own repertoires in a user-friendly workflow, adding laboratory-specific details to the core information in the substance registry. The user management system ensures that laboratories are restricted from editing content in the database core or in repertoires within other laboratory subpages. The portal is for nonprofit use, and has been fully funded by the Norwegian Medical Association, the Norwegian Society of Clinical Pharmacology, and the 8 largest pharmacologic institutions in Norway. The database server runs an open-source content management system that ensures flexibility with respect to further development projects, including the potential expansion of the Pharmacology Portal to other countries. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.
An expanded mammal mitogenome dataset from Southeast Asia
Ramos-Madrigal, Jazmín; Peñaloza, Fernando; Liu, Shanlin; Mikkel-Holger, S. Sinding; Riddhi, P. Patel; Martins, Renata; Lenz, Dorina; Fickel, Jörns; Roos, Christian; Shamsir, Mohd Shahir; Azman, Mohammad Shahfiz; Burton, K. Lim; Stephen, J. Rossiter; Wilting, Andreas
2017-01-01
Abstract Southeast (SE) Asia is 1 of the most biodiverse regions in the world, and it holds approximately 20% of all mammal species. Despite this, the majority of SE Asia's genetic diversity is still poorly characterized. The growing interest in using environmental DNA to assess and monitor SE Asian species, in particular threatened mammals—has created the urgent need to expand the available reference database of mitochondrial barcode and complete mitogenome sequences. We have partially addressed this need by generating 72 new mitogenome sequences reconstructed from DNA isolated from a range of historical and modern tissue samples. Approximately 55 gigabases of raw sequence were generated. From this data, we assembled 72 complete mitogenome sequences, with an average depth of coverage of ×102.9 and ×55.2 for modern samples and historical samples, respectively. This dataset represents 52 species, of which 30 species had no previous mitogenome data available. The mitogenomes were geotagged to their sampling location, where known, to display a detailed geographical distribution of the species. Our new database of 52 taxa will strongly enhance the utility of environmental DNA approaches for monitoring mammals in SE Asia as it greatly increases the likelihoods that identification of metabarcoding sequencing reads can be assigned to reference sequences. This magnifies the confidence in species detections and thus allows more robust surveys and monitoring programmes of SE Asia's threatened mammal biodiversity. The extensive collections of historical samples from SE Asia in western and SE Asian museums should serve as additional valuable material to further enrich this reference database. PMID:28873965
An expanded mammal mitogenome dataset from Southeast Asia.
Mohd Salleh, Faezah; Ramos-Madrigal, Jazmín; Peñaloza, Fernando; Liu, Shanlin; Mikkel-Holger, S Sinding; Riddhi, P Patel; Martins, Renata; Lenz, Dorina; Fickel, Jörns; Roos, Christian; Shamsir, Mohd Shahir; Azman, Mohammad Shahfiz; Burton, K Lim; Stephen, J Rossiter; Wilting, Andreas; Gilbert, M Thomas P
2017-08-01
Southeast (SE) Asia is 1 of the most biodiverse regions in the world, and it holds approximately 20% of all mammal species. Despite this, the majority of SE Asia's genetic diversity is still poorly characterized. The growing interest in using environmental DNA to assess and monitor SE Asian species, in particular threatened mammals-has created the urgent need to expand the available reference database of mitochondrial barcode and complete mitogenome sequences. We have partially addressed this need by generating 72 new mitogenome sequences reconstructed from DNA isolated from a range of historical and modern tissue samples. Approximately 55 gigabases of raw sequence were generated. From this data, we assembled 72 complete mitogenome sequences, with an average depth of coverage of ×102.9 and ×55.2 for modern samples and historical samples, respectively. This dataset represents 52 species, of which 30 species had no previous mitogenome data available. The mitogenomes were geotagged to their sampling location, where known, to display a detailed geographical distribution of the species. Our new database of 52 taxa will strongly enhance the utility of environmental DNA approaches for monitoring mammals in SE Asia as it greatly increases the likelihoods that identification of metabarcoding sequencing reads can be assigned to reference sequences. This magnifies the confidence in species detections and thus allows more robust surveys and monitoring programmes of SE Asia's threatened mammal biodiversity. The extensive collections of historical samples from SE Asia in western and SE Asian museums should serve as additional valuable material to further enrich this reference database. © The Author 2017. Published by Oxford University Press.
Carrara, Marta; Carozzi, Luca; Moss, Travis J; de Pasquale, Marco; Cerutti, Sergio; Lake, Douglas E; Moorman, J Randall; Ferrario, Manuela
2015-01-01
Identification of atrial fibrillation (AF) is a clinical imperative. Heartbeat interval time series are increasingly available from personal monitors, allowing new opportunity for AF diagnosis. Previously, we devised numerical algorithms for identification of normal sinus rhythm (NSR), AF, and SR with frequent ectopy using dynamical measures of heart rate. Here, we wished to validate them in the canonical MIT-BIH ECG databases. We tested algorithms on the NSR, AF and arrhythmia databases. When the databases were combined, the positive predictive value of the new algorithms exceeded 95% for NSR and AF, and was 40% for SR with ectopy. Further, dynamical measures did not distinguish atrial from ventricular ectopy. Inspection of individual 24hour records showed good correlation of observed and predicted rhythms. Heart rate dynamical measures are effective ingredients in numerical algorithms to classify cardiac rhythm from the heartbeat intervals time series alone. Copyright © 2015 Elsevier Inc. All rights reserved.
Database of episode-integrated solar energetic proton fluences
NASA Astrophysics Data System (ADS)
Robinson, Zachary D.; Adams, James H.; Xapsos, Michael A.; Stauffer, Craig A.
2018-04-01
A new database of proton episode-integrated fluences is described. This database contains data from two different instruments on multiple satellites. The data are from instruments on the Interplanetary Monitoring Platform-8 (IMP8) and the Geostationary Operational Environmental Satellites (GOES) series. A method to normalize one set of data to one another is presented to create a seamless database spanning 1973 to 2016. A discussion of some of the characteristics that episodes exhibit is presented, including episode duration and number of peaks. As an example of what can be understood about episodes, the July 4, 2012 episode is examined in detail. The coronal mass ejections and solar flares that caused many of the fluctuations of the proton flux seen at Earth are associated with peaks in the proton flux during this episode. The reasoning for each choice is laid out to provide a reference for how CME and solar flares associations are made.
NASA Astrophysics Data System (ADS)
Scanu, Sergio; Peviani, Maximo; Carli, Filippo Maria; Paladini de Mendoza, Francesco; Piermattei, Viviana; Bonamano, Simone; Marcelli, Marco
2015-04-01
This work proposes a multidisciplinary approach in which wave power potential maps are used as baseline for the application of environmental monitoring techniques identified through the use of a Database for Environmental Monitoring Techniques and Equipment (DEMTE), derived in the frame of the project "Marine Renewables Infrastructure Network for Emerging Energy Technologies" (Marinet - FP7). This approach aims to standardize the monitoring of the marine environment in the event of installation, operation and decommissioning of Marine Energy Conversion Systems. The database has been obtained through the collection of techniques and instrumentation available among the partners of the consortium, in relation with all environmental marine compounds potentially affected by any impacts. Furthermore in order to plan marine energy conversion schemes, the wave potential was assessed at regional and local scales using the numerical modelling downscaling methodology. The regional scale lead to the elaboration of the Italian Wave Power Atlas, while the local scale lead to the definition of nearshore hot spots useful for the planning of devices installation along the Latium coast. The present work focus in the application of environmental monitoring techniques identified in the DEMTE, in correspondence of the hotspot derived from the wave potential maps with particular reference to the biological interaction of the devices and the management of the marine space. The obtained results are the bases for the development of standardized procedures which aims to an effective application of marine environmental monitoring techniques during the installation, operation and decommissioning of Marine Energy Conversion Systems. The present work gives a consistent contribution to overcome non-technological barriers in the concession procedures, as far as the protection of the marine environment is of concern.
Global Scale Remote Sensing Monitoring of Endorheic Lake Systems
NASA Astrophysics Data System (ADS)
Scuderi, L. A.
2010-12-01
Semi-arid regions of the world contain thousands of endorheic lakes in large shallow basins. Due to their generally remote locations few are continuously monitored. Documentation of recent variability is essential to assessing how endorheic lakes respond to short-term meteorological conditions and longer-term decadal-scale climatic variability and is critical in determining future disturbance of hydrological regimes with respect to predicted warming and drying in the mid-latitudes. Short- and long-term departures from climatic averages, rapid environmental shifts and increased population pressures may result in significant fluctuations in the hydrologic budgets of these lakes and adversely impact endorheic lake/basin ecosystems. Information on flooding variability is also critical in estimating changes in P/E balances and on the production of exposed and easily deflated surfaces that may impact dust loading locally and regionally. In order to provide information on how these lakes respond we need to understand how entire systems respond hydrologically to different climatic inputs. This requires monitoring and analysis of regional to continental-scale systems. To date, this level of monitoring has not been achieved in an operational system. In order to assess the possibility of creating a global-scale lake inundation database we analyzed two contrasting lake systems in western North America (Mexico and New Mexico, USA) and China (Inner Mongolia). We asked two major questions: 1) is it possible to quickly and accurately quantify current lake inundation events in near real time using remote sensing? and, 2) is it possible to differentiate variable meteorological sources and resultant lake inundation responses using this type of database? With respect to these results we outline an automated lake monitoring approach using MODIS data and real-time processing systems that may provide future global monitoring capabilities.
Farnham, David J; Gibson, Rebecca A; Hsueh, Diana Y; McGillis, Wade R; Culligan, Patricia J; Zain, Nina; Buchanan, Rob
2017-02-15
To protect recreational water users from waterborne pathogen exposure, it is crucial that waterways are monitored for the presence of harmful bacteria. In NYC, a citizen science campaign is monitoring waterways impacted by inputs of storm water and untreated sewage during periods of rainfall. However, the spatial and temporal scales over which the monitoring program can sample are constrained by cost and time, thus hindering the construction of databases that benefit both scientists and citizens. In this study, we first illustrate the scientific value of a citizen scientist monitoring campaign by using the data collected through the campaign to characterize the seasonal variability of sampled bacterial concentration as well as its response to antecedent rainfall. Second, we examine the efficacy of the HyServe Compact Dry ETC method, a lower cost and time-efficient alternative to the EPA-approved IDEXX Enterolert method for fecal indicator monitoring, through a paired sample comparison of IDEXX and HyServe (total of 424 paired samples). The HyServe and IDEXX methods return the same result for over 80% of the samples with regard to whether a water sample is above or below the EPA's recreational water quality criteria for a single sample of 110 enterococci per 100mL. The HyServe method classified as unsafe 90% of the 119 water samples that were classified as having unsafe enterococci concentrations by the more established IDEXX method. This study seeks to encourage other scientists to engage with citizen scientist communities and to also pursue the development of cost- and time-efficient methodologies to sample environmental variables that are not easily collected or analyzed in an automated manner. Copyright © 2016 Elsevier B.V. All rights reserved.
A new efficient method to monitor precocious puberty nationwide in France.
Rigou, Annabel; Le Moal, Joëlle; Léger, Juliane; Le Tertre, Alain; Carel, Jean-Claude
2018-02-01
Clinical precocious puberty (PP) is a disease, reputed to be on the increase and suspected to be linked to endocrine disrupting chemicals (EDC) exposure. Population-based epidemiological data are lacking in France and scarce elsewhere. We accessed the feasibility of monitoring PP nationwide in France in this context, using a nationwide existing database, the French National Health Insurance Information System. Here, we present the method we used with a step-by-step approach to build and select the most suitable indicator. We built three indicators reflecting the incidence of idiopathic central precocious puberty (ICPP), the most frequent form of PP, and we compared these indicators according to their strengths and weaknesses with respect to surveillance purposes. Monitoring ICPP in France proved feasible using a Drug reimbursement indicator. Our method is cost efficient and highly relevant in public health surveillance. Our step-by-step approach proved helpful to achieve this project and could be proposed for assessing the feasibility of monitoring health outcomes of interest using existing data bases. What is known: • Precocious puberty (PP) is suspected to be related to EDC exposure and it is believed to be on the increase in France and in others countries. • Very few epidemiologic data on PP are currently available in the world at the national scale. What is new: • This is the first study describing a method to monitor the most frequent form of PP, idiopathic central PP (ICPP) nationwide in a cost-efficient way, using health insurance databases. • This cost-effective method will allow to estimate and monitor the incidence of ICPP in France and to analyze spatial variations at a very precise scale, which will be very useful to examine the role of environmental exposures, especially to EDCs.
Nußbeck, Gunnar; Gök, Murat
2013-01-01
This review gives a comprehensive overview on the technical perspective of personal health monitoring. It is designed to build a mutual basis for the project partners of the PHM-Ethics project. A literature search was conducted to screen pertinent literature databases for relevant publications. All review papers that were retrieved were analyzed. The increasing number of publications that are published per year shows that the field of personal health monitoring is of growing interest in the research community. Most publications deal with telemonitoring, thus forming the core technology of personal health monitoring. Measured parameters, fields of application, participants and stakeholders are described. Moreover an outlook on information and communication technology that foster the integration possibilities of personal health monitoring into decision making and remote monitoring of individual people's health is provided. The removal of the technological barriers opens new perspectives in health and health care delivery using home monitoring applications.
Processing of the WLCG monitoring data using NoSQL
NASA Astrophysics Data System (ADS)
Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.
2014-06-01
The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.
Toward Phase IV, Populating the WOVOdat Database
NASA Astrophysics Data System (ADS)
Ratdomopurbo, A.; Newhall, C. G.; Schwandner, F. M.; Selva, J.; Ueda, H.
2009-12-01
One of challenges for volcanologists is the fact that more and more people are likely to live on volcanic slopes. Information about volcanic activity during unrest should be accurate and rapidly distributed. As unrest may lead to eruption, evacuation may be necessary to minimize damage and casualties. The decision to evacuate people is usually based on the interpretation of monitoring data. Over the past several decades, monitoring volcanoes has used more and more sophisticated instruments. A huge volume of data is collected in order to understand the state of activity and behaviour of a volcano. WOVOdat, The World Organization of Volcano Observatories (WOVO) Database of Volcanic Unrest, will provide context within which scientists can interpret the state of their own volcano, during and between crises. After a decision during the 2000 IAVCEI General Assembly to create WOVOdat, development has passed through several phases, from Concept Development (Phase-I in 2000-2002), Database Design (Phase-II, 2003-2006) and Pilot Testing (Phase-III in 2007-2008). For WOVOdat to be operational, there are still two (2) steps to complete, which are: Database Population (Phase-IV) and Enhancement and Maintenance (Phase-V). Since January 2009, the WOVOdat project is hosted by Earth Observatory of Singapore for at least a 5-year period. According to the original planning in 2002, this 5-year period will be used for completing the Phase-IV. As the WOVOdat design is not yet tested for all types of data, 2009 is still reserved for building the back-end relational database management system (RDBMS) of WOVOdat and testing it with more complex data. Fine-tuning of the WOVOdat’s RDBMS design is being done with each new upload of observatory data. The next and main phase of WOVOdat development will be data population, managing data transfer from multiple observatory formats to WOVOdat format. Data population will depend on two important things, the availability of SQL database in volcano observatories and their data sharing policy. Hence, a strong collaboration with every WOVO observatory is important. For some volcanoes where the data are not in an SQL system, the WOVOdat project will help scientists working on the volcano to start building an SQL database.
Monitoring-induced disruption in skilled typewriting.
Snyder, Kristy M; Logan, Gordon D
2013-10-01
It is often disruptive to attend to the details of one's expert performance. The current work presents four experiments that utilized a monitor to report protocol to evaluate the sufficiency of three accounts of monitoring-induced disruption. The inhibition hypothesis states that disruption results from costs associated with preparing to withhold inappropriate responses. The dual-task hypothesis states that disruption results from maintaining monitored information in working memory. The implicit-explicit hypothesis states that disruption results from explicitly monitoring details of performance that are normally implicit. The findings suggest that all three hypotheses are sufficient to produce disruption, but inhibition and dual-task costs are not necessary. Experiment 1 showed that monitoring to report was disruptive even when there was no requirement to inhibit. Experiment 2 showed that maintaining information in working memory caused some disruption but much less than monitoring to report. Experiment 4 showed that monitoring to inhibit was more disruptive than monitoring to report, suggesting that monitoring is more disruptive when it is combined with other task requirements, such as inhibition. PsycINFO Database Record (c) 2013 APA, all rights reserved.
77 FR 45965 - Determination of Attainment for the Paul Spur/Douglas PM10
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-02
... plans, and based on the findings of our technical system audit report, ADEQ's monitoring network meets... to EPA's Air Quality System (AQS) database as quality- assured. Next, we reviewed the ambient PM 10...
Experimental Evaluation of Fuzzy Logic Control of a Flexible Arm Manipulator
1993-12-09
temperature into a fuzzy context), and humidity is musty, Then air conditioner power is high. The database and knowledge base combine to form the...this case, the output, perhaps air conditioner power, would be medium to a degree of 50%. However, as shown in Table 3.2, Oare are more possible...OF WASHINGTON AFIT/CI/CIA-93-167 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING/MONITORING DEPARTMENI OF THE AIR FORCE AGENCY
Lower Granite Dam Smolt Monitoring Program, 1998 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verhey, Peter; Ross, Doug; Morrill, Charles
1998-12-01
The 1998 fish collection season at Lower Granite was characterized by relatively moderate spring flows and spill, moderate levels of debris, cool spring, warm summer and fall water temperatures, and increased chinook numbers, particularly wild subyearling chinook collected and transported. The Fish Passage Center's Smolt Monitoring Program is designed to provide a consistent, real-time database on fish passage and document the migrational characteristics of the many stocks of salmon and steelhead in the Columbia Basin.
Daniel J. Isaak; Seth J. Wenger; Erin E. Peterson; Jay M. Ver Hoef; David E. Nagel; Charles H. Luce; Steven W. Hostetler; Jason B. Dunham; Brett B. Roper; Sherry P. Wollrab; Gwynne L. Chandler; Dona L. Horan; Sharon Parkes-Payne
2017-01-01
Thermal regimes are fundamental determinants of aquatic ecosystems, which makes description and prediction of temperatures critical during a period of rapid global change. The advent of inexpensive temperature sensors dramatically increased monitoring in recent decades, and although most monitoring is done by individuals for agency-specific purposes, collectively these...
Remily-Wood, Elizabeth R.; Liu, Richard Z.; Xiang, Yun; Chen, Yi; Thomas, C. Eric; Rajyaguru, Neal; Kaufman, Laura M.; Ochoa, Joana E.; Hazlehurst, Lori; Pinilla-Ibarz, Javier; Lancet, Jeffrey; Zhang, Guolin; Haura, Eric; Shibata, David; Yeatman, Timothy; Smalley, Keiran S.M.; Dalton, William S.; Huang, Emina; Scott, Ed; Bloom, Gregory C.; Eschrich, Steven A.; Koomen, John M.
2012-01-01
Purpose The Quantitative Assay Database (QuAD), http://proteome.moffitt.org/QUAD/, facilitates widespread implementation of quantitative mass spectrometry in cancer biology and clinical research through sharing of methods and reagents for monitoring protein expression and modification. Experimental Design Liquid chromatography coupled to multiple reaction monitoring mass spectrometry (LC-MRM) assays are developed using SDS-PAGE fractionated lysates from cancer cell lines. Pathway maps created using GeneGO Metacore provide the biological relationships between proteins and illustrate concepts for multiplexed analysis; each protein can be selected to examine assay development at the protein and peptide level. Results The coupling of SDS-PAGE and LC-MRM screening has been used to detect 876 peptides from 218 cancer-related proteins in model systems including colon, lung, melanoma, leukemias, and myeloma, which has led to the development of 95 quantitative assays including stable-isotope labeled peptide standards. Methods are published online and peptide standards are made available to the research community. Protein expression measurements for heat shock proteins, including a comparison with ELISA and monitoring response to the HSP90 inhibitor, 17-DMAG, are used to illustrate the components of the QuAD and its potential utility. Conclusions and Clinical Relevance This resource enables quantitative assessment of protein components of signaling pathways and biological processes and holds promise for systematic investigation of treatment responses in cancer. PMID:21656910
Overview of four prescription monitoring/review programs in Canada.
Furlan, Andrea D; MacDougall, Peter; Pellerin, Denise; Shaw, Karen; Spitzig, Doug; Wilson, Galt; Wright, Janet
2014-01-01
Prescription monitoring or review programs collect information about prescription and dispensing of controlled substances for the purposes of monitoring, analysis and education. In Canada, it is the responsibility of the provincial institutions to organize, maintain and run such programs. To describe the characteristics of four provincial programs that have been in place for >6 years. The managers of the prescription monitoring⁄review programs of four provinces (British Columbia, Alberta, Saskatchewan and Nova Scotia) were invited to present at a symposium at the Canadian Pain Society in May 2012. In preparation for the symposium, one author collected and summarized the information. Three provinces have a mix of review and monitoring programs; the program in British Columbia is purely for review and education. All programs include controlled substances (narcotics, barbiturates and psychostimulants); however, other substances are differentially included among the programs: anabolic steroids are included in Saskatchewan and Nova Scotia; and cannabinoids are included in British Columbia and Nova Scotia. Access to the database is available to pharmacists in all provinces. Physicians need consent from patients in British Columbia, and only professionals registered with the program can access the database in Alberta. The definition of inappropriate prescribing and dispensing is not uniform. Double doctoring, double pharmacy and high-volume dispensing are considered to be red flags in all programs. There is variability among Canadian provinces in managing prescription monitoring⁄review programs.
NASA Astrophysics Data System (ADS)
Lu, Anxin; Wang, Lihong; Chen, Xianzhang
2003-07-01
A major monitoring area, a part of the middle reaches of Heihe basin, was selected. The Landsat TM data in summer of 1990 and 2000 were used with interpretation on the computer screen, classification and setting up environmental investigation database (1:100000) combined with DEM, land cover/land use, land type data and etc., according to the environmental classification system. Then towards to the main problems of environment, the spatial statistical analysis and dynamic comparisons were carried out using the database. The dynamic monitoring results of 1999 and 2000 show that the changing percentage with the area of 6 ground objects are as follows: land use and agriculture land use increased by 34.17% and 19.47% respectively, wet land and water-body also increased by 6.29% and 8.03% respectively; unused land increased by 1.73% and the biggest change is natural/semi-natural vegetation area, decreased by 42.78%, the main results above meat with the requirements of precise and practical conditions by the precise exam and spot check. With the combinations of using TM remote sensing data and rich un-remote sensing data, the investigations of ecology and environment and the dynamic monitoring would be carried out efficiently in the arid area. It is a dangerous signal of large area desertification if the area of natural/semi-natural vegetation is reduced continuously and obviously.
Evolution of the LBT Telemetry System
NASA Astrophysics Data System (ADS)
Summers, K.; Biddick, C.; De La Peña, M. D.; Summers, D.
2014-05-01
The Large Binocular Telescope (LBT) Telescope Control System (TCS) records about 10GB of telemetry data per night. Additionally, the vibration monitoring system records about 9GB of telemetry data per night. Through 2013, we have amassed over 6TB of Hierarchical Data Format (HDF5) files and almost 9TB in a MySQL database of TCS and vibration data. The LBT telemetry system, in its third major revision since 2004, provides the mechanism to capture and store this data. The telemetry system has evolved from a simple HDF file system with MySQL stream definitions within the TCS, to a separate system using a MySQL database system for the definitions and data, and finally to no database use at all, using HDF5 files.
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Letter to Bay Area on Periodic Monitoring
This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Andre M.; Johnson, Gary E.; Borde, Amy B.
Pacific Northwest National Laboratory (PNNL) conducted this project for the U.S. Army Corps of Engineers, Portland District (Corps). The purpose of the project is to develop a geospatial, web-accessible database (called “Oncor”) for action effectiveness and related data from monitoring and research efforts for the Columbia Estuary Ecosystem Restoration Program (CEERP). The intent is for the Oncor database to enable synthesis and evaluation, the results of which can then be applied in subsequent CEERP decision-making. This is the first annual report in what is expected to be a 3- to 4-year project, which commenced on February 14, 2012.
Region 7 Policy on Periodic Monitoring for Opacity
This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Age 60 study, part III : consolidated database experiments final report.
DOT National Transportation Integrated Search
1994-10-01
This document is one of four products completed as a part of the Age 60 Rule research contract monitored by Pam Della Rocco, Civil Aerospace Medical Institute, Contracting Officer's Technical Representative. This work was performed. This report was a...
The AirData site provides access to yearly summaries of United States air pollution data, taken from EPA's air pollution databases. AirData has information about where air pollution comes from (emissions) and how much pollution is in the air outside our homes and work places (monitoring).
76 FR 63288 - Notice of Proposed Information Collection Requests
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-12
... ED to monitor CSP grant performance and analyze data related to accountability for academic performance, financial integrity, and program effectiveness. Copies of the proposed information collection... database of current CSP-funded charter schools and award amounts; ED merges performance information...
Urban roadway congestion : annual report
DOT National Transportation Integrated Search
1998-01-01
The annual traffic congestion study is an effort to monitor roadway congestion in major urban areas in the United States. The comparisons to other areas and to previous experiences in each area are facilitated by a database that begins in 1982 and in...
Six steps to an effective denials management program.
Robertson, Brian; Doré, Alexander
2005-09-01
The following six steps can help you manage denials management issues in your organization: Create standard definitions of denial types. Establish a denial hierarchy. Establish a centralized denial database. Develop key performance indicators. Build responsibility matrices. Measure, monitor, and take action.
Real time monitoring of slope stability in eastern Oklahoma.
DOT National Transportation Integrated Search
2014-01-01
There were three primary objectives of the proposed research. The first was to establish a : comprehensive landslide database, the second was to create a first- cut regional landslide map and : the third was to relate safe and stable constructed slop...
Pan European Phenological database (PEP725): a single point of access for European data
NASA Astrophysics Data System (ADS)
Templ, Barbara; Koch, Elisabeth; Bolmgren, Kjell; Ungersböck, Markus; Paul, Anita; Scheifinger, Helfried; Rutishauser, This; Busto, Montserrat; Chmielewski, Frank-M.; Hájková, Lenka; Hodzić, Sabina; Kaspar, Frank; Pietragalla, Barbara; Romero-Fresneda, Ramiro; Tolvanen, Anne; Vučetič, Višnja; Zimmermann, Kirsten; Zust, Ana
2018-06-01
The Pan European Phenology (PEP) project is a European infrastructure to promote and facilitate phenological research, education, and environmental monitoring. The main objective is to maintain and develop a Pan European Phenological database (PEP725) with an open, unrestricted data access for science and education. PEP725 is the successor of the database developed through the COST action 725 "Establishing a European phenological data platform for climatological applications" working as a single access point for European-wide plant phenological data. So far, 32 European meteorological services and project partners from across Europe have joined and supplied data collected by volunteers from 1868 to the present for the PEP725 database. Most of the partners actively provide data on a regular basis. The database presently holds almost 12 million records, about 46 growing stages and 265 plant species (including cultivars), and can be accessed via
Menditto, Enrica; Bolufer De Gea, Angela; Cahir, Caitriona; Marengoni, Alessandra; Riegler, Salvatore; Fico, Giuseppe; Costa, Elisio; Monaco, Alessandro; Pecorelli, Sergio; Pani, Luca; Prados-Torres, Alexandra
2016-01-01
Computerized health care databases have been widely described as an excellent opportunity for research. The availability of "big data" has brought about a wave of innovation in projects when conducting health services research. Most of the available secondary data sources are restricted to the geographical scope of a given country and present heterogeneous structure and content. Under the umbrella of the European Innovation Partnership on Active and Healthy Ageing, collaborative work conducted by the partners of the group on "adherence to prescription and medical plans" identified the use of observational and large-population databases to monitor medication-taking behavior in the elderly. This article describes the methodology used to gather the information from available databases among the Adherence Action Group partners with the aim of improving data sharing on a European level. A total of six databases belonging to three different European countries (Spain, Republic of Ireland, and Italy) were included in the analysis. Preliminary results suggest that there are some similarities. However, these results should be applied in different contexts and European countries, supporting the idea that large European studies should be designed in order to get the most of already available databases.
Gilligan, Tony; Alamgir, Hasanat
2008-01-01
Healthcare workers are exposed to a variety of work-related hazards including biological, chemical, physical, ergonomic, psychological hazards; and workplace violence. The Occupational Health and Safety Agency for Healthcare in British Columbia (OHSAH), in conjunction with British Columbia (BC) health regions, developed and implemented a comprehensive surveillance system that tracks occupational exposures and stressors as well as injuries and illnesses among a defined population of healthcare workers. Workplace Health Indicator Tracking and Evaluation (WHITE) is a secure operational database, used for data entry and transaction reporting. It has five modules: Incident Investigation, Case Management, Employee Health, Health and Safety, and Early Intervention/Return to Work. Since the WHITE database was first introduced into BC in 2004, it has tracked the health of 84,318 healthcare workers (120,244 jobs), representing 35,927 recorded incidents, resulting in 18,322 workers' compensation claims. Currently, four of BC's six healthcare regions are tracking and analyzing incidents and the health of healthcare workers using WHITE, providing OHSAH and healthcare stakeholders with comparative performance indicators on workplace health and safety. A number of scientific manuscripts have also been published in peer-reviewed journals. The WHITE database has been very useful for descriptive epidemiological studies, monitoring health risk factors, benchmarking, and evaluating interventions.
Tran, Le-Thuy T.; Brewster, Philip J.; Chidambaram, Valliammai; Hurdle, John F.
2017-01-01
This study presents a method laying the groundwork for systematically monitoring food quality and the healthfulness of consumers’ point-of-sale grocery purchases. The method automates the process of identifying United States Department of Agriculture (USDA) Food Patterns Equivalent Database (FPED) components of grocery food items. The input to the process is the compact abbreviated descriptions of food items that are similar to those appearing on the point-of-sale sales receipts of most food retailers. The FPED components of grocery food items are identified using Natural Language Processing techniques combined with a collection of food concept maps and relationships that are manually built using the USDA Food and Nutrient Database for Dietary Studies, the USDA National Nutrient Database for Standard Reference, the What We Eat In America food categories, and the hierarchical organization of food items used by many grocery stores. We have established the construct validity of the method using data from the National Health and Nutrition Examination Survey, but further evaluation of validity and reliability will require a large-scale reference standard with known grocery food quality measures. Here we evaluate the method’s utility in identifying the FPED components of grocery food items available in a large sample of retail grocery sales data (~190 million transaction records). PMID:28475153
Tran, Le-Thuy T; Brewster, Philip J; Chidambaram, Valliammai; Hurdle, John F
2017-05-05
This study presents a method laying the groundwork for systematically monitoring food quality and the healthfulness of consumers' point-of-sale grocery purchases. The method automates the process of identifying United States Department of Agriculture (USDA) Food Patterns Equivalent Database (FPED) components of grocery food items. The input to the process is the compact abbreviated descriptions of food items that are similar to those appearing on the point-of-sale sales receipts of most food retailers. The FPED components of grocery food items are identified using Natural Language Processing techniques combined with a collection of food concept maps and relationships that are manually built using the USDA Food and Nutrient Database for Dietary Studies, the USDA National Nutrient Database for Standard Reference, the What We Eat In America food categories, and the hierarchical organization of food items used by many grocery stores. We have established the construct validity of the method using data from the National Health and Nutrition Examination Survey, but further evaluation of validity and reliability will require a large-scale reference standard with known grocery food quality measures. Here we evaluate the method's utility in identifying the FPED components of grocery food items available in a large sample of retail grocery sales data (~190 million transaction records).
BIO-Plex Information System Concept
NASA Technical Reports Server (NTRS)
Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)
1999-01-01
This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.
Importance of Data Management in a Long-term Biological Monitoring Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christensen, Sigurd W; Brandt, Craig C; McCracken, Kitty
2011-01-01
The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program. An existing relational database was adapted and extended to handle biological data. Data modeling enabled the program's database to process, store, and retrievemore » its data. The data base's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. There are limitations. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.« less
Importance of Data Management in a Long-Term Biological Monitoring Program
NASA Astrophysics Data System (ADS)
Christensen, Sigurd W.; Brandt, Craig C.; McCracken, Mary K.
2011-06-01
The long-term Biological Monitoring and Abatement Program (BMAP) has always needed to collect and retain high-quality data on which to base its assessments of ecological status of streams and their recovery after remediation. Its formal quality assurance, data processing, and data management components all contribute to meeting this need. The Quality Assurance Program comprehensively addresses requirements from various institutions, funders, and regulators, and includes a data management component. Centralized data management began a few years into the program when an existing relational database was adapted and extended to handle biological data. The database's main data tables and several key reference tables are described. One of the most important related activities supporting long-term analyses was the establishing of standards for sampling site names, taxonomic identification, flagging, and other components. The implemented relational database supports the transmittal of data to the Oak Ridge Environmental Information System (OREIS) as the permanent repository. We also discuss some limitations to our implementation. Some types of program data were not easily accommodated in the central systems, and many possible data-sharing and integration options are not easily accessible to investigators. From our experience we offer data management advice to other biologically oriented long-term environmental sampling and analysis programs.
Sustainable Seas Student Intertidal Monitoring Project at Duxbury Reef in Bolinas, CA
NASA Astrophysics Data System (ADS)
Broad, C.; Soave, K.; Ericson, W.; Raabe, B.; Glazer, R.; Ahuatzi, A.; Pereira, M.; Rainsford, A.
2013-12-01
The Sustainable Seas Student Monitoring Project at the Branson School in Ross, CA has monitored Duxbury Reef in Bolinas, CA since 1999, in cooperation with the Farallones Marine Sanctuary Association and the Gulf of Farallones National Marine Sanctuary. Goals of this student-run project include: 1) To monitor the rocky intertidal habitat and develop a baseline database of invertebrates and algal density and abundance; 2) To contribute to the conservation of the rocky intertidal habitat through education of students and visitors about intertidal species and the requirements for maintaining a healthy, diverse intertidal ecosystem; 3) To increase stewardship in the Gulf of the Farallones National Marine Sanctuary; and 4) To contribute abundance and population data on key algae and invertebrate species to the national database, LiMPETS (Long Term Monitoring Program & Experiential Training for Students). Student volunteers complete an intensive training course on the natural history of intertidal invertebrates and algae, identification of key species, rocky intertidal monitoring techniques, and history of the sanctuary. Students identify and count key invertebrate and algae species along two permanent transects and, using randomly determined points, within two permanent 100 m2 areas, three times per year (fall, winter, and late spring). Using the data collected since 2004, we will once again compare population densities, seasonal abundance and long-term population trends of key algal and invertebrate species, including Tegula funebralis, Anthopluera elegantissima and Fucus spp. We will continue to closely monitor algal population densities in within our site in light of the November 2007 San Francisco Bay oil spill that leaked heavy bunker fuel into intertidal habitats around the SF Bay. Future analyses and investigations will include intertidal abiotic factors (including water temperature and human foot-traffic) to enhance insights into the workings of the Duxbury Reef ecosystem, in particular, the high and mid-intertidal zones experiencing the greatest amount of human impacts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, Tatiparthi B. K.; Thomas, Alex D.; Stamatis, Dimitri
The Genomes OnLine Database (GOLD; http://www.genomesonline.org) is a comprehensive online resource to catalog and monitor genetic studies worldwide. GOLD provides up-to-date status on complete and ongoing sequencing projects along with a broad array of curated metadata. Within this paper, we report version 5 (v.5) of the database. The newly designed database schema and web user interface supports several new features including the implementation of a four level (meta)genome project classification system and a simplified intuitive web interface to access reports and launch search tools. The database currently hosts information for about 19 200 studies, 56 000 Biosamples, 56 000 sequencingmore » projects and 39 400 analysis projects. More than just a catalog of worldwide genome projects, GOLD is a manually curated, quality-controlled metadata warehouse. The problems encountered in integrating disparate and varying quality data into GOLD are briefly highlighted. Lastly, GOLD fully supports and follows the Genomic Standards Consortium (GSC) Minimum Information standards.« less
A user friendly database for use in ALARA job dose assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zodiates, A.M.; Willcock, A.
1995-03-01
The pressurized water reactor (PWR) design chosen for adoption by Nuclear Electric plc was based on the Westinghouse Standard Nuclear Unit Power Plant (SNUPPS). This design was developed to meet the United Kingdom requirements and these improvements are embodied in the Sizewell B plant which will start commercial operation in 1994. A user-friendly database was developed to assist the station in the dose and ALARP assessments of the work expected to be carried out during station operation and outage. The database stores the information in an easily accessible form and enables updating, editing, retrieval, and searches of the information. Themore » database contains job-related information such as job locations, number of workers required, job times, and the expected plant doserates. It also contains the means to flag job requirements such as requirements for temporary shielding, flushing, scaffolding, etc. Typical uses of the database are envisaged to be in the prediction of occupational doses, the identification of high collective and individual dose jobs, use in ALARP assessments, setting of dose targets, monitoring of dose control performance, and others.« less
Pivot/Remote: a distributed database for remote data entry in multi-center clinical trials.
Higgins, S B; Jiang, K; Plummer, W D; Edens, T R; Stroud, M J; Swindell, B B; Wheeler, A P; Bernard, G R
1995-01-01
1. INTRODUCTION. Data collection is a critical component of multi-center clinical trials. Clinical trials conducted in intensive care units (ICU) are even more difficult because the acute nature of illnesses in ICU settings requires that masses of data be collected in a short time. More than a thousand data points are routinely collected for each study patient. The majority of clinical trials are still "paper-based," even if a remote data entry (RDE) system is utilized. The typical RDE system consists of a computer housed in the CC office and connected by modem to a centralized data coordinating center (DCC). Study data must first be recorded on a paper case report form (CRF), transcribed into the RDE system, and transmitted to the DCC. This approach requires additional monitoring since both the paper CRF and study database must be verified. The paper-based RDE system cannot take full advantage of automatic data checking routines. Much of the effort (and expense) of a clinical trial is ensuring that study data matches the original patient data. 2. METHODS. We have developed an RDE system, Pivot/Remote, that eliminates the need for paper-based CRFs. It creates an innovative, distributed database. The database resides partially at the study clinical centers (CC) and at the DCC. Pivot/Remote is descended from technology introduced with Pivot [1]. Study data is collected at the bedside with laptop computers. A graphical user interface (GUI) allows the display of electronic CRFs that closely mimic the normal paper-based forms. Data entry time is the same as for paper CRFs. Pull-down menus, displaying the possible responses, simplify the process of entering data. Edit checks are performed on most data items. For example, entered dates must conform to some temporal logic imposed by the study. Data must conform to some acceptable range of values. Calculations, such as computing the subject's age or the APACHE II score, are automatically made as the data is entered. Data that is collected serially (BP, HR, etc.) can be displayed graphically in a trend form along with other related variables. An audit trail is created that automatically tracks all changes to the original data, making it possible to reconstruct the CRF to any point in time. On-line help provides information on the study protocol as well as assistance with the use of the system. Electronic security makes it possible to lock certain parts of the CRF once it has been monitored. Completed CRFs are transmitted to the DCC via electronic mail where it is reviewed and merged into the study database. Questions about subject data are transmitted back to the CC via electronic mail. This approach to maintaining the study database is unique in that the study data files are distributed among the CC and DCC. Until a subject's CRF is monitored (verified against the original patient data residing in the hospital record), it logically resides at the CC where it was collected. Copies are transmitted to the DCC and are only read there. Any pre-monitoring changes must be made to the data at the CC. Once the subject's CRF is monitored, it logically moves to the DCC, and any subsequent changes are made at the DCC with copies of the CRF flowing back to the CC. 3. DISCUSSION. Pivot/Remote eliminates the need for paper forms by utilizing portable computers that can be used at the patient bedside. A GUI makes it possible to quickly enter data. Because the user gets instant feedback on possible error conditions, time is saved because the original data is close at hand. The ability to display trended data or variables in the context of other data allows detection of erroneous conditions beyond simple range checks. The logical construction of the database minimizes the problem of managing dual databases (at the CC and DCC) and keeps CC personnel in the loop until all changes are made.
MERCURY IN MARINE LIFE DATABASE
The purpose of the Mercury in Marine Life Project is to organize information on estuarine and marine species so that EPA can better understand both the extent of monitoring for mercury and level of mercury contamination in the biota of coastal environments. This report follows a ...
MRLC-LAND COVER MAPPING, ACCURACY ASSESSMENT AND APPLICATION RESEARCH
The National Land Cover Database (NLCD), produced by the Multi-Resolution Land Characteristics (MRLC) provides consistently classified land-cover and ancillary data for the United States. These data support many of the modeling and monitoring efforts related to GPRA goals of Cle...
ERIC Educational Resources Information Center
Scheideman, Dale; Dufresne, Ray
2001-01-01
Nevada's Clark County, the fastest growing school district in the nation, uses a life-cycle facilities management approach that monitors the individual components of each building on a database. The district's 10-year building program is addressing facilities infrastructure renewal, deferred maintenance, replacement, and new school construction.…
DOT National Transportation Integrated Search
2015-11-01
One of the most efficient ways to solve the damage detection problem using the statistical pattern recognition : approach is that of exploiting the methods of outlier analysis. Cast within the pattern recognition framework, : damage detection assesse...
Enabling Scientists: Serving Sci-Tech Library Users with Disabilities.
ERIC Educational Resources Information Center
Coonin, Bryna
2001-01-01
Discusses how librarians in scientific and technical libraries can contribute to an accessible electronic library environment for users with disabilities to ensure independent access to information. Topics include relevant assistive technologies; creating accessible Web pages; monitoring accessibility of electronic databases; preparing accessible…
An examination of the operational error database for air route traffic control centers.
DOT National Transportation Integrated Search
1993-12-01
Monitoring the frequency and determining the causes of operational errors - defined as the loss of prescribed separation between aircraft - is one approach to assessing the operational safety of the air traffic control system. The Federal Aviation Ad...
@Caribbean_LCC | CARIBBEAN LANDSCAPE CONSERVATION COOPERATIVE (A2)
Monitoring Data Ecosystem Governance Community Get involved Advisory Groups Scientific Community Practitioner ! Caribbean Agriculture, Forestry and Climate Governance Database Slide background LANDSCAPE Conservation Is Caribbean. Ecosystem Governance Discover our compendium of NGOs and coalition groups doing conservation
ERIC Educational Resources Information Center
Nitecki, Danuta A.
1985-01-01
Observations on impact of WILSONLINE (online access to H. W. Wilson's indexes) on the industry, library services, and end users are drawn from interviews with reference librarians, H. W. Wilson administrative staff, and monitors of database industry. Issues addressed include quality, thoroughness, ease of use, and marketing (demand, pricing,…
Model-Based, Noninvasive Monitoring of Intracranial Pressure
2012-10-01
nICP) estimate requires simultaneous measurement of the waveforms of arterial blood pressure ( ABP ), obtained via radial artery catheter or finger...initial database comprises subarachnoid hemorrhage patients in neuro-intensive care at our partner hospital, for whom ICP, ABP and CBFV are currently
Draft secure medical database standard.
Pangalos, George
2002-01-01
Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.
The Eruption Forecasting Information System (EFIS) database project
NASA Astrophysics Data System (ADS)
Ogburn, Sarah; Harpel, Chris; Pesicek, Jeremy; Wellik, Jay; Pallister, John; Wright, Heather
2016-04-01
The Eruption Forecasting Information System (EFIS) project is a new initiative of the U.S. Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) with the goal of enhancing VDAP's ability to forecast the outcome of volcanic unrest. The EFIS project seeks to: (1) Move away from relying on the collective memory to probability estimation using databases (2) Create databases useful for pattern recognition and for answering common VDAP questions; e.g. how commonly does unrest lead to eruption? how commonly do phreatic eruptions portend magmatic eruptions and what is the range of antecedence times? (3) Create generic probabilistic event trees using global data for different volcano 'types' (4) Create background, volcano-specific, probabilistic event trees for frequently active or particularly hazardous volcanoes in advance of a crisis (5) Quantify and communicate uncertainty in probabilities A major component of the project is the global EFIS relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest, including the timing of phreatic eruptions, column heights, eruptive products, etc. and will be initially populated using chronicles of eruptive activity from Alaskan volcanic eruptions in the GeoDIVA database (Cameron et al. 2013). This database module allows us to query across other global databases such as the WOVOdat database of monitoring data and the Smithsonian Institution's Global Volcanism Program (GVP) database of eruptive histories and volcano information. The EFIS database is in the early stages of development and population; thus, this contribution also serves as a request for feedback from the community.
Real Time Monitor of Grid job executions
NASA Astrophysics Data System (ADS)
Colling, D. J.; Martyniak, J.; McGough, A. S.; Křenek, A.; Sitera, J.; Mulač, M.; Dvořák, F.
2010-04-01
In this paper we describe the architecture and operation of the Real Time Monitor (RTM), developed by the Grid team in the HEP group at Imperial College London. This is arguably the most popular dissemination tool within the EGEE [1] Grid. Having been used, on many occasions including GridFest and LHC inauguration events held at CERN in October 2008. The RTM gathers information from EGEE sites hosting Logging and Bookkeeping (LB) services. Information is cached locally at a dedicated server at Imperial College London and made available for clients to use in near real time. The system consists of three main components: the RTM server, enquirer and an apache Web Server which is queried by clients. The RTM server queries the LB servers at fixed time intervals, collecting job related information and storing this in a local database. Job related data includes not only job state (i.e. Scheduled, Waiting, Running or Done) along with timing information but also other attributes such as Virtual Organization and Computing Element (CE) queue - if known. The job data stored in the RTM database is read by the enquirer every minute and converted to an XML format which is stored on a Web Server. This decouples the RTM server database from the client removing the bottleneck problem caused by many clients simultaneously accessing the database. This information can be visualized through either a 2D or 3D Java based client with live job data either being overlaid on to a 2 dimensional map of the world or rendered in 3 dimensions over a globe map using OpenGL.
Poprach, Alexandr; Bortlíček, Zbyněk; Büchler, Tomáš; Melichar, Bohuslav; Lakomý, Radek; Vyzula, Rostislav; Brabec, Petr; Svoboda, Marek; Dušek, Ladislav; Gregor, Jakub
2012-12-01
The incidence and mortality of renal cell carcinoma (RCC) in the Czech Republic are among the highest in the world. Several targeted agents have been recently approved for the treatment of advanced/metastatic RCC. Presentation of a national clinical database for monitoring and assessment of patients with advanced/metastatic RCC treated with targeted therapy. The RenIS (RENal Information System, http://renis.registry.cz ) registry is a non-interventional post-registration database of epidemiological and clinical data of patients with RCC treated with targeted therapies in the Czech Republic. Twenty cancer centres eligible for targeted therapy administration participate in the project. As of November 2011, six agents were approved and reimbursed from public health insurance, including bevacizumab, everolimus, pazopanib, sorafenib, sunitinib, and temsirolimus. As of 10 October 2011, 1,541 patients with valid records were entered into the database. Comparison with population-based data from the Czech National Cancer Registry revealed that RCC patients treated with targeted therapy are significantly younger (median age at diagnosis 59 vs. 66 years). Most RenIS registry patients were treated with sorafenib and sunitinib, many patients sequentially with both agents. Over 10 % of patients were also treated with everolimus in the second or third line. Progression-free survival times achieved were comparable to phase III clinical trials. The RenIS registry has become an important tool and source of information for the management of cancer care and clinical practice, providing comprehensive data on monitoring and assessment of RCC targeted therapy on a national level.
The Global Survey Method Applied to Ground-level Cosmic Ray Measurements
NASA Astrophysics Data System (ADS)
Belov, A.; Eroshenko, E.; Yanke, V.; Oleneva, V.; Abunin, A.; Abunina, M.; Papaioannou, A.; Mavromichalaki, H.
2018-04-01
The global survey method (GSM) technique unites simultaneous ground-level observations of cosmic rays in different locations and allows us to obtain the main characteristics of cosmic-ray variations outside of the atmosphere and magnetosphere of Earth. This technique has been developed and applied in numerous studies over many years by the Institute of Terrestrial Magnetism, Ionosphere and Radiowave Propagation (IZMIRAN). We here describe the IZMIRAN version of the GSM in detail. With this technique, the hourly data of the world-wide neutron-monitor network from July 1957 until December 2016 were processed, and further processing is enabled upon the receipt of new data. The result is a database of homogeneous and continuous hourly characteristics of the density variations (an isotropic part of the intensity) and the 3D vector of the cosmic-ray anisotropy. It includes all of the effects that could be identified in galactic cosmic-ray variations that were caused by large-scale disturbances of the interplanetary medium in more than 50 years. These results in turn became the basis for a database on Forbush effects and interplanetary disturbances. This database allows correlating various space-environment parameters (the characteristics of the Sun, the solar wind, et cetera) with cosmic-ray parameters and studying their interrelations. We also present features of the coupling coefficients for different neutron monitors that enable us to make a connection from ground-level measurements to primary cosmic-ray variations outside the atmosphere and the magnetosphere. We discuss the strengths and weaknesses of the current version of the GSM as well as further possible developments and improvements. The method developed allows us to minimize the problems of the neutron-monitor network, which are typical for experimental physics, and to considerably enhance its advantages.
NASA Astrophysics Data System (ADS)
Campbell, T. L.; Geller, J. B.; Heller, P.; Ruiz, G.; Chang, A.; McCann, L.; Ceballos, L.; Marraffini, M.; Ashton, G.; Larson, K.; Havard, S.; Meagher, K.; Wheelock, M.; Drake, C.; Rhett, G.
2016-02-01
The Ballast Water Management Act, the Marine Invasive Species Act, and the Coastal Ecosystem Protection Act require the California Department of Fish and Wildlife to monitor and evaluate the extent of biological invasions in the state's marine and estuarine waters. This has been performed statewide, using a variety of methodologies. Conventional sample collection and processing is laborious, slow and costly, and may require considerable taxonomic expertise requiring detailed time-consuming microscopic study of multiple specimens. These factors limit the volume of biomass that can be searched for introduced species. New technologies continue to reduce the cost and increase the throughput of genetic analyses, which become efficient alternatives to traditional morphological analysis for identification, monitoring and surveillance of marine invasive species. Using next-generation sequencing of mitochondrial Cytochrome c oxidase subunit I (COI) and nuclear large subunit ribosomal RNA (LSU), we analyzed over 15,000 individual marine invertebrates collected in Californian waters. We have created sequence databases of California native and non-native species to assist in molecular identification and surveillance in North American waters. Metagenetics, the next-generation sequencing of environmental samples with comparison to DNA sequence databases, is a faster and cost-effective alternative to individual sample analysis. We have sequenced from biomass collected from whole settlement plates and plankton in California harbors, and used our introduced species database to create species lists. We can combine these species lists for individual marinas with collected environmental data, such as temperature, salinity, and dissolved oxygen to understand the ecology of marine invasions. Here we discuss high throughput sampling, sequencing, and COASTLINE, our data analysis answer to challenges working with hundreds of millions of sequencing reads from tens of thousands of specimens.
Using LiCSAR as a fast-response system for the detection and the monitoring of volcanic unrest
NASA Astrophysics Data System (ADS)
Albino, F.; Biggs, J.; Hatton, E. L.; Spaans, K.; Gaddes, M.; McDougall, A.
2017-12-01
Based on the Smithsonian Institution volcano database, a total of 13256 volcanoes exist on Earth with 1273 having evidence of eruptive or unrest activity during the Holocene. InSAR techniques have proven their ability to detect and to quantify volcanic ground deformation on a case-by-case basis. However, the use of InSAR for the daily monitoring of every active volcano requires the development of automatic processing that can provide information in a couple of hours after a new radar acquisition. The LiCSAR system (http://comet.nerc.ac.uk/COMET-LiCS-portal/) answers this requirement by processing the vast amounts of data generated daily by the EU's Sentinel-1 satellite constellation. It provides now high-resolution deformation data for the entire Alpine-Himalayan seismic belt. The aim of our study is to extend LiCSAR system to the purpose of volcano monitoring. For each active volcano, the last Sentinel products calculated (phase, coherence and amplitude) will be available online in the COMET Volcano Deformation Database. To analyse this large amount of InSAR products, we develop an algorithm to automatically detect ground deformation signals as well as changes in coherence and amplitude in the time series. This toolbox could be a powerful fast-response system for helping volcanological observatories to manage new or ongoing volcanic crisis. Important information regarding the spatial and the temporal evolution of each ground deformation signal will also be added to the COMET database. This will benefit to better understand the conditions in which volcanic unrest leads to an eruption. Such worldwide survey enables us to establish a large catalogue of InSAR products, which will also be suitable for further studies (mapping of new lava flows, modelling of magmatic sources, evaluation of stress interactions).
A georeferenced Landsat digital database for forest insect-damage assessment
NASA Technical Reports Server (NTRS)
Williams, D. L.; Nelson, R. F.; Dottavio, C. L.
1985-01-01
In 1869, the gypsy moth caterpillar was introduced in the U.S. in connection with the experiments of a French scientist. Throughout the insect's period of establishment, gypsy moth populations have periodically increased to epidemic proportions. For programs concerned with preventing the insect's spread, it would be highly desirable to be able to employ a survey technique which could provide timely, accurate, and standardized assessments at a reasonable cost. A project was, therefore, initiated with the aim to demonstrate the usefulness of satellite remotely sensed data for monitoring the insect defoliation of hardwood forests in Pennsylvania. A major effort within this project involved the development of a map-registered Landsat digital database. A complete description of the database developed is provided along with information regarding the employed data management system.
Christensen, Arne; Osterberg, Lars G; Hansen, Ebba Holme
2009-08-01
Poor patient adherence is often the reason for suboptimal blood pressure control. Electronic monitoring is one method of assessing adherence. The aim was to systematically review the literature on electronic monitoring of patient adherence to self-administered oral antihypertensive medications. We searched the Pubmed, Embase, Cinahl and Psychinfo databases and websites of suppliers of electronic monitoring devices. The quality of the studies was assessed according to the quality criteria proposed by Haynes et al. Sixty-two articles were included; three met the criteria proposed by Haynes et al. and nine reported the use of electronic adherence monitoring for feedback interventions. Adherence rates were generally high, whereas average study quality was low with a recent tendency towards improved quality. One study detected investigator fraud based on electronic monitoring data. Use of electronic monitoring of patient adherence according to the quality criteria proposed by Haynes et al. has been rather limited during the past two decades. Electronic monitoring has mainly been used as a measurement tool, but it seems to have the potential to significantly improve blood pressure control as well and should be used more widely.
Pandey, Gunjan; Pandey, Janmejay; Jain, Rakesh K
2006-05-01
Monitoring of micro-organisms released deliberately into the environment is essential to assess their movement during the bio-remediation process. During the last few years, DNA-based genetic methods have emerged as the preferred method for such monitoring; however, their use is restricted in cases where organisms used for bio-remediation are not well characterized or where the public domain databases do not provide sufficient information regarding their sequence. For monitoring of such micro-organisms, alternate approaches have to be undertaken. In this study, we have specifically monitored a p-nitrophenol (PNP)-degrading organism, Arthrobacter protophormiae RKJ100, using molecular methods during PNP degradation in soil microcosm. Cells were tagged with a transposon-based foreign DNA sequence prior to their introduction into PNP-contaminated microcosms. Later, this artificially introduced DNA sequence was PCR-amplified to distinguish the bio-augmented organism from the indigenous microflora during PNP bio-remediation.
NASA Astrophysics Data System (ADS)
Sipos, Roland; Govi, Giacomo; Franzoni, Giovanni; Di Guida, Salvatore; Pfeiffer, Andreas
2017-10-01
The CMS experiment at CERN LHC has a dedicated infrastructure to handle the alignment and calibration data. This infrastructure is composed of several services, which take on various data management tasks required for the consumption of the non-event data (also called as condition data) in the experiment activities. The criticality of these tasks imposes tights requirements for the availability and the reliability of the services executing them. In this scope, a comprehensive monitoring and alarm generating system has been developed. The system has been implemented based on the Nagios open source industry standard for monitoring and alerting services, and monitors the database back-end, the hosting nodes and key heart-beat functionalities for all the services involved. This paper describes the design, implementation and operational experience with the monitoring system developed and deployed at CMS in 2016.
Georgia's Surface-Water Resources and Streamflow Monitoring Network, 2006
Nobles, Patricia L.; ,
2006-01-01
The U.S. Geological Survey (USGS) network of 223 real-time monitoring stations, the 'Georgia HydroWatch,' provides real-time water-stage data, with streamflow computed at 198 locations, and rainfall recorded at 187 stations. These sites continuously record data on 15-minute intervals and transmit the data via satellite to be incorporated into the USGS National Water Information System database. These data are automatically posted to the USGS Web site for public dissemination (http://waterdata.usgs.gov/ga/nwis/nwis). The real-time capability of this network provides information to help emergency-management officials protect human life and property during floods, and mitigate the effects of prolonged drought. The map at right shows the USGS streamflow monitoring network for Georgia and major watersheds. Streamflow is monitored at 198 sites statewide, more than 80 percent of which include precipitation gages. Various Federal, State, and local agencies fund these streamflow monitoring stations.
Agroclimate.Org: Tools and Information for a Climate Resilient Agriculture in the Southeast USA
NASA Astrophysics Data System (ADS)
Fraisse, C.
2014-12-01
AgroClimate (http://agroclimate.org) is a web-based system developed to help the agricultural industry in the southeastern USA reduce risks associated with climate variability and change. It includes climate related information and dynamic application tools that interact with a climate and crop database system. Information available includes climate monitoring and forecasts combined with information about crop management practices that help increase the resiliency of the agricultural industry in the region. Recently we have included smartphone apps in the AgroClimate suite of tools, including irrigation management and crop disease alert systems. Decision support tools available in AgroClimate include: (a) Climate risk: expected (probabilistic) and historical climate information and freeze risk; (b) Crop yield risk: expected yield based on soil type, planting date, and basic management practices for selected commodities and historical county yield databases; (c) Crop diseases: disease risk monitoring and forecasting for strawberry and citrus; (d) Crop development: monitoring and forecasting of growing degree-days and chill accumulation; (e) Drought: monitoring and forecasting of selected drought indices, (f) Footprints: Carbon and water footprint calculators. The system also provides background information about the main drivers of climate variability and basic information about climate change in the Southeast USA. AgroClimate has been widely used as an educational tool by the Cooperative Extension Services in the region and also by producers. It is now being replicated internationally with version implemented in Mozambique and Paraguay.
ERIC Educational Resources Information Center
Journal of Chemical Education, 1988
1988-01-01
Reviews two computer programs: "Molecular Graphics," which allows molecule manipulation in three-dimensional space (requiring IBM PC with 512K, EGA monitor, and math coprocessor); and "Periodic Law," a database which contains up to 20 items of information on each of the first 103 elements (Apple II or IBM PC). (MVL)
Sodium content in US packaged foods 2009
USDA-ARS?s Scientific Manuscript database
In 2010, the Institute of Medicine recommended food manufacturers reduce the amount of sodium in their products. Monitoring sodium in packaged foods is necessary to evaluate the impact of these efforts. Using commercially available data from Nielsen and Gladson, we created a database with sales and...
The purpose of the Interagency Steering Committee on Multimedia Environmental Modeling (ISCMEM) is to foster the exchange of information about environmental modeling tools, modeling frameworks, and environmental monitoring databases that are all in the public domain. It is compos...
Age 60 rule research, part I : bibliographic database.
DOT National Transportation Integrated Search
1994-10-01
This document is one of four products completed as a part of the Age 60 Rule research contract monitored by Pam Della Rocco, Civil Aerospace Medical Institute. As part of their research contract with the FAA to study issues related to the "Age 60 Rul...
32 CFR 701.53 - FOIA fee schedule.
Code of Federal Regulations, 2011 CFR
2011-07-01
... monitoring by a human, that human time may be also assessed as computer search. The terms “programmer/operator” shall not be limited to the traditional programmers or operators. Rather, the terms shall be.... technician, administrative support, operator, programmer, database administrator, or action officer). (2...
Novel health monitoring method using an RGB camera.
Hassan, M A; Malik, A S; Fofi, D; Saad, N; Meriaudeau, F
2017-11-01
In this paper we present a novel health monitoring method by estimating the heart rate and respiratory rate using an RGB camera. The heart rate and the respiratory rate are estimated from the photoplethysmography (PPG) and the respiratory motion. The method mainly operates by using the green spectrum of the RGB camera to generate a multivariate PPG signal to perform multivariate de-noising on the video signal to extract the resultant PPG signal. A periodicity based voting scheme (PVS) was used to measure the heart rate and respiratory rate from the estimated PPG signal. We evaluated our proposed method with a state of the art heart rate measuring method for two scenarios using the MAHNOB-HCI database and a self collected naturalistic environment database. The methods were furthermore evaluated for various scenarios at naturalistic environments such as a motion variance session and a skin tone variance session. Our proposed method operated robustly during the experiments and outperformed the state of the art heart rate measuring methods by compensating the effects of the naturalistic environment.
Seamless personal health information system in cloud computing.
Chung, Wan-Young; Fong, Ee May
2014-01-01
Noncontact ECG measurement has gained popularity these days due to its noninvasive and conveniences to be applied on daily life. This approach does not require any direct contact between patient's skin and sensor for physiological signal measurement. The noncontact ECG measurement is integrated with mobile healthcare system for health status monitoring. Mobile phone acts as the personal health information system displaying health status and body mass index (BMI) tracking. Besides that, it plays an important role being the medical guidance providing medical knowledge database including symptom checker and health fitness guidance. At the same time, the system also features some unique medical functions that cater to the living demand of the patients or users, including regular medication reminders, alert alarm, medical guidance, appointment scheduling. Lastly, we demonstrate mobile healthcare system with web application for extended uses, thus health data are clouded into web server system and web database storage. This allows remote health status monitoring easily and so forth it promotes a cost effective personal healthcare system.
Raw Cow Milk Bacterial Population Shifts Attributable to Refrigeration
Lafarge, Véronique; Ogier, Jean-Claude; Girard, Victoria; Maladen, Véronique; Leveau, Jean-Yves; Gruss, Alexandra; Delacroix-Buchet, Agnès
2004-01-01
We monitored the dynamic changes in the bacterial population in milk associated with refrigeration. Direct analyses of DNA by using temporal temperature gel electrophoresis (TTGE) and denaturing gradient gel electrophoresis (DGGE) allowed us to make accurate species assignments for bacteria with low-GC-content (low-GC%) (<55%) and medium- or high-GC% (>55%) genomes, respectively. We examined raw milk samples before and after 24-h conservation at 4°C. Bacterial identification was facilitated by comparison with an extensive bacterial reference database (∼150 species) that we established with DNA fragments of pure bacterial strains. Cloning and sequencing of fragments missing from the database were used to achieve complete species identification. Considerable evolution of bacterial populations occurred during conservation at 4°C. TTGE and DGGE are shown to be a powerful tool for identifying the main bacterial species of the raw milk samples and for monitoring changes in bacterial populations during conservation at 4°C. The emergence of psychrotrophic bacteria such as Listeria spp. or Aeromonas hydrophila is demonstrated. PMID:15345453
Aircraft Operations Classification System
NASA Technical Reports Server (NTRS)
Harlow, Charles; Zhu, Weihong
2001-01-01
Accurate data is important in the aviation planning process. In this project we consider systems for measuring aircraft activity at airports. This would include determining the type of aircraft such as jet, helicopter, single engine, and multiengine propeller. Some of the issues involved in deploying technologies for monitoring aircraft operations are cost, reliability, and accuracy. In addition, the system must be field portable and acceptable at airports. A comparison of technologies was conducted and it was decided that an aircraft monitoring system should be based upon acoustic technology. A multimedia relational database was established for the study. The information contained in the database consists of airport information, runway information, acoustic records, photographic records, a description of the event (takeoff, landing), aircraft type, and environmental information. We extracted features from the time signal and the frequency content of the signal. A multi-layer feed-forward neural network was chosen as the classifier. Training and testing results were obtained. We were able to obtain classification results of over 90 percent for training and testing for takeoff events.
Schneeweiss, S; Eichler, H-G; Garcia-Altes, A; Chinn, C; Eggimann, A-V; Garner, S; Goettsch, W; Lim, R; Löbker, W; Martin, D; Müller, T; Park, B J; Platt, R; Priddy, S; Ruhl, M; Spooner, A; Vannieuwenhuyse, B; Willke, R J
2016-12-01
Analyses of healthcare databases (claims, electronic health records [EHRs]) are useful supplements to clinical trials for generating evidence on the effectiveness, harm, use, and value of medical products in routine care. A constant stream of data from the routine operation of modern healthcare systems, which can be analyzed in rapid cycles, enables incremental evidence development to support accelerated and appropriate access to innovative medicines. Evidentiary needs by regulators, Health Technology Assessment, payers, clinicians, and patients after marketing authorization comprise (1) monitoring of medication performance in routine care, including the materialized effectiveness, harm, and value; (2) identifying new patient strata with added value or unacceptable harms; and (3) monitoring targeted utilization. Adaptive biomedical innovation (ABI) with rapid cycle database analytics is successfully enabled if evidence is meaningful, valid, expedited, and transparent. These principles will bring rigor and credibility to current efforts to increase research efficiency while upholding evidentiary standards required for effective decision-making in healthcare. © 2016 American Society for Clinical Pharmacology and Therapeutics.
A mobile system for skin cancer diagnosis and monitoring
NASA Astrophysics Data System (ADS)
Gu, Yanliang; Tang, Jinshan
2014-05-01
In this paper, we propose a mobile system for aiding doctors in skin cancer diagnosis and other persons in skin cancer monitoring. The basic idea is to use image retrieval techniques to help the users to find the similar skin cancer cases stored in a database by using smart phones. The query image can be taken by a smart phone from a patient or can be uploaded from other resources. The shapes of the skin lesions are used for matching two skin lesions, which are segmented from skin images using the skin lesion extraction method developed in 1. The features used in the proposed system are obtained by Fourier descriptor. A prototype application has been developed and can be installed in an iPhone. In this application, the iPhone users can use the iPhone as a diagnosis tool to find the potential skin lesions in a persons' skin and compare the skin lesions detected by the iPhone with the skin lesions stored in a database in a remote server.
Monitoring Workload in Throwing-Dominant Sports: A Systematic Review.
Black, Georgia M; Gabbett, Tim J; Cole, Michael H; Naughton, Geraldine
2016-10-01
The ability to monitor training load accurately in professional sports is proving vital for athlete preparedness and injury prevention. While numerous monitoring techniques have been developed to assess the running demands of many team sports, these methods are not well suited to throwing-dominant sports that are infrequently linked to high running volumes. Therefore, other techniques are required to monitor the differing demands of these sports to ensure athletes are adequately prepared for competition. To investigate the different methodologies used to quantitatively monitor training load in throwing-dominant sports. A systematic review of the methods used to monitor training load in throwing-dominant sports was conducted using variations of terms that described different load-monitoring techniques and different sports. Studies included in this review were published prior to June 2015 and were identified through a systematic search of four electronic databases including Academic Search Complete, CINAHL, Medline and SPORTDiscus. Only full-length peer-reviewed articles investigating workload monitoring in throwing-dominant sports were selected for review. A total of 8098 studies were initially retrieved from the four databases and 7334 results were removed as they were either duplicates, review articles, non-peer-reviewed articles, conference abstracts or articles written in languages other than English. After screening the titles and abstracts of the remaining papers, 28 full-text papers were reviewed, resulting in the identification of 20 articles meeting the inclusion criteria for monitoring workloads in throwing-dominant sports. Reference lists of selected articles were then scanned to identify other potential articles, which yielded one additional article. Ten articles investigated workload monitoring in cricket, while baseball provided eight results, and handball, softball and water polo each contributed one article. Results demonstrated varying techniques used to monitor workload and purposes for monitoring workload, encompassing the relationship between workload and injury, individual responses to workloads, the effect of workload on subsequent performance and the future directions of workload-monitoring techniques. This systematic review highlighted a number of simple and effective workload-monitoring techniques implemented across a variety of throwing-dominant sports. The current literature placed an emphasis on the relationship between workload and injury. However, due to differences in chronological and training age, inconsistent injury definitions and time frames used for monitoring, injury thresholds remain unclear in throwing-dominant sports. Furthermore, although research has examined total workload, the intensity of workload is often neglected. Additional research on the reliability of self-reported workload data is also required to validate existing relationships between workload and injury. Considering the existing disparity within the literature, it is likely that throwing-dominant sports would benefit from the development of an automated monitoring tool to objectively assess throwing-related workloads in conjunction with well-established internal measures of load in athletes.
A seabird monitoring program for the North Pacific
Hatcher, S.A.; Kaiser, G.W.; Kondratyev, Alexander V.; Byrd, G.V.
1994-01-01
Seabird monitoring is the accumulation of time series data on any aspect of seabird distribution, abundance, demography, or behavior. Typical studies include annual or less frequent measures of numbers or productivity; less commonly, the focus is on marine habitat use, phenology, food habits, or survival. The key requirement is that observations are replicated over time and made with sufficient precision and accuracy to permit the meaningful analysis of variability and trends. Along the Pacific coast of North America, seabird monitoring has consumed substantial amounts of public funding since the early 1970s. The effort has been largely uncoordinated among the many entities involved, including provincial, state, and federal agencies, some private organizations, university faculty, and students. We reaffirm the rationale for monitoring seabirds, review briefly the nature and accomplishments of the existing effort, and suggest actions needed to improve the effectiveness of seabird monitoring in the Pacific. In particular, we propose and describe a comprehensive Seabird Monitoring Database designed specifically to work with observations on seabird population parameters that are replicated over time.
Seli, Paul; Smilek, Daniel; Ralph, Brandon C W; Schacter, Daniel L
2018-03-01
Across 2 independent samples, we examined the relation between individual differences in rates of self-caught mind wandering and individual differences in temporal monitoring of an unrelated response goal. Rates of self-caught mind wandering were assessed during a commonly used sustained-attention task, and temporal goal monitoring was indexed during a well-established prospective-memory task. The results from both samples showed a positive relation between rates of self-caught mind wandering during the sustained-attention task and rates of checking a clock to monitor the amount of time remaining before a response was required in the prospective-memory task. This relation held even when controlling for overall propensity to mind-wander (indexed by intermittent thought probes) and levels of motivation (indexed by subjective reports). These results suggest the possibility that there is a common monitoring system that monitors the contents of consciousness and the progress of ongoing goals and tasks. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Aoki, K.; Ohuchi, N.; Zong, Z.; Arimoto, Y.; Wang, X.; Yamaoka, H.; Kawai, M.; Kondou, Y.; Makida, Y.; Hirose, M.; Endou, T.; Iwasaki, M.; Nakamura, T.
2017-12-01
A remote monitoring system was developed based on the software infrastructure of the Experimental Physics and Industrial Control System (EPICS) for the cryogenic system of superconducting magnets in the interaction region of the SuperKEKB accelerator. The SuperKEKB has been constructed to conduct high-energy physics experiments at KEK. These superconducting magnets consist of three apparatuses, the Belle II detector solenoid, and QCSL and QCSR accelerator magnets. They are each contained in three cryostats cooled by dedicated helium cryogenic systems. The monitoring system was developed to read data of the EX-8000, which is an integrated instrumentation system to control all cryogenic components. The monitoring system uses the I/O control tools of EPICS software for TCP/IP, archiving techniques using a relational database, and easy human-computer interface. Using this monitoring system, it is possible to remotely monitor all real-time data of the superconducting magnets and cryogenic systems. It is also convenient to share data among multiple groups.
Guhn, Martin; Janus, Magdalena; Enns, Jennifer; Brownell, Marni; Forer, Barry; Duku, Eric; Muhajarine, Nazeem; Raos, Rob
2016-04-29
Early childhood is a key period to establish policies and practices that optimise children's health and development, but Canada lacks nationally representative data on social indicators of children's well-being. To address this gap, the Early Development Instrument (EDI), a teacher-administered questionnaire completed for kindergarten-age children, has been implemented across most Canadian provinces over the past 10 years. The purpose of this protocol is to describe the Canadian Neighbourhoods and Early Child Development (CanNECD) Study, the aims of which are to create a pan-Canadian EDI database to monitor trends over time in children's developmental health and to advance research examining the social determinants of health. Canada-wide EDI records from 2004 to 2014 (representing over 700,000 children) will be linked to Canada Census and Income Taxfiler data. Variables of socioeconomic status derived from these databases will be used to predict neighbourhood-level EDI vulnerability rates by conducting a series of regression analyses and latent variable models at provincial/territorial and national levels. Where data are available, we will measure the neighbourhood-level change in developmental vulnerability rates over time and model the socioeconomic factors associated with those trends. Ethics approval for this study was granted by the Behavioural Research Ethics Board at the University of British Columbia. Study findings will be disseminated to key partners, including provincial and federal ministries, schools and school districts, collaborative community groups and the early childhood development research community. The database created as part of this longitudinal population-level monitoring system will allow researchers to associate practices, programmes and policies at school and community levels with trends in developmental health outcomes. The CanNECD Study will guide future early childhood development action and policies, using the database as a tool for formative programme and policy evaluation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
NASA Astrophysics Data System (ADS)
Boulanger, D.; Thouret, V.
2016-12-01
IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft and do measurements of aerosols, cloud particles, greenhouse gases, ozone, water vapor and nitrogen oxides from the surface to the lower stratosphere. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core and IAGOS-CARIBIC data. The IAGOS Data Portal (http://www.iagos.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). In 2016 the new IAGOS Data Portal has been released. In addition to the data download the portal provides improved and new services such as download in NetCDF or NASA Ames formats and plotting tools (maps, time series, vertical profiles). New added value products are available through the portal: back trajectories, origin of air masses, co-location with satellite data. Web services allow to download IAGOS metadata such as flights and airports information. Administration tools have been implemented for users management and instruments monitoring. A major improvement is the interoperability with international portals and other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse, the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de) and the CAMS (Copernicus Atmosphere Monitoring Service) data center in Jülich (http://join.iek.fz-juelich.de). The link with the CAMS data center, through the JOIN interface, allows to combine model outputs with IAGOS data for inter-comparison. The CAMS project is a prominent user of the IGAS data network. Duting the next year IAGOS will improve metadata standardization and dissemination through different collaborations with the AERIS data center, GAW for which IAGOS is a contributing network and the ENVRI+ European project. Measurements traceability and quality metadata will be available and DOI will be implemented.
NASA Astrophysics Data System (ADS)
Diaconescu, V. D.; Scripcariu, L.; Mătăsaru, P. D.; Diaconescu, M. R.; Ignat, C. A.
2018-06-01
Exhibited textile-materials-based artefacts can be affected by the environmental conditions. A smart monitoring system that commands an adaptive automatic environment control system is proposed for indoor exhibition spaces containing various textile artefacts. All exhibited objects are monitored by many multi-sensor nodes containing temperature, relative humidity and light sensors. Data collected periodically from the entire sensor network is stored in a database and statistically processed in order to identify and classify the environment risk. Risk consequences are analyzed depending on the risk class and the smart system commands different control measures in order to stabilize the indoor environment conditions to the recommended values and prevent material degradation.
NASA Astrophysics Data System (ADS)
Mumladze, Tea; Wang, Haijun; Graham, Gerhard
2017-04-01
The seismic network that forms the International Monitoring System (IMS) of the Comprehensive Nuclear-test-ban Treaty Organization (CTBTO) will ultimately consist of 170 seismic stations (50 primary and 120 auxiliary) in 76 countries around the world. The Network is still under the development, but currently more than 80% of the network is in operation. The objective of seismic monitoring is to detect and locate underground nuclear explosions. However, the data from the IMS also can be widely used for scientific and civil purposes. In this study we present the results of data analysis of the seismic sequence in 2016 in Central Italy. Several hundred earthquakes were recorded for this sequence by the seismic stations of the IMS. All events were accurately located the analysts of the International Data Centre (IDC) of the CTBTO. In this study we will present the epicentral and magnitude distribution, station recordings and teleseismic phases as obtained from the Reviewed Event Bulletin (REB). We will also present a comparison of the database of the IDC with the databases of the European-Mediterranean Seismological Centre (EMSC) and U.S. Geological Survey (USGS). Present work shows that IMS data can be used for earthquake sequence analyses and can play an important role in seismological research.
The ATLAS PanDA Monitoring System and its Evolution
NASA Astrophysics Data System (ADS)
Klimentov, A.; Nevski, P.; Potekhin, M.; Wenaus, T.
2011-12-01
The PanDA (Production and Distributed Analysis) Workload Management System is used for ATLAS distributed production and analysis worldwide. The needs of ATLAS global computing imposed challenging requirements on the design of PanDA in areas such as scalability, robustness, automation, diagnostics, and usability for both production shifters and analysis users. Through a system-wide job database, the PanDA monitor provides a comprehensive and coherent view of the system and job execution, from high level summaries to detailed drill-down job diagnostics. It is (like the rest of PanDA) an Apache-based Python application backed by Oracle. The presentation layer is HTML code generated on the fly in the Python application which is also responsible for managing database queries. However, this approach is lacking in user interface flexibility, simplicity of communication with external systems, and ease of maintenance. A decision was therefore made to migrate the PanDA monitor server to Django Web Application Framework and apply JSON/AJAX technology in the browser front end. This allows us to greatly reduce the amount of application code, separate data preparation from presentation, leverage open source for tools such as authentication and authorization mechanisms, and provide a richer and more dynamic user experience. We describe our approach, design and initial experience with the migration process.
Sala-Comorera, Laura; Blanch, Anicet R; Vilaró, Carles; Galofré, Belén; García-Aljaro, Cristina
2017-10-01
The aim of this work was to assess the suitability of matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) for routine heterotrophic monitoring in a drinking water treatment plant. Water samples were collected from raw surface water and after different treatments during two campaigns over a 1-year period. Heterotrophic bacteria were studied and isolates were identified by MALDI-TOF MS. Moreover, the diversity index and the coefficient of population similarity were also calculated using biochemical fingerprinting of the populations studied. MALDI-TOF MS enabled us to characterize and detect changes in the bacterial community composition throughout the water treatment plant. Raw water showed a large and diverse population which was slightly modified after initial treatment steps (sand filtration and ultrafiltration). Reverse osmosis had a significant impact on the microbial diversity, while the final chlorination step produced a shift in the composition of the bacterial community. Although MALDI-TOF MS could not identify all the isolates since the available MALDI-TOF MS database does not cover all the bacterial diversity in water, this technique could be used to monitor bacterial changes in drinking water treatment plants by creating a specific protein profile database for tracking purposes.
Monitoring Wildlife Interactions with Their Environment: An Interdisciplinary Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charles-Smith, Lauren E.; Domnguez, Ignacio X.; Fornaro, Robert J.
In a rapidly changing world, wildlife ecologists strive to correctly model and predict complex relationships between animals and their environment, which facilitates management decisions impacting public policy to conserve and protect delicate ecosystems. Recent advances in monitoring systems span scientific domains, including animal and weather monitoring devices and landscape classification mapping techniques. The current challenge is how to combine and use detailed output from various sources to address questions spanning multiple disciplines. WolfScout wildlife and weather tracking system is a software tool capable of filling this niche. WolfScout automates integration of the latest technological advances in wildlife GPS collars, weathermore » stations, drought conditions, and severe weather reports, and animal demographic information. The WolfScout database stores a variety of classified landscape maps including natural and manmade features. Additionally, WolfScout’s spatial database management system allows users to calculate distances between animals’ location and landscape characteristics, which are linked to the best approximation of environmental conditions at the animal’s location during the interaction. Through a secure website, data are exported in formats compatible with multiple software programs including R and ArcGIS. The WolfScout design promotes interoperability in data, between researchers, and software applications while standardizing analyses of animal interactions with their environment.« less
Big data and high-performance analytics in structural health monitoring for bridge management
NASA Astrophysics Data System (ADS)
Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed
2016-04-01
Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.
A Web-based geographic information system for monitoring animal welfare during long journeys.
Ippoliti, Carla; Di Pasquale, Adriano; Fiore, Gianluca; Savini, Lara; Conte, Annamaria; Di Gianvito, Federica; Di Francesco, Cesare
2007-01-01
Animal welfare protection during long journeys is mandatory according to European Union regulations designed to ensure that animals are transported in accordance with animal welfare requirements and to provide control bodies with a regulatory tool to react promptly in cases of non-compliance and to ensure a safe network between products, animals and farms. Regulation 1/2005/EC foresees recourse to a system of traceability within European Union member states. The Joint Research Centre of the European Commission (JRC) has developed a prototype system fulfilling the requirements of the Regulation which is able to monitor compliance with animal welfare requirements during transportation, register electronic identification of transported animals and store data in a central database shared with the other member states through a Web-based application. Test equipment has recently been installed on a vehicle that records data on vehicle position (geographic coordinates, date/time) and animal welfare conditions (measurements of internal temperature of the vehicle, etc.). The information is recorded at fixed intervals and transmitted to the central database. The authors describe the Web-based geographic information system, through which authorised users can visualise instantly the real-time position of the vehicle, monitor the sensor-recorded data and follow the time-space path of the truck during journeys.
Bogialli, Sara; Bortolini, Claudio; Di Gangi, Iole Maria; Di Gregorio, Federica Nigro; Lucentini, Luca; Favaro, Gabriella; Pastore, Paolo
2017-08-01
A comprehensive risk management on human exposure to cyanotoxins, whose production is actually unpredictable, is limited by reliable analytical tools for monitoring as many toxic algal metabolites as possible. Two analytical approaches based on a LC-QTOF system for target analysis and suspect screening of cyanotoxins in freshwater were presented. A database with 369 compounds belonging to cyanobacterial metabolites was developed and used for a retrospective data analysis based on high resolution mass spectrometry (HRMS). HRMS fragmentation of the suspect cyanotoxin precursor ions was subsequently performed for correctly identifying the specific variants. Alternatively, an automatic tandem HRMS analysis tailored for cyanotoxins was performed in a single chromatographic run, using the developed database as a preferred precursor ions list. Twenty-five extracts of surface and drinking waters contaminated by cyanobacteria were processed. The identification of seven uncommon microcystins (M(O)R, MC-FR, MSer 7 -YR, D-Asp 3 MSer 7 -LR, MSer 7 -LR, dmAdda-LR and dmAdda-YR) and 6 anabaenopeptins (A, B, F, MM850, MM864, oscyllamide Y) was reported. Several isobaric variants, fully separated by chromatography, were pointed out. The developed methods are proposed to be used by environmental and health agencies for strengthening the surveillance monitoring of cyanotoxins in water. Copyright © 2017 Elsevier B.V. All rights reserved.
Issuance of the Clean Air Act Stationary Source Compliance Monitoring Strategy
This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Periodic Monitoring in Title V Permits for Turbines Subject to NSPS Subpart GG
This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Periodic Monitoring Guidance for Title V Operating Permits Programs
This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
2014-10-01
number/communicate to site coordinator N/A Task V.5 (mo 6-47): implement methods to educate/monitor participants on aspects of vit D3 and calcium...potential side effects of vit D3 supplementation CRFs for adverse event reporting have been developed and included in the protocols submitted for IRB...Task VI.3 (mo 6-47): document acceptance of storage sample in the CGRP database and vit D3 study database The process for storage of sample in the
Quebec Trophoblastic Disease Registry: how to make an easy-to-use dynamic database.
Sauthier, Philippe; Breguet, Magali; Rozenholc, Alexandre; Sauthier, Michaël
2015-05-01
To create an easy-to-use dynamic database designed specifically for the Quebec Trophoblastic Disease Registry (RMTQ). It is now well established that much of the success in managing trophoblastic diseases comes from the development of national and regional reference centers. Computerized databases allow the optimal use of data stored in these centers. We have created an electronic data registration system by producing a database using FileMaker Pro 12. It uses 11 external tables associated with a unique identification number for each patient. Each table allows specific data to be recorded, incorporating demographics, diagnosis, automated staging, laboratory values, pathological diagnosis, and imaging parameters. From January 1, 2009, to December 31, 2013, we used our database to register 311 patients with 380 diseases and have seen a 39.2% increase in registrations each year between 2009 and 2012. This database allows the automatic generation of semilogarithmic curves, which take into account β-hCG values as a function of time, complete with graphic markers for applied treatments (chemotherapy, radiotherapy, or surgery). It generates a summary sheet for a synthetic vision in real time. We have created, at a low cost, an easy-to-use database specific to trophoblastic diseases that dynamically integrates staging and monitoring. We propose a 10-step procedure for a successful trophoblastic database. It improves patient care, research, and education on trophoblastic diseases in Quebec and leads to an opportunity for collaboration on a national Canadian registry.
Kaewpitoon, Soraya J; Rujirakul, Ratana; Joosiri, Apinya; Jantakate, Sirinun; Sangkudloa, Amnat; Kaewthani, Sarochinee; Chimplee, Kanokporn; Khemplila, Kritsakorn; Kaewpitoon, Natthawut
2016-01-01
Cholangiocarcinoma (CCA) is a serious problem in Thailand, particularly in the northeastern and northern regions. Database of population at risk are need required for monitoring, surveillance, home health care, and home visit. Therefore, this study aimed to develop a geographic information system (GIS) database and Google map of the population at risk of CCA in Mueang Yang district, Nakhon Ratchasima province, northeastern Thailand during June to October 2015. Populations at risk were screened using the Korat CCA verbal screening test (KCVST). Software included Microsoft Excel, ArcGIS, and Google Maps. The secondary data included the point of villages, sub-district boundaries, district boundaries, point of hospital in Mueang Yang district, used for created the spatial databese. The populations at risk for CCA and opisthorchiasis were used to create an arttribute database. Data were tranfered to WGS84 UTM ZONE 48. After the conversion, all of the data were imported into Google Earth using online web pages www.earthpoint.us. Some 222 from a 4,800 population at risk for CCA constituted a high risk group. Geo-visual display available at following www.google.com/maps/d/u/0/ edit?mid=zPxtcHv_iDLo.kvPpxl5mAs90 and hl=th. Geo-visual display 5 layers including: layer 1, village location and number of the population at risk for CCA; layer 2, sub-district health promotion hospital in Mueang Yang district and number of opisthorchiasis; layer 3, sub-district district and the number of population at risk for CCA; layer 4, district hospital and the number of population at risk for CCA and number of opisthorchiasis; and layer 5, district and the number of population at risk for CCA and number of opisthorchiasis. This GIS database and Google map production process is suitable for further monitoring, surveillance, and home health care for CCA sufferers.
Frost, Rachael; Levati, Sara; McClurg, Doreen; Brady, Marian; Williams, Brian
2017-06-01
To systematically review methods for measuring adherence used in home-based rehabilitation trials and to evaluate their validity, reliability, and acceptability. In phase 1 we searched the CENTRAL database, NHS Economic Evaluation Database, and Health Technology Assessment Database (January 2000 to April 2013) to identify adherence measures used in randomized controlled trials of allied health professional home-based rehabilitation interventions. In phase 2 we searched the databases of MEDLINE, Embase, CINAHL, Allied and Complementary Medicine Database, PsycINFO, CENTRAL, ProQuest Nursing and Allied Health, and Web of Science (inception to April 2015) for measurement property assessments for each measure. Studies assessing the validity, reliability, or acceptability of adherence measures. Two reviewers independently extracted data on participant and measure characteristics, measurement properties evaluated, evaluation methods, and outcome statistics and assessed study quality using the COnsensus-based Standards for the selection of health Measurement INstruments checklist. In phase 1 we included 8 adherence measures (56 trials). In phase 2, from the 222 measurement property assessments identified in 109 studies, 22 high-quality measurement property assessments were narratively synthesized. Low-quality studies were used as supporting data. StepWatch Activity Monitor validly and acceptably measured short-term step count adherence. The Problematic Experiences of Therapy Scale validly and reliably assessed adherence to vestibular rehabilitation exercises. Adherence diaries had moderately high validity and acceptability across limited populations. The Borg 6 to 20 scale, Bassett and Prapavessis scale, and Yamax CW series had insufficient validity. Low-quality evidence supported use of the Joint Protection Behaviour Assessment. Polar A1 series heart monitors were considered acceptable by 1 study. Current rehabilitation adherence measures are limited. Some possess promising validity and acceptability for certain parameters of adherence, situations, and populations and should be used in these situations. Rigorous evaluation of adherence measures in a broader range of populations is needed. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
PEP725 Pan European Phenological Database
NASA Astrophysics Data System (ADS)
Koch, Elisabeth; Adler, Silke; Ungersböck, Markus; Zach-Hermann, Susanne
2010-05-01
Europe is in the fortunate situation that it has a long tradition in phenological networking: the history of collecting phenological data and using them in climatology has its starting point in 1751 when Carl von Linné outlined in his work Philosophia Botanica methods for compiling annual plant calendars of leaf opening, flowering, fruiting and leaf fall together with climatological observations "so as to show how areas differ". The Societas Meteorologicae Palatinae at Mannheim well known for its first European wide meteorological network also established a phenological network which was active from 1781 to 1792. Recently in most European countries, phenological observations have been carried out routinely for more than 50 years by different governmental and non governmental organisations and following different observation guidelines, the data stored at different places in different formats. This has been really hampering pan European studies, as one has to address many National Observations Programs (NOP) to get access to the data before one can start to bring them in a uniform style. From 2004 to 2005 the COST-action 725 was running with the main objective to establish a European reference data set of phenological observations that can be used for climatological purposes, especially climate monitoring, and detection of changes. So far the common database/reference data set of COST725 comprises 7687248 data from 7285 observation sites in 15 countries and International Phenological Gardens (IPG) spanning the timeframe from 1951 to 2000. ZAMG is hosting the database. In January 2010 PEP725 has started and will take over not only the part of maintaining, updating the database, but also to bring in phenological data from the time before 1951, developing better quality checking procedures and ensuring an open access to the database. An attractive webpage will make phenology and climate impacts on vegetation more visible in the public enabling a monitoring of vegetation development.
NASA Astrophysics Data System (ADS)
Truckenbrodt, Sina C.; Schmullius, Christiane C.
2018-03-01
Ground reference data are a prerequisite for the calibration, update, and validation of retrieval models facilitating the monitoring of land parameters based on Earth Observation data. Here, we describe the acquisition of a comprehensive ground reference database which was created to test and validate the recently developed Earth Observation Land Data Assimilation System (EO-LDAS) and products derived from remote sensing observations in the visible and infrared range. In situ data were collected for seven crop types (winter barley, winter wheat, spring wheat, durum, winter rape, potato, and sugar beet) cultivated on the agricultural Gebesee test site, central Germany, in 2013 and 2014. The database contains information on hyperspectral surface reflectance factors, the evolution of biophysical and biochemical plant parameters, phenology, surface conditions, atmospheric states, and a set of ground control points. Ground reference data were gathered at an approximately weekly resolution and on different spatial scales to investigate variations within and between acreages. In situ data collected less than 1 day apart from satellite acquisitions (RapidEye, SPOT 5, Landsat-7 and -8) with a cloud coverage ≤ 25 % are available for 10 and 15 days in 2013 and 2014, respectively. The measurements show that the investigated growing seasons were characterized by distinct meteorological conditions causing interannual variations in the parameter evolution. Here, the experimental design of the field campaigns, and methods employed in the determination of all parameters, are described in detail. Insights into the database are provided and potential fields of application are discussed. The data will contribute to a further development of crop monitoring methods based on remote sensing techniques. The database is freely available at PANGAEA (https://doi.org/10.1594/PANGAEA.874251).
Privacy-preserving search for chemical compound databases.
Shimizu, Kana; Nuida, Koji; Arai, Hiromi; Mitsunari, Shigeo; Attrapadung, Nuttapong; Hamada, Michiaki; Tsuda, Koji; Hirokawa, Takatsugu; Sakuma, Jun; Hanaoka, Goichiro; Asai, Kiyoshi
2015-01-01
Searching for similar compounds in a database is the most important process for in-silico drug screening. Since a query compound is an important starting point for the new drug, a query holder, who is afraid of the query being monitored by the database server, usually downloads all the records in the database and uses them in a closed network. However, a serious dilemma arises when the database holder also wants to output no information except for the search results, and such a dilemma prevents the use of many important data resources. In order to overcome this dilemma, we developed a novel cryptographic protocol that enables database searching while keeping both the query holder's privacy and database holder's privacy. Generally, the application of cryptographic techniques to practical problems is difficult because versatile techniques are computationally expensive while computationally inexpensive techniques can perform only trivial computation tasks. In this study, our protocol is successfully built only from an additive-homomorphic cryptosystem, which allows only addition performed on encrypted values but is computationally efficient compared with versatile techniques such as general purpose multi-party computation. In an experiment searching ChEMBL, which consists of more than 1,200,000 compounds, the proposed method was 36,900 times faster in CPU time and 12,000 times as efficient in communication size compared with general purpose multi-party computation. We proposed a novel privacy-preserving protocol for searching chemical compound databases. The proposed method, easily scaling for large-scale databases, may help to accelerate drug discovery research by making full use of unused but valuable data that includes sensitive information.
Privacy-preserving search for chemical compound databases
2015-01-01
Background Searching for similar compounds in a database is the most important process for in-silico drug screening. Since a query compound is an important starting point for the new drug, a query holder, who is afraid of the query being monitored by the database server, usually downloads all the records in the database and uses them in a closed network. However, a serious dilemma arises when the database holder also wants to output no information except for the search results, and such a dilemma prevents the use of many important data resources. Results In order to overcome this dilemma, we developed a novel cryptographic protocol that enables database searching while keeping both the query holder's privacy and database holder's privacy. Generally, the application of cryptographic techniques to practical problems is difficult because versatile techniques are computationally expensive while computationally inexpensive techniques can perform only trivial computation tasks. In this study, our protocol is successfully built only from an additive-homomorphic cryptosystem, which allows only addition performed on encrypted values but is computationally efficient compared with versatile techniques such as general purpose multi-party computation. In an experiment searching ChEMBL, which consists of more than 1,200,000 compounds, the proposed method was 36,900 times faster in CPU time and 12,000 times as efficient in communication size compared with general purpose multi-party computation. Conclusion We proposed a novel privacy-preserving protocol for searching chemical compound databases. The proposed method, easily scaling for large-scale databases, may help to accelerate drug discovery research by making full use of unused but valuable data that includes sensitive information. PMID:26678650
The Pacific Coast Ecosystem Information System (PCEIS) is a database that provides biological, ecological and geospatial information for over 8100 species from Alaska to Baja. PCEIS goes beyond capturing species’ taxonomic information by integrating monitoring information from Co...
Computer Ethics Topics and Teaching Strategies.
ERIC Educational Resources Information Center
DeLay, Jeanine A.
An overview of six major issues in computer ethics is provided in this paper: (1) unauthorized and illegal database entry, surveillance and monitoring, and privacy issues; (2) piracy and intellectual property theft; (3) equity and equal access; (4) philosophical implications of artificial intelligence and computer rights; (5) social consequences…
DAMT - DISTRIBUTED APPLICATION MONITOR TOOL (HP9000 VERSION)
NASA Technical Reports Server (NTRS)
Keith, B.
1994-01-01
Typical network monitors measure status of host computers and data traffic among hosts. A monitor to collect statistics about individual processes must be unobtrusive and possess the ability to locate and monitor processes, locate and monitor circuits between processes, and report traffic back to the user through a single application program interface (API). DAMT, Distributed Application Monitor Tool, is a distributed application program that will collect network statistics and make them available to the user. This distributed application has one component (i.e., process) on each host the user wishes to monitor as well as a set of components at a centralized location. DAMT provides the first known implementation of a network monitor at the application layer of abstraction. Potential users only need to know the process names of the distributed application they wish to monitor. The tool locates the processes and the circuit between them, and reports any traffic between them at a user-defined rate. The tool operates without the cooperation of the processes it monitors. Application processes require no changes to be monitored by this tool. Neither does DAMT require the UNIX kernel to be recompiled. The tool obtains process and circuit information by accessing the operating system's existing process database. This database contains all information available about currently executing processes. Expanding the information monitored by the tool can be done by utilizing more information from the process database. Traffic on a circuit between processes is monitored by a low-level LAN analyzer that has access to the raw network data. The tool also provides features such as dynamic event reporting and virtual path routing. A reusable object approach was used in the design of DAMT. The tool has four main components; the Virtual Path Switcher, the Central Monitor Complex, the Remote Monitor, and the LAN Analyzer. All of DAMT's components are independent, asynchronously executing processes. The independent processes communicate with each other via UNIX sockets through a Virtual Path router, or Switcher. The Switcher maintains a routing table showing the host of each component process of the tool, eliminating the need for each process to do so. The Central Monitor Complex provides the single application program interface (API) to the user and coordinates the activities of DAMT. The Central Monitor Complex is itself divided into independent objects that perform its functions. The component objects are the Central Monitor, the Process Locator, the Circuit Locator, and the Traffic Reporter. Each of these objects is an independent, asynchronously executing process. User requests to the tool are interpreted by the Central Monitor. The Process Locator identifies whether a named process is running on a monitored host and which host that is. The circuit between any two processes in the distributed application is identified using the Circuit Locator. The Traffic Reporter handles communication with the LAN Analyzer and accumulates traffic updates until it must send a traffic report to the user. The Remote Monitor process is replicated on each monitored host. It serves the Central Monitor Complex processes with application process information. The Remote Monitor process provides access to operating systems information about currently executing processes. It allows the Process Locator to find processes and the Circuit Locator to identify circuits between processes. It also provides lifetime information about currently monitored processes. The LAN Analyzer consists of two processes. Low-level monitoring is handled by the Sniffer. The Sniffer analyzes the raw data on a single, physical LAN. It responds to commands from the Analyzer process, which maintains the interface to the Traffic Reporter and keeps track of which circuits to monitor. DAMT is written in C-language for HP-9000 series computers running HP-UX and Sun 3 and 4 series computers running SunOS. DAMT requires 1Mb of disk space and 4Mb of RAM for execution. This package requires MIT's X Window System, Version 11 Revision 4, with OSF/Motif 1.1. The HP-9000 version (GSC-13589) includes sample HP-9000/375 and HP-9000/730 executables which were compiled under HP-UX, and the Sun version (GSC-13559) includes sample Sun3 and Sun4 executables compiled under SunOS. The standard distribution medium for the HP version of DAMT is a .25 inch HP pre-formatted streaming magnetic tape cartridge in UNIX tar format. It is also available on a 4mm magnetic tape in UNIX tar format. The standard distribution medium for the Sun version of DAMT is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. DAMT was developed in 1992.
Menditto, Enrica; Bolufer De Gea, Angela; Cahir, Caitriona; Marengoni, Alessandra; Riegler, Salvatore; Fico, Giuseppe; Costa, Elisio; Monaco, Alessandro; Pecorelli, Sergio; Pani, Luca; Prados-Torres, Alexandra
2016-01-01
Computerized health care databases have been widely described as an excellent opportunity for research. The availability of “big data” has brought about a wave of innovation in projects when conducting health services research. Most of the available secondary data sources are restricted to the geographical scope of a given country and present heterogeneous structure and content. Under the umbrella of the European Innovation Partnership on Active and Healthy Ageing, collaborative work conducted by the partners of the group on “adherence to prescription and medical plans” identified the use of observational and large-population databases to monitor medication-taking behavior in the elderly. This article describes the methodology used to gather the information from available databases among the Adherence Action Group partners with the aim of improving data sharing on a European level. A total of six databases belonging to three different European countries (Spain, Republic of Ireland, and Italy) were included in the analysis. Preliminary results suggest that there are some similarities. However, these results should be applied in different contexts and European countries, supporting the idea that large European studies should be designed in order to get the most of already available databases. PMID:27358570
NASA Astrophysics Data System (ADS)
Ferré, Hélène; Belmahfoud, Nizar; Boichard, Jean-Luc; Brissebrat, Guillaume; Cloché, Sophie; Descloitres, Jacques; Fleury, Laurence; Focsa, Loredana; Henriot, Nicolas; Mière, Arnaud; Ramage, Karim; Vermeulen, Anne; Boulanger, Damien
2015-04-01
The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The project includes long term monitoring of environmental parameters , intensive field campaigns, use of satellite data and modelling studies. Therefore ChARMEx scientists produce and need to access a wide diversity of data. In this context, the objective of the database task is to organize data management, distribution system and services, such as facilitating the exchange of information and stimulating the collaboration between researchers within the ChArMEx community, and beyond. The database relies on a strong collaboration between ICARE, IPSL and OMP data centers and has been set up in the framework of the Mediterranean Integrated Studies at Regional And Locals Scales (MISTRALS) program data portal. ChArMEx data, either produced or used by the project, are documented and accessible through the database website: http://mistrals.sedoo.fr/ChArMEx. The website offers the usual but user-friendly functionalities: data catalog, user registration procedure, search tool to select and access data... The metadata (data description) are standardized, and comply with international standards (ISO 19115-19139; INSPIRE European Directive; Global Change Master Directory Thesaurus). A Digital Object Identifier (DOI) assignement procedure allows to automatically register the datasets, in order to make them easier to access, cite, reuse and verify. At present, the ChArMEx database contains about 120 datasets, including more than 80 in situ datasets (2012, 2013 and 2014 summer campaigns, background monitoring station of Ersa...), 25 model output sets (dust model intercomparison, MEDCORDEX scenarios...), a high resolution emission inventory over the Mediterranean... Many in situ datasets have been inserted in a relational database, in order to enable more accurate selection and download of different datasets in a shared format. Many dedicated satellite products (SEVIRI, TRIMM, PARASOL...) are processed and will soon be accessible through the database website. In order to meet the operational needs of the airborne and ground based observational teams during the ChArMEx campaigns, a day-to-day chart display website has been developed and operated: http://choc.sedoo.org. It offers a convenient way to browse weather conditions and chemical composition during the campaign periods. Every scientist is invited to visit the ChArMEx websites, to register and to request data. Feel free to contact charmex-database@sedoo.fr for any question.
NASA Astrophysics Data System (ADS)
Ferré, Hélène; Descloitres, Jacques; Fleury, Laurence; Boichard, Jean-Luc; Brissebrat, Guillaume; Focsa, Loredana; Henriot, Nicolas; Mastrorillo, Laurence; Mière, Arnaud; Vermeulen, Anne
2013-04-01
The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The project includes long term monitoring of environmental parameters, intensive field campaigns, use of satellite data and modelling studies. Therefore ChARMEx scientists produce and need to access a wide diversity of data. In this context, the objective of the database task is to organize data management, distribution system and services such as facilitating the exchange of information and stimulating the collaboration between researchers within the ChArMEx community, and beyond. The database relies on a strong collaboration between OMP and ICARE data centres and falls within the scope of the Mediterranean Integrated Studies at Regional And Locals Scales (MISTRALS) program data portal. All the data produced by or of interest for the ChArMEx community will be documented in the data catalogue and accessible through the database website: http://mistrals.sedoo.fr/ChArMEx. The database website offers different tools: - A registration procedure which enables any scientist to accept the data policy and apply for a user database account. - Forms to document observations or products that will be provided to the database in compliance with metadata international standards (ISO 19115-19139; INSPIRE; Global Change Master Directory Thesaurus). - A search tool to browse the catalogue using thematic, geographic and/or temporal criteria. - Sorted lists of the datasets by thematic keywords, by measured parameters, by instruments or by platform type. - A shopping-cart web interface to order in situ data files. At present datasets from the background monitoring station of Ersa, Cape Corsica and from the 2012 ChArMEx pre-campaign are available. - A user-friendly access to satellite products (SEVIRI, TRIMM, PARASOL...) stored in the ICARE data archive using OpeNDAP protocole The website will soon propose new facilities. In particular, many in situ datasets will be homogenized and inserted in a relational database, in order to enable more accurate data selection and download of different datasets in a shared format. In order to meet the operational needs of the airborne and ground based observational teams during the ChArMEx 2012 pre-campaign and the 2013 experiment, a day-to-day quick look and report display website has been developed too: http://choc.sedoo.org. It offers a convenient way to browse weather conditions and chemical composition during the campaign periods.
NASA Astrophysics Data System (ADS)
Ilavajhala, S.; Davies, D.; Schmaltz, J. E.; Wong, M.; Murphy, K. J.
2013-12-01
The NASA Fire Information for Resource Management System (FIRMS) is at the forefront of providing global near real-time (NRT) MODIS thermal anomalies / hotspot location data to end-users . FIRMS serves the data via an interactive Web GIS named Web Fire Mapper, downloads of NRT active fire, archive data downloads for MODIS hotspots dating back to 1999 and a hotspot email alert system The FIRMS Email Alerts system has been successfully alerting users of fires in their area of interest in near real-time and/or via daily and weekly email summaries, with an option to receive MODIS hotspot data as a text file (CSV) attachment. Currently, there are more than 7000 email alert subscriptions from more than 100 countries. Specifically, the email alerts system is designed to generate and send an email alert for any region or area on the globe, with a special focus on providing alerts for protected areas worldwide. For many protected areas, email alerts are particularly useful for early fire detection, monitoring on going fires, as well as allocating resources to protect wildlife and natural resources of particular value. For protected areas, FIRMS uses the World Database on Protected Areas (WDPA) supplied by United Nations Environment Program - World Conservation Monitoring Centre (UNEP-WCMC). Maintaining the most up-to-date, accurate boundary geometry for the protected areas for the email alerts is a challenge as the WDPA is continuously updated due to changing boundaries, merging or delisting of certain protected areas. Because of this dynamic nature of the protected areas database, the FIRMS protected areas database is frequently out-of-date with the most current version of WDPA database. To maintain the most up-to-date boundary information for protected areas and to be in compliance with the WDPA terms and conditions, FIRMS needs to constantly update its database of protected areas. Currently, FIRMS strives to keep its database up to date by downloading the most recent WDPA database at regular intervals, processing it, and ingesting it into the FIRMS spatial database. However, due to the large size of database, the process to download, process and ingest the database is quite time consuming. The FIRMS team is currently working on developing a method to update the protected areas database via web at regular intervals or on-demand. Using such a solution, FIRMS will be able access the most up-to-date extents of any protected area and the corresponding spatial geometries in real time. As such, FIRMS can utilize such a service to access the protected areas and their associated geometries to keep users' protected area boundaries in sync with those of the most recent WDPA database, and thus serve a more accurate email alert to the users. Furthermore, any client accessing the WDPA protected areas database could potentially use the solution of real-time access to the protected areas database. This talk primarily focuses on the challenges for FIRMS in sending accurate email alerts for protected areas, along with the solution the FIRMS team is developing. This talk also introduces the FIRMS fire information system and its components, with a special emphasis on the FIRMS email alerts system.
Watson, Aaron M; Foster Thompson, Lori; Rudolph, Jane V; Whelan, Thomas J; Behrend, Tara S; Gissel, Amanda L
2013-07-01
Web-based training is frequently used by organizations as a convenient and low-cost way to teach employees new knowledge and skills. As web-based training is typically unproctored, employees may be held accountable to the organization by computer software that monitors their behaviors. The current study examines how the introduction of electronic performance monitoring may provoke negative emotional reactions and decrease learning among certain types of e-learners. Through motivated action theory and trait activation theory, we examine the role of performance goal orientation when e-learners are exposed to asynchronous and synchronous monitoring. We show that some e-learners are more susceptible than others to evaluation apprehension when they perceive their activities are being monitored electronically. Specifically, e-learners higher in avoid performance goal orientation exhibited increased evaluation apprehension if they believed asynchronous monitoring was present, and they showed decreased skill attainment as a result. E-learners higher on prove performance goal orientation showed greater evaluation apprehension if they believed real-time monitoring was occurring, resulting in decreased skill attainment. PsycINFO Database Record (c) 2013 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Neidhardt, Alexander; Kirschbauer, Katharina; Plötz, Christian; Schönberger, Matthias; Böer, Armin; Wettzell VLBI Team
2016-12-01
The first test implementation of an auxiliary data archive is tested at the Geodetic Observatory Wetttzell. It is software which follows on the Wettzell SysMon, extending the database and data sensors with the functionalities of a professional monitoring environment, named Zabbix. Some extensions to the remote control server on the NASA Field System PC enable the inclusion of data from external antennas. The presentation demonstrates the implementation and discusses the current possibilities to encourage other antennas to join the auxiliary archive.
Bucchi, L; Pierri, C; Caprara, L; Cortecchia, S; De Lillo, M; Bondi, A
2003-02-01
This paper presents a computerised system for the monitoring of integrated cervical screening, i.e. the integration of spontaneous Pap smear practice into organised screening. The general characteristics of the system are described, including background and rationale (integrated cervical screening in European countries, impact of integration on monitoring, decentralised organization of screening and levels of monitoring), general methods (definitions, sections, software description, and setting of application), and indicators of participation (distribution by time interval since previous Pap smear, distribution by screening sector--organised screening centres vs public and private clinical settings--, distribution by time interval between the last two Pap smears, and movement of women between the two screening sectors). Also, the paper reports the results of the application of these indicators in the general database of the Pathology Department of Imola Health District in northern Italy.
Remote real-time monitoring of subsurface landfill gas migration.
Fay, Cormac; Doherty, Aiden R; Beirne, Stephen; Collins, Fiachra; Foley, Colum; Healy, John; Kiernan, Breda M; Lee, Hyowon; Maher, Damien; Orpen, Dylan; Phelan, Thomas; Qiu, Zhengwei; Zhang, Kirk; Gurrin, Cathal; Corcoran, Brian; O'Connor, Noel E; Smeaton, Alan F; Diamond, Dermot
2011-01-01
The cost of monitoring greenhouse gas emissions from landfill sites is of major concern for regulatory authorities. The current monitoring procedure is recognised as labour intensive, requiring agency inspectors to physically travel to perimeter borehole wells in rough terrain and manually measure gas concentration levels with expensive hand-held instrumentation. In this article we present a cost-effective and efficient system for remotely monitoring landfill subsurface migration of methane and carbon dioxide concentration levels. Based purely on an autonomous sensing architecture, the proposed sensing platform was capable of performing complex analytical measurements in situ and successfully communicating the data remotely to a cloud database. A web tool was developed to present the sensed data to relevant stakeholders. We report our experiences in deploying such an approach in the field over a period of approximately 16 months.
Remote Real-Time Monitoring of Subsurface Landfill Gas Migration
Fay, Cormac; Doherty, Aiden R.; Beirne, Stephen; Collins, Fiachra; Foley, Colum; Healy, John; Kiernan, Breda M.; Lee, Hyowon; Maher, Damien; Orpen, Dylan; Phelan, Thomas; Qiu, Zhengwei; Zhang, Kirk; Gurrin, Cathal; Corcoran, Brian; O’Connor, Noel E.; Smeaton, Alan F.; Diamond, Dermot
2011-01-01
The cost of monitoring greenhouse gas emissions from landfill sites is of major concern for regulatory authorities. The current monitoring procedure is recognised as labour intensive, requiring agency inspectors to physically travel to perimeter borehole wells in rough terrain and manually measure gas concentration levels with expensive hand-held instrumentation. In this article we present a cost-effective and efficient system for remotely monitoring landfill subsurface migration of methane and carbon dioxide concentration levels. Based purely on an autonomous sensing architecture, the proposed sensing platform was capable of performing complex analytical measurements in situ and successfully communicating the data remotely to a cloud database. A web tool was developed to present the sensed data to relevant stakeholders. We report our experiences in deploying such an approach in the field over a period of approximately 16 months. PMID:22163975
Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F.; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P.; McCrae, John; McCorkindale, Sheila; Leather, David
2016-01-01
Abstract Background The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once‐daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. Objective The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in‐depth exploration of the safety results will be the subject of future publications. Achievements The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Conclusion Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. PMID:27804174
Nonradioactive Ambient Air Monitoring at Los Alamos National Laboratory 2001--2002
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. Gladney; J.Dewart, C.Eberhart; J.Lochamy
2004-09-01
During the spring of 2000, the Cerro Grande forest fire reached Los Alamos National Laboratory (LANL) and ignited both above-ground vegetation and disposed materials in several landfills. During and after the fire, there was concern about the potential human health impacts from chemicals emitted by the combustion of these Laboratory materials. Consequently, short-term, intensive air-monitoring studies were performed during and shortly after the fire. Unlike the radiological data from many years of AIRNET sampling, LANL did not have an adequate database of nonradiological species under baseline conditions with which to compare data collected during the fire. Therefore, during 2001 themore » Meteorology and Air Quality Group designed and implemented a new air-monitoring program, entitled NonRadNET, to provide nonradiological background data under normal conditions. The objectives of NonRadNET were to: (1) develop the capability for collecting nonradiological air-monitoring data, (2) conduct monitoring to develop a database of typical background levels of selected nonradiological species in the communities nearest the Laboratory, and (3) determine LANL's potential contribution to nonradiological air pollution in the surrounding communities. NonRadNET ended in late December 2002 with five quarters of data. The purpose of this paper is to organize and describe the NonRadNET data collected over 2001-2002 to use as baseline data, either for monitoring during a fire, some other abnormal event, or routine use. To achieve that purpose, in this paper we will: (1) document the NonRadNET program procedures, methods, and quality management, (2) describe the usual origins and uses of the species measured, (3) compare the species measured to LANL and other area emissions, (4) present the five quarters of data, (5) compare the data to known typical environmental values, and (6) evaluate the data against exposure standards.« less
Collier, Sue; Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P; McCrae, John; McCorkindale, Sheila; Leather, David
2017-03-01
The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once-daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in-depth exploration of the safety results will be the subject of future publications. The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Burns, Lee; Decker, Ryan
2004-01-01
Lightning strike location and peak current are monitored operationally in the Kennedy Space Center (KSC)/Cape Canaveral Air Force Station (CCAFS) area by the Cloud to Ground Lightning Surveillance System (CGLSS). The present study compiles ten years of CGLSS data into a climatological database of all strikes recorded within a 20-mile radius of space shuttle launch platform LP39A, which serves as a convenient central point. The period of record (POR) for the database runs from January 1, 1993 to December 31, 2002. Histograms and cumulative probability curves are produced to determine the distribution of occurrence rates for the spectrum of strike intensities (given in kA). Further analysis of the database provides a description of both seasonal and interannual variations in the lightning distribution.
Adibuzzaman, Mohammad; DeLaurentis, Poching; Hill, Jennifer; Benneyworth, Brian D
2017-01-01
Recent advances in data collection during routine health care in the form of Electronic Health Records (EHR), medical device data (e.g., infusion pump informatics, physiological monitoring data, and insurance claims data, among others, as well as biological and experimental data, have created tremendous opportunities for biological discoveries for clinical application. However, even with all the advancement in technologies and their promises for discoveries, very few research findings have been translated to clinical knowledge, or more importantly, to clinical practice. In this paper, we identify and present the initial work addressing the relevant challenges in three broad categories: data, accessibility, and translation. These issues are discussed in the context of a widely used detailed database from an intensive care unit, Medical Information Mart for Intensive Care (MIMIC III) database.
Open-access MIMIC-II database for intensive care research.
Lee, Joon; Scott, Daniel J; Villarroel, Mauricio; Clifford, Gari D; Saeed, Mohammed; Mark, Roger G
2011-01-01
The critical state of intensive care unit (ICU) patients demands close monitoring, and as a result a large volume of multi-parameter data is collected continuously. This represents a unique opportunity for researchers interested in clinical data mining. We sought to foster a more transparent and efficient intensive care research community by building a publicly available ICU database, namely Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II). The data harnessed in MIMIC-II were collected from the ICUs of Beth Israel Deaconess Medical Center from 2001 to 2008 and represent 26,870 adult hospital admissions (version 2.6). MIMIC-II consists of two major components: clinical data and physiological waveforms. The clinical data, which include patient demographics, intravenous medication drip rates, and laboratory test results, were organized into a relational database. The physiological waveforms, including 125 Hz signals recorded at bedside and corresponding vital signs, were stored in an open-source format. MIMIC-II data were also deidentified in order to remove protected health information. Any interested researcher can gain access to MIMIC-II free of charge after signing a data use agreement and completing human subjects training. MIMIC-II can support a wide variety of research studies, ranging from the development of clinical decision support algorithms to retrospective clinical studies. We anticipate that MIMIC-II will be an invaluable resource for intensive care research by stimulating fair comparisons among different studies.
Extending Glacier Monitoring into the Little Ice Age and Beyond
NASA Astrophysics Data System (ADS)
Nussbaumer, S. U.; Gärtner-Roer, I.; Zemp, M.; Zumbühl, H. J.; Masiokas, M. H.; Espizua, L. E.; Pitte, P.
2011-12-01
Glaciers are among the best natural proxies of climatic changes and, as such, a key variable within the international climate observing system. The worldwide monitoring of glacier distribution and fluctuations has been internationally coordinated for more than a century. Direct measurements of seasonal and annual glacier mass balance are available for the past six decades. Regular observations of glacier front variations have been carried out since the late 19th century. Information on glacier fluctuations before the onset of regular in situ measurements have to be reconstructed from moraines, historical evidence, and a wide range of dating methods. The majority of corresponding data is not available to the scientific community which challenges the reproducibility and direct comparison of the results. Here, we present a first approach towards the standardization of reconstructed Holocene glacier front variations as well as the integration of the corresponding data series into the database of the World Glacier Monitoring Service (www.wgms.ch), within the framework of the Global Terrestrial Network for Glaciers (www.gtn-g.org). The concept for the integration of these reconstructed front variations into the relational glacier database of the WGMS was jointly elaborated and tested by experts of both fields (natural and historical sciences), based on reconstruction series of 15 glaciers in Europe (western/central Alps and southern Norway) and 9 in southern South America. The reconstructed front variation series extend the direct measurements of the 20th century by two centuries in Norway and by four in the Alps and in South America. The storage of the records within the international glacier databases guarantees the long-term availability of the data series and increases the visibility of the scientific research which - in historical glaciology - is often the work of a lifetime. The standardized collection of reconstructed glacier front variations from southern Norway, the western Alps and the southern Andes allows a direct comparison between different glaciers. It is a first step towards a worldwide compilation and free dissemination of Holocene glacier fluctuation series within the internationally coordinated glacier monitoring.
Major results of the MAARBLE project
NASA Astrophysics Data System (ADS)
Daglis, Ioannis A.; Bourdarie, Sebastien; Horne, Richard B.; Khotyaintsev, Yuri; Mann, Ian R.; Santolik, Ondrej; Turner, Drew L.; Balasis, Georgios
2016-04-01
The goal of the MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Loss and Energization) project was to shed light on the ways the dynamic evolution of the Van Allen belts is influenced by low-frequency electromagnetic waves. MAARBLE was implemented by a consortium of seven institutions (five European, one Canadian and one US) with support from the European Community's Seventh Framework Programme. The MAARBLE project employed multi-spacecraft monitoring of the geospace environment, complemented by ground-based monitoring, in order to analyze and assess the physical mechanisms leading to radiation belt particle energisation and loss. Particular attention was paid to the role of ULF/VLF waves. Within MAARBLE we created a database containing properties of ULF and VLF waves, based on measurements from the Cluster, THEMIS and CHAMP missions and from the CARISMA and IMAGE ground magnetometer networks. The database is now available to the scientific community through the Cluster Science Archive as auxiliary content. Based on the wave database, a statistical model of the wave activity dependent on the level of geomagnetic activity, solar wind forcing, and magnetospheric region has been developed. Multi-spacecraft particle measurements have been incorporated into data assimilation tools, leading to a more accurate estimate of the state of the radiation belts. The synergy of wave and particle observations is in the core of MAARBLE research studies of radiation belt dynamics. Results and conclusions from these studies will be presented in this paper. The MAARBLE (Monitoring, Analyzing and Assessing Radiation Belt Energization and Loss) collaborative research project has received funding from the European Unions Seventh Framework Programme (FP7-SPACE 2011-1) under grant agreement no. 284520. The complete MAARBLE Team: Ioannis A. Daglis, Sebastien Bourdarie, Richard B. Horne, Yuri Khotyaintsev, Ian R. Mann, Ondrej Santolik, Drew L. Turner, Georgios Balasis, Anastasios Anastasiadis, Vassilis Angelopoulos, David Barona, Eleni Chatzichristou, Stavros Dimitrakoudis, Marina Georgiou, Omiros Giannakis, Sarah Glauert, Benjamin Grison, Zuzana Hrbackova, Andy Kale, Christos Katsavrias, Tobias Kersten, Ivana Kolmasova, Didier Lazaro, Eva Macusova, Vincent Maget, Meghan Mella, Nigel Meredith, Fiori-Anastasia Metallinou, David Milling, Louis Ozeke, Constantinos Papadimitriou, George Ropokis, Ingmar Sandberg, Maria Usanova, Iannis Dandouras, David Sibeck, Eftyhia Zesta.
A new, high-resolution global mass coral bleaching database
Rickbeil, Gregory J. M.; Heron, Scott F.
2017-01-01
Episodes of mass coral bleaching have been reported in recent decades and have raised concerns about the future of coral reefs on a warming planet. Despite the efforts to enhance and coordinate coral reef monitoring within and across countries, our knowledge of the geographic extent of mass coral bleaching over the past few decades is incomplete. Existing databases, like ReefBase, are limited by the voluntary nature of contributions, geographical biases in data collection, and the variations in the spatial scale of bleaching reports. In this study, we have developed the first-ever gridded, global-scale historical coral bleaching database. First, we conducted a targeted search for bleaching reports not included in ReefBase by personally contacting scientists and divers conducting monitoring in under-reported locations and by extracting data from the literature. This search increased the number of observed bleaching reports by 79%, from 4146 to 7429. Second, we employed spatial interpolation techniques to develop annual 0.04° × 0.04° latitude-longitude global maps of the probability that bleaching occurred for 1985 through 2010. Initial results indicate that the area of coral reefs with a more likely than not (>50%) or likely (>66%) probability of bleaching was eight times higher in the second half of the assessed time period, after the 1997/1998 El Niño. The results also indicate that annual maximum Degree Heating Weeks, a measure of thermal stress, for coral reefs with a high probability of bleaching increased over time. The database will help the scientific community more accurately assess the change in the frequency of mass coral bleaching events, validate methods of predicting mass coral bleaching, and test whether coral reefs are adjusting to rising ocean temperatures. PMID:28445534
A new, high-resolution global mass coral bleaching database.
Donner, Simon D; Rickbeil, Gregory J M; Heron, Scott F
2017-01-01
Episodes of mass coral bleaching have been reported in recent decades and have raised concerns about the future of coral reefs on a warming planet. Despite the efforts to enhance and coordinate coral reef monitoring within and across countries, our knowledge of the geographic extent of mass coral bleaching over the past few decades is incomplete. Existing databases, like ReefBase, are limited by the voluntary nature of contributions, geographical biases in data collection, and the variations in the spatial scale of bleaching reports. In this study, we have developed the first-ever gridded, global-scale historical coral bleaching database. First, we conducted a targeted search for bleaching reports not included in ReefBase by personally contacting scientists and divers conducting monitoring in under-reported locations and by extracting data from the literature. This search increased the number of observed bleaching reports by 79%, from 4146 to 7429. Second, we employed spatial interpolation techniques to develop annual 0.04° × 0.04° latitude-longitude global maps of the probability that bleaching occurred for 1985 through 2010. Initial results indicate that the area of coral reefs with a more likely than not (>50%) or likely (>66%) probability of bleaching was eight times higher in the second half of the assessed time period, after the 1997/1998 El Niño. The results also indicate that annual maximum Degree Heating Weeks, a measure of thermal stress, for coral reefs with a high probability of bleaching increased over time. The database will help the scientific community more accurately assess the change in the frequency of mass coral bleaching events, validate methods of predicting mass coral bleaching, and test whether coral reefs are adjusting to rising ocean temperatures.
Methods for Estimating Annual Wastewater Nutrient Loads in the Southeastern United States
McMahon, Gerard; Tervelt, Larinda; Donehoo, William
2007-01-01
This report describes an approach for estimating annual total nitrogen and total phosphorus loads from point-source dischargers in the southeastern United States. Nutrient load estimates for 2002 were used in the calibration and application of a regional nutrient model, referred to as the SPARROW (SPAtially Referenced Regression On Watershed attributes) watershed model. Loads from dischargers permitted under the National Pollutant Discharge Elimination System were calculated using data from the U.S. Environmental Protection Agency Permit Compliance System database and individual state databases. Site information from both state and U.S. Environmental Protection Agency databases, including latitude and longitude and monitored effluent data, was compiled into a project database. For sites with a complete effluent-monitoring record, effluent-flow and nutrient-concentration data were used to develop estimates of annual point-source nitrogen and phosphorus loads. When flow data were available but nutrient-concentration data were missing or incomplete, typical pollutant-concentration values of total nitrogen and total phosphorus were used to estimate load. In developing typical pollutant-concentration values, the major factors assumed to influence wastewater nutrient-concentration variability were the size of the discharger (the amount of flow), the season during which discharge occurred, and the Standard Industrial Classification code of the discharger. One insight gained from this study is that in order to gain access to flow, concentration, and location data, close communication and collaboration are required with the agencies that collect and manage the data. In addition, the accuracy and usefulness of the load estimates depend on the willingness of the states and the U.S. Environmental Protection Agency to provide guidance and review for at least a subset of the load estimates that may be problematic.
XTCE (XML Telemetric and Command Exchange) Standard Making It Work at NASA. Can It Work For You?
NASA Technical Reports Server (NTRS)
Munoz-Fernandez, Michela; Smith, Danford S.; Rice, James K.; Jones, Ronald A.
2017-01-01
The XML Telemetric and Command Exchange (XTCE) standard is intended as a way to describe telemetry and command databases to be exchanged across centers and space agencies. XTCE usage has the potential to lead to consolidation of the Mission Operations Center (MOC) Monitor and Control displays for mission cross-support, reducing equipment and configuration costs, as well as a decrease in the turnaround time for telemetry and command modifications during all the mission phases. The adoption of XTCE will reduce software maintenance costs by reducing the variation between our existing mission dictionaries. The main objective of this poster is to show how powerful XTCE is in terms of interoperability across centers and missions. We will provide results for a use case where two centers can use their local tools to process and display the same mission telemetry in their MOC independently of one another. In our use case we have first quantified the ability for XTCE to capture the telemetry definitions of the mission by use of our suite of support tools (Conversion, Validation, and Compliance measurement). The next step was to show processing and monitoring of the same telemetry in two mission centers. Once the database was converted to XTCE using our tool, the XTCE file became our primary database and was shared among the various tool chains through their XTCE importers and ultimately configured to ingest the telemetry stream and display or capture the telemetered information in similar ways.Summary results include the ability to take a real mission database and real mission telemetry and display them on various tools from two centers, as well as using commercially free COTS.
Classification of time series patterns from complex dynamic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Rao, N.
1998-07-01
An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately,more » the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.« less
Example of monitoring measurements in a virtual eye clinic using 'big data'.
Jones, Lee; Bryan, Susan R; Miranda, Marco A; Crabb, David P; Kotecha, Aachal
2017-10-26
To assess the equivalence of measurement outcomes between patients attending a standard glaucoma care service, where patients see an ophthalmologist in a face-to-face setting, and a glaucoma monitoring service (GMS). The average mean deviation (MD) measurement on the visual field (VF) test for 250 patients attending a GMS were compared with a 'big data' repository of patients attending a standard glaucoma care service (reference database). In addition, the speed of VF progression between GMS patients and reference database patients was compared. Reference database patients were used to create expected outcomes that GMS patients could be compared with. For GMS patients falling outside of the expected limits, further analysis was carried out on the clinical management decisions for these patients. The average MD of patients in the GMS ranged from +1.6 dB to -18.9 dB between two consecutive appointments at the clinic. In the first analysis, 12 (4.8%; 95% CI 2.5% to 8.2%) GMS patients scored outside the 90% expected values based on the reference database. In the second analysis, 1.9% (95% CI 0.4% to 5.4%) GMS patients had VF changes outside of the expected 90% limits. Using 'big data' collected in the standard glaucoma care service, we found that patients attending a GMS have equivalent outcomes on the VF test. Our findings provide support for the implementation of virtual healthcare delivery in the hospital eye service. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Technical Reports Server (NTRS)
Rochon, Gilbert L.
1989-01-01
Parameters were described for spatial database to facilitate drought monitoring and famine early warning in the African Sahel. The proposed system, referred to as the African Drought and Famine Information System (ADFIS) is ultimately recommended for implementation with the NASA/FEMA Spatial Analysis and Modeling System (SAMS), a GIS/Dymanic Modeling software package, currently under development. SAMS is derived from FEMA'S Integration Emergency Management Information System (IEMIS) and the Pacific Northwest Laborotory's/Engineering Topographic Laboratory's Airland Battlefield Environment (ALBE) GIS. SAMS is primarily intended for disaster planning and resource management applications with the developing countries. Sources of data for the system would include the Developing Economics Branch of the U.S. Dept. of Agriculture, the World Bank, Tulane University School of Public Health and Tropical Medicine's Famine Early Warning Systems (FEWS) Project, the USAID's Foreign Disaster Assistance Section, the World Resources Institute, the World Meterological Institute, the USGS, the UNFAO, UNICEF, and the United Nations Disaster Relief Organization (UNDRO). Satellite imagery would include decadal AVHRR imagery and Normalized Difference Vegetation Index (NDVI) values from 1981 to the present for the African continent and selected Landsat scenes for the Sudan pilot study. The system is initially conceived for the MicroVAX 2/GPX, running VMS. To facilitate comparative analysis, a global time-series database (1950 to 1987) is included for a basic set of 125 socio-economic variables per country per year. A more detailed database for the Sahelian countries includes soil type, water resources, agricultural production, agricultural import and export, food aid, and consumption. A pilot dataset for the Sudan with over 2,500 variables from the World Bank's ANDREX system, also includes epidemiological data on incidence of kwashiorkor, marasmus, other nutritional deficiencies, and synergistically-related infectious diseases.
A WebGIS system on the base of satellite data processing system for marine application
NASA Astrophysics Data System (ADS)
Gong, Fang; Wang, Difeng; Huang, Haiqing; Chen, Jianyu
2007-10-01
From 2002 to 2004, a satellite data processing system for marine application had been built up in State Key Laboratory of Satellite Ocean Environment Dynamics (Second Institute of Oceanography, State Oceanic Administration). The system received satellite data from TERRA, AQUA, NOAA-12/15/16/17/18, FY-1D and automatically generated Level3 products and Level4 products(products of single orbit and merged multi-orbits products) deriving from Level0 data, which is controlled by an operational control sub-system. Currently, the products created by this system play an important role in the marine environment monitoring, disaster monitoring and researches. Now a distribution platform has been developed on this foundation, namely WebGIS system for querying and browsing of oceanic remote sensing data. This system is based upon large database system-Oracle. We made use of the space database engine of ArcSDE and other middleware to perform database operation in addition. J2EE frame was adopted as development model, and Oracle 9.2 DBMS as database background and server. Simply using standard browsers(such as IE6.0), users can visit and browse the public service information that provided by system, including browsing for oceanic remote sensing data, and enlarge, contract, move, renew, traveling, further data inquiry, attribution search and data download etc. The system is still under test now. Founding of such a system will become an important distribution platform of Chinese satellite oceanic environment products of special topic and category (including Sea surface temperature, Concentration of chlorophyll, and so on), for the exaltation of satellite products' utilization and promoting the data share and the research of the oceanic remote sensing platform.
HISTORY OF TROPOSPHERIC OZONE FOR THE SAN BERNARDINO MOUNTAINS OF SOUTHERN CALIFORNIA, 1963-1999
A historical database of hourly O3 concentrations for Crestline, California in 1963-1999 has been developed based on all available representative oxidant/ozone monitoring data taken since 1963. All data were obtained from the California Air Resources Board and the U.S. Departmen...
75 FR 45485 - Determination of Attainment for PM10
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-03
... reports for years 2007, 2008, and 2009. Furthermore, we concluded in our Technical System Audit Report...://www.regulations.gov or e-mail. http://www.regulations.gov is an ``anonymous access'' system, and EPA... System (AQS) database. Data from air monitors operated by State/local/tribal agencies in compliance with...
A GIS APPROACH TO IDENTIFY AND CLASSIFY HYDROGEOMORPHIC TYPES OF COASTAL WETLANDS OF THE GREAT LAKES
There is a need by Great Lakes managers to have a comprehensive inventory of the coastal wetland resources for monitoring and assessment. An electronic database and geographic information system (GIS) point coverage of coastal wetland locations along the U.S. shoreline have been ...
SPATIALLY-BALANCED SURVEY DESIGN FOR GROUNDWATER USING EXISTING WELLS
Many states have a monitoring program to evaluate the water quality of groundwater across the state. These programs rely on existing wells for access to the groundwater, due to the high cost of drilling new wells. Typically, a state maintains a database of all well locations, in...
ERIC Educational Resources Information Center
van den Bosch, Roxette M.; Espin, Christine A.; Chung, Siuman; Saab, Nadira
2017-01-01
Teachers have difficulty using data from Curriculum-based Measurement (CBM) progress graphs of students with learning difficulties for instructional decision-making. As a first step in unraveling those difficulties, we studied teachers' comprehension of CBM graphs. Using think-aloud methodology, we examined 23 teachers' ability to…
Code of Federal Regulations, 2013 CFR
2013-07-01
... continuous monitors must be submitted to the Administrator at the appropriate address as shown in 40 CFR 60.4. (b) The following information must be reported to the Administrator for each 30 operating day period...FIRE database by using the Compliance and Emissions Data Reporting Interface (CEDRI) that is accessed...
Code of Federal Regulations, 2014 CFR
2014-07-01
... continuous monitors must be submitted to the Administrator at the appropriate address as shown in 40 CFR 60.4. (b) The following information must be reported to the Administrator for each 30 operating day period...FIRE database by using the Compliance and Emissions Data Reporting Interface (CEDRI) that is accessed...
The Unexplored Impact of IPv6 on Intrusion Detection Systems
2012-03-01
of cross-NIDS, standardized, rule sets such as SNORT’s VRT [23]. • Continuously monitor vulnerability or exploit development sites. For example, the...and BRO polices should be written to enhance detection. The bolstering of built-in databases and repositories such as VRT [23] for specific IPv6 issues
NASA Astrophysics Data System (ADS)
Ferré, Helene; Belmahfoud, Nizar; Boichard, Jean-Luc; Brissebrat, Guillaume; Descloitres, Jacques; Fleury, Laurence; Focsa, Loredana; Henriot, Nicolas; Mastrorillo, Laurence; Mière, Arnaud; Vermeulen, Anne
2014-05-01
The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The project includes long term monitoring of environmental parameters, intensive field campaigns, use of satellite data and modelling studies. Therefore ChARMEx scientists produce and need to access a wide diversity of data. In this context, the objective of the database task is to organize data management, distribution system and services, such as facilitating the exchange of information and stimulating the collaboration between researchers within the ChArMEx community, and beyond. The database relies on a strong collaboration between OMP and ICARE data centres and has been set up in the framework of the Mediterranean Integrated Studies at Regional And Locals Scales (MISTRALS) program data portal. All the data produced by or of interest for the ChArMEx community will be documented in the data catalogue and accessible through the database website: http://mistrals.sedoo.fr/ChArMEx. At present, the ChArMEx database contains about 75 datasets, including 50 in situ datasets (2012 and 2013 campaigns, Ersa background monitoring station), 25 model outputs (dust model intercomparison, MEDCORDEX scenarios), and a high resolution emission inventory over the Mediterranean. Many in situ datasets have been inserted in a relational database, in order to enable more accurate data selection and download of different datasets in a shared format. The database website offers different tools: - A registration procedure which enables any scientist to accept the data policy and apply for a user database account. - A data catalogue that complies with metadata international standards (ISO 19115-19139; INSPIRE European Directive; Global Change Master Directory Thesaurus). - Metadata forms to document observations or products that will be provided to the database. - A search tool to browse the catalogue using thematic, geographic and/or temporal criteria. - A shopping-cart web interface to order in situ data files. - A web interface to select and access to homogenized datasets. Interoperability between the two data centres is being set up using the OPEnDAP protocol. The data portal will soon propose a user-friendly access to satellite products managed by the ICARE data centre (SEVIRI, TRIMM, PARASOL...). In order to meet the operational needs of the airborne and ground based observational teams during the ChArMEx 2012 and 2013 campaigns, a day-to-day chart and report display website has been developed too: http://choc.sedoo.org. It offers a convenient way to browse weather conditions and chemical composition during the campaign periods.
Rice proteome database: a step toward functional analysis of the rice genome.
Komatsu, Setsuko
2005-09-01
The technique of proteome analysis using two-dimensional polyacrylamide gel electrophoresis (2D-PAGE) has the power to monitor global changes that occur in the protein complement of tissues and subcellular compartments. In this study, the proteins of rice were cataloged, a rice proteome database was constructed, and a functional characterization of some of the identified proteins was undertaken. Proteins extracted from various tissues and subcellular compartments in rice were separated by 2D-PAGE and an image analyzer was used to construct a display of the proteins. The Rice Proteome Database contains 23 reference maps based on 2D-PAGE of proteins from various rice tissues and subcellular compartments. These reference maps comprise 13129 identified proteins, and the amino acid sequences of 5092 proteins are entered in the database. Major proteins involved in growth or stress responses were identified using the proteome approach. Some of these proteins, including a beta-tubulin, calreticulin, and ribulose-1,5-bisphosphate carboxylase/oxygenase activase in rice, have unexpected functions. The information obtained from the Rice Proteome Database will aid in cloning the genes for and predicting the function of unknown proteins.
Goldsmith, M-R; Grulke, C M; Brooks, R D; Transue, T R; Tan, Y M; Frame, A; Egeghy, P P; Edwards, R; Chang, D T; Tornero-Velez, R; Isaacs, K; Wang, A; Johnson, J; Holm, K; Reich, M; Mitchell, J; Vallero, D A; Phillips, L; Phillips, M; Wambaugh, J F; Judson, R S; Buckley, T J; Dary, C C
2014-03-01
Consumer products are a primary source of chemical exposures, yet little structured information is available on the chemical ingredients of these products and the concentrations at which ingredients are present. To address this data gap, we created a database of chemicals in consumer products using product Material Safety Data Sheets (MSDSs) publicly provided by a large retailer. The resulting database represents 1797 unique chemicals mapped to 8921 consumer products and a hierarchy of 353 consumer product "use categories" within a total of 15 top-level categories. We examine the utility of this database and discuss ways in which it will support (i) exposure screening and prioritization, (ii) generic or framework formulations for several indoor/consumer product exposure modeling initiatives, (iii) candidate chemical selection for monitoring near field exposure from proximal sources, and (iv) as activity tracers or ubiquitous exposure sources using "chemical space" map analyses. Chemicals present at high concentrations and across multiple consumer products and use categories that hold high exposure potential are identified. Our database is publicly available to serve regulators, retailers, manufacturers, and the public for predictive screening of chemicals in new and existing consumer products on the basis of exposure and risk. Published by Elsevier Ltd.
Accessing the public MIMIC-II intensive care relational database for clinical research.
Scott, Daniel J; Lee, Joon; Silva, Ikaro; Park, Shinhyuk; Moody, George B; Celi, Leo A; Mark, Roger G
2013-01-10
The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge "Predicting mortality of ICU Patients". QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database.
NASA Astrophysics Data System (ADS)
Brown, K. M.; Fritzinger, C.; Wharton, E.
2004-12-01
The Grand Canyon Monitoring and Research Center measures the effects of Glen Canyon Dam operations on the resources along the Colorado River from Glen Canyon Dam to Lake Mead in support of the Grand Canyon Adaptive Management Program. Control points are integral for geo-referencing the myriad of data collected in the Grand Canyon including aerial photography, topographic and bathymetric data used for classification and change-detection analysis of physical, biologic and cultural resources. The survey department has compiled a list of 870 control points installed by various organizations needing to establish a consistent reference for data collected at field sites along the 240 mile stretch of Colorado River in the Grand Canyon. This list is the foundation for the Control Point Database established primarily for researchers, to locate control points and independently geo-reference collected field data. The database has the potential to be a valuable mapping tool for assisting researchers to easily locate a control point and reduce the occurrance of unknowingly installing new control points within close proximity of an existing control point. The database is missing photographs and accurate site description information. Current site descriptions do not accurately define the location of the point but refer to the project that used the point, or some other interesting fact associated with the point. The Grand Canyon Monitoring and Research Center (GCMRC) resolved this problem by turning the data collection effort into an educational exercise for the participants of the Grand Canyon Youth organization. Grand Canyon Youth is a non-profit organization providing experiential education for middle and high school aged youth. GCMRC and the Grand Canyon Youth formed a partnership where GCMRC provided the logistical support, equipment, and training to conduct the field work, and the Grand Canyon Youth provided the time and personnel to complete the field work. Two data collection efforts were conducted during the 2004 summer allowing 40 youth the opportunity to contribute valuable information to the Control Point Database. This information included: verification of point existence, photographs, accurate site descriptions concisely describing the location of the point, how to reach the point, the specific point location and detailed bearings to visible and obvious land marks. The youth learned to locate themselves and find the points using 1:1000 airphotos, write detailed site descriptions, take bearings with a compass, measure vertical and horizontal distances, and use a digital camera. The youth found information for 252 control points (29% of the total points).
Coulter, D M
2001-12-01
The purpose of this paper is to describe how the New Zealand (NZ) Intensive Medicines Monitoring Programme (IMMP) functions in relation to NZ privacy laws and to describe the attitudes of patients to drug safety monitoring and the privacy of their personal and health information. The IMMP undertakes prospective observational event monitoring cohort studies on new drugs. The cohorts are established from prescription data and the events are obtained using prescription event monitoring and spontaneous reporting. Personal details, prescribing history of the monitored drugs and adverse events data are stored in databases long term. The NZ Health Information Privacy Code is outlined and the monitoring of sumatriptan is used to illustrate how the IMMP functions in relation to the Code. Patient responses to the programme are described. Sumatriptan was monitored in 14,964 patients and 107,646 prescriptions were recorded. There were 2344 reports received describing 3987 adverse events. A majority of the patients were involved in the recording of events data either personally or by telephone interview. There were no objections to the monitoring process on privacy grounds. Given the fact that all reasonable precautions are taken to ensure privacy, patients perceive drug safety to have greater priority than any slight risk of breach of confidentiality concerning their personal details and health information.
Internationally coordinated glacier monitoring - a timeline since 1894
NASA Astrophysics Data System (ADS)
Nussbaumer, Samuel U.; Armstrong, Richard; Fetterer, Florence; Gärtner-Roer, Isabelle; Hoelzle, Martin; Machguth, Horst; Mölg, Nico; Paul, Frank; Raup, Bruce H.; Zemp, Michael
2016-04-01
Changes in glaciers and ice caps provide some of the clearest evidence of climate change, with impacts on sea-level variations, regional hydrological cycles, and natural hazard situations. Therefore, glaciers have been recognized as an Essential Climate Variable (ECV). Internationally coordinated collection and distribution of standardized information about the state and change of glaciers and ice caps was initiated in 1894 and is today organized within the Global Terrestrial Network for Glaciers (GTN-G). GTN-G ensures the continuous development and adaptation of the international strategies to the long-term needs of users in science and policy. A GTN-G Steering Committee coordinates, supports and advices the operational bodies responsible for the international glacier monitoring, which are the World Glacier Monitoring Service (WGMS), the US National Snow and Ice Data Center (NSIDC), and the Global Land Ice Measurements from Space (GLIMS) initiative. In this presentation, we trace the development of the internationally coordinated glacier monitoring since its beginning in the 19th century. Today, several online databases containing a wealth of diverse data types with different levels of detail and global coverage provide fast access to continuously updated information on glacier fluctuation and inventory data. All glacier datasets are made freely available through the respective operational bodies within GTN-G, and can be accessed through the GTN-G Global Glacier Browser (http://www.gtn-g.org/data_browser.html). Glacier inventory data (e.g., digital outlines) are available for about 180,000 glaciers (GLIMS database, RGI - Randolph Glacier Inventory, WGI - World Glacier Inventory). Glacier front variations with about 45,000 entries since the 17th century and about 6,200 glaciological and geodetic mass (volume) change observations dating back to the 19th century are available in the Fluctuations of Glaciers (FoG) database. These datasets reveal clear evidence that glacier retreat and mass loss is a global phenomenon. Glaciological and geodetic observations show that the rates of the 21st-century mass loss are unprecedented on a global scale, for the time period observed, and probably also for recorded history, as indicated in glacier reconstructions from written and illustrated documents. The databases are supplemented by specific index datasets (e.g., glacier thickness data) and a dataset containing information on special events including glacier surges, glacier lake outbursts, ice avalanches, eruptions of ice-clad volcanoes, etc. related to about 200 glaciers. A special database of glacier photographs (GPC - Glacier Photograph Collection) contains more than 15,000 pictures from around 500 glaciers, some of them dating back to the mid-19th century. Current efforts are to close remaining observational gaps regarding data both from in-situ measurements and remote sensing, to establish a well-distributed baseline for sound estimates of climate-related glacier changes and their impacts. Within the framework of dedicated capacity building and twinning activities, disrupted long-term mass balance programmes in Central Asia have recently been resumed, and the continuation of mass balance measurements in the Tropical Andes has been supported. New data also emerge from several research projects using NASA and ESA sensors and are actively integrated into the GTN-G databases. Key tasks for the future include the quantitative assessment of uncertainties of available measurements, and their representativeness for changes in the respective mountain ranges. For this, a well-considered integration of in-situ measurements, remotely sensed observations, and numerical modelling is required.
Study on an agricultural environment monitoring server system using Wireless Sensor Networks.
Hwang, Jeonghwan; Shin, Changsun; Yoe, Hyun
2010-01-01
This paper proposes an agricultural environment monitoring server system for monitoring information concerning an outdoors agricultural production environment utilizing Wireless Sensor Network (WSN) technology. The proposed agricultural environment monitoring server system collects environmental and soil information on the outdoors through WSN-based environmental and soil sensors, collects image information through CCTVs, and collects location information using GPS modules. This collected information is converted into a database through the agricultural environment monitoring server consisting of a sensor manager, which manages information collected from the WSN sensors, an image information manager, which manages image information collected from CCTVs, and a GPS manager, which processes location information of the agricultural environment monitoring server system, and provides it to producers. In addition, a solar cell-based power supply is implemented for the server system so that it could be used in agricultural environments with insufficient power infrastructure. This agricultural environment monitoring server system could even monitor the environmental information on the outdoors remotely, and it could be expected that the use of such a system could contribute to increasing crop yields and improving quality in the agricultural field by supporting the decision making of crop producers through analysis of the collected information.
NASA Astrophysics Data System (ADS)
Kulchitsky, A.; Maurits, S.; Watkins, B.
2006-12-01
With the widespread availability of the Internet today, many people can monitor various scientific research activities. It is important to accommodate this interest providing on-line access to dynamic and illustrative Web-resources, which could demonstrate different aspects of ongoing research. It is especially important to explain and these research activities for high school and undergraduate students, thereby providing more information for making decisions concerning their future studies. Such Web resources are also important to clarify scientific research for the general public, in order to achieve better awareness of research progress in various fields. Particularly rewarding is dissemination of information about ongoing projects within Universities and research centers to their local communities. The benefits of this type of scientific outreach are mutual, since development of Web-based automatic systems is prerequisite for many research projects targeting real-time monitoring and/or modeling of natural conditions. Continuous operation of such systems provide ongoing research opportunities for the statistically massive validation of the models, as well. We have developed a Web-based system to run the University of Alaska Fairbanks Polar Ionospheric Model in real-time. This model makes use of networking and computational resources at the Arctic Region Supercomputing Center. This system was designed to be portable among various operating systems and computational resources. Its components can be installed across different computers, separating Web servers and computational engines. The core of the system is a Real-Time Management module (RMM) written Python, which facilitates interactions of remote input data transfers, the ionospheric model runs, MySQL database filling, and PHP scripts for the Web-page preparations. The RMM downloads current geophysical inputs as soon as they become available at different on-line depositories. This information is processed to provide inputs for the next ionospheic model time step and then stored in a MySQL database as the first part of the time-specific record. The RMM then performs synchronization of the input times with the current model time, prepares a decision on initialization for the next model time step, and monitors its execution. Then, as soon as the model completes computations for the next time step, RMM visualizes the current model output into various short-term (about 1-2 hours) forecasting products and compares prior results with available ionospheric measurements. The RMM places prepared images into the MySQL database, which can be located on a different computer node, and then proceeds to the next time interval continuing the time-loop. The upper-level interface of this real-time system is the a PHP-based Web site (http://www.arsc.edu/SpaceWeather/new). This site provides general information about the Earth polar and adjacent mid-latitude ionosphere, allows for monitoring of the current developments and short-term forecasts, and facilitates access to the comparisons archive stored in the database.
Monitoring of waste disposal in deep geological formations
NASA Astrophysics Data System (ADS)
German, V.; Mansurov, V.
2003-04-01
In the paper application of kinetic approach for description of rock failure process and waste disposal microseismic monitoring is advanced. On base of two-stage model of failure process the capability of rock fracture is proved. The requests to monitoring system such as real time mode of data registration and processing and its precision range are formulated. The method of failure nuclei delineation in a rock masses is presented. This method is implemented in a software program for strong seismic events forecasting. It is based on direct use of the fracture concentration criterion. The method is applied to the database of microseismic events of the North Ural Bauxite Mine. The results of this application, such as: efficiency, stability, possibility of forecasting rockburst are discussed.
Monitoring groundwater and river interaction along the Hanford reach of the Columbia River
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, M.D.
1994-04-01
As an adjunct to efficient Hanford Site characterization and remediation of groundwater contamination, an automatic monitor network has been used to measure Columbia River and adjacent groundwater levels in several areas of the Hanford Site since 1991. Water levels, temperatures, and electrical conductivity measured by the automatic monitor network provided an initial database with which to calibrate models and from which to infer ground and river water interactions for site characterization and remediation activities. Measurements of the dynamic river/aquifer system have been simultaneous at 1-hr intervals, with a quality suitable for hydrologic modeling and for computer model calibration and testing.more » This report describes the equipment, procedures, and results from measurements done in 1993.« less
NASA Technical Reports Server (NTRS)
Hielkema, J. U.; Howard, J. A.; Tucker, C. J.; Van Ingen Schenau, H. A.
1987-01-01
The African real time environmental monitoring using imaging satellites (Artemis) system, which should monitor precipitation and vegetation conditions on a continental scale, is presented. The hardware and software characteristics of the system are illustrated and the Artemis databases are outlined. Plans for the system include the use of hourly digital Meteosat data and daily NOAA/AVHRR data to study environmental conditions. Planned mapping activities include monthly rainfall anomaly maps, normalized difference vegetation index maps for ten day and monthly periods with a spatial resolution of 7.6 km, ten day crop/rangeland moisture availability maps, and desert locust potential breeding activity factor maps for a plague prevention program.
Development of the USGS national land-cover database over two decades
Xian, George Z.; Homer, Collin G.; Yang, Limin; Weng, Qihao
2011-01-01
Land-cover composition and change have profound impacts on terrestrial ecosystems. Land-cover and land-use (LCLU) conditions and their changes can affect social and physical environments by altering ecosystem conditions and services. Information about LCLU change is often used to produce landscape-based metrics and evaluate landscape conditions to monitor LCLU status and trends over a specific time interval (Loveland et al. 2002; Coppin et al. 2004; Lunetta et al. 2006). Continuous, accurate, and up-to-date land-cover data are important for natural resource and ecosystem management and are needed to support consistent monitoring of landscape attributes over time. Large-area land-cover information at regional, national, and global scales is critical for monitoring landscape variations over large areas.
Follstad, Shah J.J.; Dahm, Clifford N.; Gloss, S.P.; Bernhardt, E.S.
2007-01-01
Restoration activity has exponentially increased across the Southwest since 1990. Over 37,000 records were compiled into the National River Restoration Science Synthesis (NRRSS) database to summarize restoration trends and assess project effectiveness. We analyzed data from 576 restoration projects in the Southwest (NRRSS-SW). More than 50% of projects were less than or equal to 3 km in length. The most common restoration project intent categories were riparian management, water quality management, in-stream habitat improvement, and flow modification. Common project activities were well matched to goals. Conservative estimates of total restoration costs exceeded $500 million. Most restoration dollars have been allocated to flow modification and water quality management. Monitoring was linked to 28% of projects across the Southwest, as opposed to just 10% nationwide. Mean costs were statistically similar whether or not projects were monitored. Results from 48 telephone interviews provided validation of NRRSS-SW database analyses but showed that project costs are often underreported within existing datasets. The majority of interviewees considered their projects to be successful, most often based upon observed improvements to biota or positive public reaction rather than evaluation of field data. The efficacy of restoration is difficult to ascertain given the dearth of information contained within most datasets. There is a great need for regional entities that not only track information on project implementation but also maintain and analyze monitoring data associated with restoration. Agencies that fund or regulate restoration should reward projects that emphasize monitoring and evaluation as much as project implementation. ?? 2007 Society for Ecological Restoration International.
Quercia, Kelly; Tran, Phuong Lien; Jinoro, Jéromine; Herniainasolo, Joséa Lea; Viviano, Manuela; Vassilakos, Pierre; Benski, Caroline; Petignat, Patrick
2018-04-01
Barriers to efficient cervical cancer screening in low- and medium-income countries include the lack of systematic monitoring of the participants' data. The aim of this study was to assess the feasibility of a mobile health (m-Health) data collection system to facilitate monitoring of women participating to cervical cancer screening campaign. Women aged 30-65 years, participating in a cervical cancer screening campaign in Ambanja, Madagascar, were invited to participate in the study. Cervical Cancer Prevention System, an m-Health application, allows the registration of clinical data, while women are undergoing cervical cancer screening. All data registered in the smartphone were transmitted onto a secure, Web-based platform through the use of an Internet connection. Healthcare providers had access to the central database and could use it for the follow-up visits. Quality of data was assessed by computing the percentage of key data missing. A total of 151 women were recruited in the study. Mean age of participants was 41.8 years. The percentage of missing data for the key variables was less than 0.02%, corresponding to one woman's medical history data, which was not sent to the central database. Technical problems, including transmission of photos, human papillomavirus test results, and pelvic examination data, have subsequently been solved through a system update. The quality of the data was satisfactory and allowed monitoring of cervical cancer screening data of participants. Larger studies evaluating the efficacy of the system for the women's follow-up are needed in order to confirm its efficiency on a long-term scale.
A Community Network of 100 Black Carbon Sensors
NASA Astrophysics Data System (ADS)
Preble, C.; Kirchstetter, T.; Caubel, J.; Cados, T.; Keeling, C.; Chang, S.
2017-12-01
We developed a low-cost black carbon sensor, field tested its performance, and then built and deployed a network of 100 sensors in West Oakland, California. We operated the network for 100 days beginning mid-May 2017 to measure spatially resolved black carbon concentrations throughout the community. West Oakland is a San Francisco Bay Area mixed residential and industrial community that is adjacent to regional port and rail yard facilities and surrounded by major freeways. As such, the community is affected by diesel particulate matter emissions from heavy-duty diesel trucks, locomotives, and ships associated with freight movement. In partnership with Environmental Defense Fund, the Bay Area Air Quality Management District, and the West Oakland Environmental Indicators Project, we deployed the black carbon monitoring network outside of residences and business, along truck routes and arterial streets, and at upwind locations. The sensor employs the filter-based light transmission method to measure black carbon and has good precision and correspondence with current commercial black carbon instruments. Throughout the 100-day period, each of the 100 sensors transmitted data via a cellular network. A MySQL database was built to receive and manage the data in real-time. The database included diagnostic features to monitor each sensor's operational status and facilitate the maintenance of the network. Spatial and temporal patterns in black carbon concentrations will be presented, including patterns around industrial facilities, freeways, and truck routes, as well as the relationship between neighborhood concentrations and the BAAQMD's monitoring site. Lessons learned during this first of its kind black carbon monitoring network will also be shared.
Sustainable Seas Student Intertidal Monitoring Project at Duxbury Reef in Bolinas, CA (Invited)
NASA Astrophysics Data System (ADS)
Soave, K.; Dean, A.; Darakananda, K.; Ball, O.; Butti, C.; Yang, G.; Vetter, M.; Grimaldi, Z.
2009-12-01
Sustainable Seas Student Intertidal Monitoring Project at Duxbury Reef in Bolinas, CA Kathy Soave, Amy Dean, Olivia Ball, Karin Darakananda, Matt Vetter, Grant Yang, Charlotte Butti, Zoe Grimaldi The Sustainable Seas Student Monitoring Project at the Branson School in Ross, CA has monitored Duxbury Reef in Bolinas, CA since 1999, in cooperation with the Farallones Marine Sanctuary Association and the Gulf of Farallones National Marine Sanctuary. Goals of the project include: 1) To monitor the rocky intertidal habitat and develop a baseline database of invertebrates and algal density and abundance; 2) To contribute to the conservation of the rocky intertidal habitat through education of students and visitors about intertidal species and the requirements for maintaining a healthy, diverse intertidal ecosystem; 3) To increase stewardship in the Gulf of the Farallones National Marine Sanctuary; and 4) To contribute abundance and population data on key algae and invertebrate species to the national database, LiMPETS (Long Term Monitoring Program & Experiential Training for Students). Student volunteers complete an intensive training course on the natural history of intertidal invertebrates and algae, identification of key species, rocky intertidal monitoring techniques, and history of the sanctuary. Students identify and count key invertebrate and algae species along two permanent transects (A and B) and using randomly determined points within a permanent 100 m2 area, three times per year (fall, winter, and late spring). Using the data collected since 2004, we will analyze the population densities, seasonal abundance and long-term population trends of key algal and invertebrate species. Future analyses and investigations will include intertidal abiotic factors (including water temperature and human foot-traffic) to enhance insights into the workings of the Duxbury Reef ecosystem, in particular, the high intertidal zone which experiences the greatest amount of human impacts. Kathy Soave The Branson School 39 Fernhill Rd. Ross, CA 94957 (415) 454-3612 x 323 Amy Dean Farallones Marine Sanctuary Association, PO Box 29386 San Francisco, CA 94129, 415-561-6625 x 303 AGU Sponsor, Ines Cifuentes, AGU membership number 10189667
NASA Astrophysics Data System (ADS)
You, Y.; Wang, S.; Yang, Q.; Shen, M.; Chen, G.
2017-12-01
Alpine river water environment on the Plateau (such as Tibetan Plateau, China) is a key indicator for water security and environmental security in China. Due to the complex terrain and various surface eco-environment, it is a very difficult to monitor the water environment over the complex land surface of the plateau. The increasing availability of remote sensing techniques with appropriate spatiotemporal resolutions, broad coverage and low costs allows for effective monitoring river water environment on the Plateau, particularly in remote and inaccessible areas where are lack of in situ observations. In this study, we propose a remote sense-based monitoring model by using multi-platform remote sensing data for monitoring alpine river environment. In this study some parameterization methodologies based on satellite remote sensing data and field observations have been proposed for monitoring the water environmental parameters (including chlorophyll-a concentration (Chl-a), water turbidity (WT) or water clarity (SD), total nitrogen (TN), total phosphorus (TP), and total organic carbon (TOC)) over the china's southwest highland rivers, such as the Brahmaputra. First, because most sensors do not collect multiple observations of a target in a single pass, data from multiple orbits or acquisition times may be used, and varying atmospheric and irradiance effects must be reconciled. So based on various types of satellite data, at first we developed the techniques of multi-sensor data correction, atmospheric correction. Second, we also built the inversion spectral database derived from long-term remote sensing data and field sampling data. Then we have studied and developed a high-precision inversion model over the southwest highland river backed by inversion spectral database through using the techniques of multi-sensor remote sensing information optimization and collaboration. Third, take the middle reaches of the Brahmaputra river as the study area, we validated the key water environmental parameters and further improved the inversion model. The results indicate that our proposed water environment inversion model can be a good inversion for alpine water environmental parameters, and can improve the monitoring and warning ability for the alpine river water environment in the future.
Wright, Stephen; Boyd, Mark A.; Yunihastuti, Evy; Law, Matthew; Sirisanthana, Thira; Hoy, Jennifer; Pujari, Sanjay; Lee, Man Po; Petoumenos, Kathy
2013-01-01
Background In the Asia-Pacific region many countries have adopted the WHO’s public health approach to HIV care and treatment. We performed exploratory analyses of the factors associated with first major modification to first-line combination antiretroviral therapy (ART) in resource-rich and resource-limited countries in the region. Methods We selected treatment naive HIV-positive adults from the Australian HIV Observational Database (AHOD) and the TREAT Asia HIV Observational Database (TAHOD). We dichotomised each country’s per capita income into high/upper-middle (T-H) and lower-middle/low (T-L). Survival methods stratified by income were used to explore time to first major modification of first-line ART and associated factors. We defined a treatment modification as either initiation of a new class of antiretroviral (ARV) or a substitution of two or more ARV agents from within the same ARV class. Results A total of 4250 patients had 961 major modifications to first-line ART in the first five years of therapy. The cumulative incidence (95% CI) of treatment modification was 0.48 (0.44–0.52), 0.33 (0.30–0.36) and 0.21 (0.18–0.23) for AHOD, T-H and T-L respectively. We found no strong associations between typical patient characteristic factors and rates of treatment modification. In AHOD, relative to sites that monitor twice-yearly (both CD4 and HIV RNA-VL), quarterly monitoring corresponded with a doubling of the rate of treatment modifications. In T-H, relative to sites that monitor once-yearly (both CD4 and HIV RNA-VL), monitoring twice-yearly corresponded to a 1.8 factor increase in treatment modifications. In T-L, no sites on average monitored both CD4 & HIV RNA-VL concurrently once-yearly. We found no differences in rates of modifications for once- or twice-yearly CD4 count monitoring. Conclusions Low-income countries tended to have lower rates of major modifications made to first-line ART compared to higher-income countries. In higher-income countries, an increased rate of RNA-VL monitoring was associated with increased modifications to first-line ART. PMID:23840312
Wright, Stephen; Boyd, Mark A; Yunihastuti, Evy; Law, Matthew; Sirisanthana, Thira; Hoy, Jennifer; Pujari, Sanjay; Lee, Man Po; Petoumenos, Kathy
2013-01-01
In the Asia-Pacific region many countries have adopted the WHO's public health approach to HIV care and treatment. We performed exploratory analyses of the factors associated with first major modification to first-line combination antiretroviral therapy (ART) in resource-rich and resource-limited countries in the region. We selected treatment naive HIV-positive adults from the Australian HIV Observational Database (AHOD) and the TREAT Asia HIV Observational Database (TAHOD). We dichotomised each country's per capita income into high/upper-middle (T-H) and lower-middle/low (T-L). Survival methods stratified by income were used to explore time to first major modification of first-line ART and associated factors. We defined a treatment modification as either initiation of a new class of antiretroviral (ARV) or a substitution of two or more ARV agents from within the same ARV class. A total of 4250 patients had 961 major modifications to first-line ART in the first five years of therapy. The cumulative incidence (95% CI) of treatment modification was 0.48 (0.44-0.52), 0.33 (0.30-0.36) and 0.21 (0.18-0.23) for AHOD, T-H and T-L respectively. We found no strong associations between typical patient characteristic factors and rates of treatment modification. In AHOD, relative to sites that monitor twice-yearly (both CD4 and HIV RNA-VL), quarterly monitoring corresponded with a doubling of the rate of treatment modifications. In T-H, relative to sites that monitor once-yearly (both CD4 and HIV RNA-VL), monitoring twice-yearly corresponded to a 1.8 factor increase in treatment modifications. In T-L, no sites on average monitored both CD4 & HIV RNA-VL concurrently once-yearly. We found no differences in rates of modifications for once- or twice-yearly CD4 count monitoring. Low-income countries tended to have lower rates of major modifications made to first-line ART compared to higher-income countries. In higher-income countries, an increased rate of RNA-VL monitoring was associated with increased modifications to first-line ART.
NASA Astrophysics Data System (ADS)
Bartolini, S.; Becerril, L.; Martí, J.
2014-11-01
One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.
Creating a High-Frequency Electronic Database in the PICU: The Perpetual Patient.
Brossier, David; El Taani, Redha; Sauthier, Michael; Roumeliotis, Nadia; Emeriaud, Guillaume; Jouvet, Philippe
2018-04-01
Our objective was to construct a prospective high-quality and high-frequency database combining patient therapeutics and clinical variables in real time, automatically fed by the information system and network architecture available through fully electronic charting in our PICU. The purpose of this article is to describe the data acquisition process from bedside to the research electronic database. Descriptive report and analysis of a prospective database. A 24-bed PICU, medical ICU, surgical ICU, and cardiac ICU in a tertiary care free-standing maternal child health center in Canada. All patients less than 18 years old were included at admission to the PICU. None. Between May 21, 2015, and December 31, 2016, 1,386 consecutive PICU stays from 1,194 patients were recorded in the database. Data were prospectively collected from admission to discharge, every 5 seconds from monitors and every 30 seconds from mechanical ventilators and infusion pumps. These data were linked to the patient's electronic medical record. The database total volume was 241 GB. The patients' median age was 2.0 years (interquartile range, 0.0-9.0). Data were available for all mechanically ventilated patients (n = 511; recorded duration, 77,678 hr), and respiratory failure was the most frequent reason for admission (n = 360). The complete pharmacologic profile was synched to database for all PICU stays. Following this implementation, a validation phase is in process and several research projects are ongoing using this high-fidelity database. Using the existing bedside information system and network architecture of our PICU, we implemented an ongoing high-fidelity prospectively collected electronic database, preventing the continuous loss of scientific information. This offers the opportunity to develop research on clinical decision support systems and computational models of cardiorespiratory physiology for example.
Construction and validation of a population-based bone densitometry database.
Leslie, William D; Caetano, Patricia A; Macwilliam, Leonard R; Finlayson, Gregory S
2005-01-01
Utilization of dual-energy X-ray absorptiometry (DXA) for the initial diagnostic assessment of osteoporosis and in monitoring treatment has risen dramatically in recent years. Population-based studies of the impact of DXA and osteoporosis remain challenging because of incomplete and fragmented test data that exist in most regions. Our aim was to create and assess completeness of a database of all clinical DXA services and test results for the province of Manitoba, Canada and to present descriptive data resulting from testing. A regionally based bone density program for the province of Manitoba, Canada was established in 1997. Subsequent DXA services were prospectively captured in a program database. This database was retrospectively populated with earlier DXA results dating back to 1990 (the year that the first DXA scanner was installed) by integrating multiple data sources. A random chart audit was performed to assess completeness and accuracy of this dataset. For comparison, testing rates determined from the DXA database were compared with physician administrative claims data. There was a high level of completeness of this database (>99%) and accurate personal identifier information sufficient for linkage with other health care administrative data (>99%). This contrasted with physician billing data that were found to be markedly incomplete. Descriptive data provide a profile of individuals receiving DXA and their test results. In conclusion, the Manitoba bone density database has great potential as a resource for clinical and health policy research because it is population based with a high level of completeness and accuracy.
A Remote Patient Monitoring System for Congestive Heart Failure
Suh, Myung-kyung; Chen, Chien-An; Woodbridge, Jonathan; Tu, Michael Kai; Kim, Jung In; Nahapetian, Ani; Evangelista, Lorraine S.; Sarrafzadeh, Majid
2011-01-01
Congestive heart failure (CHF) is a leading cause of death in the United States affecting approximately 670,000 individuals. Due to the prevalence of CHF related issues, it is prudent to seek out methodologies that would facilitate the prevention, monitoring, and treatment of heart disease on a daily basis. This paper describes WANDA (Weight and Activity with Blood Pressure Monitoring System); a study that leverages sensor technologies and wireless communications to monitor the health related measurements of patients with CHF. The WANDA system is a three-tier architecture consisting of sensors, web servers, and back-end databases. The system was developed in conjunction with the UCLA School of Nursing and the UCLA Wireless Health Institute to enable early detection of key clinical symptoms indicative of CHF-related decompensation. This study shows that CHF patients monitored by WANDA are less likely to have readings fall outside a healthy range. In addition, WANDA provides a useful feedback system for regulating readings of CHF patients. PMID:21611788
Fault tree analysis for data-loss in long-term monitoring networks.
Dirksen, J; ten Veldhuis, J A E; Schilperoort, R P S
2009-01-01
Prevention of data-loss is an important aspect in the design as well as the operational phase of monitoring networks since data-loss can seriously limit intended information yield. In the literature limited attention has been paid to the origin of unreliable or doubtful data from monitoring networks. Better understanding of causes of data-loss points out effective solutions to increase data yield. This paper introduces FTA as a diagnostic tool to systematically deduce causes of data-loss in long-term monitoring networks in urban drainage systems. In order to illustrate the effectiveness of FTA, a fault tree is developed for a monitoring network and FTA is applied to analyze the data yield of a UV/VIS submersible spectrophotometer. Although some of the causes of data-loss cannot be recovered because the historical database of metadata has been updated infrequently, the example points out that FTA still is a powerful tool to analyze the causes of data-loss and provides useful information on effective data-loss prevention.
Laboratory Information Systems.
Henricks, Walter H
2015-06-01
Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gao, Zhiqiang; Xu, Fuxiang; Song, Debin; Zheng, Xiangyu; Chen, Maosi
2017-09-01
This paper conducted dynamic monitoring over the green tide (large green alga—Ulva prolifera) occurred in the Yellow Sea in 2014 to 2016 by the use of multi-source remote sensing data, including GF-1 WFV, HJ-1A/1B CCD, CBERS-04 WFI, Landsat-7 ETM+ and Landsta-8 OLI, and by the combination of VB-FAH (index of Virtual-Baseline Floating macroAlgae Height) with manual assisted interpretation based on remote sensing and geographic information system technologies. The result shows that unmanned aerial vehicle (UAV) and shipborne platform could accurately monitor the distribution of Ulva prolifera in small spaces, and therefore provide validation data for the result of remote sensing monitoring over Ulva prolifera. The result of this research can provide effective information support for the prevention and control of Ulva prolifera.
Wearable Health Monitoring Systems
NASA Technical Reports Server (NTRS)
Bell, John
2015-01-01
The shrinking size and weight of electronic circuitry has given rise to a new generation of smart clothing that enables biological data to be measured and transmitted. As the variation in the number and type of deployable devices and sensors increases, technology must allow their seamless integration so they can be electrically powered, operated, and recharged over a digital pathway. Nyx Illuminated Clothing Company has developed a lightweight health monitoring system that integrates medical sensors, electrodes, electrical connections, circuits, and a power supply into a single wearable assembly. The system is comfortable, bendable in three dimensions, durable, waterproof, and washable. The innovation will allow astronaut health monitoring in a variety of real-time scenarios, with data stored in digital memory for later use in a medical database. Potential commercial uses are numerous, as the technology enables medical personnel to noninvasively monitor patient vital signs in a multitude of health care settings and applications.
Design and Development of a Web-Based Self-Monitoring System to Support Wellness Coaching.
Zarei, Reza; Kuo, Alex
2017-01-01
We analyzed, designed and deployed a web-based, self-monitoring system to support wellness coaching. A wellness coach can plan for clients' exercise and diet through the system and is able to monitor the changes in body dimensions and body composition that the client reports. The system can also visualize the client's data in form of graphs for both the client and the coach. Both parties can also communicate through the messaging feature embedded in the application. A reminder system is also incorporated into the system and sends reminder messages to the clients when their reporting is due. The web-based self-monitoring application uses Oracle 11g XE as the backend database and Application Express 4.2 as user interface development tool. The system allowed users to access, update and modify data through web browser anytime, anywhere, and on any device.
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2014-01-01
We present a form of automaton, referred to as data automata, suited for monitoring sequences of data-carrying events, for example emitted by an executing software system. This form of automata allows states to be parameterized with data, forming named records, which are stored in an efficiently indexed data structure, a form of database. This very explicit approach differs from other automaton-based monitoring approaches. Data automata are also characterized by allowing transition conditions to refer to other parameterized states, and by allowing transitions sequences. The presented automaton concept is inspired by rule-based systems, especially the Rete algorithm, which is one of the well-established algorithms for executing rule-based systems. We present an optimized external DSL for data automata, as well as a comparable unoptimized internal DSL (API) in the Scala programming language, in order to compare the two solutions. An evaluation compares these two solutions to several other monitoring systems.
Development of noSQL data storage for the ATLAS PanDA Monitoring System
NASA Astrophysics Data System (ADS)
Potekhin, M.; ATLAS Collaboration
2012-06-01
For several years the PanDA Workload Management System has been the basis for distributed production and analysis for the ATLAS experiment at the LHC. Since the start of data taking PanDA usage has ramped up steadily, typically exceeding 500k completed jobs/day by June 2011. The associated monitoring data volume has been rising as well, to levels that present a new set of challenges in the areas of database scalability and monitoring system performance and efficiency. These challenges are being met with a R&D effort aimed at implementing a scalable and efficient monitoring data storage based on a noSQL solution (Cassandra). We present our motivations for using this technology, as well as data design and the techniques used for efficient indexing of the data. We also discuss the hardware requirements as they were determined by testing with actual data and realistic loads.
A wireless blood pressure monitoring system for personal health management.
Li, Wun-Jin; Luo, Yuan-Long; Chang, Yao-Shun; Lin, Yuan-Hsiang
2010-01-01
In this paper, we developed a wireless blood pressure monitoring system which provides a useful tool for users to measure and manage their daily blood pressure values. This system includes an ARM-based blood pressure monitor with a ZigBee wireless transmission module and a PC-based management unit with graphic user interface and database. The wireless blood pressure monitor can measure the blood pressure and heart rate and then store and forward the measuring information to the management unit through the ZigBee wireless transmission. On the management unit, user can easy to see their blood pressure variation in the past using a line chart. Accuracy of blood pressure measurement has been verified by a commercial blood pressure simulator and shown the bias of systolic blood pressure is ≤ 1 mmHg and the bias of diastolic blood pressure is ≤ 1.4 mmHg.
Scientific Framework for Stormwater Monitoring by the Washington State Department of Transportation
Sheibley, R.W.; Kelly, V.J.; Wagner, R.J.
2009-01-01
The Washington State Department of Transportation municipal stormwater monitoring program, in operation for about 8 years, never has received an external, objective assessment. In addition, the Washington State Department of Transportation would like to identify the standard operating procedures and quality assurance protocols that must be adopted so that their monitoring program will meet the requirements of the new National Pollutant Discharge Elimination System municipal stormwater permit. As a result, in March 2009, the Washington State Department of Transportation asked the U.S. Geological Survey to assess their pre-2009 municipal stormwater monitoring program. This report presents guidelines developed for the Washington State Department of Transportation to meet new permit requirements and regional/national stormwater monitoring standards to ensure that adequate processes and procedures are identified to collect high-quality, scientifically defensible municipal stormwater monitoring data. These include: (1) development of coherent vision and cooperation among all elements of the program; (2) a comprehensive approach for site selection; (3) an effective quality assurance program for field, laboratory, and data management; and (4) an adequate database and data management system.
Hannan, M A; Arebey, Maher; Begum, R A; Basri, Hassan
2011-12-01
This paper deals with a system of integration of Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system. RFID, GPS, GPRS and GIS along with camera technologies have been integrated and developed the bin and truck intelligent monitoring system. A new kind of integrated theoretical framework, hardware architecture and interface algorithm has been introduced between the technologies for the successful implementation of the proposed system. In this system, bin and truck database have been developed such a way that the information of bin and truck ID, date and time of waste collection, bin status, amount of waste and bin and truck GPS coordinates etc. are complied and stored for monitoring and management activities. The results showed that the real-time image processing, histogram analysis, waste estimation and other bin information have been displayed in the GUI of the monitoring system. The real-time test and experimental results showed that the performance of the developed system was stable and satisfied the monitoring system with high practicability and validity. Copyright © 2011 Elsevier Ltd. All rights reserved.
Privacy-Preserving Electrocardiogram Monitoring for Intelligent Arrhythmia Detection.
Son, Junggab; Park, Juyoung; Oh, Heekuck; Bhuiyan, Md Zakirul Alam; Hur, Junbeom; Kang, Kyungtae
2017-06-12
Long-term electrocardiogram (ECG) monitoring, as a representative application of cyber-physical systems, facilitates the early detection of arrhythmia. A considerable number of previous studies has explored monitoring techniques and the automated analysis of sensing data. However, ensuring patient privacy or confidentiality has not been a primary concern in ECG monitoring. First, we propose an intelligent heart monitoring system, which involves a patient-worn ECG sensor (e.g., a smartphone) and a remote monitoring station, as well as a decision support server that interconnects these components. The decision support server analyzes the heart activity, using the Pan-Tompkins algorithm to detect heartbeats and a decision tree to classify them. Our system protects sensing data and user privacy, which is an essential attribute of dependability, by adopting signal scrambling and anonymous identity schemes. We also employ a public key cryptosystem to enable secure communication between the entities. Simulations using data from the MIT-BIH arrhythmia database demonstrate that our system achieves a 95.74% success rate in heartbeat detection and almost a 96.63% accuracy in heartbeat classification, while successfully preserving privacy and securing communications among the involved entities.
Privacy-Preserving Electrocardiogram Monitoring for Intelligent Arrhythmia Detection †
Son, Junggab; Park, Juyoung; Oh, Heekuck; Bhuiyan, Md Zakirul Alam; Hur, Junbeom; Kang, Kyungtae
2017-01-01
Long-term electrocardiogram (ECG) monitoring, as a representative application of cyber-physical systems, facilitates the early detection of arrhythmia. A considerable number of previous studies has explored monitoring techniques and the automated analysis of sensing data. However, ensuring patient privacy or confidentiality has not been a primary concern in ECG monitoring. First, we propose an intelligent heart monitoring system, which involves a patient-worn ECG sensor (e.g., a smartphone) and a remote monitoring station, as well as a decision support server that interconnects these components. The decision support server analyzes the heart activity, using the Pan–Tompkins algorithm to detect heartbeats and a decision tree to classify them. Our system protects sensing data and user privacy, which is an essential attribute of dependability, by adopting signal scrambling and anonymous identity schemes. We also employ a public key cryptosystem to enable secure communication between the entities. Simulations using data from the MIT-BIH arrhythmia database demonstrate that our system achieves a 95.74% success rate in heartbeat detection and almost a 96.63% accuracy in heartbeat classification, while successfully preserving privacy and securing communications among the involved entities. PMID:28604628
A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring
NASA Astrophysics Data System (ADS)
Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.
2017-04-01
This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.
Monitoring outcomes with relational databases: does it improve quality of care?
Clemmer, Terry P
2004-12-01
There are 3 key ingredients in improving quality of medial care: 1) using a scientific process of improvement, 2) executing the process at the lowest possible level in the organization, and 3) measuring the results of any change reliably. Relational databases when used within these guidelines are of great value in these efforts if they contain reliable information that is pertinent to the project and used in a scientific process of quality improvement by a front line team. Unfortunately, the data are frequently unreliable and/or not pertinent to the local process and is used by persons at very high levels in the organization without a scientific process and without reliable measurement of the outcome. Under these circumstances the effectiveness of relational databases in improving care is marginal at best, frequently wasteful and has the potential to be harmful. This article explores examples of these concepts.
Database assessment of CMIP5 and hydrological models to determine flood risk areas
NASA Astrophysics Data System (ADS)
Limlahapun, Ponthip; Fukui, Hiromichi
2016-11-01
Solutions for water-related disasters may not be solved with a single scientific method. Based on this premise, we involved logic conceptions, associate sequential result amongst models, and database applications attempting to analyse historical and future scenarios in the context of flooding. The three main models used in this study are (1) the fifth phase of the Coupled Model Intercomparison Project (CMIP5) to derive precipitation; (2) the Integrated Flood Analysis System (IFAS) to extract amount of discharge; and (3) the Hydrologic Engineering Center (HEC) model to generate inundated areas. This research notably focused on integrating data regardless of system-design complexity, and database approaches are significantly flexible, manageable, and well-supported for system data transfer, which makes them suitable for monitoring a flood. The outcome of flood map together with real-time stream data can help local communities identify areas at-risk of flooding in advance.
Murnyak, George R; Spencer, Clark O; Chaney, Ann E; Roberts, Welford C
2002-04-01
During the 1970s, the Army health hazard assessment (HHA) process developed as a medical program to minimize hazards in military materiel during the development process. The HHA Program characterizes health hazards that soldiers and civilians may encounter as they interact with military weapons and equipment. Thus, it is a resource for medical planners and advisors to use that can identify and estimate potential hazards that soldiers may encounter as they train and conduct missions. The U.S. Army Center for Health Promotion and Preventive Medicine administers the program, which is integrated with the Army's Manpower and Personnel Integration program. As the HHA Program has matured, an electronic database has been developed to record and monitor the health hazards associated with military equipment and systems. The current database tracks the results of HHAs and provides reporting designed to assist the HHA Program manager in daily activities.
Promoting health equity: WHO health inequality monitoring at global and national levels.
Hosseinpoor, Ahmad Reza; Bergen, Nicole; Schlotheuber, Anne
2015-01-01
Health equity is a priority in the post-2015 sustainable development agenda and other major health initiatives. The World Health Organization (WHO) has a history of promoting actions to achieve equity in health, including efforts to encourage the practice of health inequality monitoring. Health inequality monitoring systems use disaggregated data to identify disadvantaged subgroups within populations and inform equity-oriented health policies, programs, and practices. This paper provides an overview of a number of recent and current WHO initiatives related to health inequality monitoring at the global and/or national level. We outline the scope, content, and intended uses/application of the following: Health Equity Monitor database and theme page; State of inequality: reproductive, maternal, newborn, and child health report; Handbook on health inequality monitoring: with a focus on low- and middle-income countries; Health inequality monitoring eLearning module; Monitoring health inequality: an essential step for achieving health equity advocacy booklet and accompanying video series; and capacity building workshops conducted in WHO Member States and Regions. The paper concludes by considering how the work of the WHO can be expanded upon to promote the establishment of sustainable and robust inequality monitoring systems across a variety of health topics among Member States and at the global level.