Sample records for automatic data collection systems

  1. Design of a real-time tax-data monitoring intelligent card system

    NASA Astrophysics Data System (ADS)

    Gu, Yajun; Bi, Guotang; Chen, Liwei; Wang, Zhiyuan

    2009-07-01

    To solve the current problem of low efficiency of domestic Oil Station's information management, Oil Station's realtime tax data monitoring system has been developed to automatically access tax data of Oil pumping machines, realizing Oil-pumping machines' real-time automatic data collection, displaying and saving. The monitoring system uses the noncontact intelligent card or network to directly collect data which can not be artificially modified and so seals the loopholes and improves the tax collection's automatic level. It can perform real-time collection and management of the Oil Station information, and find the problem promptly, achieves the automatic management for the entire process covering Oil sales accounting and reporting. It can also perform remote query to the Oil Station's operation data. This system has broad application future and economic value.

  2. Roadway system assessment using bluetooth-based automatic vehicle identification travel time data.

    DOT National Transportation Integrated Search

    2012-12-01

    This monograph is an exposition of several practice-ready methodologies for automatic vehicle identification (AVI) data collection : systems. This includes considerations in the physical setup of the collection system as well as the interpretation of...

  3. Assessment of Automatically Exported Clinical Data from a Hospital Information System for Clinical Research in Multiple Myeloma.

    PubMed

    Torres, Viviana; Cerda, Mauricio; Knaup, Petra; Löpprich, Martin

    2016-01-01

    An important part of the electronic information available in Hospital Information System (HIS) has the potential to be automatically exported to Electronic Data Capture (EDC) platforms for improving clinical research. This automation has the advantage of reducing manual data transcription, a time consuming and prone to errors process. However, quantitative evaluations of the process of exporting data from a HIS to an EDC system have not been reported extensively, in particular comparing with manual transcription. In this work an assessment to study the quality of an automatic export process, focused in laboratory data from a HIS is presented. Quality of the laboratory data was assessed in two types of processes: (1) a manual process of data transcription, and (2) an automatic process of data transference. The automatic transference was implemented as an Extract, Transform and Load (ETL) process. Then, a comparison was carried out between manual and automatic data collection methods. The criteria to measure data quality were correctness and completeness. The manual process had a general error rate of 2.6% to 7.1%, obtaining the lowest error rate if data fields with a not clear definition were removed from the analysis (p < 10E-3). In the case of automatic process, the general error rate was 1.9% to 12.1%, where lowest error rate is obtained when excluding information missing in the HIS but transcribed to the EDC from other physical sources. The automatic ETL process can be used to collect laboratory data for clinical research if data in the HIS as well as physical documentation not included in HIS, are identified previously and follows a standardized data collection protocol.

  4. Global Positioning System for Personal Travel Surveys: Lexington Area Travel Data Collection Test, Final Report

    DOT National Transportation Integrated Search

    1997-09-15

    This report describes the development and field test of an automated data : collection device that includes Global Positioning System (GPS) technology for : the collection of personal travel data. This project configured an automatic : data collectio...

  5. Hand held data collection and monitoring system for nuclear facilities

    DOEpatents

    Brayton, D.D.; Scharold, P.G.; Thornton, M.W.; Marquez, D.L.

    1999-01-26

    Apparatus and method is disclosed for a data collection and monitoring system that utilizes a pen based hand held computer unit which has contained therein interaction software that allows the user to review maintenance procedures, collect data, compare data with historical trends and safety limits, and input new information at various collection sites. The system has a means to allow automatic transfer of the collected data to a main computer data base for further review, reporting, and distribution purposes and uploading updated collection and maintenance procedures. The hand held computer has a running to-do list so sample collection and other general tasks, such as housekeeping are automatically scheduled for timely completion. A done list helps users to keep track of all completed tasks. The built-in check list assures that work process will meet the applicable processes and procedures. Users can hand write comments or drawings with an electronic pen that allows the users to directly interface information on the screen. 15 figs.

  6. Hand held data collection and monitoring system for nuclear facilities

    DOEpatents

    Brayton, Darryl D.; Scharold, Paul G.; Thornton, Michael W.; Marquez, Diana L.

    1999-01-01

    Apparatus and method is disclosed for a data collection and monitoring system that utilizes a pen based hand held computer unit which has contained therein interaction software that allows the user to review maintenance procedures, collect data, compare data with historical trends and safety limits, and input new information at various collection sites. The system has a means to allow automatic transfer of the collected data to a main computer data base for further review, reporting, and distribution purposes and uploading updated collection and maintenance procedures. The hand held computer has a running to-do list so sample collection and other general tasks, such as housekeeping are automatically scheduled for timely completion. A done list helps users to keep track of all completed tasks. The built-in check list assures that work process will meet the applicable processes and procedures. Users can hand write comments or drawings with an electronic pen that allows the users to directly interface information on the screen.

  7. Fully automatic characterization and data collection from crystals of biological macromolecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to themore » optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.« less

  8. Automatic safety belt usage in 1981 Toyotas

    DOT National Transportation Integrated Search

    1982-02-01

    The objectives of the study were to evaluate the effectiveness of automatic restraint systems provided in Toyota Cressidas in increasing use of seat belts, and to evaluate attitude of owners toward those systems. Data were collected through telephone...

  9. Driving photomask supplier quality through automation

    NASA Astrophysics Data System (ADS)

    Russell, Drew; Espenscheid, Andrew

    2007-10-01

    In 2005, Freescale Semiconductor's newly centralized mask data prep organization (MSO) initiated a project to develop an automated global quality validation system for photomasks delivered to Freescale Semiconductor fabs. The system handles Certificate of Conformance (CofC) quality metric collection, validation, reporting and an alert system for all photomasks shipped to Freescale fabs from all qualified global suppliers. The completed system automatically collects 30+ quality metrics for each photomask shipped. Other quality metrics are generated from the collected data and quality metric conformance is automatically validated to specifications or control limits with failure alerts emailed to fab photomask and mask data prep engineering. A quality data warehouse stores the data for future analysis, which is performed quarterly. The improved access to data provided by the system has improved Freescale engineers' ability to spot trends and opportunities for improvement with our suppliers' processes. This paper will review each phase of the project, current system capabilities and quality system benefits for both our photomask suppliers and Freescale.

  10. 75 FR 43487 - Proposed Information Collection; Comment Request; Vessel Monitoring System Requirements in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Monitoring System Requirements in the Western Pacific Pelagic Longline Fishery), OMB Control No. 0648-0519... requirement from OMB Control No. 0648-0584 (Permitting, Vessel Identification and Vessel Monitoring System... one collection (OMB Control No. 0648-0441). II. Method of Collection Automatic. III. Data OMB Control...

  11. An evaluation of the ERTS data collection system as a potential operational tool. [automatic hydrologic data collection and processing system for geological surveys

    NASA Technical Reports Server (NTRS)

    Paulson, R. W.

    1974-01-01

    The Earth Resources Technology Satellite Data Collection System has been shown to be, from the users vantage point, a reliable and simple system for collecting data from U.S. Geological Survey operational field instrumentation. It is technically feasible to expand the ERTS system into an operational polar-orbiting data collection system to gather data from the Geological Survey's Hydrologic Data Network. This could permit more efficient internal management of the Network, and could enable the Geological Survey to make data available to cooperating agencies in near-real time. The Geological Survey is conducting an analysis of the costs and benefits of satellite data-relay systems.

  12. A low-cost, computer-controlled robotic flower system for behavioral experiments.

    PubMed

    Kuusela, Erno; Lämsä, Juho

    2016-04-01

    Human observations during behavioral studies are expensive, time-consuming, and error prone. For this reason, automatization of experiments is highly desirable, as it reduces the risk of human errors and workload. The robotic system we developed is simple and cheap to build and handles feeding and data collection automatically. The system was built using mostly off-the-shelf components and has a novel feeding mechanism that uses servos to perform refill operations. We used the robotic system in two separate behavioral studies with bumblebees (Bombus terrestris): The system was used both for training of the bees and for the experimental data collection. The robotic system was reliable, with no flight in our studies failing due to a technical malfunction. The data recorded were easy to apply for further analysis. The software and the hardware design are open source. The development of cheap open-source prototyping platforms during the recent years has opened up many possibilities in designing of experiments. Automatization not only reduces workload, but also potentially allows experimental designs never done before, such as dynamic experiments, where the system responds to, for example, learning of the animal. We present a complete system with hardware and software, and it can be used as such in various experiments requiring feeders and collection of visitation data. Use of the system is not limited to any particular experimental setup or even species.

  13. Automatic, semi-automatic and manual validation of urban drainage data.

    PubMed

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  14. Creating an iPhone Application for Collecting Continuous ABC Data

    ERIC Educational Resources Information Center

    Whiting, Seth W.; Dixon, Mark R.

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data- collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to…

  15. CREATING AN IPHONE APPLICATION FOR COLLECTING CONTINUOUS ABC DATA

    PubMed Central

    Whiting, Seth W; Dixon, Mark R

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs. PMID:23060682

  16. Creating an iPhone application for collecting continuous ABC data.

    PubMed

    Whiting, Seth W; Dixon, Mark R

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs.

  17. Fracture Systems - Digital Field Data Capture

    NASA Astrophysics Data System (ADS)

    Haslam, Richard

    2017-04-01

    Fracture systems play a key role in subsurface resources and developments including groundwater and nuclear waste repositories. There is increasing recognition that there is a need to record and quantify fracture systems to better understand the potential risks and opportunities. With the advent of smart phones and digital field geology there have been numerous systems designed for field data collection. Digital field data collection allows for rapid data collection and interpretations. However, many of the current systems have principally been designed to cover the full range of field mapping and data needs, making them large and complex, plus many do not offer the tools necessary for the collection of fracture specific data. A new multiplatform data recording app has been developed for the collection of field data on faults and joint/fracture systems and a relational database designed for storage and retrieval. The app has been developed to collect fault data and joint/fracture data based on an open source platform. Data is captured in a form-based approach including validity checks to ensure data is collected systematically. In addition to typical structural data collection, the International Society of Rock Mechanics' (ISRM) "Suggested Methods for the Quantitative Description of Discontinuities in Rock Masses" is included allowing for industry standards to be followed and opening up the tools to industry as well as research. All data is uploaded automatically to a secure server and users can view their data and open access data as required. Users can decide if the data they produce should remain private or be open access. A series of automatic reports can be produced and/or the data downloaded. The database will hold a national archive and data retrieval will be made through a web interface.

  18. Fully automatic characterization and data collection from crystals of biological macromolecules.

    PubMed

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier; Bowler, Matthew W

    2015-08-01

    Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.

  19. Collection, processing and dissemination of data for the national solar demonstration program

    NASA Technical Reports Server (NTRS)

    Day, R. E.; Murphy, L. J.; Smok, J. T.

    1978-01-01

    A national solar data system developed for the DOE by IBM provides for automatic gathering, conversion, transfer, and analysis of demonstration site data. NASA requirements for this system include providing solar site hardware, engineering, data collection, and analysis. The specific tasks include: (1) solar energy system design/integration; (2) developing a site data acquisition subsystem; (3) developing a central data processing system; (4) operating the test facility at Marshall Space Flight Center; (5) collecting and analyzing data. The systematic analysis and evaluation of the data from the National Solar Data System is reflected in a monthly performance report and a solar energy system performance evaluation report.

  20. Resources monitoring and automatic management system for multi-VO distributed computing system

    NASA Astrophysics Data System (ADS)

    Chen, J.; Pelevanyuk, I.; Sun, Y.; Zhemchugov, A.; Yan, T.; Zhao, X. H.; Zhang, X. M.

    2017-10-01

    Multi-VO supports based on DIRAC have been set up to provide workload and data management for several high energy experiments in IHEP. To monitor and manage the heterogeneous resources which belong to different Virtual Organizations in a uniform way, a resources monitoring and automatic management system based on Resource Status System(RSS) of DIRAC has been presented in this paper. The system is composed of three parts: information collection, status decision and automatic control, and information display. The information collection includes active and passive way of gathering status from different sources and stores them in databases. The status decision and automatic control is used to evaluate the resources status and take control actions on resources automatically through some pre-defined policies and actions. The monitoring information is displayed on a web portal. Both the real-time information and historical information can be obtained from the web portal. All the implementations are based on DIRAC framework. The information and control including sites, policies, web portal for different VOs can be well defined and distinguished within DIRAC user and group management infrastructure.

  1. User Metrics in NASA Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2018-01-01

    This presentation the collection and use of user metrics in NASA's Earth Science data systems. A variety of collection methods is discussed, with particular emphasis given to the American Customer Satisfaction Index (ASCI). User sentiment on potential use of cloud computing is presented, with generally positive responses. The presentation also discusses various forms of automatically collected metrics, including an example of the relative usage of different functions within the Giovanni analysis system.

  2. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  3. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  4. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  5. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  6. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  7. Efficient Method of Achieving Agreements between Individuals and Organizations about RFID Privacy

    NASA Astrophysics Data System (ADS)

    Cha, Shi-Cho

    This work presents novel technical and legal approaches that address privacy concerns for personal data in RFID systems. In recent years, to minimize the conflict between convenience and the privacy risk of RFID systems, organizations have been requested to disclose their policies regarding RFID activities, obtain customer consent, and adopt appropriate mechanisms to enforce these policies. However, current research on RFID typically focuses on enforcement mechanisms to protect personal data stored in RFID tags and prevent organizations from tracking user activity through information emitted by specific RFID tags. A missing piece is how organizations can obtain customers' consent efficiently and flexibly. This study recommends that organizations obtain licenses automatically or semi-automatically before collecting personal data via RFID technologies rather than deal with written consents. Such digitalized and standard licenses can be checked automatically to ensure that collection and use of personal data is based on user consent. While individuals can easily control who has licenses and license content, the proposed framework provides an efficient and flexible way to overcome the deficiencies in current privacy protection technologies for RFID systems.

  8. Automatic detection and notification of "wrong patient-wrong location'' errors in the operating room.

    PubMed

    Sandberg, Warren S; Häkkinen, Matti; Egan, Marie; Curran, Paige K; Fairbrother, Pamela; Choquette, Ken; Daily, Bethany; Sarkka, Jukka-Pekka; Rattner, David

    2005-09-01

    When procedures and processes to assure patient location based on human performance do not work as expected, patients are brought incrementally closer to a possible "wrong patient-wrong procedure'' error. We developed a system for automated patient location monitoring and management. Real-time data from an active infrared/radio frequency identification tracking system provides patient location data that are robust and can be compared with an "expected process'' model to automatically flag wrong-location events as soon as they occur. The system also generates messages that are automatically sent to process managers via the hospital paging system, thus creating an active alerting function to annunciate errors. We deployed the system to detect and annunciate "patient-in-wrong-OR'' events. The system detected all "wrong-operating room (OR)'' events, and all "wrong-OR'' locations were correctly assigned within 0.50+/-0.28 minutes (mean+/-SD). This corresponded to the measured latency of the tracking system. All wrong-OR events were correctly annunciated via the paging function. This experiment demonstrates that current technology can automatically collect sufficient data to remotely monitor patient flow through a hospital, provide decision support based on predefined rules, and automatically notify stakeholders of errors.

  9. Wireless Sensor Network-Based Greenhouse Environment Monitoring and Automatic Control System for Dew Condensation Prevention

    PubMed Central

    Park, Dae-Heon; Park, Jang-Woo

    2011-01-01

    Dew condensation on the leaf surface of greenhouse crops can promote diseases caused by fungus and bacteria, affecting the growth of the crops. In this paper, we present a WSN (Wireless Sensor Network)-based automatic monitoring system to prevent dew condensation in a greenhouse environment. The system is composed of sensor nodes for collecting data, base nodes for processing collected data, relay nodes for driving devices for adjusting the environment inside greenhouse and an environment server for data storage and processing. Using the Barenbrug formula for calculating the dew point on the leaves, this system is realized to prevent dew condensation phenomena on the crop’s surface acting as an important element for prevention of diseases infections. We also constructed a physical model resembling the typical greenhouse in order to verify the performance of our system with regard to dew condensation control. PMID:22163813

  10. Wireless sensor network-based greenhouse environment monitoring and automatic control system for dew condensation prevention.

    PubMed

    Park, Dae-Heon; Park, Jang-Woo

    2011-01-01

    Dew condensation on the leaf surface of greenhouse crops can promote diseases caused by fungus and bacteria, affecting the growth of the crops. In this paper, we present a WSN (Wireless Sensor Network)-based automatic monitoring system to prevent dew condensation in a greenhouse environment. The system is composed of sensor nodes for collecting data, base nodes for processing collected data, relay nodes for driving devices for adjusting the environment inside greenhouse and an environment server for data storage and processing. Using the Barenbrug formula for calculating the dew point on the leaves, this system is realized to prevent dew condensation phenomena on the crop's surface acting as an important element for prevention of diseases infections. We also constructed a physical model resembling the typical greenhouse in order to verify the performance of our system with regard to dew condensation control.

  11. The study of data collection method for the plasma properties collection and evaluation system from web

    NASA Astrophysics Data System (ADS)

    Park, Jun-Hyoung; Song, Mi-Young; Plasma Fundamental Technology Research Team

    2015-09-01

    Plasma databases are necessarily required to compute the plasma parameters and high reliable databases are closely related with accuracy enhancement of simulations. Therefore, a major concern of plasma properties collection and evaluation system is to create a sustainable and useful research environment for plasma data. The system has a commitment to provide not only numerical data but also bibliographic data (including DOI information). Originally, our collection data methods were done by manual data search. In some cases, it took a long time to find data. We will be find data more automatically and quickly than legacy methods by crawling or search engine such as Lucene.

  12. An automated atmospheric sampling system operating on 747 airliners

    NASA Technical Reports Server (NTRS)

    Perkins, P. J.; Gustafsson, U. R. C.

    1976-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of particulate and gaseous constituents of the atmosphere is collecting data on commercial air routes covering the world. Measurements are made in the upper troposphere and lower stratosphere (6 to 12 km) of constituents related to aircraft engine emissions and other pollutants. Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This unique system includes specialized instrumentation, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituent and related flight data are tape recorded in flight for later computer processing on the ground.

  13. A fire danger rating system for Hawaii

    Treesearch

    Robert E. Burgan; Francis M. Fujioka; George H. Hirata

    1974-01-01

    Extremes in rainfall on the Hawaiian Islands make it difficult to judge forest fire danger conditions. The use of an automatic data collection and computer processing system helps to monitor the problem.

  14. Calibration of automatic performance measures - speed and volume data : volume 1, evaluation of the accuracy of traffic volume counts collected by microwave sensors.

    DOT National Transportation Integrated Search

    2015-09-01

    Over the past few years, the Utah Department of Transportation (UDOT) has developed a system called the : Signal Performance Metrics System (SPMS) to evaluate the performance of signalized intersections. This system : currently provides data summarie...

  15. Facilities Engineering Management System Study: Catalog of Automatic Data Processing Applications Developed by USACERL (U.S. Army Construction Engineering Research Laboratory) for Army Installation Directories of Engineering and Housing

    DTIC Science & Technology

    1989-08-01

    Programming Languages Used: AUTOCAD Command, AUTOLISP Type of Commercial Program Used: CAD Specific Commercial Program Used: AUTOCAD Version: 1.0...collection which the system can directly translate into printed reports. This eliminates the need for filling data collection forms and manual compiling of

  16. Design and realization of an automatic weather station at island

    NASA Astrophysics Data System (ADS)

    Chen, Yong-hua; Li, Si-ren

    2011-10-01

    In this paper, the design and development of an automatic weather station monitoring is described. The proposed system consists of a set of sensors for measuring meteorological parameters (temperature, wind speed & direction, rain fall, visibility, etc.). To increase the reliability of the system, wind speed & direction are measured redundantly with duplicate sensors. The sensor signals are collected by the data logger CR1000 at several analog and digital inputs. The CR1000 and the sensors form a completely autonomous system which works with the other systems installed in the container. Communication with the master PC is accomplished over the method of Code Division Multiple Access (CDMA) with the Compact Caimore6550P CDMA DTU. The data are finally stored in tables on the CPU as well as on the CF-Card. The weather station was built as an efficient autonomous system which operates with the other systems to provide the required data for a fully automatic measurement system.

  17. [Automated anesthesia record system].

    PubMed

    Zhu, Tao; Liu, Jin

    2005-12-01

    Based on Client/Server architecture, a software of automated anesthesia record system running under Windows operation system and networks has been developed and programmed with Microsoft Visual C++ 6.0, Visual Basic 6.0 and SQL Server. The system can deal with patient's information throughout the anesthesia. It can collect and integrate the data from several kinds of medical equipment such as monitor, infusion pump and anesthesia machine automatically and real-time. After that, the system presents the anesthesia sheets automatically. The record system makes the anesthesia record more accurate and integral and can raise the anesthesiologist's working efficiency.

  18. Karst show caves - how DTN technology as used in space assists automatic environmental monitoring and tourist protection - experiment in Postojna cave

    NASA Astrophysics Data System (ADS)

    Gabrovšek, F.; Grašič, B.; Božnar, M. Z.; Mlakar, P.; Udén, M.; Davies, E.

    2013-10-01

    The paper presents an experiment demonstrating a novel and successful application of Delay- and Disruption-Tolerant Networking (DTN) technology for automatic data transfer in a karst cave Early Warning and Measuring System. The experiment took place inside the Postojna Cave in Slovenia, which is open to tourists. Several automatic meteorological measuring stations are set up inside the cave, as an adjunct to the surveillance infrastructure; the regular data transfer provided by the DTN technology allows the surveillance system to take on the role of an Early Warning System (EWS). One of the stations is set up alongside the railway tracks, which allows the tourist to travel inside the cave by train. The experiment was carried out by placing a DTN "data mule" (a DTN-enabled computer with WiFi connection) on the train and by upgrading the meteorological station with a DTN-enabled WiFi transmission system. When the data mule is in the wireless drive-by mode, it collects measurement data from the station over a period of several seconds as the train passes the stationary equipment, and delivers data at the final train station by the cave entrance. This paper describes an overview of the experimental equipment and organisation allowing the use of a DTN system for data collection and an EWS inside karst caves where there is a regular traffic of tourists and researchers.

  19. Karst show caves - how DTN technology as used in space assists automatic environmental monitoring and tourist protection - experiment in Postojna Cave

    NASA Astrophysics Data System (ADS)

    Gabrovšek, F.; Grašič, B.; Božnar, M. Z.; Mlakar, P.; Udén, M.; Davies, E.

    2014-02-01

    The paper presents an experiment demonstrating a novel and successful application of delay- and disruption-tolerant networking (DTN) technology for automatic data transfer in a karst cave early warning and measuring system. The experiment took place inside the Postojna Cave in Slovenia, which is open to tourists. Several automatic meteorological measuring stations are set up inside the cave, as an adjunct to the surveillance infrastructure; the regular data transfer provided by the DTN technology allows the surveillance system to take on the role of an early warning system (EWS). One of the stations is set up alongside the railway tracks, which allows the tourist to travel inside the cave by train. The experiment was carried out by placing a DTN "data mule" (a DTN-enabled computer with WiFi connection) on the train and by upgrading the meteorological station with a DTN-enabled WiFi transmission system. When the data mule is in the wireless drive-by mode, it collects measurement data from the station over a period of several seconds as the train without stopping passes the stationary equipment, and delivers data at the final train station by the cave entrance. This paper describes an overview of the experimental equipment and organization allowing the use of a DTN system for data collection and an EWS inside karst caves where there is regular traffic of tourists and researchers.

  20. Research on Automatic Positioning System of Ultrasonic Testing of Wind Turbine Blade Flaws

    NASA Astrophysics Data System (ADS)

    Liu, Q. X.; Wang, Z. H.; Long, S. G.; Cai, M.; Cai, M.; Wang, X.; Chen, X. Y.; Bu, J. L.

    2017-11-01

    Ultrasonic testing technology has been used essentially in non-destructive testing of wind turbine blades. However, it is fact that the ultrasonic flaw detection method has inefficiently employed in recent years. This is because the testing result will illustrate a small deviation due to the artificial, environmental and technical factors. Therefore, it is an urgent technical demand for engineers to test the various flaws efficiently and quickly. An automatic positioning system has been designed in this paper to record the moving coordinates and the target distance in real time. Simultaneously, it could launch and acquire the sonic wave automatically. The ADNS-3080 optoelectronic chip is manufactured by Agilent Technologies Inc, which is also utilized in the system. With the combination of the chip, the power conversion module and the USB transmission module, the collected data can be transmitted from the upper monitor to the hardware that could process and control the data through software programming. An experiment has been designed to prove the reliability of automotive positioning system. The result has been validated by comparing the result collected form LABVIEW and actual plots on Perspex plane, it concludes that the system possesses high accuracy and magnificent meanings in practical engineering.

  1. Smartphone data as an electronic biomarker of illness activity in bipolar disorder.

    PubMed

    Faurholt-Jepsen, Maria; Vinberg, Maj; Frost, Mads; Christensen, Ellen Margrethe; Bardram, Jakob E; Kessing, Lars Vedel

    2015-11-01

    Objective methods are lacking for continuous monitoring of illness activity in bipolar disorder. Smartphones offer unique opportunities for continuous monitoring and automatic collection of real-time data. The objectives of the paper were to test the hypotheses that (i) daily electronic self-monitored data and (ii) automatically generated objective data collected using smartphones correlate with clinical ratings of depressive and manic symptoms in patients with bipolar disorder. Software for smartphones (the MONARCA I system) that collects automatically generated objective data and self-monitored data on illness activity in patients with bipolar disorder was developed by the authors. A total of 61 patients aged 18-60 years and with a diagnosis of bipolar disorder according to ICD-10 used the MONARCA I system for six months. Depressive and manic symptoms were assessed monthly using the Hamilton Depression Rating Scale 17-item (HDRS-17) and the Young Mania Rating Scale (YMRS), respectively. Data are representative of over 400 clinical ratings. Analyses were computed using linear mixed-effect regression models allowing for both between individual variation and within individual variation over time. Analyses showed significant positive correlations between the duration of incoming and outgoing calls/day and scores on the HDRS-17, and significant positive correlations between the number and duration of incoming calls/day and scores on the YMRS; the number of and duration of outgoing calls/day and scores on the YMRS; and the number of outgoing text messages/day and scores on the YMRS. Analyses showed significant negative correlations between self-monitored data (i.e., mood and activity) and scores on the HDRS-17, and significant positive correlations between self-monitored data (i.e., mood and activity) and scores on the YMRS. Finally, the automatically generated objective data were able to discriminate between affective states. Automatically generated objective data and self-monitored data collected using smartphones correlate with clinically rated depressive and manic symptoms and differ between affective states in patients with bipolar disorder. Smartphone apps represent an easy and objective way to monitor illness activity with real-time data in bipolar disorder and may serve as an electronic biomarker of illness activity. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    PubMed

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  3. Developing an Active Traffic Management System for I-70 in Colorado

    DOT National Transportation Integrated Search

    2012-09-01

    The Colorado DOT is at the forefront of developing an Active Traffic Management (ATM) system that not only : considers operation aspects, but also integrates safety measures. In this research, data collected from Automatic : Vehicle Identification (A...

  4. Standards for the Analysis and Processing of Surface-Water Data and Information Using Electronic Methods

    USGS Publications Warehouse

    Sauer, Vernon B.

    2002-01-01

    Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.

  5. 3D exploitation of large urban photo archives

    NASA Astrophysics Data System (ADS)

    Cho, Peter; Snavely, Noah; Anderson, Ross

    2010-04-01

    Recent work in computer vision has demonstrated the potential to automatically recover camera and scene geometry from large collections of uncooperatively-collected photos. At the same time, aerial ladar and Geographic Information System (GIS) data are becoming more readily accessible. In this paper, we present a system for fusing these data sources in order to transfer 3D and GIS information into outdoor urban imagery. Applying this system to 1000+ pictures shot of the lower Manhattan skyline and the Statue of Liberty, we present two proof-of-concept examples of geometry-based photo enhancement which are difficult to perform via conventional image processing: feature annotation and image-based querying. In these examples, high-level knowledge projects from 3D world-space into georegistered 2D image planes and/or propagates between different photos. Such automatic capabilities lay the groundwork for future real-time labeling of imagery shot in complex city environments by mobile smart phones.

  6. Data and knowledge in medical distributed applications.

    PubMed

    Serban, Alexandru; Crişan-Vida, Mihaela; Stoicu-Tivadar, Lăcrămioara

    2014-01-01

    Building a clinical decision support system (CDSS) capable to collect process and diagnose data from the patients automatically, based on information, symptoms and investigations is one of the current challenges for researchers and medical science. The purpose of the current study is to design a cloud-based CDSS to improve patient safety, quality of care and organizational efficiency. It presents the design of a cloud-based application system using a medical based approach, which covers different diseases to diagnosis, differentiated on most important pathologies. Using online questionnaires, traditional and new data will be collected from patients. After data input, the application will formulate a presumptive diagnosis and will direct patients to the correspondent department. A questionnaire will dynamically ask questions about the interface, and functionality improvements. Based on the answers, the functionality of the system and the user interface will be improved considering the real needs expressed by the end-users. The cloud-based CDSS, as a useful tool for patients, physicians and healthcare providers involves the computer support in the diagnosis of different pathologies and an accurate automatic differential diagnostic system.

  7. Building Automation Systems.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  8. Assessment of WMATA's Automatic Fare Collection Equipment Performance

    DOT National Transportation Integrated Search

    1981-01-01

    The Washington Metropolitan Area Transit Authority (WMATA) has had an Automatic Fare Collection (AFC) system in operation since June 1977. The AFC system, comprised of entry/exit gates, farecard vendors, and addfare machines, initially encountered ma...

  9. Recent advances in automatic alignment system for the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Wilhelmsen, Karl; Awwal, Abdul A. S.; Kalantar, Dan; Leach, Richard; Lowe-Webb, Roger; McGuigan, David; Miller Kamm, Vicki

    2011-03-01

    The automatic alignment system for the National Ignition Facility (NIF) is a large-scale parallel system that directs all 192 laser beams along the 300-m optical path to a 50-micron focus at target chamber in less than 50 minutes. The system automatically commands 9,000 stepping motors to adjust mirrors and other optics based upon images acquired from high-resolution digital cameras viewing beams at various locations. Forty-five control loops per beamline request image processing services running on a LINUX cluster to analyze these images of the beams and references, and automatically steer the beams toward the target. This paper discusses the upgrades to the NIF automatic alignment system to handle new alignment needs and evolving requirements as related to various types of experiments performed. As NIF becomes a continuously-operated system and more experiments are performed, performance monitoring is increasingly important for maintenance and commissioning work. Data, collected during operations, is analyzed for tuning of the laser and targeting maintenance work. Handling evolving alignment and maintenance needs is expected for the planned 30-year operational life of NIF.

  10. Prototyping sensor network system for automatic vital signs collection. Evaluation of a location based automated assignment of measured vital signs to patients.

    PubMed

    Kuroda, T; Noma, H; Naito, C; Tada, M; Yamanaka, H; Takemura, T; Nin, K; Yoshihara, H

    2013-01-01

    Development of a clinical sensor network system that automatically collects vital sign and its supplemental data, and evaluation the effect of automatic vital sensor value assignment to patients based on locations of sensors. The sensor network estimates the data-source, a target patient, from the position of a vital sign sensor obtained from a newly developed proximity sensing system. The proximity sensing system estimates the positions of the devices using a Bluetooth inquiry process. Using Bluetooth access points and the positioning system newly developed in this project, the sensor network collects vital sign and its 4W (who, where, what, and when) supplemental data from any Bluetooth ready vital sign sensors such as Continua-ready devices. The prototype was evaluated in a pseudo clinical setting at Kyoto University Hospital using a cyclic paired comparison and statistical analysis. The result of the cyclic paired analysis shows the subjects evaluated the proposed system is more effective and safer than POCS as well as paper-based operation. It halves the times for vital signs input and eliminates input errors. On the other hand, the prototype failed in its position estimation for 12.6% of all attempts, and the nurses overlooked half of the errors. A detailed investigation clears that an advanced interface to show the system's "confidence", i.e. the probability of estimation error, must be effective to reduce the oversights. This paper proposed a clinical sensor network system that relieves nurses from vital signs input tasks. The result clearly shows that the proposed system increases the efficiency and safety of the nursing process both subjectively and objectively. It is a step toward new generation of point of nursing care systems where sensors take over the tasks of data input from the nurses.

  11. [Development and clinical evaluation of an anesthesia information management system].

    PubMed

    Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei

    2010-09-21

    To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.

  12. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  13. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  14. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  15. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  16. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  17. Rail Transit System Maintenance Practices for Automatic Fare Collection Equipment

    DOT National Transportation Integrated Search

    1984-05-01

    A review of rail transit system maintenance practices for automatic fare collection (AFC) equipment was performed. This study supports an UMTA sponsored program to improve the reliability of AFC equipment. The maintenance practices of the transit sys...

  18. Vital Recorder-a free research tool for automatic recording of high-resolution time-synchronised physiological data from multiple anaesthesia devices.

    PubMed

    Lee, Hyung-Chul; Jung, Chul-Woo

    2018-01-24

    The current anaesthesia information management system (AIMS) has limited capability for the acquisition of high-quality vital signs data. We have developed a Vital Recorder program to overcome the disadvantages of AIMS and to support research. Physiological data of surgical patients were collected from 10 operating rooms using the Vital Recorder. The basic equipment used were a patient monitor, the anaesthesia machine, and the bispectral index (BIS) monitor. Infusion pumps, cardiac output monitors, regional oximeter, and rapid infusion device were added as required. The automatic recording option was used exclusively and the status of recording was frequently checked through web monitoring. Automatic recording was successful in 98.5% (4,272/4,335) cases during eight months of operation. The total recorded time was 13,489 h (3.2 ± 1.9 h/case). The Vital Recorder's automatic recording and remote monitoring capabilities enabled us to record physiological big data with minimal effort. The Vital Recorder also provided time-synchronised data captured from a variety of devices to facilitate an integrated analysis of vital signs data. The free distribution of the Vital Recorder is expected to improve data access for researchers attempting physiological data studies and to eliminate inequalities in research opportunities due to differences in data collection capabilities.

  19. SmartPort: A Platform for Sensor Data Monitoring in a Seaport Based on FIWARE

    PubMed Central

    Fernández, Pablo; Santana, José Miguel; Ortega, Sebastián; Trujillo, Agustín; Suárez, José Pablo; Domínguez, Conrado; Santana, Jaisiel; Sánchez, Alejandro

    2016-01-01

    Seaport monitoring and management is a significant research area, in which infrastructure automatically collects big data sets that lead the organization in its multiple activities. Thus, this problem is heavily related to the fields of data acquisition, transfer, storage, big data analysis and information visualization. Las Palmas de Gran Canaria port is a good example of how a seaport generates big data volumes through a network of sensors. They are placed on meteorological stations and maritime buoys, registering environmental parameters. Likewise, the Automatic Identification System (AIS) registers several dynamic parameters about the tracked vessels. However, such an amount of data is useless without a system that enables a meaningful visualization and helps make decisions. In this work, we present SmartPort, a platform that offers a distributed architecture for the collection of the port sensors’ data and a rich Internet application that allows the user to explore the geolocated data. The presented SmartPort tool is a representative, promising and inspiring approach to manage and develop a smart system. It covers a demanding need for big data analysis and visualization utilities for managing complex infrastructures, such as a seaport. PMID:27011192

  20. Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning.

    PubMed

    Norouzzadeh, Mohammad Sadegh; Nguyen, Anh; Kosmala, Margaret; Swanson, Alexandra; Palmer, Meredith S; Packer, Craig; Clune, Jeff

    2018-06-19

    Having accurate, detailed, and up-to-date information about the location and behavior of animals in the wild would improve our ability to study and conserve ecosystems. We investigate the ability to automatically, accurately, and inexpensively collect such data, which could help catalyze the transformation of many fields of ecology, wildlife biology, zoology, conservation biology, and animal behavior into "big data" sciences. Motion-sensor "camera traps" enable collecting wildlife pictures inexpensively, unobtrusively, and frequently. However, extracting information from these pictures remains an expensive, time-consuming, manual task. We demonstrate that such information can be automatically extracted by deep learning, a cutting-edge type of artificial intelligence. We train deep convolutional neural networks to identify, count, and describe the behaviors of 48 species in the 3.2 million-image Snapshot Serengeti dataset. Our deep neural networks automatically identify animals with >93.8% accuracy, and we expect that number to improve rapidly in years to come. More importantly, if our system classifies only images it is confident about, our system can automate animal identification for 99.3% of the data while still performing at the same 96.6% accuracy as that of crowdsourced teams of human volunteers, saving >8.4 y (i.e., >17,000 h at 40 h/wk) of human labeling effort on this 3.2 million-image dataset. Those efficiency gains highlight the importance of using deep neural networks to automate data extraction from camera-trap images, reducing a roadblock for this widely used technology. Our results suggest that deep learning could enable the inexpensive, unobtrusive, high-volume, and even real-time collection of a wealth of information about vast numbers of animals in the wild. Copyright © 2018 the Author(s). Published by PNAS.

  1. Automatization of hydrodynamic modelling in a Floreon+ system

    NASA Astrophysics Data System (ADS)

    Ronovsky, Ales; Kuchar, Stepan; Podhoranyi, Michal; Vojtek, David

    2017-07-01

    The paper describes fully automatized hydrodynamic modelling as a part of the Floreon+ system. The main purpose of hydrodynamic modelling in the disaster management is to provide an accurate overview of the hydrological situation in a given river catchment. Automatization of the process as a web service could provide us with immediate data based on extreme weather conditions, such as heavy rainfall, without the intervention of an expert. Such a service can be used by non scientific users such as fire-fighter operators or representatives of a military service organizing evacuation during floods or river dam breaks. The paper describes the whole process beginning with a definition of a schematization necessary for hydrodynamic model, gathering of necessary data and its processing for a simulation, the model itself and post processing of a result and visualization on a web service. The process is demonstrated on a real data collected during floods in our Moravian-Silesian region in 2010.

  2. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  3. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  4. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  5. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  6. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  7. 40 CFR 51.362 - Motorist compliance enforcement program oversight.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... collection through the use of automatic data capture systems such as bar-code scanners or optical character... determination of compliance through parking lot surveys, road-side pull-overs, or other in-use vehicle...

  8. 40 CFR 51.362 - Motorist compliance enforcement program oversight.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... collection through the use of automatic data capture systems such as bar-code scanners or optical character... determination of compliance through parking lot surveys, road-side pull-overs, or other in-use vehicle...

  9. Calibration of automatic performance measures - speed and volume data: volume 2, evaluation of the accuracy of approach volume counts and speeds collected by microwave sensors.

    DOT National Transportation Integrated Search

    2016-05-01

    This study evaluated the accuracy of approach volumes and free flow approach speeds collected by the Wavetronix : SmartSensor Advance sensor for the Signal Performance Metrics system of the Utah Department of Transportation (UDOT), : using the field ...

  10. A cloud-based system for automatic glaucoma screening.

    PubMed

    Fengshou Yin; Damon Wing Kee Wong; Ying Quan; Ai Ping Yow; Ngan Meng Tan; Gopalakrishnan, Kavitha; Beng Hai Lee; Yanwu Xu; Zhuo Zhang; Jun Cheng; Jiang Liu

    2015-08-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases including glaucoma. However, these systems are usually standalone software with basic functions only, limiting their usage in a large scale. In this paper, we introduce an online cloud-based system for automatic glaucoma screening through the use of medical image-based pattern classification technologies. It is designed in a hybrid cloud pattern to offer both accessibility and enhanced security. Raw data including patient's medical condition and fundus image, and resultant medical reports are collected and distributed through the public cloud tier. In the private cloud tier, automatic analysis and assessment of colour retinal fundus images are performed. The ubiquitous anywhere access nature of the system through the cloud platform facilitates a more efficient and cost-effective means of glaucoma screening, allowing the disease to be detected earlier and enabling early intervention for more efficient intervention and disease management.

  11. The Use of Automatic Indexing for Authority Control.

    ERIC Educational Resources Information Center

    Dillon, Martin; And Others

    1981-01-01

    Uses an experimental system for authority control on a collection of bibliographic records to demonstrate the resemblance between thesaurus-based automatic indexing and automatic authority control. Details of the automatic indexing system are given, results discussed, and the benefits of the resemblance examined. Included are a rules appendix and…

  12. Data visualization as a tool for improved decision making within transit agencies

    DOT National Transportation Integrated Search

    2007-02-01

    TriMet, the regional transit provider in the Portland, OR, area has been a leader in bus transit performance monitoring using data collected via automatic vehicle location and automatic passenger counter technologies. This information is collected an...

  13. Spacelab 4: Primate experiment support hardware

    NASA Astrophysics Data System (ADS)

    Fusco, P. R.; Peyran, R. J.

    1984-05-01

    A squirrel monkey feeder and automatic urine collection system were designed to fly on the Spacelab 4 Shuttle Mission presently scheduled for January 1986. Prototypes of the feeder and urine collection systems were fabricated and extensively tested on squirrel monkeys at the National Aeronautics and Space Administration's (NASA) Ames Research Center (ARC). The feeder design minimizes impact on the monkey's limited space in the cage and features improved reliability and biocompatibility over previous systems. The urine collection system is the first flight qualified, automatic urine collection device for squirrel monkeys. Flight systems are currently being fabricated.

  14. Spacelab 4: Primate experiment support hardware

    NASA Technical Reports Server (NTRS)

    Fusco, P. R.; Peyran, R. J.

    1984-01-01

    A squirrel monkey feeder and automatic urine collection system were designed to fly on the Spacelab 4 Shuttle Mission presently scheduled for January 1986. Prototypes of the feeder and urine collection systems were fabricated and extensively tested on squirrel monkeys at the National Aeronautics and Space Administration's (NASA) Ames Research Center (ARC). The feeder design minimizes impact on the monkey's limited space in the cage and features improved reliability and biocompatibility over previous systems. The urine collection system is the first flight qualified, automatic urine collection device for squirrel monkeys. Flight systems are currently being fabricated.

  15. AIRSAR Web-Based Data Processing

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; Van Zyl, Jakob; Kim, Yunjin; Hensley, Scott; Lou, Yunling; Madsen, Soren; Chapman, Bruce; Imel, David; Durden, Stephen; Tung, Wayne

    2007-01-01

    The AIRSAR automated, Web-based data processing and distribution system is an integrated, end-to-end synthetic aperture radar (SAR) processing system. Designed to function under limited resources and rigorous demands, AIRSAR eliminates operational errors and provides for paperless archiving. Also, it provides a yearly tune-up of the processor on flight missions, as well as quality assurance with new radar modes and anomalous data compensation. The software fully integrates a Web-based SAR data-user request subsystem, a data processing system to automatically generate co-registered multi-frequency images from both polarimetric and interferometric data collection modes in 80/40/20 MHz bandwidth, an automated verification quality assurance subsystem, and an automatic data distribution system for use in the remote-sensor community. Features include Survey Automation Processing in which the software can automatically generate a quick-look image from an entire 90-GB SAR raw data 32-MB/s tape overnight without operator intervention. Also, the software allows product ordering and distribution via a Web-based user request system. To make AIRSAR more user friendly, it has been designed to let users search by entering the desired mission flight line (Missions Searching), or to search for any mission flight line by entering the desired latitude and longitude (Map Searching). For precision image automation processing, the software generates the products according to each data processing request stored in the database via a Queue management system. Users are able to have automatic generation of coregistered multi-frequency images as the software generates polarimetric and/or interferometric SAR data processing in ground and/or slant projection according to user processing requests for one of the 12 radar modes.

  16. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    NASA Astrophysics Data System (ADS)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  17. Computer memory management system

    DOEpatents

    Kirk, III, Whitson John

    2002-01-01

    A computer memory management system utilizing a memory structure system of "intelligent" pointers in which information related to the use status of the memory structure is designed into the pointer. Through this pointer system, The present invention provides essentially automatic memory management (often referred to as garbage collection) by allowing relationships between objects to have definite memory management behavior by use of coding protocol which describes when relationships should be maintained and when the relationships should be broken. In one aspect, the present invention system allows automatic breaking of strong links to facilitate object garbage collection, coupled with relationship adjectives which define deletion of associated objects. In another aspect, The present invention includes simple-to-use infinite undo/redo functionality in that it has the capability, through a simple function call, to undo all of the changes made to a data model since the previous `valid state` was noted.

  18. An automated atmospheric sampling system operating on 747 airliners

    NASA Technical Reports Server (NTRS)

    Perkins, P.; Gustafsson, U. R. C.

    1975-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of selected particulate and gaseous constituents of the atmosphere has been installed on a number of commercial airliners and is collecting data on commercial air routes covering the world. Measurements of constituents related to aircraft engine emissions and other pollutants are made in the upper troposphere and lower stratosphere (6 to 12 km) in support of the Global Air Sampling Program (GASP). Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This system includes specialized instrumentation for measuring carbon monoxide, ozone, water vapor, and particulates, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituents and related flight data are tape recorded in flight for later computer processing on the ground.

  19. Electronic field permeameter

    DOEpatents

    Chandler, Mark A.; Goggin, David J.; Horne, Patrick J.; Kocurek, Gary G.; Lake, Larry W.

    1989-01-01

    For making rapid, non-destructive permeability measurements in the field, a portable minipermeameter of the kind having a manually-operated gas injection tip is provided with a microcomputer system which operates a flow controller to precisely regulate gas flow rate to a test sample, and reads a pressure sensor which senses the pressure across the test sample. The microcomputer system automatically turns on the gas supply at the start of each measurement, senses when a steady-state is reached, collects and records pressure and flow rate data, and shuts off the gas supply immediately after the measurement is completed. Preferably temperature is also sensed to correct for changes in gas viscosity. The microcomputer system may also provide automatic zero-point adjustment, sensor calibration, over-range sensing, and may select controllers, sensors, and set-points for obtaining the most precise measurements. Electronic sensors may provide increased accuracy and precision. Preferably one microcomputer is used for sensing instrument control and data collection, and a second microcomputer is used which is dedicated to recording and processing the data, selecting the sensors and set-points for obtaining the most precise measurements, and instructing the user how to set-up and operate the minipermeameter. To provide mass data collection and user-friendly operation, the second microcomputer is preferably a lap-type portable microcomputer having a non-volatile or battery-backed CMOS memory.

  20. Visual perception system and method for a humanoid robot

    NASA Technical Reports Server (NTRS)

    Chelian, Suhas E. (Inventor); Linn, Douglas Martin (Inventor); Wampler, II, Charles W. (Inventor); Bridgwater, Lyndon (Inventor); Wells, James W. (Inventor); Mc Kay, Neil David (Inventor)

    2012-01-01

    A robotic system includes a humanoid robot with robotic joints each moveable using an actuator(s), and a distributed controller for controlling the movement of each of the robotic joints. The controller includes a visual perception module (VPM) for visually identifying and tracking an object in the field of view of the robot under threshold lighting conditions. The VPM includes optical devices for collecting an image of the object, a positional extraction device, and a host machine having an algorithm for processing the image and positional information. The algorithm visually identifies and tracks the object, and automatically adapts an exposure time of the optical devices to prevent feature data loss of the image under the threshold lighting conditions. A method of identifying and tracking the object includes collecting the image, extracting positional information of the object, and automatically adapting the exposure time to thereby prevent feature data loss of the image.

  1. Virtual Titrator: A Student-Oriented Instrument.

    ERIC Educational Resources Information Center

    Ritter, David; Johnson, Michael

    1997-01-01

    Describes a titrator system, constructed from a computer-interfaced pH-meter, that was designed to increase student involvement in the process. Combines automatic data collection with real-time graphical display and interactive controls to focus attention on the process rather than on bits of data. Improves understanding of concepts and…

  2. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    PubMed

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Real World Experience With Ion Implant Fault Detection at Freescale Semiconductor

    NASA Astrophysics Data System (ADS)

    Sing, David C.; Breeden, Terry; Fakhreddine, Hassan; Gladwin, Steven; Locke, Jason; McHugh, Jim; Rendon, Michael

    2006-11-01

    The Freescale automatic fault detection and classification (FDC) system has logged data from over 3.5 million implants in the past two years. The Freescale FDC system is a low cost system which collects summary implant statistics at the conclusion of each implant run. The data is collected by either downloading implant data log files from the implant tool workstation, or by exporting summary implant statistics through the tool's automation interface. Compared to the traditional FDC systems which gather trace data from sensors on the tool as the implant proceeds, the Freescale FDC system cannot prevent scrap when a fault initially occurs, since the data is collected after the implant concludes. However, the system can prevent catastrophic scrap events due to faults which are not detected for days or weeks, leading to the loss of hundreds or thousands of wafers. At the Freescale ATMC facility, the practical applications of the FD system fall into two categories: PM trigger rules which monitor tool signals such as ion gauges and charge control signals, and scrap prevention rules which are designed to detect specific failure modes that have been correlated to yield loss and scrap. PM trigger rules are designed to detect shifts in tool signals which indicate normal aging of tool systems. For example, charging parameters gradually shift as flood gun assemblies age, and when charge control rules start to fail a flood gun PM is performed. Scrap prevention rules are deployed to detect events such as particle bursts and excessive beam noise, events which have been correlated to yield loss. The FDC system does have tool log-down capability, and scrap prevention rules often use this capability to automatically log the tool into a maintenance state while simultaneously paging the sustaining technician for data review and disposition of the affected product.

  4. 77 FR 5058 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Automatic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-01

    ... for OMB Review; Comment Request; Automatic Fire Sensor and Warning Devices Systems; Examination and..., ``Automatic Fire Sensor and Warning Devices Systems,'' to the Office of Management and Budget (OMB) for review... and warning device systems are maintained and calibrated in order to function properly at all times...

  5. Tashkeela: Novel corpus of Arabic vocalized texts, data for auto-diacritization systems.

    PubMed

    Zerrouki, Taha; Balla, Amar

    2017-04-01

    Arabic diacritics are often missed in Arabic scripts. This feature is a handicap for new learner to read َArabic, text to speech conversion systems, reading and semantic analysis of Arabic texts. The automatic diacritization systems are the best solution to handle this issue. But such automation needs resources as diactritized texts to train and evaluate such systems. In this paper, we describe our corpus of Arabic diacritized texts. This corpus is called Tashkeela. It can be used as a linguistic resource tool for natural language processing such as automatic diacritics systems, dis-ambiguity mechanism, features and data extraction. The corpus is freely available, it contains 75 million of fully vocalized words mainly 97 books from classical and modern Arabic language. The corpus is collected from manually vocalized texts using web crawling process.

  6. Automated Meteor Detection by All-Sky Digital Camera Systems

    NASA Astrophysics Data System (ADS)

    Suk, Tomáš; Šimberová, Stanislava

    2017-12-01

    We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.

  7. Automatic mission planning algorithms for aerial collection of imaging-specific tasks

    NASA Astrophysics Data System (ADS)

    Sponagle, Paul; Salvaggio, Carl

    2017-05-01

    The rapid advancement and availability of small unmanned aircraft systems (sUAS) has led to many novel exploitation tasks utilizing that utilize this unique aerial imagery data. Collection of this unique data requires novel flight planning to accomplish the task at hand. This work describes novel flight planning to better support structure-from-motion missions to minimize occlusions, autonomous and periodic overflight of reflectance calibration panels to permit more efficient and accurate data collection under varying illumination conditions, and the collection of imagery data to study optical properties such as the bidirectional reflectance distribution function without disturbing the target in sensitive or remote areas of interest. These novel mission planning algorithms will provide scientists with additional tools to meet their future data collection needs.

  8. The effect of automated monitoring and real-time prompting on nurses' hand hygiene performance.

    PubMed

    Levchenko, Alexander I; Boscart, Veronique M; Fernie, Geoff R

    2013-10-01

    Adequate hand hygiene compliance by healthcare staff is considered an effective method to reduce hospital-acquired infections. The electronic system developed at Toronto Rehabilitation Institute automatically detects hand hygiene opportunities and records hand hygiene actions. It includes an optional visual hand hygiene status indication, generates real-time hand hygiene prompting signals, and enables automated monitoring of individual and aggregated hand hygiene performance. The system was installed on a complex continuous care unit at the entrance to 17 patient rooms and a utility room. A total of 93 alcohol gel and soap dispensers were instrumented and 14 nurses were provided with the personal wearable electronic monitors. The study included three phases with the system operating in three different modes: (1) an inactive mode during the first phase when hand hygiene opportunities and hand hygiene actions were recorded but prompting and visual indication functions were disabled, (2) only hand hygiene status indicators were enabled during the second phase, and (3) both hand hygiene status and real-time hand hygiene prompting signals were enabled during the third phase. Data collection was performed automatically during all of the three phases. The system indicated significantly higher hand hygiene activity rates and compliance during the third phase, with both hand hygiene indication and real-time prompting functions enabled. To increase the efficacy of the technology, its use was supplemented with individual performance reviews of the automatically collected data.

  9. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    PubMed

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  10. Automatic mine detection based on multiple features

    NASA Astrophysics Data System (ADS)

    Yu, Ssu-Hsin; Gandhe, Avinash; Witten, Thomas R.; Mehra, Raman K.

    2000-08-01

    Recent research sponsored by the Army, Navy and DARPA has significantly advanced the sensor technologies for mine detection. Several innovative sensor systems have been developed and prototypes were built to investigate their performance in practice. Most of the research has been focused on hardware design. However, in order for the systems to be in wide use instead of in limited use by a small group of well-trained experts, an automatic process for mine detection is needed to make the final decision process on mine vs. no mine easier and more straightforward. In this paper, we describe an automatic mine detection process consisting of three stage, (1) signal enhancement, (2) pixel-level mine detection, and (3) object-level mine detection. The final output of the system is a confidence measure that quantifies the presence of a mine. The resulting system was applied to real data collected using radar and acoustic technologies.

  11. ClinData Express – A Metadata Driven Clinical Research Data Management System for Secondary Use of Clinical Data

    PubMed Central

    Li, Zuofeng; Wen, Jingran; Zhang, Xiaoyan; Wu, Chunxiao; Li, Zuogao; Liu, Lei

    2012-01-01

    Aim to ease the secondary use of clinical data in clinical research, we introduce a metadata driven web-based clinical data management system named ClinData Express. ClinData Express is made up of two parts: 1) m-designer, a standalone software for metadata definition; 2) a web based data warehouse system for data management. With ClinData Express, what the researchers need to do is to define the metadata and data model in the m-designer. The web interface for data collection and specific database for data storage will be automatically generated. The standards used in the system and the data export modular make sure of the data reuse. The system has been tested on seven disease-data collection in Chinese and one form from dbGap. The flexibility of system makes its great potential usage in clinical research. The system is available at http://code.google.com/p/clindataexpress. PMID:23304327

  12. [Automatic adjustment control system for DC glow discharge plasma source].

    PubMed

    Wan, Zhen-zhen; Wang, Yong-qing; Li, Xiao-jia; Wang, Hai-zhou; Shi, Ning

    2011-03-01

    There are three important parameters in the DC glow discharge process, the discharge current, discharge voltage and argon pressure in discharge source. These parameters influence each other during glow discharge process. This paper presents an automatic control system for DC glow discharge plasma source. This system collects and controls discharge voltage automatically by adjusting discharge source pressure while the discharge current is constant in the glow discharge process. The design concept, circuit principle and control program of this automatic control system are described. The accuracy is improved by this automatic control system with the method of reducing the complex operations and manual control errors. This system enhances the control accuracy of glow discharge voltage, and reduces the time to reach discharge voltage stability. The glow discharge voltage stability test results with automatic control system are provided as well, the accuracy with automatic control system is better than 1% FS which is improved from 4% FS by manual control. Time to reach discharge voltage stability has been shortened to within 30 s by automatic control from more than 90 s by manual control. Standard samples like middle-low alloy steel and tin bronze have been tested by this automatic control system. The concentration analysis precision has been significantly improved. The RSDs of all the test result are better than 3.5%. In middle-low alloy steel standard sample, the RSD range of concentration test result of Ti, Co and Mn elements is reduced from 3.0%-4.3% by manual control to 1.7%-2.4% by automatic control, and that for S and Mo is also reduced from 5.2%-5.9% to 3.3%-3.5%. In tin bronze standard sample, the RSD range of Sn, Zn and Al elements is reduced from 2.6%-4.4% to 1.0%-2.4%, and that for Si, Ni and Fe is reduced from 6.6%-13.9% to 2.6%-3.5%. The test data is also shown in this paper.

  13. Video-processing-based system for automated pedestrian data collection and analysis when crossing the street

    NASA Astrophysics Data System (ADS)

    Mansouri, Nabila; Watelain, Eric; Ben Jemaa, Yousra; Motamed, Cina

    2018-03-01

    Computer-vision techniques for pedestrian detection and tracking have progressed considerably and become widely used in several applications. However, a quick glance at the literature shows a minimal use of these techniques in pedestrian behavior and safety analysis, which might be due to the technical complexities facing the processing of pedestrian videos. To extract pedestrian trajectories from a video automatically, all road users must be detected and tracked during sequences, which is a challenging task, especially in a congested open-outdoor urban space. A multipedestrian tracker based on an interframe-detection-association process was proposed and evaluated. The tracker results are used to implement an automatic tool for pedestrians data collection when crossing the street based on video processing. The variations in the instantaneous speed allowed the detection of the street crossing phases (approach, waiting, and crossing). These were addressed for the first time in the pedestrian road security analysis to illustrate the causal relationship between pedestrian behaviors in the different phases. A comparison with a manual data collection method, by computing the root mean square error and the Pearson correlation coefficient, confirmed that the procedures proposed have significant potential to automate the data collection process.

  14. Profiling Animal Toxicants by Automatically Mining Public Bioassay Data: A Big Data Approach for Computational Toxicology

    PubMed Central

    Zhang, Jun; Hsieh, Jui-Hua; Zhu, Hao

    2014-01-01

    In vitro bioassays have been developed and are currently being evaluated as potential alternatives to traditional animal toxicity models. Already, the progress of high throughput screening techniques has resulted in an enormous amount of publicly available bioassay data having been generated for a large collection of compounds. When a compound is tested using a collection of various bioassays, all the testing results can be considered as providing a unique bio-profile for this compound, which records the responses induced when the compound interacts with different cellular systems or biological targets. Profiling compounds of environmental or pharmaceutical interest using useful toxicity bioassay data is a promising method to study complex animal toxicity. In this study, we developed an automatic virtual profiling tool to evaluate potential animal toxicants. First, we automatically acquired all PubChem bioassay data for a set of 4,841 compounds with publicly available rat acute toxicity results. Next, we developed a scoring system to evaluate the relevance between these extracted bioassays and animal acute toxicity. Finally, the top ranked bioassays were selected to profile the compounds of interest. The resulting response profiles proved to be useful to prioritize untested compounds for their animal toxicity potentials and form a potential in vitro toxicity testing panel. The protocol developed in this study could be combined with structure-activity approaches and used to explore additional publicly available bioassay datasets for modeling a broader range of animal toxicities. PMID:24950175

  15. Profiling animal toxicants by automatically mining public bioassay data: a big data approach for computational toxicology.

    PubMed

    Zhang, Jun; Hsieh, Jui-Hua; Zhu, Hao

    2014-01-01

    In vitro bioassays have been developed and are currently being evaluated as potential alternatives to traditional animal toxicity models. Already, the progress of high throughput screening techniques has resulted in an enormous amount of publicly available bioassay data having been generated for a large collection of compounds. When a compound is tested using a collection of various bioassays, all the testing results can be considered as providing a unique bio-profile for this compound, which records the responses induced when the compound interacts with different cellular systems or biological targets. Profiling compounds of environmental or pharmaceutical interest using useful toxicity bioassay data is a promising method to study complex animal toxicity. In this study, we developed an automatic virtual profiling tool to evaluate potential animal toxicants. First, we automatically acquired all PubChem bioassay data for a set of 4,841 compounds with publicly available rat acute toxicity results. Next, we developed a scoring system to evaluate the relevance between these extracted bioassays and animal acute toxicity. Finally, the top ranked bioassays were selected to profile the compounds of interest. The resulting response profiles proved to be useful to prioritize untested compounds for their animal toxicity potentials and form a potential in vitro toxicity testing panel. The protocol developed in this study could be combined with structure-activity approaches and used to explore additional publicly available bioassay datasets for modeling a broader range of animal toxicities.

  16. An Experimental Seismic Data and Parameter Exchange System for Interim NEAMTWS

    NASA Astrophysics Data System (ADS)

    Hanka, W.; Hoffmann, T.; Weber, B.; Heinloo, A.; Hoffmann, M.; Müller-Wrana, T.; Saul, J.

    2009-04-01

    In 2008 GFZ Potsdam has started to operate its global earthquake monitoring system as an experimental seismic background data centre for the interim NEAMTWS (NE Atlantic and Mediterranean Tsunami Warning System). The SeisComP3 (SC3) software, developed within the GITEWS (German Indian Ocean Tsunami Early Warning System) project was extended to test the export and import of individual processing results within a cluster of SC3 systems. The initiated NEAMTWS SC3 cluster consists presently of the 24/7 seismic services at IMP, IGN, LDG/EMSC and KOERI, whereas INGV and NOA are still pending. The GFZ virtual real-time seismic network (GEOFON Extended Virtual Network - GEVN) was substantially extended by many stations from Western European countries optimizing the station distribution for NEAMTWS purposes. To amend the public seismic network (VEBSN - Virtual European Broadband Seismic Network) some attached centres provided additional private stations for NEAMTWS usage. In parallel to the data collection by Internet the GFZ VSAT hub for the secured data collection of the EuroMED GEOFON and NEAMTWS backbone network stations became operational and the first data links were established. In 2008 the experimental system could already prove its performance since a number of relevant earthquakes have happened in NEAMTWS area. The results are very promising in terms of speed as the automatic alerts (reliable solutions based on a minimum of 25 stations and disseminated by emails and SMS) were issued between 2 1/2 and 4 minutes for Greece and 5 minutes for Iceland. They are also promising in terms of accuracy since epicenter coordinates, depth and magnitude estimates were sufficiently accurate from the very beginning, usually don't differ substantially from the final solutions and provide a good starting point for the operations of the interim NEAMTWS. However, although an automatic seismic system is a good first step, 24/7 manned RTWCs are mandatory for regular manual verification of the automatic seismic results and the estimation of the tsunami potential for a given event.

  17. Implementing electronic identification for performance recording in sheep: II. Cost-benefit analysis in meat and dairy farms.

    PubMed

    Ait-Saidi, A; Caja, G; Salama, A A K; Milán, M J

    2014-12-01

    Costs and secondary benefits of implementing electronic identification (e-ID) for performance recording (i.e., lambing, body weight, inventory, and milk yield) in dairy and meat ewes were assessed by using the results from a previous study in which manual (M), semiautomatic (SA), and automatic (AU) data collection systems were compared. Ewes were identified with visual ear tags and electronic rumen boluses. The M system used visual identification, on-paper data recording, and manual data uploading to a computer. The SA system used e-ID with a handheld reader in which performances were typed and automatic uploaded to a computer. The use of a personal digital assistant (PDA) for recording and automatic data uploading, which transformed M in a SA system, was also considered. The AU system was only used for BW recording and consisted of e-ID, automatic data recording in an electronic scale, and uploading to a computer. The cost-benefit study was applied to 2 reference sheep farms of 700 meat ewes, under extensive or intensive production systems, and of 400 dairy ewes, practicing once- or twice-a-day machine milkings. Sensitivity analyses under voluntary and mandatory e-ID scenarios were also included. Benefits of using e-ID for SA or AU performance recording mainly depended on sheep farm purpose, number of test days per year, handheld reader and PDA prices, and flock size. Implementing e-ID for SA and AU performance recording saved approximately 50% of the time required by the M system, and increased the reliability of the data collected. Use of e-ID increased the cost of performance recording in a voluntary e-ID scenario, paying only partially the investment made (15 to 70%). For the mandatory e-ID scenario, in which the cost of e-ID devices was not included, savings paid 100% of the extra costs needed for using e-ID in all farm types and conditions. In both scenarios, the reader price was the most important extra cost (40 to 90%) for implementing e-ID in sheep farms. Calculated extra costs of using the PDA covered more than 100% of the implementation costs in all type of sheep farms, indicating that this device was cost-effective for sheep-performance recording. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. Advanced instrumentation for the collection, retrieval, and processing of urban stormwater data

    USGS Publications Warehouse

    Robinson, Jerald B.; Bales, Jerad D.; Young, Wendi S.; ,

    1995-01-01

    The U.S. Geological Survey, in cooperation with the City of Charlotte and Mecklenburg County, North Carolina, has developed a data-collection network that uses advanced instrumentation to automatically collect, retrieve, and process urban stormwater data. Precipitation measurement and water-quality networks provide data for (1) planned watershed simulation models, (2) early warning of possible flooding, (3) computation of material export, and (4) characterization of water quality in relation to basin conditions. Advantages of advanced instrumentation include remote access to real-time data, reduced demands on and more efficient use of limited human resources, and direct importation of data into a geographical information system for display and graphic analysis.

  19. Automatic summary generating technology of vegetable traceability for information sharing

    NASA Astrophysics Data System (ADS)

    Zhenxuan, Zhang; Minjing, Peng

    2017-06-01

    In order to solve problems of excessive data entries and consequent high costs for data collection in vegetable traceablility for farmers in traceability applications, the automatic summary generating technology of vegetable traceability for information sharing was proposed. The proposed technology is an effective way for farmers to share real-time vegetable planting information in social networking platforms to enhance their brands and obtain more customers. In this research, the influencing factors in the vegetable traceablility for customers were analyzed to establish the sub-indicators and target indicators and propose a computing model based on the collected parameter values of the planted vegetables and standard legal systems on food safety. The proposed standard parameter model involves five steps: accessing database, establishing target indicators, establishing sub-indicators, establishing standard reference model and computing scores of indicators. On the basis of establishing and optimizing the standards of food safety and traceability system, this proposed technology could be accepted by more and more farmers and customers.

  20. Weather Radar Studies

    DTIC Science & Technology

    1988-03-31

    radar operation and data - collection activities, a large data -analysis effort has been under way in support of automatic wind-shear detection algorithm ...REDUCTION AND ALGORITHM DEVELOPMENT 49 A. General-Purpose Software 49 B. Concurrent Computer Systems 49 C. Sun Workstations 51 D. Radar Data Analysis 52...1. Algorithm Verification 52 2. Other Studies 53 3. Translations 54 4. Outside Distributions 55 E. Mesonet/LLWAS Data Analysis 55 1. 1985 Data 55 2

  1. Driver behavior following an automatic steering intervention.

    PubMed

    Fricke, Nicola; Griesche, Stefan; Schieben, Anna; Hesse, Tobias; Baumann, Martin

    2015-10-01

    The study investigated driver behavior toward an automatic steering intervention of a collision mitigation system. Forty participants were tested in a driving simulator and confronted with an inevitable collision. They performed a naïve drive and afterwards a repeated exposure in which they were told to hold the steering wheel loosely. In a third drive they experienced a false alarm situation. Data on driving behavior, i.e. steering and braking behavior as well as subjective data was assessed in the scenarios. Results showed that most participants held on to the steering wheel strongly or counter-steered during the system intervention during the first encounter. Moreover, subjective data collected after the first drive showed that the majority of drivers was not aware of the system intervention. Data from the repeated drive in which participants were instructed to hold the steering wheel loosely, led to significantly more participants holding the steering wheel loosely and thus complying with the instruction. This study seems to imply that without knowledge and information of the system about an upcoming intervention, the most prevalent driving behavior is a strong reaction with the steering wheel similar to an automatic steering reflex which decreases the system's effectiveness. Results of the second drive show some potential for countermeasures, such as informing drivers shortly before a system intervention in order to prevent inhibiting reactions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. The STP (Solar-Terrestrial Physics) Semantic Web based on the RSS1.0 and the RDF

    NASA Astrophysics Data System (ADS)

    Kubo, T.; Murata, K. T.; Kimura, E.; Ishikura, S.; Shinohara, I.; Kasaba, Y.; Watari, S.; Matsuoka, D.

    2006-12-01

    In the Solar-Terrestrial Physics (STP), it is pointed out that circulation and utilization of observation data among researchers are insufficient. To archive interdisciplinary researches, we need to overcome this circulation and utilization problems. Under such a background, authors' group has developed a world-wide database that manages meta-data of satellite and ground-based observation data files. It is noted that retrieving meta-data from the observation data and registering them to database have been carried out by hand so far. Our goal is to establish the STP Semantic Web. The Semantic Web provides a common framework that allows a variety of data shared and reused across applications, enterprises, and communities. We also expect that the secondary information related with observations, such as event information and associated news, are also shared over the networks. The most fundamental issue on the establishment is who generates, manages and provides meta-data in the Semantic Web. We developed an automatic meta-data collection system for the observation data using the RSS (RDF Site Summary) 1.0. The RSS1.0 is one of the XML-based markup languages based on the RDF (Resource Description Framework), which is designed for syndicating news and contents of news-like sites. The RSS1.0 is used to describe the STP meta-data, such as data file name, file server address and observation date. To describe the meta-data of the STP beyond RSS1.0 vocabulary, we defined original vocabularies for the STP resources using the RDF Schema. The RDF describes technical terms on the STP along with the Dublin Core Metadata Element Set, which is standard for cross-domain information resource descriptions. Researchers' information on the STP by FOAF, which is known as an RDF/XML vocabulary, creates a machine-readable metadata describing people. Using the RSS1.0 as a meta-data distribution method, the workflow from retrieving meta-data to registering them into the database is automated. This technique is applied for several database systems, such as the DARTS database system and NICT Space Weather Report Service. The DARTS is a science database managed by ISAS/JAXA in Japan. We succeeded in generating and collecting the meta-data automatically for the CDF (Common data Format) data, such as Reimei satellite data, provided by the DARTS. We also create an RDF service for space weather report and real-time global MHD simulation 3D data provided by the NICT. Our Semantic Web system works as follows: The RSS1.0 documents generated on the data sites (ISAS and NICT) are automatically collected by a meta-data collection agent. The RDF documents are registered and the agent extracts meta-data to store them in the Sesame, which is an open source RDF database with support for RDF Schema inferencing and querying. The RDF database provides advanced retrieval processing that has considered property and relation. Finally, the STP Semantic Web provides automatic processing or high level search for the data which are not only for observation data but for space weather news, physical events, technical terms and researches information related to the STP.

  3. Automatic monitoring of ecosystem structure and functions using integrated low-cost near surface sensors

    NASA Astrophysics Data System (ADS)

    Kim, J.; Ryu, Y.; Jiang, C.; Hwang, Y.

    2016-12-01

    Near surface sensors are able to acquire more reliable and detailed information with higher temporal resolution than satellite observations. Conventional near surface sensors usually work individually, and thus they require considerable manpower from data collection through information extraction and sharing. Recent advances of Internet of Things (IoT) provides unprecedented opportunities to integrate various low-cost sensors as an intelligent near surface observation system for monitoring ecosystem structure and functions. In this study, we developed a Smart Surface Sensing System (4S), which can automatically collect, transfer, process and analyze data, and then publish time series results on public-available website. The system is composed of micro-computer Raspberry pi, micro-controller Arduino, multi-spectral spectrometers made from Light Emitting Diode (LED), visible and near infrared cameras, and Internet module. All components are connected with each other and Raspberry pi intelligently controls the automatic data production chain. We did intensive tests and calibrations in-lab. Then, we conducted in-situ observations at a rice paddy field and a deciduous broadleaf forest. During the whole growth season, 4S obtained landscape images, spectral reflectance in red, green, blue, and near infrared, normalized difference vegetation index (NDVI), fraction of photosynthetically active radiation (fPAR), and leaf area index (LAI) continuously. Also We compared 4S data with other independent measurements. NDVI obtained from 4S agreed well with Jaz hyperspectrometer at both diurnal and seasonal scales (R2 = 0.92, RMSE = 0.059), and 4S derived fPAR and LAI were comparable to LAI-2200 and destructive measurements in both magnitude and seasonal trajectory. We believe that the integrated low-cost near surface sensor could help research community monitoring ecosystem structure and functions closer and easier through a network system.

  4. Temporal Cyber Attack Detection.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingram, Joey Burton; Draelos, Timothy J.; Galiardi, Meghan

    Rigorous characterization of the performance and generalization ability of cyber defense systems is extremely difficult, making it hard to gauge uncertainty, and thus, confidence. This difficulty largely stems from a lack of labeled attack data that fully explores the potential adversarial space. Currently, performance of cyber defense systems is typically evaluated in a qualitative manner by manually inspecting the results of the system on live data and adjusting as needed. Additionally, machine learning has shown promise in deriving models that automatically learn indicators of compromise that are more robust than analyst-derived detectors. However, to generate these models, most algorithms requiremore » large amounts of labeled data (i.e., examples of attacks). Algorithms that do not require annotated data to derive models are similarly at a disadvantage, because labeled data is still necessary when evaluating performance. In this work, we explore the use of temporal generative models to learn cyber attack graph representations and automatically generate data for experimentation and evaluation. Training and evaluating cyber systems and machine learning models requires significant, annotated data, which is typically collected and labeled by hand for one-off experiments. Automatically generating such data helps derive/evaluate detection models and ensures reproducibility of results. Experimentally, we demonstrate the efficacy of generative sequence analysis techniques on learning the structure of attack graphs, based on a realistic example. These derived models can then be used to generate more data. Additionally, we provide a roadmap for future research efforts in this area.« less

  5. Personalization algorithm for real-time activity recognition using PDA, wireless motion bands, and binary decision tree.

    PubMed

    Pärkkä, Juha; Cluitmans, Luc; Ermes, Miikka

    2010-09-01

    Inactive and sedentary lifestyle is a major problem in many industrialized countries today. Automatic recognition of type of physical activity can be used to show the user the distribution of his daily activities and to motivate him into more active lifestyle. In this study, an automatic activity-recognition system consisting of wireless motion bands and a PDA is evaluated. The system classifies raw sensor data into activity types online. It uses a decision tree classifier, which has low computational cost and low battery consumption. The classifier parameters can be personalized online by performing a short bout of an activity and by telling the system which activity is being performed. Data were collected with seven volunteers during five everyday activities: lying, sitting/standing, walking, running, and cycling. The online system can detect these activities with overall 86.6% accuracy and with 94.0% accuracy after classifier personalization.

  6. 90-kilobar diamond-anvil high-pressure cell for use on an automatic diffractometer.

    PubMed

    Schiferl, D; Jamieson, J C; Lenko, J E

    1978-03-01

    A gasketed diamond-anvil high-pressure cell is described which can be used on a four-circle automatic diffractometer to collect x-ray intensity data from single-crystal samples subjected to truly hydrostatic pressures of over 90 kilobars. The force generating system exerts only forces normal to the diamond faces to obtain maximum reliability. A unique design allows exceptionally large open areas for maximum x-ray access and is particularly well suited for highly absorbing materials, as the x rays are not transmitted through the sample. Studies on ruby show that high-pressure crystal structure determinations may be done rapidly, reliably, and routinely with this system.

  7. Meteorological Automatic Weather Station (MAWS) Instrument Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holdridge, Donna J; Kyrouac, Jenni A

    The Meteorological Automatic Weather Station (MAWS) is a surface meteorological station, manufactured by Vaisala, Inc., dedicated to the balloon-borne sounding system (BBSS), providing surface measurements of the thermodynamic state of the atmosphere and the wind speed and direction for each radiosonde profile. These data are automatically provided to the BBSS during the launch procedure and included in the radiosonde profile as the surface measurements of record for the sounding. The MAWS core set of measurements is: Barometric Pressure (hPa), Temperature (°C), Relative Humidity (%), Arithmetic-Averaged Wind Speed (m/s), and Vector-Averaged Wind Direction (deg). The sensors that collect the core variablesmore » are mounted at the standard heights defined for each variable.« less

  8. Argo workstation: a key component of operational oceanography

    NASA Astrophysics Data System (ADS)

    Dong, Mingmei; Xu, Shanshan; Miao, Qingsheng; Yue, Xinyang; Lu, Jiawei; Yang, Yang

    2018-02-01

    Operational oceanography requires the quantity, quality, and availability of data set and the timeliness and effectiveness of data products. Without steady and strong operational system supporting, operational oceanography will never be proceeded far. In this paper we describe an integrated platform named Argo Workstation. It operates as a data processing and management system, capable of data collection, automatic data quality control, visualized data check, statistical data search and data service. After it is set up, Argo workstation provides global high quality Argo data to users every day timely and effectively. It has not only played a key role in operational oceanography but also set up an example for operational system.

  9. An automated, broad-based, near real-time public health surveillance system using presentations to hospital Emergency Departments in New South Wales, Australia.

    PubMed

    Muscatello, David J; Churches, Tim; Kaldor, Jill; Zheng, Wei; Chiu, Clayton; Correll, Patricia; Jorm, Louisa

    2005-12-22

    In a climate of concern over bioterrorism threats and emergent diseases, public health authorities are trialling more timely surveillance systems. The 2003 Rugby World Cup (RWC) provided an opportunity to test the viability of a near real-time syndromic surveillance system in metropolitan Sydney, Australia. We describe the development and early results of this largely automated system that used data routinely collected in Emergency Departments (EDs). Twelve of 49 EDs in the Sydney metropolitan area automatically transmitted surveillance data from their existing information systems to a central database in near real-time. Information captured for each ED visit included patient demographic details, presenting problem and nursing assessment entered as free-text at triage time, physician-assigned provisional diagnosis codes, and status at departure from the ED. Both diagnoses from the EDs and triage text were used to assign syndrome categories. The text information was automatically classified into one or more of 26 syndrome categories using automated "naïve Bayes" text categorisation techniques. Automated processes were used to analyse both diagnosis and free text-based syndrome data and to produce web-based statistical summaries for daily review. An adjusted cumulative sum (cusum) was used to assess the statistical significance of trends. During the RWC the system did not identify any major public health threats associated with the tournament, mass gatherings or the influx of visitors. This was consistent with evidence from other sources, although two known outbreaks were already in progress before the tournament. Limited baseline in early monitoring prevented the system from automatically identifying these ongoing outbreaks. Data capture was invisible to clinical staff in EDs and did not add to their workload. We have demonstrated the feasibility and potential utility of syndromic surveillance using routinely collected data from ED information systems. Key features of our system are its nil impact on clinical staff, and its use of statistical methods to assign syndrome categories based on clinical free text information. The system is ongoing, and has expanded to cover 30 EDs. Results of formal evaluations of both the technical efficiency and the public health impacts of the system will be described subsequently.

  10. Water monitor system: Phase 1 test report

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.; Jeffers, E. L.

    1976-01-01

    Automatic water monitor system was tested with the objectives of assuring high-quality effluent standards and accelerating the practice of reclamation and reuse of water. The NASA water monitor system is described. Various components of the system, including the necessary sensors, the sample collection system, and the data acquisition and display system, are discussed. The test facility and the analysis methods are described. Test results are reviewed, and recommendations for water monitor system design improvement are presented.

  11. Temperatures kept cool in Southampton.

    PubMed

    McDonald, Katey

    2010-03-01

    According to Digitron, hospitals countrywide are seeing the benefits of DigiTrak, the company's automatic wireless temperature monitoring system. Katey McDonald, the company's marketing manager, outlines how the system replaces traditional methods of data collection by providing a single networked package, and describes its use at Southampton General Hospital for blood monitoring, with the help of advanced biomedical scientist and quality officer there Marie Cundall.

  12. Computerized database management system for breast cancer patients.

    PubMed

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.

  13. Collecting and distributing wearable sensor data: an embedded personal area network to local area network gateway server.

    PubMed

    Neuhaeuser, Jakob; D'Angelo, Lorenzo T

    2013-01-01

    The goal of the concept and of the device presented in this contribution is to be able to collect sensor data from wearable sensors directly, automatically and wirelessly and to make them available over a wired local area network. Several concepts in e-health and telemedicine make use of portable and wearable sensors to collect movement or activity data. Usually these data are either collected via a wireless personal area network or using a connection to the user's smartphone. However, users might not carry smartphones on them while inside a residential building such as a nursing home or a hospital, but also within their home. Also, in such areas the use of other wireless communication technologies might be limited. The presented system is an embedded server which can be deployed in several rooms in order to ensure live data collection in bigger buildings. Also, the collection of data batches recorded out of range, as soon as a connection is established, is also possible. Both, the system concept and the realization are presented.

  14. IA-Regional-Radio - Social Network for Radio Recommendation

    NASA Astrophysics Data System (ADS)

    Dziczkowski, Grzegorz; Bougueroua, Lamine; Wegrzyn-Wolska, Katarzyna

    This chapter describes the functions of a system proposed for the music hit recommendation from social network data base. This system carries out the automatic collection, evaluation and rating of music reviewers and the possibility for listeners to rate musical hits and recommendations deduced from auditor's profiles in the form of regional Internet radio. First, the system searches and retrieves probable music reviews from the Internet. Subsequently, the system carries out an evaluation and rating of those reviews. From this list of music hits, the system directly allows notation from our application. Finally, the system automatically creates the record list diffused each day depending on the region, the year season, the day hours and the age of listeners. Our system uses linguistics and statistic methods for classifying music opinions and data mining techniques for recommendation part needed for recorded list creation. The principal task is the creation of popular intelligent radio adaptive on auditor's age and region - IA-Regional-Radio.

  15. The Future of the Perfusion Record: Automated Data Collection vs. Manual Recording

    PubMed Central

    Ottens, Jane; Baker, Robert A.; Newland, Richard F.; Mazzone, Annette

    2005-01-01

    Abstract: The perfusion record, whether manually recorded or computer generated, is a legal representation of the procedure. The handwritten perfusion record has been the most common method of recording events that occur during cardiopulmonary bypass. This record is of significant contrast to the integrated data management systems available that provide continuous collection of data automatically or by means of a few keystrokes. Additionally, an increasing number of monitoring devices are available to assist in the management of patients on bypass. These devices are becoming more complex and provide more data for the perfusionist to monitor and record. Most of the data from these can be downloaded automatically into online data management systems, allowing more time for the perfusionist to concentrate on the patient while simultaneously producing a more accurate record. In this prospective report, we compared 17 cases that were recorded using both manual and electronic data collection techniques. The perfusionist in charge of the case recorded the perfusion using the manual technique while a second perfusionist entered relevant events on the electronic record generated by the Stockert S3 Data Management System/Data Bahn (Munich, Germany). Analysis of the two types of perfusion records showed significant variations in the recorded information. Areas that showed the most inconsistency included measurement of the perfusion pressures, flow, blood temperatures, cardioplegia delivery details, and the recording of events, with the electronic record superior in the integrity of the data. In addition, the limitations of the electronic system were also shown by the lack of electronic gas flow data in our hardware. Our results confirm the importance of accurate methods of recording of perfusion events. The use of an automated system provides the opportunity to minimize transcription error and bias. This study highlights the limitation of spot recording of perfusion events in the overall record keeping for perfusion management. PMID:16524151

  16. Validation of an automatic system (DoubleCage) for detecting the location of animals during preference tests.

    PubMed

    Tsai, P P; Nagelschmidt, N; Kirchner, J; Stelzer, H D; Hackbarth, H

    2012-01-01

    Preference tests have often been performed for collecting information about animals' acceptance of environmental refinement objects. In numerous published studies animals were individually tested during preference experiments, as it is difficult to observe group-housed animals with an automatic system. Thus, videotaping is still the most favoured method for observing preferences of socially-housed animals. To reduce the observation workload and to be able to carry out preference testing of socially-housed animals, an automatic recording system (DoubleCage) was developed for determining the location of group-housed animals in a preference test set-up. This system is able to distinguish the transition of individual animals between two cages and to record up to 16 animals at the same time (four animals per cage). The present study evaluated the reliability of the DoubleCage system. The data recorded by the DoubleCage program and the data obtained by human observation were compared. The measurements of the DoubleCage system and manual observation of the videotapes are comparable and significantly correlated (P < 0.0001) with good agreement. Using the DoubleCage system enables precise and reliable recording of the preferences of group-housed animals and a considerable reduction of animal observation time.

  17. Design of wideband solar ultraviolet radiation intensity monitoring and control system

    NASA Astrophysics Data System (ADS)

    Ye, Linmao; Wu, Zhigang; Li, Yusheng; Yu, Guohe; Jin, Qi

    2009-08-01

    According to the principle of SCM (Single Chip Microcomputer) and computer communication technique, the system is composed of chips such as ATML89C51, ADL0809, integrated circuit and sensors for UV radiation, which is designed for monitoring and controlling the UV index. This system can automatically collect the UV index data, analyze and check the history database, research the law of UV radiation in the region.

  18. Implementation of an automated system for monitoring adherence to hemodialysis treatment: a report of seven years of experience.

    PubMed

    Bellazzi, Riccardo; Sacchi, Lucia; Caffi, Ezio; de Vincenzi, Amedeo; Nai, Maurizio; Manicone, Francesco; Larizza, Cristiana; Bellazzi, Roberto

    2012-05-01

    In this paper we present the clinical deployment and evaluation of a computerized system, EMOSTAT, aimed at improving the quality of hemodialysis sessions. EMOSTAT automatically imports data from the hemodialysis monitoring software tools and analyzes the delivered treatment looking at six clinically relevant parameters. Failures-to-adhere (FtAs) to the planned treatment are detected and reported to the care-givers. EMOSTAT has been used for more than seven years in the management of a dialysis service located in Mede, Italy. A total of 72 patients were monitored and 21251 dialyses were collected. Data analysis is performed on the periods 2002-2005 and 2005-2008, corresponding to two different software releases. The system had been exploited into everyday clinical practice for the entire considered period. The number of FtAs significantly decreased along the first period: the bulk blood flow FtAs decreased after the introduction of the system. Hemodialysis sessions lasted longer in the second period. Co-occurring FtAs, highlighting the presence of complex FtAs patterns, were also detected. EMOSTAT provides an effective way to re-focus the attention of the dialysis department on the treatment plan and on its implementation. The automatic data collection and the design philosophy of EMOSTAT allowed the routine use of the system. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. A control system based on field programmable gate array for papermaking sewage treatment

    NASA Astrophysics Data System (ADS)

    Zhang, Zi Sheng; Xie, Chang; Qing Xiong, Yan; Liu, Zhi Qiang; Li, Qing

    2013-03-01

    A sewage treatment control system is designed to improve the efficiency of papermaking wastewater treatment system. The automation control system is based on Field Programmable Gate Array (FPGA), coded with Very-High-Speed Integrate Circuit Hardware Description Language (VHDL), compiled and simulated with Quartus. In order to ensure the stability of the data used in FPGA, the data is collected through temperature sensors, water level sensor and online PH measurement system. The automatic control system is more sensitive, and both the treatment efficiency and processing power are increased. This work provides a new method for sewage treatment control.

  20. NASA Wrangler: Automated Cloud-Based Data Assembly in the RECOVER Wildfire Decision Support System

    NASA Technical Reports Server (NTRS)

    Schnase, John; Carroll, Mark; Gill, Roger; Wooten, Margaret; Weber, Keith; Blair, Kindra; May, Jeffrey; Toombs, William

    2017-01-01

    NASA Wrangler is a loosely-coupled, event driven, highly parallel data aggregation service designed to take advantageof the elastic resource capabilities of cloud computing. Wrangler automatically collects Earth observational data, climate model outputs, derived remote sensing data products, and historic biophysical data for pre-, active-, and post-wildfire decision making. It is a core service of the RECOVER decision support system, which is providing rapid-response GIS analytic capabilities to state and local government agencies. Wrangler reduces to minutes the time needed to assemble and deliver crucial wildfire-related data.

  1. Implementation and flight tests for the Digital Integrated Automatic Landing System (DIALS). Part 2: Complete set of flight data

    NASA Technical Reports Server (NTRS)

    Hueschen, R. M.

    1986-01-01

    Five flight tests of the Digital Automated Landing System (DIALS) were conducted on the Advanced Transport Operating System (ATOPS) Transportation Research Vehicle (TSRV)--a modified Boeing 737 Aircraft for advanced controls and displays research. These flight tests were conducted at NASA's Wallops Flight Center using the Microwave Landing System (MLS) installation on Runway 22. This report is primarily a collection of data plots of all performance variables recorded for the entire five flight tests. A description and source of the performance variables is included. Performance variables include inertial data, air data, automatic control commands, control servo positions, sensor data, DIALS guidance and control parameters, and Kalman filter data. This data illustrates low overshoot captures of the localizer for intercept angles of 20 deg, 30 deg, 40 deg, and 50 deg intercept angles, and low overshoot captures of the glideslope slope for 3 deg, 4.5 deg, and 5 deg glideslopes. Flare maneuvers were successfully performed from the various glideslope angles and good decrab maneuvers were performed in crosswinds of 6 knots. In 18 to 20 knot crosswind conditions rudder limiting occurred which caused lateral drifting although heading alignment was achieved.

  2. Effectiveness of an automatic manual wheelchair braking system in the prevention of falls.

    PubMed

    Martorello, Laura; Swanson, Edward

    2006-01-01

    The purpose of this study was to evaluate the effectiveness of an automatic manual wheelchair braking system in the reduction of falls for patients at high risk of falls while transferring to and from a manual wheelchair. The study design was a normative survey carried out through the use of a written questionnaire sent to 60 skilled nursing facilities to collect data from the medical charts, which identified patients at high risk for falls who used an automatic wheelchair braking system. The facilities participating in the study identified a frequency of falls of high-risk patients while transferring to and from the wheelchair ranging from 2 to 10 per year, with a median fall rate per facility of 4 falls. One year after the installation of the automatic wheelchair braking system, participating facilities demonstrated a reduction of zero to three falls during transfers by high-risk patients, with a median fall rate of zero falls. This represents a statistically significant reduction of 78% in the fall rate of high-risk patients while transferring to and from the wheelchair, t (18) = 6.39, p < .0001. Incident reports of falls to and from manual wheelchairs were reviewed retrospectively for a 1-year period. This study suggests that high-risk fallers transferring to or from manual wheelchairs sustained significantly fewer falls when the Steddy Mate automatic braking system for manual wheelchairs was installed. The application of the automatic braking system allows clients, families/caregivers, and facility personnel an increased safety factor for the reduction of falls from the wheelchair.

  3. Cloud-Based Smart Health Monitoring System for Automatic Cardiovascular and Fall Risk Assessment in Hypertensive Patients.

    PubMed

    Melillo, P; Orrico, A; Scala, P; Crispino, F; Pecchia, L

    2015-10-01

    The aim of this paper is to describe the design and the preliminary validation of a platform developed to collect and automatically analyze biomedical signals for risk assessment of vascular events and falls in hypertensive patients. This m-health platform, based on cloud computing, was designed to be flexible, extensible, and transparent, and to provide proactive remote monitoring via data-mining functionalities. A retrospective study was conducted to train and test the platform. The developed system was able to predict a future vascular event within the next 12 months with an accuracy rate of 84 % and to identify fallers with an accuracy rate of 72 %. In an ongoing prospective trial, almost all the recruited patients accepted favorably the system with a limited rate of inadherences causing data losses (<20 %). The developed platform supported clinical decision by processing tele-monitored data and providing quick and accurate risk assessment of vascular events and falls.

  4. Automation of a laboratory particleboard press

    Treesearch

    Robert L. Geimer; Gordon H. Stevens; Richard E. Kinney

    1982-01-01

    A manually operated particleboard press was converted to a fully automatic, programable system with updated data collection capabilities. Improved control has permitted observations of very small changes in pressing variables resulting in the development of a technique capable of reducing press times by 70 percent. Accurate control of the press is obtained through an...

  5. Validation of Automated Scoring of Oral Reading

    ERIC Educational Resources Information Center

    Balogh, Jennifer; Bernstein, Jared; Cheng, Jian; Van Moere, Alistair; Townshend, Brent; Suzuki, Masanori

    2012-01-01

    A two-part experiment is presented that validates a new measurement tool for scoring oral reading ability. Data collected by the U.S. government in a large-scale literacy assessment of adults were analyzed by a system called VersaReader that uses automatic speech recognition and speech processing technologies to score oral reading fluency. In the…

  6. Implementation guide for turbidity threshold sampling: principles, procedures, and analysis

    Treesearch

    Jack Lewis; Rand Eads

    2009-01-01

    Turbidity Threshold Sampling uses real-time turbidity and river stage information to automatically collect water quality samples for estimating suspended sediment loads. The system uses a programmable data logger in conjunction with a stage measurement device, a turbidity sensor, and a pumping sampler. Specialized software enables the user to control the sampling...

  7. Automatic and continuous landslide monitoring: the Rotolon Web-based platform

    NASA Astrophysics Data System (ADS)

    Frigerio, Simone; Schenato, Luca; Mantovani, Matteo; Bossi, Giulia; Marcato, Gianluca; Cavalli, Marco; Pasuto, Alessandro

    2013-04-01

    Mount Rotolon (Eastern Italian Alps) is affected by a complex landslide that, since 1985, is threatening the nearby village of Recoaro Terme. The first written proof of a landslide occurrence dated back to 1798. After the last re-activation on November 2010 (637 mm of intense rainfall recorded in the 12 days prior the event), a mass of approximately 320.000 m3 detached from the south flank of Mount Rotolon and evolved into a fast debris flow that ran for about 3 km along the stream bed. A real-time monitoring system was required to detect early indication of rapid movements, potentially saving lives and property. A web-based platform for automatic and continuous monitoring was designed as a first step in the implementation of an early-warning system. Measurements collected by the automated geotechnical and topographic instrumentation, deployed over the landslide body, are gathered in a central box station. After the calibration process, they are transmitted by web services on a local server, where graphs, maps, reports and alert announcement are automatically generated and updated. All the processed information are available by web browser with different access rights. The web environment provides the following advantages: 1) data is collected from different data sources and matched on a single server-side frame 2) a remote user-interface allows regular technical maintenance and direct access to the instruments 3) data management system is synchronized and automatically tested 4) a graphical user interface on browser provides a user-friendly tool for decision-makers to interact with a system continuously updated. On this site two monitoring systems are actually on course: 1) GB-InSAR radar interferometer (University of Florence - Department of Earth Science) and 2) Automated Total Station (ATS) combined with extensometers network in a Web-based solution (CNR-IRPI Padova). This work deals with details on methodology, services and techniques adopted for the second monitoring solution. The activity directly interfaces with local Civil Protection agency, Regional Geological Service and local authorities with integrated roles and aims.

  8. Game-powered machine learning

    PubMed Central

    Barrington, Luke; Turnbull, Douglas; Lanckriet, Gert

    2012-01-01

    Searching for relevant content in a massive amount of multimedia information is facilitated by accurately annotating each image, video, or song with a large number of relevant semantic keywords, or tags. We introduce game-powered machine learning, an integrated approach to annotating multimedia content that combines the effectiveness of human computation, through online games, with the scalability of machine learning. We investigate this framework for labeling music. First, a socially-oriented music annotation game called Herd It collects reliable music annotations based on the “wisdom of the crowds.” Second, these annotated examples are used to train a supervised machine learning system. Third, the machine learning system actively directs the annotation games to collect new data that will most benefit future model iterations. Once trained, the system can automatically annotate a corpus of music much larger than what could be labeled using human computation alone. Automatically annotated songs can be retrieved based on their semantic relevance to text-based queries (e.g., “funky jazz with saxophone,” “spooky electronica,” etc.). Based on the results presented in this paper, we find that actively coupling annotation games with machine learning provides a reliable and scalable approach to making searchable massive amounts of multimedia data. PMID:22460786

  9. Game-powered machine learning.

    PubMed

    Barrington, Luke; Turnbull, Douglas; Lanckriet, Gert

    2012-04-24

    Searching for relevant content in a massive amount of multimedia information is facilitated by accurately annotating each image, video, or song with a large number of relevant semantic keywords, or tags. We introduce game-powered machine learning, an integrated approach to annotating multimedia content that combines the effectiveness of human computation, through online games, with the scalability of machine learning. We investigate this framework for labeling music. First, a socially-oriented music annotation game called Herd It collects reliable music annotations based on the "wisdom of the crowds." Second, these annotated examples are used to train a supervised machine learning system. Third, the machine learning system actively directs the annotation games to collect new data that will most benefit future model iterations. Once trained, the system can automatically annotate a corpus of music much larger than what could be labeled using human computation alone. Automatically annotated songs can be retrieved based on their semantic relevance to text-based queries (e.g., "funky jazz with saxophone," "spooky electronica," etc.). Based on the results presented in this paper, we find that actively coupling annotation games with machine learning provides a reliable and scalable approach to making searchable massive amounts of multimedia data.

  10. Isothermal thermogravimetric data acquisition analysis system

    NASA Technical Reports Server (NTRS)

    Cooper, Kenneth, Jr.

    1991-01-01

    The description of an Isothermal Thermogravimetric Analysis (TGA) Data Acquisition System is presented. The system consists of software and hardware to perform a wide variety of TGA experiments. The software is written in ANSI C using Borland's Turbo C++. The hardware consists of a 486/25 MHz machine with a Capital Equipment Corp. IEEE488 interface card. The interface is to a Hewlett Packard 3497A data acquisition system using two analog input cards and a digital actuator card. The system provides for 16 TGA rigs with weight and temperature measurements from each rig. Data collection is conducted in three phases. Acquisition is done at a rapid rate during initial startup, at a slower rate during extended data collection periods, and finally at a fast rate during shutdown. Parameters controlling the rate and duration of each phase are user programmable. Furnace control (raising and lowering) is also programmable. Provision is made for automatic restart in the event of power failure or other abnormal terminations. Initial trial runs were conducted to show system stability.

  11. Towards a smart glove: arousal recognition based on textile Electrodermal Response.

    PubMed

    Valenza, Gaetano; Lanata, Antonio; Scilingo, Enzo Pasquale; De Rossi, Danilo

    2010-01-01

    This paper investigates the possibility of using Electrodermal Response, acquired by a sensing fabric glove with embedded textile electrodes, as reliable means for emotion recognition. Here, all the essential steps for an automatic recognition system are described, from the recording of physiological data set to a feature-based multiclass classification. Data were collected from 35 healthy volunteers during arousal elicitation by means of International Affective Picture System (IAPS) pictures. Experimental results show high discrimination after twenty steps of cross validation.

  12. Design and fabrication of a prototype for an automatic transport system for transferring human and other wastes to an incinerator unit onboard spacecraft, phase A

    NASA Technical Reports Server (NTRS)

    Labak, L. J.; Remus, G. A.; Mansnerus, R.

    1971-01-01

    Three transport system concepts were experimentally evaluated for transferring human and nonhuman wastes from a collection site to an incineration unit onboard spacecraft. The operating parameters, merits, and shortcomings of a porous-pneumatic, nozzle-pneumatic, and a mechanical screw-feed system were determined. An analysis of the test data was made and a preliminary design of two prototype systems was prepared.

  13. Verifying the Comprehensive Nuclear-Test-Ban Treaty by Radioxenon Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ringbom, Anders

    2005-05-24

    The current status of the ongoing establishment of a verification system for the Comprehensive Nuclear-Test-Ban Treaty using radioxenon detection is discussed. As an example of equipment used in this application the newly developed fully automatic noble gas sampling and detection system SAUNA is described, and data collected with this system are discussed. It is concluded that the most important remaining scientific challenges in the field concern event categorization and meteorological backtracking.

  14. An Automated Web Diary System for TeleHomeCare Patient Monitoring

    PubMed Central

    Ganzinger, Matthias; Demiris, George; Finkelstein, Stanley M.; Speedie, Stuart; Lundgren, Jan Marie

    2001-01-01

    The TeleHomeCare project monitors home care patients via the Internet. Each patient has a personalized homepage with an electronic diary for collecting the monitoring data with HTML forms. The web pages are generated dynamically using PHP. All data are stored in a MySQL database. Data are checked immediately by the system; if a value exceeds a predefined limit an alarm message is generated and sent automatically to the patient's case manager. Weekly graphical reports (PDF format) are also generated and sent by email to the same destination.

  15. Five Years of Citizen Science: Macroseismic Data Collection with the USGS Community Internet Intensity Maps (``Did You Feel It?'')

    NASA Astrophysics Data System (ADS)

    Quitoriano, V.; Wald, D. J.; Dewey, J. W.; Hopper, M.; Tarr, A.

    2003-12-01

    The U.S. Geological Survey Community Internet Intensity Map (CIIM) is an automatic Web-based system for rapidly generating seismic intensity maps based on shaking and damage reports collected from Internet users immediately following felt earthquakes in the United States. The data collection procedure is fundamentally Citizen Science. The vast majority of data are contributed by non-specialists, describing their own experiences of earthquakes. Internet data contributed by the public have profoundly changed the approach, coverage and usefulness of intensity observation in the U.S. We now typically receive thousands of individual questionnaire responses for widely felt earthquakes. After five years, these total over 350,000 individual entries nationwide, including entries from all 50 States, the District of Columbia, as well as territories of Guam, the Virgin Islands and Puerto Rico. The widespread access and use of online felt reports have added unanticipated but welcome capacities to USGS earthquake reporting. We can more easily validate earthquake occurrence in poorly instrumented regions, identify and locate sonic booms, and readily gauge societal importance of earthquakes by the nature of the response. In some parts of the U.S., CIIM provides constraints on earthquake magnitudes and focal depths beyond those provided by instrumental data, and the data are robust enough to test regionalized models of ground-motion attenuation. CIIM invokes an enthusiastic response from members of the public who contribute to it; it clearly provides an important opportunity for public education and outreach. In this paper we provide background on advantages and limitations of on-line data collection and explore recent developments and improvements to the CIIM system, including improved quality assurance using a relational database and greater data availability for scientific and sociological studies. We also describe a number of post-processing tools and applications that make use of the extensive intensity data sets now gathered. These new applications include automatic location and magnitude determination, estimating ground motions from the intensity observations thereby augmenting ShakeMap, automatic geocoding to allow for more refined intensity localization, and recovering higher precision decimal intensities rather than limiting intensities to integer values. Because of differences in the data and procedure, CIIM intensities are not strictly comparable to intensities assigned with the Modified Mercalli scale. Hence, continued collection of traditional macroseismic data will be essential to calibrate our understanding of CIIM intensities, and, conversely, CIIM data will improve our understanding of conventional macroseismic intensities. CIIM can be found online at http://earthquake.usgs.gov under ``Did You Feel It?''.

  16. Cloud-ECG for real time ECG monitoring and analysis.

    PubMed

    Xia, Henian; Asif, Irfan; Zhao, Xiaopeng

    2013-06-01

    Recent advances in mobile technology and cloud computing have inspired numerous designs of cloud-based health care services and devices. Within the cloud system, medical data can be collected and transmitted automatically to medical professionals from anywhere and feedback can be returned to patients through the network. In this article, we developed a cloud-based system for clients with mobile devices or web browsers. Specially, we aim to address the issues regarding the usefulness of the ECG data collected from patients themselves. Algorithms for ECG enhancement, ECG quality evaluation and ECG parameters extraction were implemented in the system. The system was demonstrated by a use case, in which ECG data was uploaded to the web server from a mobile phone at a certain frequency and analysis was performed in real time using the server. The system has been proven to be functional, accurate and efficient. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. A multiparametric automatic method to monitor long-term reproducibility in digital mammography: results from a regional screening programme.

    PubMed

    Gennaro, G; Ballaminut, A; Contento, G

    2017-09-01

    This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.

  18. 75 FR 62098 - Proposed Information Collection; Comment Request; Expanded Vessel Monitoring System Requirement...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-07

    ... Collection; Comment Request; Expanded Vessel Monitoring System Requirement in the Pacific Coast Groundfish... and use a vessel monitoring system (VMS) that automatically sends hourly position reports. Exemptions... declaration reporting system are not expected to change the public reporting burden. II. Method of Collection...

  19. Development of a portable bicycle/pedestrian monitoring system for safety enhancement

    NASA Astrophysics Data System (ADS)

    Usher, Colin; Daley, W. D. R.

    2015-03-01

    Pedestrians involved in roadway accidents account for nearly 12 percent of all traffic fatalities and 59,000 injuries each year. Most injuries occur when pedestrians attempt to cross roads, and there have been noted differences in accident rates midblock vs. at intersections. Collecting data on pedestrian behavior is a time consuming manual process that is prone to error. This leads to a lack of quality information to guide the proper design of lane markings and traffic signals to enhance pedestrian safety. Researchers at the Georgia Tech Research Institute are developing and testing an automated system that can be rapidly deployed for data collection to support the analysis of pedestrian behavior at intersections and midblock crossings with and without traffic signals. This system will analyze the collected video data to automatically identify and characterize the number of pedestrians and their behavior. It consists of a mobile trailer with four high definition pan-tilt cameras for data collection. The software is custom designed and uses state of the art commercial pedestrian detection algorithms. We will be presenting the system hardware and software design, challenges, and results from the preliminary system testing. Preliminary results indicate the ability to provide representative quantitative data on pedestrian motion data more efficiently than current techniques.

  20. Connecticut Biodiesel Power Generation Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grannis, Lee; York, Carla R.

    Sabre will continue support of the emissions equipment and VARS issues to ensure all are resolved and the system is functioning as expected. The remote data collection to become more automated. Final project reports for data collection and system performance to be generated. Sabre continued to support the emissions equipment and VARS issues to ensure all are resolved and the system is functioning as expected. The remote data collection became more automated. Final project reports for data collection and system performance were generated and are part of this final report. Some Systems Sensors were replaced due to a lightning strike.more » Sample data charts are shown at the end of the report. During the project, Sabre Engineering provided support to the project team with regarding to troubleshooting technical issues and system integration with the local power utility company. The resulting lessons learned through Sabre’s participation in the project have been valuable to the integrity of the data collected as well as in providing BioPur Light & Power valuable insights into future operations and planning for possible expansion. The system monitoring and data collection system has been operating as designed and continues to provide relevant information to the system operators. The information routinely gathered automatically by the system also contributes to the REN and REC validations which are required to secure credit for these items. During the quarter, the remaining work on the operations and safety manual were completed and released for publication after screen shots were verified. The goal of this effort to provide an accurate set of precautions and procedures for the technology system that can be replicated to other similar system.« less

  1. Development and testing of a portable wind sensitive directional air sampler

    NASA Technical Reports Server (NTRS)

    Deyo, J.; Toma, J.; King, R. B.

    1975-01-01

    A portable wind sensitive directional air sampler was developed as part of an air pollution source identification system. The system is designed to identify sources of air pollution based on the directional collection of field air samples and their analysis for TSP and trace element characteristics. Sources can be identified by analyzing the data on the basis of pattern recognition concepts. The unit, designated Air Scout, receives wind direction signals from an associated wind vane. Air samples are collected on filter slides using a standard high volume air sampler drawing air through a porting arrangement which tracks the wind direction and permits collection of discrete samples. A preset timer controls the length of time each filter is in the sampling position. At the conclusion of the sampling period a new filter is automatically moved into sampling position displacing the previous filter to a storage compartment. Thus the Air Scout may be set up at a field location, loaded with up to 12 filter slides, and left to acquire air samples automatically, according to the wind, at any timer interval desired from 1 to 30 hours.

  2. Designing Extensible Data Management for Ocean Observatories, Platforms, and Devices

    NASA Astrophysics Data System (ADS)

    Graybeal, J.; Gomes, K.; McCann, M.; Schlining, B.; Schramm, R.; Wilkin, D.

    2002-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) has been collecting science data for 15 years from all kinds of oceanographic instruments and systems, and is building a next-generation observing system, the MBARI Ocean Observing System (MOOS). To meet the data management requirements of the MOOS, the Institute began developing a flexible, extensible data management solution, the Shore Side Data System (SSDS). This data management system must address a wide variety of oceanographic instruments and data sources, including instruments and platforms of the future. Our data management solution will address all elements of the data management challenge, from ingest (including suitable pre-definition of metadata) through to access and visualization. Key to its success will be ease of use, and automatic incorporation of new data streams and data sets. The data will be of many different forms, and come from many different types of instruments. Instruments will be designed for fixed locations (as with moorings), changing locations (drifters and AUVs), and cruise-based sampling. Data from airplanes, satellites, models, and external archives must also be considered. Providing an architecture which allows data from these varied sources to be automatically archived and processed, yet readily accessed, is only possible with the best practices in metadata definition, software design, and re-use of third-party components. The current status of SSDS development will be presented, including lessons learned from our science users and from previous data management designs.

  3. East Europe Report, Scientific Affairs, Number 749

    DTIC Science & Technology

    1982-07-23

    Scientific Associate A. Andreev, Corresponding Member D. Shopov , Scientific Associate B. Kunev, V. Idakiev and A. Bankova. A collective at the Economic...operational systems — manual, automatic and remote control; control system — G P oSL ?nP™ y measurement ~ automatic and in percentage figures; power

  4. SPAR reference manual. [for stress analysis

    NASA Technical Reports Server (NTRS)

    Whetstone, W. D.

    1974-01-01

    SPAR is a system of related programs which may be operated either in batch or demand (teletype) mode. Information exchange between programs is automatically accomplished through one or more direct access libraries, known collectively as the data complex. Card input is command-oriented, in free-field form. Capabilities available in the first production release of the system are fully documented, and include linear stress analysis, linear bifurcation buckling analysis, and linear vibrational analysis.

  5. Integrated exhaust gas analysis system for aircraft turbine engine component testing

    NASA Technical Reports Server (NTRS)

    Summers, R. L.; Anderson, R. C.

    1985-01-01

    An integrated exhaust gas analysis system was designed and installed in the hot-section facility at the Lewis Research Center. The system is designed to operate either manually or automatically and also to be operated from a remote station. The system measures oxygen, water vapor, total hydrocarbons, carbon monoxide, carbon dioxide, and oxides of nitrogen. Two microprocessors control the system and the analyzers, collect data and process them into engineering units, and present the data to the facility computers and the system operator. Within the design of this system there are innovative concepts and procedures that are of general interest and application to other gas analysis tasks.

  6. Study on Remote Monitoring System of Crossing and Spanning Tangent Tower

    NASA Astrophysics Data System (ADS)

    Chen, Da-bing; Zhang, Nai-long; Zhang, Meng-ge; Wang, Ze-hua; Zhang, Yan

    2017-05-01

    In order to grasp the vibration state of overhead transmission line and ensure the operational security of transmission line, the remote monitoring system of crossing and spanning tangent tower was studied. By use of this system, the displacement, velocity and acceleration of the tower, and the local weather data are collected automatically, displayed on computer of remote monitoring centre through wireless network, real-time collection and transmission of vibration signals are realized. The applying results show that the system is excellent in reliability and accuracy and so on. The system can be used to remote monitoring of transmission tower of UHV power transmission lines and in large spanning areas.

  7. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  8. MEMS product engineering: methodology and tools

    NASA Astrophysics Data System (ADS)

    Ortloff, Dirk; Popp, Jens; Schmidt, Thilo; Hahn, Kai; Mielke, Matthias; Brück, Rainer

    2011-03-01

    The development of MEMS comprises the structural design as well as the definition of an appropriate manufacturing process. Technology constraints have a considerable impact on the device design and vice-versa. Product design and technology development are therefore concurrent tasks. Based on a comprehensive methodology the authors introduce a software environment that links commercial design tools from both area into a common design flow. In this paper emphasis is put on automatic low threshold data acquisition. The intention is to collect and categorize development data for further developments with minimum overhead and minimum disturbance of established business processes. As a first step software tools that automatically extract data from spreadsheets or file-systems and put them in context with existing information are presented. The developments are currently carried out in a European research project.

  9. MAC, A System for Automatically IPR Identification, Collection and Distribution

    NASA Astrophysics Data System (ADS)

    Serrão, Carlos

    Controlling Intellectual Property Rights (IPR) in the Digital World is a very hard challenge. The facility to create multiple bit-by-bit identical copies from original IPR works creates the opportunities for digital piracy. One of the most affected industries by this fact is the Music Industry. The Music Industry has supported huge losses during the last few years due to this fact. Moreover, this fact is also affecting the way that music rights collecting and distributing societies are operating to assure a correct music IPR identification, collection and distribution. In this article a system for automating this IPR identification, collection and distribution is presented and described. This system makes usage of advanced automatic audio identification system based on audio fingerprinting technology. This paper will present the details of the system and present a use-case scenario where this system is being used.

  10. Information Robots and Manipulators.

    ERIC Educational Resources Information Center

    Katys, G. P.; And Others

    In the modern concept a robot is a complex automatic cybernetics system capable of executing various operations in the sphere of human activity and in various respects combining the imitative capacity of the physical and mental activity of man. They are a class of automatic information systems intended for search, collection, processing, and…

  11. MOLGENIS/connect: a system for semi-automatic integration of heterogeneous phenotype data with applications in biobanks.

    PubMed

    Pang, Chao; van Enckevort, David; de Haan, Mark; Kelpin, Fleur; Jetten, Jonathan; Hendriksen, Dennis; de Boer, Tommy; Charbon, Bart; Winder, Erwin; van der Velde, K Joeri; Doiron, Dany; Fortier, Isabel; Hillege, Hans; Swertz, Morris A

    2016-07-15

    While the size and number of biobanks, patient registries and other data collections are increasing, biomedical researchers still often need to pool data for statistical power, a task that requires time-intensive retrospective integration. To address this challenge, we developed MOLGENIS/connect, a semi-automatic system to find, match and pool data from different sources. The system shortlists relevant source attributes from thousands of candidates using ontology-based query expansion to overcome variations in terminology. Then it generates algorithms that transform source attributes to a common target DataSchema. These include unit conversion, categorical value matching and complex conversion patterns (e.g. calculation of BMI). In comparison to human-experts, MOLGENIS/connect was able to auto-generate 27% of the algorithms perfectly, with an additional 46% needing only minor editing, representing a reduction in the human effort and expertise needed to pool data. Source code, binaries and documentation are available as open-source under LGPLv3 from http://github.com/molgenis/molgenis and www.molgenis.org/connect : m.a.swertz@rug.nl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  12. MOLGENIS/connect: a system for semi-automatic integration of heterogeneous phenotype data with applications in biobanks

    PubMed Central

    Pang, Chao; van Enckevort, David; de Haan, Mark; Kelpin, Fleur; Jetten, Jonathan; Hendriksen, Dennis; de Boer, Tommy; Charbon, Bart; Winder, Erwin; van der Velde, K. Joeri; Doiron, Dany; Fortier, Isabel; Hillege, Hans

    2016-01-01

    Motivation: While the size and number of biobanks, patient registries and other data collections are increasing, biomedical researchers still often need to pool data for statistical power, a task that requires time-intensive retrospective integration. Results: To address this challenge, we developed MOLGENIS/connect, a semi-automatic system to find, match and pool data from different sources. The system shortlists relevant source attributes from thousands of candidates using ontology-based query expansion to overcome variations in terminology. Then it generates algorithms that transform source attributes to a common target DataSchema. These include unit conversion, categorical value matching and complex conversion patterns (e.g. calculation of BMI). In comparison to human-experts, MOLGENIS/connect was able to auto-generate 27% of the algorithms perfectly, with an additional 46% needing only minor editing, representing a reduction in the human effort and expertise needed to pool data. Availability and Implementation: Source code, binaries and documentation are available as open-source under LGPLv3 from http://github.com/molgenis/molgenis and www.molgenis.org/connect. Contact: m.a.swertz@rug.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153686

  13. AMP: A platform for managing and mining data in the treatment of Autism Spectrum Disorder.

    PubMed

    Linstead, Erik; Burns, Ryan; Duy Nguyen; Tyler, David

    2016-08-01

    We introduce AMP (Autism Management Platform), an integrated health care information system for capturing, analyzing, and managing data associated with the diagnosis and treatment of Autism Spectrum Disorder in children. AMP's mobile application simplifies the means by which parents, guardians, and clinicians can collect and share multimedia data with one another, facilitating communication and reducing data redundancy, while simplifying retrieval. Additionally, AMP provides an intelligent web interface and analytics platform which allow physicians and specialists to aggregate and mine patient data in real-time, as well as give relevant feedback to automatically learn data filtering preferences over time. Together AMP's mobile app, web client, and analytics engine implement a rich set of features that streamline the data collection and analysis process in the context of a secure and easy-to-use system so that data may be more effectively leveraged to guide treatment.

  14. A microcomputer-based data acquisition and control system for the direct shear, ring shear, triaxial shear, and consolidation tests

    USGS Publications Warehouse

    Powers, Philip S.

    1983-01-01

    This report is intended to provide internal documentation for the U.S. Geological Survey laboratory's automatic data acquisition system. The operating procedures for each type of test are designed to independently lead a first-time user through the various stages of using the computer to control the test. Continuing advances in computer technology and the availability of desktop microcomputers with a wide variety of peripheral equipment at a reasonable cost can create an efficient automated geotechnical testing environment. A geotechnical testing environment is shown in figure 1. Using an automatic data acquisition system, laboratory test data from a variety of sensors can be collected, and manually or automatically recorded on a magnetic device at the same apparent time. The responses of a test can be displayed graphically on a CRT in a matter of seconds, giving the investigator an opportunity to evaluate the test data, and to make timely, informed decisions on such matters as whether to continue testing, abandon a test, or modify procedures. Data can be retrieved and results reported in tabular form, or graphic plots, suitable for publication. Thermistors, thermocouples, load cells, pressure transducers, and linear variable differential transformers are typical sensors which are incorporated in automated systems. The geotechnical tests which are most practical to automate are the long-term tests which often require readings to be recorded outside normal work hours and on weekends. Automation applications include incremental load consolidation tests, constant-rate-of-strain consolidation tests, direct shear tests, ring shear tests, and triaxial shear tests.

  15. An automated, broad-based, near real-time public health surveillance system using presentations to hospital Emergency Departments in New South Wales, Australia

    PubMed Central

    Muscatello, David J; Churches, Tim; Kaldor, Jill; Zheng, Wei; Chiu, Clayton; Correll, Patricia; Jorm, Louisa

    2005-01-01

    Background In a climate of concern over bioterrorism threats and emergent diseases, public health authorities are trialling more timely surveillance systems. The 2003 Rugby World Cup (RWC) provided an opportunity to test the viability of a near real-time syndromic surveillance system in metropolitan Sydney, Australia. We describe the development and early results of this largely automated system that used data routinely collected in Emergency Departments (EDs). Methods Twelve of 49 EDs in the Sydney metropolitan area automatically transmitted surveillance data from their existing information systems to a central database in near real-time. Information captured for each ED visit included patient demographic details, presenting problem and nursing assessment entered as free-text at triage time, physician-assigned provisional diagnosis codes, and status at departure from the ED. Both diagnoses from the EDs and triage text were used to assign syndrome categories. The text information was automatically classified into one or more of 26 syndrome categories using automated "naïve Bayes" text categorisation techniques. Automated processes were used to analyse both diagnosis and free text-based syndrome data and to produce web-based statistical summaries for daily review. An adjusted cumulative sum (cusum) was used to assess the statistical significance of trends. Results During the RWC the system did not identify any major public health threats associated with the tournament, mass gatherings or the influx of visitors. This was consistent with evidence from other sources, although two known outbreaks were already in progress before the tournament. Limited baseline in early monitoring prevented the system from automatically identifying these ongoing outbreaks. Data capture was invisible to clinical staff in EDs and did not add to their workload. Conclusion We have demonstrated the feasibility and potential utility of syndromic surveillance using routinely collected data from ED information systems. Key features of our system are its nil impact on clinical staff, and its use of statistical methods to assign syndrome categories based on clinical free text information. The system is ongoing, and has expanded to cover 30 EDs. Results of formal evaluations of both the technical efficiency and the public health impacts of the system will be described subsequently. PMID:16372902

  16. Inductive System Health Monitoring

    NASA Technical Reports Server (NTRS)

    Iverson, David L.

    2004-01-01

    The Inductive Monitoring System (IMS) software was developed to provide a technique to automatically produce health monitoring knowledge bases for systems that are either difficult to model (simulate) with a computer or which require computer models that are too complex to use for real time monitoring. IMS uses nominal data sets collected either directly from the system or from simulations to build a knowledge base that can be used to detect anomalous behavior in the system. Machine learning and data mining techniques are used to characterize typical system behavior by extracting general classes of nominal data from archived data sets. IMS is able to monitor the system by comparing real time operational data with these classes. We present a description of learning and monitoring method used by IMS and summarize some recent IMS results.

  17. A system design of data acquisition and processing for side-scatter lidar

    NASA Astrophysics Data System (ADS)

    Zhang, ZhanYe; Xie, ChenBo; Wang, ZhenZhu; Kuang, ZhiQiang; Deng, Qian; Tao, ZongMing; Liu, Dong; Wang, Yingjian

    2018-03-01

    A system for collecting data of Side-Scatter lidar based on Charge Coupled Device (CCD),is designed and implemented. The system of data acquisition is based on Microsoft. Net structure and the language of C# is used to call dynamic link library (DLL) of CCD for realization of the real-time data acquisition and processing. The software stores data as txt file for post data acquisition and analysis. The system has ability to operate CCD device in all-day, automatic, continuous and high frequency data acquisition and processing conditions, which will catch 24-hour information of the atmospheric scatter's light intensity and retrieve the spatial and temporal properties of aerosol particles. The experimental result shows that the system is convenient to observe the aerosol optical characteristics near surface.

  18. A design for the geoinformatics system

    NASA Astrophysics Data System (ADS)

    Allison, M. L.

    2002-12-01

    Informatics integrates and applies information technologies with scientific and technical disciplines. A geoinformatics system targets the spatially based sciences. The system is not a master database, but will collect pertinent information from disparate databases distributed around the world. Seamless interoperability of databases promises quantum leaps in productivity not only for scientific researchers but also for many areas of society including business and government. The system will incorporate: acquisition of analog and digital legacy data; efficient information and data retrieval mechanisms (via data mining and web services); accessibility to and application of visualization, analysis, and modeling capabilities; online workspace, software, and tutorials; GIS; integration with online scientific journal aggregates and digital libraries; access to real time data collection and dissemination; user-defined automatic notification and quality control filtering for selection of new resources; and application to field techniques such as mapping. In practical terms, such a system will provide the ability to gather data over the Web from a variety of distributed sources, regardless of computer operating systems, database formats, and servers. Search engines will gather data about any geographic location, above, on, or below ground, covering any geologic time, and at any scale or detail. A distributed network of digital geolibraries can archive permanent copies of databases at risk of being discontinued and those that continue to be maintained by the data authors. The geoinformatics system will generate results from widely distributed sources to function as a dynamic data network. Instead of posting a variety of pre-made tables, charts, or maps based on static databases, the interactive dynamic system creates these products on the fly, each time an inquiry is made, using the latest information in the appropriate databases. Thus, in the dynamic system, a map generated today may differ from one created yesterday and one to be created tomorrow, because the databases used to make it are constantly (and sometimes automatically) being updated.

  19. Urine sampling and collection system

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Reinhardt, C. G.

    1971-01-01

    This specification defines the performance and design requirements for the urine sampling and collection system engineering model and establishes requirements for its design, development, and test. The model shall provide conceptual verification of a system applicable to manned space flight which will automatically provide for collection, volume sensing, and sampling of urine.

  20. Deep Learning Methods for Quantifying Invasive Benthic Species in the Great Lakes

    NASA Astrophysics Data System (ADS)

    Billings, G.; Skinner, K.; Johnson-Roberson, M.

    2017-12-01

    In recent decades, invasive species such as the round goby and dreissenid mussels have greatly impacted the Great Lakes ecosystem. It is critical to monitor these species, model their distribution, and quantify the impacts on the native fisheries and surrounding ecosystem in order to develop an effective management response. However, data collection in underwater environments is challenging and expensive. Furthermore, the round goby is typically found in rocky habitats, which are inaccessible to standard survey techniques such as bottom trawling. In this work we propose a robotic system for visual data collection to automatically detect and quantify invasive round gobies and mussels in the Great Lakes. Robotic platforms equipped with cameras can perform efficient, cost-effective, low-bias benthic surveys. This data collection can be further optimized through automatic detection and annotation of the target species. Deep learning methods have shown success in image recognition tasks. However, these methods often rely on a labelled training dataset, with up to millions of labelled images. Hand labeling large numbers of images is expensive and often impracticable. Furthermore, data collected in the field may be sparse when only considering images that contain the objects of interest. It is easier to collect dense, clean data in controlled lab settings, but this data is not a realistic representation of real field environments. In this work, we propose a deep learning approach to generate a large set of labelled training data realistic of underwater environments in the field. To generate these images, first we draw random sample images of individual fish and mussels from a library of images captured in a controlled lab environment. Next, these randomly drawn samples will be automatically merged into natural background images. Finally, we will use a generative adversarial network (GAN) that incorporates constraints of the physical model of underwater light propagation to simulate the process of underwater image formation in various water conditions. The output of the GAN will be realistic looking annotated underwater images. This generated dataset of images will be used to train a classifier to identify round gobies and mussels in order to measure the biomass and abundance of these invasive species in the Great Lakes.

  1. Automatic Coding of Short Text Responses via Clustering in Educational Assessment

    ERIC Educational Resources Information Center

    Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank

    2016-01-01

    Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…

  2. A real-time measurement system for long-life flood monitoring and warning applications.

    PubMed

    Marin-Perez, Rafael; García-Pintado, Javier; Gómez, Antonio Skarmeta

    2012-01-01

    A flood warning system incorporates telemetered rainfall and flow/water level data measured at various locations in the catchment area. Real-time accurate data collection is required for this use, and sensor networks improve the system capabilities. However, existing sensor nodes struggle to satisfy the hydrological requirements in terms of autonomy, sensor hardware compatibility, reliability and long-range communication. We describe the design and development of a real-time measurement system for flood monitoring, and its deployment in a flash-flood prone 650 km(2) semiarid watershed in Southern Spain. A developed low-power and long-range communication device, so-called DatalogV1, provides automatic data gathering and reliable transmission. DatalogV1 incorporates self-monitoring for adapting measurement schedules for consumption management and to capture events of interest. Two tests are used to assess the success of the development. The results show an autonomous and robust monitoring system for long-term collection of water level data in many sparse locations during flood events.

  3. A Real-Time Measurement System for Long-Life Flood Monitoring and Warning Applications

    PubMed Central

    Marin-Perez, Rafael; García-Pintado, Javier; Gómez, Antonio Skarmeta

    2012-01-01

    A flood warning system incorporates telemetered rainfall and flow/water level data measured at various locations in the catchment area. Real-time accurate data collection is required for this use, and sensor networks improve the system capabilities. However, existing sensor nodes struggle to satisfy the hydrological requirements in terms of autonomy, sensor hardware compatibility, reliability and long-range communication. We describe the design and development of a real-time measurement system for flood monitoring, and its deployment in a flash-flood prone 650 km2 semiarid watershed in Southern Spain. A developed low-power and long-range communication device, so-called DatalogV1, provides automatic data gathering and reliable transmission. DatalogV1 incorporates self-monitoring for adapting measurement schedules for consumption management and to capture events of interest. Two tests are used to assess the success of the development. The results show an autonomous and robust monitoring system for long-term collection of water level data in many sparse locations during flood events. PMID:22666028

  4. Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data

    DTIC Science & Technology

    2017-01-01

    files, organized by location. The data were processed using the Python programming language (van Rossum and Drake 2001), the Pandas data analysis...ER D C/ CH L TR -1 7- 2 Coastal Inlets Research Program Tidal Analysis and Arrival Process Mining Using Automatic Identification System...17-2 January 2017 Tidal Analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data Brandan M. Scully Coastal and

  5. MxCuBE: a synchrotron beamline control environment customized for macromolecular crystallography experiments

    PubMed Central

    Gabadinho, José; Beteva, Antonia; Guijarro, Matias; Rey-Bakaikoa, Vicente; Spruce, Darren; Bowler, Matthew W.; Brockhauser, Sandor; Flot, David; Gordon, Elspeth J.; Hall, David R.; Lavault, Bernard; McCarthy, Andrew A.; McCarthy, Joanne; Mitchell, Edward; Monaco, Stéphanie; Mueller-Dieckmann, Christoph; Nurizzo, Didier; Ravelli, Raimond B. G.; Thibault, Xavier; Walsh, Martin A.; Leonard, Gordon A.; McSweeney, Sean M.

    2010-01-01

    The design and features of a beamline control software system for macromolecular crystallography (MX) experiments developed at the European Synchrotron Radiation Facility (ESRF) are described. This system, MxCuBE, allows users to easily and simply interact with beamline hardware components and provides automated routines for common tasks in the operation of a synchrotron beamline dedicated to experiments in MX. Additional functionality is provided through intuitive interfaces that enable the assessment of the diffraction characteristics of samples, experiment planning, automatic data collection and the on-line collection and analysis of X-ray emission spectra. The software can be run in a tandem client-server mode that allows for remote control and relevant experimental parameters and results are automatically logged in a relational database, ISPyB. MxCuBE is modular, flexible and extensible and is currently deployed on eight macromolecular crystallography beamlines at the ESRF. Additionally, the software is installed at MAX-lab beamline I911-3 and at BESSY beamline BL14.1. PMID:20724792

  6. Making automated computer program documentation a feature of total system design

    NASA Technical Reports Server (NTRS)

    Wolf, A. W.

    1970-01-01

    It is pointed out that in large-scale computer software systems, program documents are too often fraught with errors, out of date, poorly written, and sometimes nonexistent in whole or in part. The means are described by which many of these typical system documentation problems were overcome in a large and dynamic software project. A systems approach was employed which encompassed such items as: (1) configuration management; (2) standards and conventions; (3) collection of program information into central data banks; (4) interaction among executive, compiler, central data banks, and configuration management; and (5) automatic documentation. A complete description of the overall system is given.

  7. A robust and non-obtrusive automatic event tracking system for operating room management to improve patient care.

    PubMed

    Huang, Albert Y; Joerger, Guillaume; Salmon, Remi; Dunkin, Brian; Sherman, Vadim; Bass, Barbara L; Garbey, Marc

    2016-08-01

    Optimization of OR management is a complex problem as each OR has different procedures throughout the day inevitably resulting in scheduling delays, variations in time durations and overall suboptimal performance. There exists a need for a system that automatically tracks procedural progress in real time in the OR. This would allow for efficient monitoring of operating room states and target sources of inefficiency and points of improvement. We placed three wireless sensors (floor-mounted pressure sensor, ventilator-mounted bellows motion sensor and ambient light detector, and a general room motion detector) in two ORs at our institution and tracked cases 24 h a day for over 4 months. We collected data on 238 total cases (107 laparoscopic cases). A total of 176 turnover times were also captured, and we found that the average turnover time between cases was 35 min while the institutional goal was 30 min. Deeper examination showed that 38 % of laparoscopic cases had some aspect of suboptimal activity with the time between extubation and patient exiting the OR being the biggest contributor (16 %). Our automated system allows for robust, wireless real-time OR monitoring as well as data collection and retrospective data analyses. We plan to continue expanding our system and to project the data in real time for all OR personnel to see. At the same time, we plan on adding key pieces of technology such as RFID and other radio-frequency systems to track patients and physicians to further increase efficiency and patient safety.

  8. Time-critical Database Condition Data Handling in the CMS Experiment During the First Data Taking Period

    NASA Astrophysics Data System (ADS)

    Cavallari, Francesca; de Gruttola, Michele; Di Guida, Salvatore; Govi, Giacomo; Innocente, Vincenzo; Pfeiffer, Andreas; Pierro, Antonio

    2011-12-01

    Automatic, synchronous and reliable population of the condition databases is critical for the correct operation of the online selection as well as of the offline reconstruction and analysis of data. In this complex infrastructure, monitoring and fast detection of errors is a very challenging task. In this paper, we describe the CMS experiment system to process and populate the Condition Databases and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are automatically collected using centralized jobs or are "dropped" by the users in dedicated services (offline and online drop-box), which synchronize them and take care of writing them into the online database. Then they are automatically streamed to the offline database, and thus are immediately accessible offline worldwide. The condition data are managed by different users using a wide range of applications.In normal operation the database monitor is used to provide simple timing information and the history of all transactions for all database accounts, and in the case of faults it is used to return simple error messages and more complete debugging information.

  9. Calling on a million minds for community annotation in WikiProteins

    PubMed Central

    Mons, Barend; Ashburner, Michael; Chichester, Christine; van Mulligen, Erik; Weeber, Marc; den Dunnen, Johan; van Ommen, Gert-Jan; Musen, Mark; Cockerill, Matthew; Hermjakob, Henning; Mons, Albert; Packer, Abel; Pacheco, Roberto; Lewis, Suzanna; Berkeley, Alfred; Melton, William; Barris, Nickolas; Wales, Jimmy; Meijssen, Gerard; Moeller, Erik; Roes, Peter Jan; Borner, Katy; Bairoch, Amos

    2008-01-01

    WikiProteins enables community annotation in a Wiki-based system. Extracts of major data sources have been fused into an editable environment that links out to the original sources. Data from community edits create automatic copies of the original data. Semantic technology captures concepts co-occurring in one sentence and thus potential factual statements. In addition, indirect associations between concepts have been calculated. We call on a 'million minds' to annotate a 'million concepts' and to collect facts from the literature with the reward of collaborative knowledge discovery. The system is available for beta testing at . PMID:18507872

  10. 77 FR 58170 - Proposed Renewal of Existing Information Collection; Fire Protection (Underground Coal Mines)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-19

    ... the locations of automatic fire warning sensors and the intended air flow direction at these locations...) requires that a qualified person examine the automatic fire sensor and warning device systems on a weekly....1103-8(b) requires that a record of the weekly automatic fire sensor functional tests be maintained by...

  11. INFORMATION STORAGE AND RETRIEVAL, REPORTS ON EVALUATION PROCEDURES AND RESULTS 1965-1967.

    ERIC Educational Resources Information Center

    SALTON, GERALD

    A DETAILED ANALYSIS OF THE RETRIEVAL EVALUATION RESULTS OBTAINED WITH THE AUTOMATIC SMART DOCUMENT RETRIEVAL SYSTEM FOR DOCUMENT COLLECTIONS IN THE FIELDS OF AERODYNAMICS, COMPUTER SCIENCE, AND DOCUMENTATION IS GIVEN IN THIS REPORT. THE VARIOUS COMPONENTS OF FULLY AUTOMATIC DOCUMENT RETRIEVAL SYSTEMS ARE DISCUSSED IN DETAIL, INCLUDING THE FORMS OF…

  12. Data transmission system with distributed microprocessors

    DOEpatents

    Nambu, Shigeo

    1985-01-01

    A data transmission system having a common request line and a special request line in addition to a transmission line. The special request line has priority over the common request line. A plurality of node stations are multi-drop connected to the transmission line. Among the node stations, a supervising station is connected to the special request line and takes precedence over other slave stations to become a master station. The master station collects data from the slave stations. The station connected to the common request line can assign a master control function to any station requesting to be assigned the master control function within a short period of time. Each station has an auto response control circuit. The master station automatically collects data by the auto response controlling circuit independently of the microprocessors of the slave stations.

  13. Nondestructive Vibratory Testing and Evaluation Procedure for Military Roads and Streets.

    DTIC Science & Technology

    1984-07-01

    the addition of an auto- matic data acquisition system to the instrumentation control panel. This system , presently available, would automatically ...the data used to further develop and define the basic correlations. c. Consideration be given to installing an automatic data acquisi- tion system to...glows red any time the force generator is not fully elevated. Depressing this switch will stop the automatic cycle at any point and clear all system

  14. High-speed railway signal trackside equipment patrol inspection system

    NASA Astrophysics Data System (ADS)

    Wu, Nan

    2018-03-01

    High-speed railway signal trackside equipment patrol inspection system comprehensively applies TDI (time delay integration), high-speed and highly responsive CMOS architecture, low illumination photosensitive technique, image data compression technique, machine vision technique and so on, installed on high-speed railway inspection train, and achieves the collection, management and analysis of the images of signal trackside equipment appearance while the train is running. The system will automatically filter out the signal trackside equipment images from a large number of the background image, and identify of the equipment changes by comparing the original image data. Combining with ledger data and train location information, the system accurately locate the trackside equipment, conscientiously guiding maintenance.

  15. Inexpensive automated paging system for use at remote research sites

    USGS Publications Warehouse

    Sargent, S.L.; Dey, W.S.; Keefer, D.A.

    1998-01-01

    The use of a flow-activated automatic sampler at a remote research site required personnel to periodically visit the site to collect samples and reset the automatic sampler. To reduce site visits, a cellular telephone was modified for activation by a datalogger. The purpose of this study was to demonstrate the use and benefit of the modified telephone. Both the power switch and the speed-dial button on the telephone were bypassed and wired to a relay driver. The datalogger was programmed to compare values of a monitored environmental parameter with a target value. When the target value was reached or exceeded, the datalogger pulsed a relay driver, activating power to the telephone. A separate relay activated the speed dial, dialing the number of a tone-only pager. The use of this system has saved time and reduced travel costs by reducing the number of trips to the site, without the loss of any data.The use of a flow-activated automatic sampler at a remote research site required personnel to periodically visit the site to collect samples and reset the automatic sampler. To reduce site visits, a cellular telephone was modified for activation by a datalogger. The purpose of this study was to demonstrate the use and benefit of the modified telephone. Both the power switch and the speed-dial button on the telephone were bypassed and wired to a relay driver. The datalogger was programmed to compare values of a monitored environmental parameter with a target value. When the target value was reached or exceeded, the datalogger pulsed a relay driver, activating power to the telephone. A separate relay activated the speed dial, dialing the number of a tone-only pager. The use of this system has saved time and reduced travel costs by reducing the number of trips to the site, without the loss of any data.

  16. SCAN+

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth Krebs, John Svoboda

    2009-11-01

    SCAN+ is a software application specifically designed to control the positioning of a gamma spectrometer by a two dimensional translation system above spent fuel bundles located in a sealed spent fuel cask. The gamma spectrometer collects gamma spectrum information for the purpose of spent fuel cask fuel loading verification. SCAN+ performs manual and automatic gamma spectrometer positioning functions as-well-as exercising control of the gamma spectrometer data acquisitioning functions. Cask configuration files are used to determine the positions of spent fuel bundles. Cask scanning files are used to determine the desired scan paths for scanning a spent fuel cask allowing formore » automatic unattended cask scanning that may take several hours.« less

  17. Fast massive preventive security and information communication systems

    NASA Astrophysics Data System (ADS)

    Akopian, David; Chen, Philip; Miryakar, Susheel; Kumar, Abhinav

    2008-04-01

    We present a fast massive information communication system for data collection from distributive sources such as cell phone users. As a very important application one can mention preventive notification systems when timely notification and evidence communication may help to improve safety and security through wide public involvement by ensuring easy-to-access and easy-to-communicate information systems. The technology significantly simplifies the response to the events and will help e.g. special agencies to gather crucial information in time and respond as quickly as possible. Cellular phones are nowadays affordable for most of the residents and became a common personal accessory. The paper describes several ways to design such systems including existing internet access capabilities of cell phones or downloadable specialized software. We provide examples of such designs. The main idea is in structuring information in predetermined way and communicating data through a centralized gate-server which will automatically process information and forward it to a proper destination. The gate-server eliminates a need in knowing contact data and specific local community infrastructure. All the cell phones will have self-localizing capability according to FCC E911 mandate, thus the communicated information can be further tagged automatically by location and time information.

  18. EPA requirements and programs

    NASA Technical Reports Server (NTRS)

    Koutsandreas, J. D.

    1975-01-01

    The proposed ERTS-DCS system is designed to allow EPA the capability to evaluate, through demonstrable hardware, the effectiveness of automated data collection techniques. The total effectiveness of any system is dependent upon many factors which include equipment cost, installation, maintainability, logistic support, growth potential, flexibility and failure rate. This can best be accomplished by installing the system at an operational environmental control agency (CAMP station) to insure that valid data is being obtained and processed. Consequently, it is imperative that the equipment interface must not compromise the validity of the sensor data nor should the experimental system effect the present operations of the CAMP station. Since both the system which is presently in use and the automatic system would be in operation in parallel, conformation and comparison are readily obtained.

  19. Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis

    NASA Astrophysics Data System (ADS)

    Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz

    2004-04-01

    Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.

  20. Analysing trends and forecasting malaria epidemics in Madagascar using a sentinel surveillance network: a web-based application.

    PubMed

    Girond, Florian; Randrianasolo, Laurence; Randriamampionona, Lea; Rakotomanana, Fanjasoa; Randrianarivelojosia, Milijaona; Ratsitorahina, Maherisoa; Brou, Télesphore Yao; Herbreteau, Vincent; Mangeas, Morgan; Zigiumugabe, Sixte; Hedje, Judith; Rogier, Christophe; Piola, Patrice

    2017-02-13

    The use of a malaria early warning system (MEWS) to trigger prompt public health interventions is a key step in adding value to the epidemiological data routinely collected by sentinel surveillance systems. This study describes a system using various epidemic thresholds and a forecasting component with the support of new technologies to improve the performance of a sentinel MEWS. Malaria-related data from 21 sentinel sites collected by Short Message Service are automatically analysed to detect malaria trends and malaria outbreak alerts with automated feedback reports. Roll Back Malaria partners can, through a user-friendly web-based tool, visualize potential outbreaks and generate a forecasting model. The system already demonstrated its ability to detect malaria outbreaks in Madagascar in 2014. This approach aims to maximize the usefulness of a sentinel surveillance system to predict and detect epidemics in limited-resource environments.

  1. 78 FR 55060 - Proposed Information Collection; Comment Request; Expanded Vessel Monitoring System Requirement...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-09

    ... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; Expanded Vessel Monitoring System Requirement in the Pacific Coast Groundfish... and use a vessel monitoring system (VMS) that automatically sends hourly position reports. Exemptions...

  2. Audio-Visual Situational Awareness for General Aviation Pilots

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Lodha, Suresh K.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    Weather is one of the major causes of general aviation accidents. Researchers are addressing this problem from various perspectives including improving meteorological forecasting techniques, collecting additional weather data automatically via on-board sensors and "flight" modems, and improving weather data dissemination and presentation. We approach the problem from the improved presentation perspective and propose weather visualization and interaction methods tailored for general aviation pilots. Our system, Aviation Weather Data Visualization Environment (AWE), utilizes information visualization techniques, a direct manipulation graphical interface, and a speech-based interface to improve a pilot's situational awareness of relevant weather data. The system design is based on a user study and feedback from pilots.

  3. Streamlining Metadata and Data Management for Evolving Digital Libraries

    NASA Astrophysics Data System (ADS)

    Clark, D.; Miller, S. P.; Peckman, U.; Smith, J.; Aerni, S.; Helly, J.; Sutton, D.; Chase, A.

    2003-12-01

    What began two years ago as an effort to stabilize the Scripps Institution of Oceanography (SIO) data archives from more than 700 cruises going back 50 years, has now become the operational fully-searchable "SIOExplorer" digital library, complete with thousands of historic photographs, images, maps, full text documents, binary data files, and 3D visualization experiences, totaling nearly 2 terabytes of digital content. Coping with data diversity and complexity has proven to be more challenging than dealing with large volumes of digital data. SIOExplorer has been built with scalability in mind, so that the addition of new data types and entire new collections may be accomplished with ease. It is a federated system, currently interoperating with three independent data-publishing authorities, each responsible for their own quality control, metadata specifications, and content selection. The IT architecture implemented at the San Diego Supercomputer Center (SDSC) streamlines the integration of additional projects in other disciplines with a suite of metadata management and collection building tools for "arbitrary digital objects." Metadata are automatically harvested from data files into domain-specific metadata blocks, and mapped into various specification standards as needed. Metadata can be browsed and objects can be viewed onscreen or downloaded for further analysis, with automatic proprietary-hold request management.

  4. 76 FR 58301 - Proposed Extension of Existing Information Collection; Automatic Fire Sensor and Warning Device...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    .... Sec. 75.1103-8(b) and (c). MSHA expects to subsume these provisions into OMB 1219-0054, Fire....1103-5(a)(2)(ii) Automatic fire sensor and warning device systems and the package is at OMB for its 3.... 75.1103-5(a) Automatic fire warning devices; actions, response in October 2010; OMB 1219-0127...

  5. Renewable Energy at NASA's Johnson Space Center

    NASA Technical Reports Server (NTRS)

    McDowall, Lindsay

    2014-01-01

    NASA's Johnson Space Center has implemented a great number of renewable energy systems. Renewable energy systems are necessary to research and implement if we humans are expected to continue to grow and thrive on this planet. These systems generate energy using renewable sources - water, wind, sun - things that we will not run out of. Johnson Space Center is helping to pave the way by installing and studying various renewable energy systems. The objective of this report will be to examine the completed renewable energy projects at NASA's Johnson Space Center for a time span of ten years, beginning in 2003 and ending in early 2014. This report will analyze the success of each project based on actual vs. projected savings and actual vs. projected efficiency. Additionally, both positive and negative experiences are documented so that lessons may be learned from past experiences. NASA is incorporating renewable energy wherever it can, including into buildings. According to the 2012 JSC Annual Sustainability Report, there are 321,660 square feet of green building space on JSC's campus. The two projects discussed here are major contributors to that statistic. These buildings were designed to meet various Leadership in Energy and Environmental Design (LEED) Certification criteria. LEED Certified buildings use 30 to 50 percent less energy and water compared to non-LEED buildings. The objectives of this project were to examine data from the renewable energy systems in two of the green buildings onsite - Building 12 and Building 20. In Building 12, data was examined from the solar photovoltaic arrays. In Building 20, data was examined from the solar water heater system. By examining the data from the two buildings, it could be determined if the renewable energy systems are operating efficiently. Objectives In Building 12, the data from the solar photovoltaic arrays shows that the system is continuously collecting energy from the sun, as shown by the graph below. Building 12 has two solar inverters, located on the second floor, that collected the data from the solar photovoltaic arrays. The data displayed here is the total energy produced by the system. These are cumulative amounts, so the last point on the graph shows all of the energy collected from the system since the start of its operation. The data shown here was manually collected from the solar inverters. However, the data is also automatically recorded through EBI. Through analysis of both sets of data it was determined that the EBI data was faulty. For example, from the manually collected data it can be determined that a total of 73 kWh of energy was collected between the dates of 1/16/2014 – 1/22/2014. The EBI data reports that approximately 17800 kWh of energy was collected during the same time frame. Not only does this exceed the time frame examined, but it also exceeds the total energy collected from the start of collection as recorded from the inverters. This leads to the belief that there is a malfunction with the automatic recording of the energy. In Building 20, data was examined from the solar water heater dating back many months and found that the pump for the solar water heater system was not operating properly, as exhibited in the graph shown below. The pump operates on a solar energy system, meaning that it collects energy throughout the day from the sun. Because of this, the system would stop operating shortly after the sun set because of a lack of sunlight. At that point, the graph should show a zero flow rate, but as exhibited in the graph below, that is not the case. It is clearly shown that the pump is continuously operating, even during the night. It was also observed that the majority of the time the pump would not turn on at all, despite good weather conditions. This led to the conclusion that the pump is malfunctioning, and needs to be examined and fixed.

  6. Machine-Aided Indexing in Practice: An Encounter with Automatic Indexing of the Third Kind.

    ERIC Educational Resources Information Center

    Klingbiel, Paul H.

    This three-part report includes a brief history of the Defense Documentation Center (DDC) with a description of the collections and their accessibility; categorization of automatic indexing into three kinds with a brief description of the DDC system of machine-aided indexing; and an indication of some operational experiences with the system.…

  7. An Experimental Seismic Data and Parameter Exchange System for Tsunami Warning Systems

    NASA Astrophysics Data System (ADS)

    Hoffmann, T. L.; Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Heinloo, A.; Hoffmann, M.

    2009-12-01

    For several years GFZ Potsdam is operating a global earthquake monitoring system. Since the beginning of 2008, this system is also used as an experimental seismic background data center for two different regional Tsunami Warning Systems (TWS), the IOTWS (Indian Ocean) and the interim NEAMTWS (NE Atlantic and Mediterranean). The SeisComP3 (SC3) software, developed within the GITEWS (German Indian Ocean Tsunami Early Warning System) project, capable to acquire, archive and process real-time data feeds, was extended for export and import of individual processing results within the two clusters of connected SC3 systems. Therefore not only real-time waveform data are routed to the attached warning centers through GFZ but also processing results. While the current experimental NEAMTWS cluster consists of SC3 systems in six designated national warning centers in Europe, the IOTWS cluster presently includes seven centers, with another three likely to join in 2009/10. For NEAMTWS purposes, the GFZ virtual real-time seismic network (GEOFON Extended Virtual Network -GEVN) in Europe was substantially extended by adding many stations from Western European countries optimizing the station distribution. In parallel to the data collection over the Internet, a GFZ VSAT hub for secured data collection of the EuroMED GEOFON and NEAMTWS backbone network stations became operational and first data links were established through this backbone. For the Southeast Asia region, a VSAT hub has been established in Jakarta already in 2006, with some other partner networks connecting to this backbone via the Internet. Since its establishment, the experimental system has had the opportunity to prove its performance in a number of relevant earthquakes. Reliable solutions derived from a minimum of 25 stations were very promising in terms of speed. For important events, automatic alerts were released and disseminated by emails and SMS. Manually verified solutions are added as soon as they become available. The results are also promising in terms of accuracy since epicenter coordinates, depth and magnitude estimates were sufficiently accurate from the very beginning, and usually do not differ substantially from the final solutions. In summary, automatic seismic event processing has shown to work well as a first step for starting a Tsunami Warning process. However, for the secured assessment of the tsunami potential of a given event, 24/7-manned regional TWCs are mandatory for reliable manual verification of the automatic seismic results. At this time, GFZ itself provides manual verification only when staff is available, not on a 24/7 basis, while the actual national tsunami warning centers have all a reliable 24/7 service.

  8. Popular song and lyrics synchronization and its application to music information retrieval

    NASA Astrophysics Data System (ADS)

    Chen, Kai; Gao, Sheng; Zhu, Yongwei; Sun, Qibin

    2006-01-01

    An automatic synchronization system of the popular song and its lyrics is presented in the paper. The system includes two main components: a) automatically detecting vocal/non-vocal in the audio signal and b) automatically aligning the acoustic signal of the song with its lyric using speech recognition techniques and positioning the boundaries of the lyrics in its acoustic realization at the multiple levels simultaneously (e.g. the word / syllable level and phrase level). The GMM models and a set of HMM-based acoustic model units are carefully designed and trained for the detection and alignment. To eliminate the severe mismatch due to the diversity of musical signal and sparse training data available, the unsupervised adaptation technique such as maximum likelihood linear regression (MLLR) is exploited for tailoring the models to the real environment, which improves robustness of the synchronization system. To further reduce the effect of the missed non-vocal music on alignment, a novel grammar net is build to direct the alignment. As we know, this is the first automatic synchronization system only based on the low-level acoustic feature such as MFCC. We evaluate the system on a Chinese song dataset collecting from 3 popular singers. We obtain 76.1% for the boundary accuracy at the syllable level (BAS) and 81.5% for the boundary accuracy at the phrase level (BAP) using fully automatic vocal/non-vocal detection and alignment. The synchronization system has many applications such as multi-modality (audio and textual) content-based popular song browsing and retrieval. Through the study, we would like to open up the discussion of some challenging problems when developing a robust synchronization system for largescale database.

  9. Automatic reconstruction of a bacterial regulatory network using Natural Language Processing

    PubMed Central

    Rodríguez-Penagos, Carlos; Salgado, Heladia; Martínez-Flores, Irma; Collado-Vides, Julio

    2007-01-01

    Background Manual curation of biological databases, an expensive and labor-intensive process, is essential for high quality integrated data. In this paper we report the implementation of a state-of-the-art Natural Language Processing system that creates computer-readable networks of regulatory interactions directly from different collections of abstracts and full-text papers. Our major aim is to understand how automatic annotation using Text-Mining techniques can complement manual curation of biological databases. We implemented a rule-based system to generate networks from different sets of documents dealing with regulation in Escherichia coli K-12. Results Performance evaluation is based on the most comprehensive transcriptional regulation database for any organism, the manually-curated RegulonDB, 45% of which we were able to recreate automatically. From our automated analysis we were also able to find some new interactions from papers not already curated, or that were missed in the manual filtering and review of the literature. We also put forward a novel Regulatory Interaction Markup Language better suited than SBML for simultaneously representing data of interest for biologists and text miners. Conclusion Manual curation of the output of automatic processing of text is a good way to complement a more detailed review of the literature, either for validating the results of what has been already annotated, or for discovering facts and information that might have been overlooked at the triage or curation stages. PMID:17683642

  10. Classifying seismic waveforms from scratch: a case study in the alpine environment

    NASA Astrophysics Data System (ADS)

    Hammer, C.; Ohrnberger, M.; Fäh, D.

    2013-01-01

    Nowadays, an increasing amount of seismic data is collected by daily observatory routines. The basic step for successfully analyzing those data is the correct detection of various event types. However, the visually scanning process is a time-consuming task. Applying standard techniques for detection like the STA/LTA trigger still requires the manual control for classification. Here, we present a useful alternative. The incoming data stream is scanned automatically for events of interest. A stochastic classifier, called hidden Markov model, is learned for each class of interest enabling the recognition of highly variable waveforms. In contrast to other automatic techniques as neural networks or support vector machines the algorithm allows to start the classification from scratch as soon as interesting events are identified. Neither the tedious process of collecting training samples nor a time-consuming configuration of the classifier is required. An approach originally introduced for the volcanic task force action allows to learn classifier properties from a single waveform example and some hours of background recording. Besides a reduction of required workload this also enables to detect very rare events. Especially the latter feature provides a milestone point for the use of seismic devices in alpine warning systems. Furthermore, the system offers the opportunity to flag new signal classes that have not been defined before. We demonstrate the application of the classification system using a data set from the Swiss Seismological Survey achieving very high recognition rates. In detail we document all refinements of the classifier providing a step-by-step guide for the fast set up of a well-working classification system.

  11. Point cloud generation from aerial image data acquired by a quadrocopter type micro unmanned aerial vehicle and a digital still camera.

    PubMed

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems' SOCET SET classical commercial photogrammetric software and another is built using Microsoft(®)'s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation.

  12. Research on Application of Automatic Weather Station Based on Internet of Things

    NASA Astrophysics Data System (ADS)

    Jianyun, Chen; Yunfan, Sun; Chunyan, Lin

    2017-12-01

    In this paper, the Internet of Things is briefly introduced, and then its application in the weather station is studied. A method of data acquisition and transmission based on NB-iot communication mode is proposed, Introduction of Internet of things technology, Sensor digital and independent power supply as the technical basis, In the construction of Automatic To realize the intelligent interconnection of the automatic weather station, and then to form an automatic weather station based on the Internet of things. A network structure of automatic weather station based on Internet of things technology is constructed to realize the independent operation of intelligent sensors and wireless data transmission. Research on networking data collection and dissemination of meteorological data, through the data platform for data analysis, the preliminary work of meteorological information publishing standards, networking of meteorological information receiving terminal provides the data interface, to the wisdom of the city, the wisdom of the purpose of the meteorological service.

  13. Activity classification using realistic data from wearable sensors.

    PubMed

    Pärkkä, Juha; Ermes, Miikka; Korpipää, Panu; Mäntyjärvi, Jani; Peltola, Johannes; Korhonen, Ilkka

    2006-01-01

    Automatic classification of everyday activities can be used for promotion of health-enhancing physical activities and a healthier lifestyle. In this paper, methods used for classification of everyday activities like walking, running, and cycling are described. The aim of the study was to find out how to recognize activities, which sensors are useful and what kind of signal processing and classification is required. A large and realistic data library of sensor data was collected. Sixteen test persons took part in the data collection, resulting in approximately 31 h of annotated, 35-channel data recorded in an everyday environment. The test persons carried a set of wearable sensors while performing several activities during the 2-h measurement session. Classification results of three classifiers are shown: custom decision tree, automatically generated decision tree, and artificial neural network. The classification accuracies using leave-one-subject-out cross validation range from 58 to 97% for custom decision tree classifier, from 56 to 97% for automatically generated decision tree, and from 22 to 96% for artificial neural network. Total classification accuracy is 82 % for custom decision tree classifier, 86% for automatically generated decision tree, and 82% for artificial neural network.

  14. Passenger Flow Estimation and Characteristics Expansion

    DOT National Transportation Integrated Search

    2016-04-01

    Mark R. McCord (ORCID ID 0000-0002-6293-3143) The objectives of this study are to investigate the estimation of bus passenger boarding-to-alighting (B2A) flows using Automatic Passenger Count (APC) and Automatic Fare Collection (AFC) (fare-box) data ...

  15. [Research on automatic external defibrillator based on DSP].

    PubMed

    Jing, Jun; Ding, Jingyan; Zhang, Wei; Hong, Wenxue

    2012-10-01

    Electrical defibrillation is the most effective way to treat the ventricular tachycardia (VT) and ventricular fibrillation (VF). An automatic external defibrillator based on DSP is introduced in this paper. The whole design consists of the signal collection module, the microprocessor controlingl module, the display module, the defibrillation module and the automatic recognition algorithm for VF and non VF, etc. This automatic external defibrillator has achieved goals such as ECG signal real-time acquisition, ECG wave synchronous display, data delivering to U disk and automatic defibrillate when shockable rhythm appears, etc.

  16. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  17. A new standardized data collection system for interdisciplinary thyroid cancer management: Thyroid COBRA.

    PubMed

    Tagliaferri, Luca; Gobitti, Carlo; Colloca, Giuseppe Ferdinando; Boldrini, Luca; Farina, Eleonora; Furlan, Carlo; Paiar, Fabiola; Vianello, Federica; Basso, Michela; Cerizza, Lorenzo; Monari, Fabio; Simontacchi, Gabriele; Gambacorta, Maria Antonietta; Lenkowicz, Jacopo; Dinapoli, Nicola; Lanzotti, Vito; Mazzarotto, Renzo; Russi, Elvio; Mangoni, Monica

    2018-07-01

    The big data approach offers a powerful alternative to Evidence-based medicine. This approach could guide cancer management thanks to machine learning application to large-scale data. Aim of the Thyroid CoBRA (Consortium for Brachytherapy Data Analysis) project is to develop a standardized web data collection system, focused on thyroid cancer. The Metabolic Radiotherapy Working Group of Italian Association of Radiation Oncology (AIRO) endorsed the implementation of a consortium directed to thyroid cancer management and data collection. The agreement conditions, the ontology of the collected data and the related software services were defined by a multicentre ad hoc working-group (WG). Six Italian cancer centres were firstly started the project, defined and signed the Thyroid COBRA consortium agreement. Three data set tiers were identified: Registry, Procedures and Research. The COBRA-Storage System (C-SS) appeared to be not time-consuming and to be privacy respecting, as data can be extracted directly from the single centre's storage platforms through a secured connection that ensures reliable encryption of sensible data. Automatic data archiving could be directly performed from Image Hospital Storage System or the Radiotherapy Treatment Planning Systems. The C-SS architecture will allow "Cloud storage way" or "distributed learning" approaches for predictive model definition and further clinical decision support tools development. The development of the Thyroid COBRA data Storage System C-SS through a multicentre consortium approach appeared to be a feasible tool in the setup of complex and privacy saving data sharing system oriented to the management of thyroid cancer and in the near future every cancer type. Copyright © 2018 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  18. Description of a Remote Ionospheric Scintillation Data Collection Facility

    DOT National Transportation Integrated Search

    1973-03-01

    An experimental technique is described which measures L-band ionospheric scintillation at a remote, unmanned site. Details of an automatic data collection facility are presented. The remote facility comprises an L-band receiver, and a complete VHF co...

  19. Electronic Toll And Traffic Management Systems, National Cooperative Highway Research Program Synthesis

    DOT National Transportation Integrated Search

    1993-01-01

    ELECTRONIC TOLL COLLECTION OR ETC AND TRAFFIC MANAGEMENT OR ETTM, AUTOMATIC VEHICLE IDENTIFICATION OR AVI : ELECTRONIC TOLL COLLECTION AND TRAFFIC MANAGEMENT (ETTM) SYSTEMS ARE NOT A FUTURISTIC DREAM, THEY ARE OPERATING OR ARE BEING TESTED TODAY I...

  20. Automation, Operation, and Data Analysis in the Cryogenic, High Accuracy, Refraction Measuring System (CHARMS)

    NASA Technical Reports Server (NTRS)

    Frey, Bradley J.; Leviton, Douglas B.

    2005-01-01

    The Cryogenic High Accuracy Refraction Measuring System (CHARMS) at NASA's Goddard Space Flight Center has been enhanced in a number of ways in the last year to allow the system to accurately collect refracted beam deviation readings automatically over a range of temperatures from 15 K to well beyond room temperature with high sampling density in both wavelength and temperature. The engineering details which make this possible are presented. The methods by which the most accurate angular measurements are made and the corresponding data reduction methods used to reduce thousands of observed angles to a handful of refractive index values are also discussed.

  1. Automation, Operation, and Data Analysis in the Cryogenic, High Accuracy, Refraction Measuring System (CHARMS)

    NASA Technical Reports Server (NTRS)

    Frey, Bradley; Leviton, Duoglas

    2005-01-01

    The Cryogenic High Accuracy Refraction Measuring System (CHARMS) at NASA s Goddard Space Flight Center has been enhanced in a number of ways in the last year to allow the system to accurately collect refracted beam deviation readings automatically over a range of temperatures from 15 K to well beyond room temperature with high sampling density in both wavelength and temperature. The engineering details which make this possible are presented. The methods by which the most accurate angular measurements are made and the corresponding data reduction methods used to reduce thousands of observed angles to a handful of refractive index values are also discussed.

  2. A laser technique for characterizing the geometry of plant canopies

    NASA Technical Reports Server (NTRS)

    Vanderbilt, V. C.; Silva, L. F.; Bauer, M. E.

    1977-01-01

    The interception of solar power by the canopy is investigated as a function of solar zenith angle (time), component of the canopy, and depth into the canopy. The projected foliage area, cumulative leaf area, and view factors within the canopy are examined as a function of the same parameters. Two systems are proposed that are capable of describing the geometrical aspects of a vegetative canopy and of operation in an automatic mode. Either system would provide sufficient data to yield a numerical map of the foliage area in the canopy. Both systems would involve the collection of large data sets in a short time period using minimal manpower.

  3. Remote Safety Monitoring for Elderly Persons Based on Omni-Vision Analysis

    PubMed Central

    Xiang, Yun; Tang, Yi-ping; Ma, Bao-qing; Yan, Hang-chen; Jiang, Jun; Tian, Xu-yuan

    2015-01-01

    Remote monitoring service for elderly persons is important as the aged populations in most developed countries continue growing. To monitor the safety and health of the elderly population, we propose a novel omni-directional vision sensor based system, which can detect and track object motion, recognize human posture, and analyze human behavior automatically. In this work, we have made the following contributions: (1) we develop a remote safety monitoring system which can provide real-time and automatic health care for the elderly persons and (2) we design a novel motion history or energy images based algorithm for motion object tracking. Our system can accurately and efficiently collect, analyze, and transfer elderly activity information and provide health care in real-time. Experimental results show that our technique can improve the data analysis efficiency by 58.5% for object tracking. Moreover, for the human posture recognition application, the success rate can reach 98.6% on average. PMID:25978761

  4. Computer assisted data analysis in intensive care: the ICDEV project--development of a scientific database system for intensive care (Intensive Care Data Evaluation Project).

    PubMed

    Metnitz, P G; Laback, P; Popow, C; Laback, O; Lenz, K; Hiesmayr, M

    1995-01-01

    Patient Data Management Systems (PDMS) for ICUs collect, present and store clinical data. Various intentions make analysis of those digitally stored data desirable, such as quality control or scientific purposes. The aim of the Intensive Care Data Evaluation project (ICDEV), was to provide a database tool for the analysis of data recorded at various ICUs at the University Clinics of Vienna. General Hospital of Vienna, with two different PDMSs used: CareVue 9000 (Hewlett Packard, Andover, USA) at two ICUs (one medical ICU and one neonatal ICU) and PICIS Chart+ (PICIS, Paris, France) at one Cardiothoracic ICU. CONCEPT AND METHODS: Clinically oriented analysis of the data collected in a PDMS at an ICU was the beginning of the development. After defining the database structure we established a client-server based database system under Microsoft Windows NI and developed a user friendly data quering application using Microsoft Visual C++ and Visual Basic; ICDEV was successfully installed at three different ICUs, adjustment to the different PDMS configurations were done within a few days. The database structure developed by us enables a powerful query concept representing an 'EXPERT QUESTION COMPILER' which may help to answer almost any clinical questions. Several program modules facilitate queries at the patient, group and unit level. Results from ICDEV-queries are automatically transferred to Microsoft Excel for display (in form of configurable tables and graphs) and further processing. The ICDEV concept is configurable for adjustment to different intensive care information systems and can be used to support computerized quality control. However, as long as there exists no sufficient artifact recognition or data validation software for automatically recorded patient data, the reliability of these data and their usage for computer assisted quality control remain unclear and should be further studied.

  5. Text recognition and correction for automated data collection by mobile devices

    NASA Astrophysics Data System (ADS)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    Participatory sensing is an approach which allows mobile devices such as mobile phones to be used for data collection, analysis and sharing processes by individuals. Data collection is the first and most important part of a participatory sensing system, but it is time consuming for the participants. In this paper, we discuss automatic data collection approaches for reducing the time required for collection, and increasing the amount of collected data. In this context, we explore automated text recognition on images of store receipts which are captured by mobile phone cameras, and the correction of the recognized text. Accordingly, our first goal is to evaluate the performance of the Optical Character Recognition (OCR) method with respect to data collection from store receipt images. Images captured by mobile phones exhibit some typical problems, and common image processing methods cannot handle some of them. Consequently, the second goal is to address these types of problems through our proposed Knowledge Based Correction (KBC) method used in support of the OCR, and also to evaluate the KBC method with respect to the improvement on the accurate recognition rate. Results of the experiments show that the KBC method improves the accurate data recognition rate noticeably.

  6. USGS "Did You Feel It?" internet-based macroseismic intensity maps

    USGS Publications Warehouse

    Wald, D.J.; Quitoriano, V.; Worden, B.; Hopper, M.; Dewey, J.W.

    2011-01-01

    The U.S. Geological Survey (USGS) "Did You Feel It?" (DYFI) system is an automated approach for rapidly collecting macroseismic intensity data from Internet users' shaking and damage reports and generating intensity maps immediately following earthquakes; it has been operating for over a decade (1999-2011). DYFI-based intensity maps made rapidly available through the DYFI system fundamentally depart from more traditional maps made available in the past. The maps are made more quickly, provide more complete coverage and higher resolution, provide for citizen input and interaction, and allow data collection at rates and quantities never before considered. These aspects of Internet data collection, in turn, allow for data analyses, graphics, and ways to communicate with the public, opportunities not possible with traditional data-collection approaches. Yet web-based contributions also pose considerable challenges, as discussed herein. After a decade of operational experience with the DYFI system and users, we document refinements to the processing and algorithmic procedures since DYFI was first conceived. We also describe a number of automatic post-processing tools, operations, applications, and research directions, all of which utilize the extensive DYFI intensity datasets now gathered in near-real time. DYFI can be found online at the website http://earthquake.usgs.gov/dyfi/. ?? 2011 by the Istituto Nazionale di Geofisica e Vulcanologia.

  7. A new approach to configurable primary data collection.

    PubMed

    Stanek, J; Babkin, E; Zubov, M

    2016-09-01

    The formats, semantics and operational rules of data processing tasks in genomics (and health in general) are highly divergent and can rapidly change. In such an environment, the problem of consistent transformation and loading of heterogeneous input data to various target repositories becomes a critical success factor. The objective of the project was to design a new conceptual approach to configurable data transformation, de-identification, and submission of health and genomic data sets. Main motivation was to facilitate automated or human-driven data uploading, as well as consolidation of heterogeneous sources in large genomic or health projects. Modern methods of on-demand specialization of generic software components were applied. For specification of input-output data and required data collection activities, we propose a simple data model of flat tables as well as a domain-oriented graphical interface and portable representation of transformations in XML. Using such methods, the prototype of the Configurable Data Collection System (CDCS) was implemented in Java programming language with Swing graphical interfaces. The core logic of transformations was implemented as a library of reusable plugins. The solution is implemented as a software prototype for a configurable service-oriented system for semi-automatic data collection, transformation, sanitization and safe uploading to heterogeneous data repositories-CDCS. To address the dynamic nature of data schemas and data collection processes, the CDCS prototype facilitates interactive, user-driven configuration of the data collection process and extends basic functionality with a wide range of third-party plugins. Notably, our solution also allows for the reduction of manual data entry for data originally missing in the output data sets. First experiments and feedback from domain experts confirm the prototype is flexible, configurable and extensible; runs well on data owner's systems; and is not dependent on vendor's standards. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. An Automatic Baseline Regulation in a Highly Integrated Receiver Chip for JUNO

    NASA Astrophysics Data System (ADS)

    Muralidharan, P.; Zambanini, A.; Karagounis, M.; Grewing, C.; Liebau, D.; Nielinger, D.; Robens, M.; Kruth, A.; Peters, C.; Parkalian, N.; Yegin, U.; van Waasen, S.

    2017-09-01

    This paper describes the data processing unit and an automatic baseline regulation of a highly integrated readout chip (Vulcan) for JUNO. The chip collects data continuously at 1 Gsamples/sec. The Primary data processing which is performed in the integrated circuit can aid to reduce the memory and data processing efforts in the subsequent stages. In addition, a baseline regulator compensating a shift in the baseline is described.

  9. Systems and methods for analyzing building operations sensor data

    DOEpatents

    Mezic, Igor; Eisenhower, Bryan A.

    2015-05-26

    Systems and methods are disclosed for analyzing building sensor information and decomposing the information therein to a more manageable and more useful form. Certain embodiments integrate energy-based and spectral-based analysis methods with parameter sampling and uncertainty/sensitivity analysis to achieve a more comprehensive perspective of building behavior. The results of this analysis may be presented to a user via a plurality of visualizations and/or used to automatically adjust certain building operations. In certain embodiments, advanced spectral techniques, including Koopman-based operations, are employed to discern features from the collected building sensor data.

  10. Automatic Collision Avoidance Technology (ACAT)

    NASA Technical Reports Server (NTRS)

    Swihart, Donald E.; Skoog, Mark A.

    2007-01-01

    This document represents two views of the Automatic Collision Avoidance Technology (ACAT). One viewgraph presentation reviews the development and system design of Automatic Collision Avoidance Technology (ACAT). Two types of ACAT exist: Automatic Ground Collision Avoidance (AGCAS) and Automatic Air Collision Avoidance (AACAS). The AGCAS Uses Digital Terrain Elevation Data (DTED) for mapping functions, and uses Navigation data to place aircraft on map. It then scans DTED in front of and around aircraft and uses future aircraft trajectory (5g) to provide automatic flyup maneuver when required. The AACAS uses data link to determine position and closing rate. It contains several canned maneuvers to avoid collision. Automatic maneuvers can occur at last instant and both aircraft maneuver when using data link. The system can use sensor in place of data link. The second viewgraph presentation reviews the development of a flight test and an evaluation of the test. A review of the operation and comparison of the AGCAS and a pilot's performance are given. The same review is given for the AACAS is given.

  11. Distributed health care imaging information systems

    NASA Astrophysics Data System (ADS)

    Thompson, Mary R.; Johnston, William E.; Guojun, Jin; Lee, Jason; Tierney, Brian; Terdiman, Joseph F.

    1997-05-01

    We have developed an ATM network-based system to collect and catalogue cardio-angiogram videos from the source at a Kaiser central facility and make them available for viewing by doctors at primary care Kaiser facilities. This an example of the general problem of diagnostic data being generated at tertiary facilities, while the images, or other large data objects they produce, need to be used from a variety of other locations such as doctor's offices or local hospitals. We describe the use of a highly distributed computing and storage architecture to provide all aspects of collecting, storing, analyzing, and accessing such large data-objects in a metropolitan area ATM network. Our large data-object management system provides network interface between the object sources, the data management system and the user of the data. As the data is being stored, a cataloguing system automatically creates and stores condensed versions of the data, textural metadata and pointers to the original data. The catalogue system provides a Web-based graphical interface to the data. The user is able the view the low-resolution data with a standard Internet connection and Web browser. If high-resolution is required, a high-speed connection and special application programs can be used to view the high-resolution original data.

  12. SYRIAC: The systematic review information automated collection system a data warehouse for facilitating automated biomedical text classification.

    PubMed

    Yang, Jianji J; Cohen, Aaron M; Cohen, Aaron; McDonagh, Marian S

    2008-11-06

    Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent.To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation data sets for SR text mining research.

  13. Modeling and enhanced sampling of molecular systems with smooth and nonlinear data-driven collective variables

    NASA Astrophysics Data System (ADS)

    Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino

    2013-12-01

    Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.

  14. Modeling and enhanced sampling of molecular systems with smooth and nonlinear data-driven collective variables.

    PubMed

    Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino

    2013-12-07

    Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.

  15. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Treesearch

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  16. Design of a National Retail Data Monitor for Public Health Surveillance

    PubMed Central

    Wagner, Michael M.; Robinson, J. Michael; Tsui, Fu-Chiang; Espino, Jeremy U.; Hogan, William R.

    2003-01-01

    The National Retail Data Monitor receives data daily from 10,000 stores, including pharmacies, that sell health care products. These stores belong to national chains that process sales data centrally and utilize Universal Product Codes and scanners to collect sales information at the cash register. The high degree of retail sales data automation enables the monitor to collect information from thousands of store locations in near to real time for use in public health surveillance. The monitor provides user interfaces that display summary sales data on timelines and maps. Algorithms monitor the data automatically on a daily basis to detect unusual patterns of sales. The project provides the resulting data and analyses, free of charge, to health departments nationwide. Future plans include continued enrollment and support of health departments, developing methods to make the service financially self-supporting, and further refinement of the data collection system to reduce the time latency of data receipt and analysis. PMID:12807802

  17. Real-Time Wireless Data Acquisition System

    NASA Technical Reports Server (NTRS)

    Valencia, Emilio J.; Perotti, Jose; Lucena, Angel; Mata, Carlos

    2007-01-01

    Current and future aerospace requirements demand the creation of a new breed of sensing devices, with emphasis on reduced weight, power consumption, and physical size. This new generation of sensors must possess a high degree of intelligence to provide critical data efficiently and in real-time. Intelligence will include self-calibration, self-health assessment, and pre-processing of raw data at the sensor level. Most of these features are already incorporated in the Wireless Sensors Network (SensorNet(TradeMark)), developed by the Instrumentation Group at Kennedy Space Center (KSC). A system based on the SensorNet(TradeMark) architecture consists of data collection point(s) called Central Stations (CS) and intelligent sensors called Remote Stations (RS) where one or more CSs can be accommodated depending on the specific application. The CS's major function is to establish communications with the Remote Stations and to poll each RS for data and health information. The CS also collects, stores and distributes these data to the appropriate systems requiring the information. The system has the ability to perform point-to-point, multi-point and relay mode communications with an autonomous self-diagnosis of each communications link. Upon detection of a communication failure, the system automatically reconfigures to establish new communication paths. These communication paths are automatically and autonomously selected as the best paths by the system based on the existing operating environment. The data acquisition system currently under development at KSC consists of the SensorNet(TradeMark) wireless sensors as the remote stations and the central station called the Radio Frequency Health Node (RFHN). The RFF1N is the central station which remotely communicates with the SensorNet(TradeMark) sensors to control them and to receive data. The system's salient feature is the ability to provide deterministic sensor data with accurate time stamps for both time critical and non-time critical applications. Current wireless standards such as Zigbee(TradeMark) and Bluetooth(Registered TradeMark) do not have these capabilities and can not meet the needs that are provided by the SensorNet technology. Additionally, the system has the ability to automatically reconfigure the wireless communication link to a secondary frequency if interference is encountered and can autonomously search for a sensor that was perceived to be lost using the relay capabilities of the sensors and the secondary frequency. The RFHN and the SensorNet designs are based on modular architectures that allow for future increases in capability and the ability to expand or upgrade with relative ease. The RFHN and SensorNet sensors .can also perform data processing which forms a distributed processing architecture allowing the system to pass along information rather than just sending "raw data points" to the next higher level system. With a relatively small size, weight and power consumption, this system has the potential for both spacecraft and aircraft applications as well as ground applications that require time critical data.

  18. Automatic limb identification and sleeping parameters assessment for pressure ulcer prevention.

    PubMed

    Baran Pouyan, Maziyar; Birjandtalab, Javad; Nourani, Mehrdad; Matthew Pompeo, M D

    2016-08-01

    Pressure ulcers (PUs) are common among vulnerable patients such as elderly, bedridden and diabetic. PUs are very painful for patients and costly for hospitals and nursing homes. Assessment of sleeping parameters on at-risk limbs is critical for ulcer prevention. An effective assessment depends on automatic identification and tracking of at-risk limbs. An accurate limb identification can be used to analyze the pressure distribution and assess risk for each limb. In this paper, we propose a graph-based clustering approach to extract the body limbs from the pressure data collected by a commercial pressure map system. A robust signature-based technique is employed to automatically label each limb. Finally, an assessment technique is applied to evaluate the experienced stress by each limb over time. The experimental results indicate high performance and more than 94% average accuracy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Vadose zone monitoring strategies to control water flux dynamics and changes in soil hydraulic properties.

    NASA Astrophysics Data System (ADS)

    Valdes-Abellan, Javier; Jiménez-Martínez, Joaquin; Candela, Lucila

    2013-04-01

    For monitoring the vadose zone, different strategies can be chosen, depending on the objectives and scale of observation. The effects of non-conventional water use on the vadose zone might produce impacts in porous media which could lead to changes in soil hydraulic properties, among others. Controlling these possible effects requires an accurate monitoring strategy that controls the volumetric water content, θ, and soil pressure, h, along the studied profile. According to the available literature, different monitoring systems have been carried out independently, however less attention has received comparative studies between different techniques. An experimental plot of 9x5 m2 was set with automatic and non-automatic sensors to control θ and h up to 1.5m depth. The non-automatic system consisted of ten Jet Fill tensiometers at 30, 45, 60, 90 and 120 cm (Soil Moisture®) and a polycarbonate access tube of 44 mm (i.d) for soil moisture measurements with a TRIME FM TDR portable probe (IMKO®). Vertical installation was carefully performed; measurements with this system were manual, twice a week for θ and three times per week for h. The automatic system composed of five 5TE sensors (Decagon Devices®) installed at 20, 40, 60, 90 and 120 cm for θ measurements and one MPS1 sensor (Decagon Devices®) at 60 cm depth for h. Installation took place laterally in a 40-50 cm length hole bored in a side of a trench that was excavated. All automatic sensors hourly recorded and stored in a data-logger. Boundary conditions were controlled with a volume-meter and with a meteorological station. ET was modelled with Penman-Monteith equation. Soil characterization include bulk density, gravimetric water content, grain size distribution, saturated hydraulic conductivity and soil water retention curves determined following laboratory standards. Soil mineralogy was determined by X-Ray difractometry. Unsaturated soil hydraulic parameters were model-fitted through SWRC-fit code and ROSETTA based on soil textural fractions. Simulation of water flow using automatic and non-automatic date was carried out by HYDRUS-1D independently. A good agreement from collected automatic and non-automatic data and modelled results can be recognized. General trend was captured, except for the outlier values as expected. Slightly differences were found between hydraulic properties obtained from laboratory determinations, and from inverse modelling from the two approaches. Differences up to 14% of flux through the lower boundary were detected between the two strategies According to results, automatic sensors have more resolution and then they're more appropriated to detect subtle changes of soil hydraulic properties. Nevertheless, if the aim of the research is to control the general trend of water dynamics, no significant differences were observed between the two systems.

  20. Automatic system of collection of parameters and control of receiving equipment of the radiotelescope of VLBI complex "Quasar "

    NASA Astrophysics Data System (ADS)

    Syrovoy, Sergey

    At present the radiointerferometry with Very Long Bases (VLBI) is more and more globalized, turning into the world network of observation posts. So the inclusion of the developing Russian system "Quasar" into the world VLBI community has a great importance to us. The important role in the work of radiotelescope as a part of VLBI network belongs to a question of ensuring the optimal interaction of the its sub-systems, which can only be done by means of automation of the whole process of observation. The possibility of participation of RTF-32 in the international VLBI sessions observation is taken into account in the system development. These observations have the stable technology of experiments on the base Mark-IV Field System. In this paper the description, the structured and the functional schemes of the system of automatic collection of parameters and control of receiving complex of radiotelescope RTF-32 are given. This system is to solve the given problem. The most important tasks of the system being developed are the ensuring of distant checking and control of the following systems of the radiotelescope: 1. the receivers system, which consists of the five dual-channel radiometers 21-18 sm, 13 sm, 6 sm, 3.5 sm, 1.35 sm brands; 2. the radiotelescope pointing system; 3. the frequency-time synchronizing system, which consists of the hydrogen standard of frequency, the system of ultrahigh frequency oscillators and the generators of picosecond impulses; 4. the signal transformation system; 5. the signal registration system; 6. the system of measurement of electrical features of atmosphere; 7. the power supply system. The part of the automatic system, ensuring the distant checking and control of the radiotelescope pointing system both in the local mode and in the state of working under control the Field System computer, was put into operation and is functioning at this moment. Now the part of the automatic system ensuring the checking and control of receiving system of radiotelescope is being developed. The functional scheme has been designed. The experimental model of the device of connection of control PC with the terminal has been produced. The algorithms of receiver control in the different modes of observation have been developed. The questions of interaction with the computer Field System have been solved. The radiotelescope RTF-32 is capable of functioning in two modes such as radio-astronomical and radio-interferometrical. The control of the transformation signal system and the registration signal system in these modes is different and is entrusted with the Field System computer. The automation of collection of the meteorological data and parameters of the power supply system of the radiotelescope is last stage of the development of the presented system.

  1. Integrating technology into complex intervention trial processes: a case study.

    PubMed

    Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica

    2016-11-17

    Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database designed to support data collection, intervention fidelity and trial progress provides a viable option for streamlining trial processes in a multicentre complex intervention trial. There is scope to further extend the system to cater for larger trials and add further functionality such as automatic reporting facilities and participant management support. ISRCTN65378754 , registered on 13 March 2014.

  2. Realization of the ergonomics design and automatic control of the fundus cameras

    NASA Astrophysics Data System (ADS)

    Zeng, Chi-liang; Xiao, Ze-xin; Deng, Shi-chao; Yu, Xin-ye

    2012-12-01

    The principles of ergonomics design in fundus cameras should be extending the agreeableness by automatic control. Firstly, a 3D positional numerical control system is designed for positioning the eye pupils of the patients who are doing fundus examinations. This system consists of a electronically controlled chin bracket for moving up and down, a lateral movement of binocular with the detector and the automatic refocusing of the edges of the eye pupils. Secondly, an auto-focusing device for the object plane of patient's fundus is designed, which collects the patient's fundus images automatically whether their eyes is ametropic or not. Finally, a moving visual target is developed for expanding the fields of the fundus images.

  3. Increasing Accuracy: A New Design and Algorithm for Automatically Measuring Weights, Travel Direction and Radio Frequency Identification (RFID) of Penguins.

    PubMed

    Afanasyev, Vsevolod; Buldyrev, Sergey V; Dunn, Michael J; Robst, Jeremy; Preston, Mark; Bremner, Steve F; Briggs, Dirk R; Brown, Ruth; Adlard, Stacey; Peat, Helen J

    2015-01-01

    A fully automated weighbridge using a new algorithm and mechanics integrated with a Radio Frequency Identification System is described. It is currently in use collecting data on Macaroni penguins (Eudyptes chrysolophus) at Bird Island, South Georgia. The technology allows researchers to collect very large, highly accurate datasets of both penguin weight and direction of their travel into or out of a breeding colony, providing important contributory information to help understand penguin breeding success, reproductive output and availability of prey. Reliable discrimination between single and multiple penguin crossings is demonstrated. Passive radio frequency tags implanted into penguins allow researchers to match weight and trip direction to individual birds. Low unit and operation costs, low maintenance needs, simple operator requirements and accurate time stamping of every record are all important features of this type of weighbridge, as is its proven ability to operate 24 hours a day throughout a breeding season, regardless of temperature or weather conditions. Users are able to define required levels of accuracy by adjusting filters and raw data are automatically recorded and stored allowing for a range of processing options. This paper presents the underlying principles, design specification and system description, provides evidence of the weighbridge's accurate performance and demonstrates how its design is a significant improvement on existing systems.

  4. Faster, efficient and secure collection of research images: the utilization of cloud technology to expand the OMI-DB

    NASA Astrophysics Data System (ADS)

    Patel, M. N.; Young, K.; Halling-Brown, M. D.

    2018-03-01

    The demand for medical images for research is ever increasing owing to the rapid rise in novel machine learning approaches for early detection and diagnosis. The OPTIMAM Medical Image Database (OMI-DB)1,2 was created to provide a centralized, fully annotated dataset for research. The database contains both processed and unprocessed images, associated data, annotations and expert-determined ground truths. Since the inception of the database in early 2011, the volume of images and associated data collected has dramatically increased owing to automation of the collection pipeline and inclusion of new sites. Currently, these data are stored at each respective collection site and synced periodically to a central store. This leads to a large data footprint at each site, requiring large physical onsite storage, which is expensive. Here, we propose an update to the OMI-DB collection system, whereby the storage of all the data is automatically transferred to the cloud on collection. This change in the data collection paradigm reduces the reliance of physical servers at each site; allows greater scope for future expansion; and removes the need for dedicated backups and improves security. Moreover, with the number of applications to access the data increasing rapidly with the maturity of the dataset cloud technology facilities faster sharing of data and better auditing of data access. Such updates, although may sound trivial; require substantial modification to the existing pipeline to ensure data integrity and security compliance. Here, we describe the extensions to the OMI-DB collection pipeline and discuss the relative merits of the new system.

  5. Overnight non-contact continuous vital signs monitoring using an intelligent automatic beam-steering Doppler sensor at 2.4 GHz.

    PubMed

    Batchu, S; Narasimhachar, H; Mayeda, J C; Hall, T; Lopez, J; Nguyen, T; Banister, R E; Lie, D Y C

    2017-07-01

    Doppler-based non-contact vital signs (NCVS) sensors can monitor heart rates, respiration rates, and motions of patients without physically touching them. We have developed a novel single-board Doppler-based phased-array antenna NCVS biosensor system that can perform robust overnight continuous NCVS monitoring with intelligent automatic subject tracking and optimal beam steering algorithms. Our NCVS sensor achieved overnight continuous vital signs monitoring with an impressive heart-rate monitoring accuracy of over 94% (i.e., within ±5 Beats-Per-Minute vs. a reference sensor), analyzed from over 400,000 data points collected during each overnight monitoring period of ~ 6 hours at a distance of 1.75 meters. The data suggests our intelligent phased-array NCVS sensor can be very attractive for continuous monitoring of low-acuity patients.

  6. VoIP attacks detection engine based on neural network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Slachta, Jiri

    2015-05-01

    The security is crucial for any system nowadays, especially communications. One of the most successful protocols in the field of communication over IP networks is Session Initiation Protocol. It is an open-source project used by different kinds of applications, both open-source and proprietary. High penetration and text-based principle made SIP number one target in IP telephony infrastructure, so security of SIP server is essential. To keep up with hackers and to detect potential malicious attacks, security administrator needs to monitor and evaluate SIP traffic in the network. But monitoring and following evaluation could easily overwhelm the security administrator in networks, typically in networks with a number of SIP servers, users and logically or geographically separated networks. The proposed solution lies in automatic attack detection systems. The article covers detection of VoIP attacks through a distributed network of nodes. Then the gathered data analyze aggregation server with artificial neural network. Artificial neural network means multilayer perceptron network trained with a set of collected attacks. Attack data could also be preprocessed and verified with a self-organizing map. The source data is detected by distributed network of detection nodes. Each node contains a honeypot application and traffic monitoring mechanism. Aggregation of data from each node creates an input for neural networks. The automatic classification on a centralized server with low false positive detection reduce the cost of attack detection resources. The detection system uses modular design for easy deployment in final infrastructure. The centralized server collects and process detected traffic. It also maintains all detection nodes.

  7. High-speed data search

    NASA Technical Reports Server (NTRS)

    Driscoll, James N.

    1994-01-01

    The high-speed data search system developed for KSC incorporates existing and emerging information retrieval technology to help a user intelligently and rapidly locate information found in large textual databases. This technology includes: natural language input; statistical ranking of retrieved information; an artificial intelligence concept called semantics, where 'surface level' knowledge found in text is used to improve the ranking of retrieved information; and relevance feedback, where user judgements about viewed information are used to automatically modify the search for further information. Semantics and relevance feedback are features of the system which are not available commercially. The system further demonstrates focus on paragraphs of information to decide relevance; and it can be used (without modification) to intelligently search all kinds of document collections, such as collections of legal documents medical documents, news stories, patents, and so forth. The purpose of this paper is to demonstrate the usefulness of statistical ranking, our semantic improvement, and relevance feedback.

  8. Point Cloud Generation from Aerial Image Data Acquired by a Quadrocopter Type Micro Unmanned Aerial Vehicle and a Digital Still Camera

    PubMed Central

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems’ SOCET SET classical commercial photogrammetric software and another is built using Microsoft®’s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479

  9. Design description of the Tangaye Village photovoltaic power system

    NASA Astrophysics Data System (ADS)

    Martz, J. E.; Ratajczak, A. F.

    1982-06-01

    The engineering design of a stand alone photovoltaic (PV) powered grain mill and water pump for the village of Tangaye, Upper Volta is described. The socioeconomic effects of reducing the time required by women in rural areas for drawing water and grinding grain were studied. The suitability of photovoltaic technology for use in rural areas by people of limited technical training was demonstrated. The PV system consists of a 1.8-kW (peak) solar cell array, 540 ampere hours of battery storage, instrumentation, automatic controls, and a data collection and storage system. The PV system is situated near an improved village well and supplies d.c. power to a grain mill and a water pump. The array is located in a fenced area and the mill, battery, instruments, controls, and data system are in a mill building. A water storage tank is located near the well. The system employs automatic controls which provide battery charge regulation and system over and under voltage protection. This report includes descriptions of the engineering design of the system and of the load that it serves; a discussion of PV array and battery sizing methodology; descriptions of the mechanical and electrical designs including the array, battery, controls, and instrumentation; and a discussion of the safety features. The system became operational on March 1, 1979.

  10. Design description of the Tangaye Village photovoltaic power system

    NASA Technical Reports Server (NTRS)

    Martz, J. E.; Ratajczak, A. F.

    1982-01-01

    The engineering design of a stand alone photovoltaic (PV) powered grain mill and water pump for the village of Tangaye, Upper Volta is described. The socioeconomic effects of reducing the time required by women in rural areas for drawing water and grinding grain were studied. The suitability of photovoltaic technology for use in rural areas by people of limited technical training was demonstrated. The PV system consists of a 1.8-kW (peak) solar cell array, 540 ampere hours of battery storage, instrumentation, automatic controls, and a data collection and storage system. The PV system is situated near an improved village well and supplies d.c. power to a grain mill and a water pump. The array is located in a fenced area and the mill, battery, instruments, controls, and data system are in a mill building. A water storage tank is located near the well. The system employs automatic controls which provide battery charge regulation and system over and under voltage protection. This report includes descriptions of the engineering design of the system and of the load that it serves; a discussion of PV array and battery sizing methodology; descriptions of the mechanical and electrical designs including the array, battery, controls, and instrumentation; and a discussion of the safety features. The system became operational on March 1, 1979.

  11. EDMUS, a European database for multiple sclerosis.

    PubMed

    Confavreux, C; Compston, D A; Hommes, O R; McDonald, W I; Thompson, A J

    1992-08-01

    EDMUS is a minimal descriptive record developed for research purposes to document clinical and laboratory data in patients with multiple sclerosis (MS). It has been designed by a committee of the European Concerted Action for MS, organised under the auspices of the Commission of the European Communities. The software is user-friendly and fast, with a minimal set of obligatory data. Priority has been given to analytical data and the system is capable of automatically generating data, such as diagnosis classification, using appropriate algorithms. This procedure saves time, ensures a uniform approach to individual cases and allows automatic updating of the classification whenever additional information becomes available. It is also compatible with future developments and requirements since new algorithms can be entered in the programme when necessary. This system is flexible and may be adapted to the users needs. It is run on Apple and IBM-PC personal microcomputers. Great care has been taken to preserve confidentiality of the data. It is anticipated that this "common" language will enable the collection of appropriate cases for specific purposes, including population-based studies of MS and will be particularly useful in projects where the collaboration of several centres is needed to recruit a critical number of patients.

  12. Automatic humidification system to support the assessment of food drying processes

    NASA Astrophysics Data System (ADS)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  13. Collection, storage, retrieval, and publication of water-resources data

    USGS Publications Warehouse

    Showen, C. R.

    1978-01-01

    This publication represents a series of papers devoted to the subject of collection, storage, retrieval, and publication of hydrologic data. The papers were presented by members of the U.S. Geological Survey at the International Seminar on Organization and Operation of Hydrologic Services, Ottawa, Canada, July 15-16, 1976, sponsored by the World Meteorological Organization. The first paper, ' Standardization of Hydrologic Measurements, ' by George F. Smoot discusses the need for standardization of the methods and instruments used in measuring hydrologic data. The second paper, ' Use of Earth Satellites for Automation of Hydrologic Data Collection, ' by Richard W. Paulson discusses the use of inexpensive battery-operated radios to transmit realtime hydrologic data to earth satellites and back to ground receiving stations for computer processing. The third paper, ' Operation Hydrometeorological Data-Collection System for the Columbia River, ' by Nicholas A. Kallio discusses the operation of a complex water-management system for a large river basin utilizing the latest automatic telemetry and processing devices. The fourth paper, ' Storage and Retrieval of Water-Resources Data, ' by Charles R. Showen discusses the U.S. Geological Survey 's National Water Data Storage and Retrieval System (WATSTORE) and its use in processing water resources data. The final paper, ' Publication of Water Resources Data, ' by S. M. Lang and C. B. Ham discusses the requirement for publication of water-resources data to meet the needs of a widespread audience and for archival purposes. (See W78-09324 thru W78-09328) (Woodard-USGS)

  14. A Security Monitoring Framework For Virtualization Based HEP Infrastructures

    NASA Astrophysics Data System (ADS)

    Gomez Ramirez, A.; Martinez Pedreira, M.; Grigoras, C.; Betev, L.; Lara, C.; Kebschull, U.; ALICE Collaboration

    2017-10-01

    High Energy Physics (HEP) distributed computing infrastructures require automatic tools to monitor, analyze and react to potential security incidents. These tools should collect and inspect data such as resource consumption, logs and sequence of system calls for detecting anomalies that indicate the presence of a malicious agent. They should also be able to perform automated reactions to attacks without administrator intervention. We describe a novel framework that accomplishes these requirements, with a proof of concept implementation for the ALICE experiment at CERN. We show how we achieve a fully virtualized environment that improves the security by isolating services and Jobs without a significant performance impact. We also describe a collected dataset for Machine Learning based Intrusion Prevention and Detection Systems on Grid computing. This dataset is composed of resource consumption measurements (such as CPU, RAM and network traffic), logfiles from operating system services, and system call data collected from production Jobs running in an ALICE Grid test site and a big set of malware samples. This malware set was collected from security research sites. Based on this dataset, we will proceed to develop Machine Learning algorithms able to detect malicious Jobs.

  15. Pivot/Remote: a distributed database for remote data entry in multi-center clinical trials.

    PubMed

    Higgins, S B; Jiang, K; Plummer, W D; Edens, T R; Stroud, M J; Swindell, B B; Wheeler, A P; Bernard, G R

    1995-01-01

    1. INTRODUCTION. Data collection is a critical component of multi-center clinical trials. Clinical trials conducted in intensive care units (ICU) are even more difficult because the acute nature of illnesses in ICU settings requires that masses of data be collected in a short time. More than a thousand data points are routinely collected for each study patient. The majority of clinical trials are still "paper-based," even if a remote data entry (RDE) system is utilized. The typical RDE system consists of a computer housed in the CC office and connected by modem to a centralized data coordinating center (DCC). Study data must first be recorded on a paper case report form (CRF), transcribed into the RDE system, and transmitted to the DCC. This approach requires additional monitoring since both the paper CRF and study database must be verified. The paper-based RDE system cannot take full advantage of automatic data checking routines. Much of the effort (and expense) of a clinical trial is ensuring that study data matches the original patient data. 2. METHODS. We have developed an RDE system, Pivot/Remote, that eliminates the need for paper-based CRFs. It creates an innovative, distributed database. The database resides partially at the study clinical centers (CC) and at the DCC. Pivot/Remote is descended from technology introduced with Pivot [1]. Study data is collected at the bedside with laptop computers. A graphical user interface (GUI) allows the display of electronic CRFs that closely mimic the normal paper-based forms. Data entry time is the same as for paper CRFs. Pull-down menus, displaying the possible responses, simplify the process of entering data. Edit checks are performed on most data items. For example, entered dates must conform to some temporal logic imposed by the study. Data must conform to some acceptable range of values. Calculations, such as computing the subject's age or the APACHE II score, are automatically made as the data is entered. Data that is collected serially (BP, HR, etc.) can be displayed graphically in a trend form along with other related variables. An audit trail is created that automatically tracks all changes to the original data, making it possible to reconstruct the CRF to any point in time. On-line help provides information on the study protocol as well as assistance with the use of the system. Electronic security makes it possible to lock certain parts of the CRF once it has been monitored. Completed CRFs are transmitted to the DCC via electronic mail where it is reviewed and merged into the study database. Questions about subject data are transmitted back to the CC via electronic mail. This approach to maintaining the study database is unique in that the study data files are distributed among the CC and DCC. Until a subject's CRF is monitored (verified against the original patient data residing in the hospital record), it logically resides at the CC where it was collected. Copies are transmitted to the DCC and are only read there. Any pre-monitoring changes must be made to the data at the CC. Once the subject's CRF is monitored, it logically moves to the DCC, and any subsequent changes are made at the DCC with copies of the CRF flowing back to the CC. 3. DISCUSSION. Pivot/Remote eliminates the need for paper forms by utilizing portable computers that can be used at the patient bedside. A GUI makes it possible to quickly enter data. Because the user gets instant feedback on possible error conditions, time is saved because the original data is close at hand. The ability to display trended data or variables in the context of other data allows detection of erroneous conditions beyond simple range checks. The logical construction of the database minimizes the problem of managing dual databases (at the CC and DCC) and keeps CC personnel in the loop until all changes are made.

  16. SYRIAC: The SYstematic Review Information Automated Collection System A Data Warehouse for Facilitating Automated Biomedical Text Classification

    PubMed Central

    Yang, Jianji J.; Cohen, Aaron M.; McDonagh, Marian S.

    2008-01-01

    Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent. To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation datasets for SR text mining research. PMID:18999194

  17. Automatic Co-Registration of QuickBird Data for Change Detection Applications

    NASA Technical Reports Server (NTRS)

    Bryant, Nevin A.; Logan, Thomas L.; Zobrist, Albert L.

    2006-01-01

    This viewgraph presentation reviews the use Automatic Fusion of Image Data System (AFIDS) for Automatic Co-Registration of QuickBird Data to ascertain if changes have occurred in images. The process is outlined, and views from Iraq and Los Angelels are shown to illustrate the process.

  18. Speech input system for meat inspection and pathological coding used thereby

    NASA Astrophysics Data System (ADS)

    Abe, Shozo

    Meat inspection is one of exclusive and important jobs of veterinarians though it is not well known in general. As the inspection should be conducted skillfully during a series of continuous operations in a slaughter house, development of automatic inspecting systems has been required for a long time. We employed a hand-free speech input system to record the inspecting data because inspecters have to use their both hands to treat the internals of catles and check their health conditions by necked eyes. The data collected by the inspectors are transfered to a speech recognizer and then stored as controlable data of each catle inspected. Control of terms such as pathological conditions to be input and their coding are also important in this speech input system and practical examples are shown.

  19. [Development of a medical equipment support information system based on PDF portable document].

    PubMed

    Cheng, Jiangbo; Wang, Weidong

    2010-07-01

    According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data

  20. A real time dynamic data acquisition and processing system for velocity, density, and total temperature fluctuation measurements

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1991-01-01

    The real time Dynamic Data Acquisition and Processing System (DDAPS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAPS replaces both a recording mechanism and a separate data processing system. DDAPS receives input from hot wire anemometers. Amplifiers and filters condition the signals with computer controlled modules. The analog signals are simultaneously digitized and digitally recorded on disk. Automatic acquisition collects necessary calibration and environment data. Hot wire sensitivities are generated and applied to the hot wire data to compute fluctuations. The presentation of the raw and processed data is accomplished on demand. The interface to DDAPS is described along with the internal mechanisms of DDAPS. A summary of operations relevant to the use of the DDAPS is also provided.

  1. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  2. Control law system for X-Wing aircraft

    NASA Technical Reports Server (NTRS)

    Lawrence, Thomas H. (Inventor); Gold, Phillip J. (Inventor)

    1990-01-01

    Control law system for the collective axis, as well as pitch and roll axes, of an X-Wing aircraft and for the pneumatic valving controlling circulation control blowing for the rotor. As to the collective axis, the system gives the pilot single-lever direct lift control and insures that maximum cyclic blowing control power is available in transition. Angle-of-attach de-coupling is provided in rotary wing flight, and mechanical collective is used to augment pneumatic roll control when appropriate. Automatic gain variations with airspeed and rotor speed are provided, so a unitary set of control laws works in all three X-Wing flight modes. As to pitch and roll axes, the system produces essentially the same aircraft response regardless of flight mode or condition. Undesirable cross-couplings are compensated for in a manner unnoticeable to the pilot without requiring pilot action, as flight mode or condition is changed. A hub moment feedback scheme is implemented, utilizing a P+I controller, significantly improving bandwidth. Limits protect aircraft structure from inadvertent damage. As to pneumatic valving, the system automatically provides the pressure required at each valve azimuth location, as dictated by collective, cyclic and higher harmonic blowing commands. Variations in the required control phase angle are automatically introduced, and variations in plenum pressure are compensated for. The required switching for leading, trailing and dual edge blowing is automated, using a simple table look-up procedure. Non-linearities due to valve characteristics of circulation control lift are linearized by map look-ups.

  3. BiobankConnect: software to rapidly connect data elements for pooled analysis across biobanks using ontological and lexical indexing.

    PubMed

    Pang, Chao; Hendriksen, Dennis; Dijkstra, Martijn; van der Velde, K Joeri; Kuiper, Joel; Hillege, Hans L; Swertz, Morris A

    2015-01-01

    Pooling data across biobanks is necessary to increase statistical power, reveal more subtle associations, and synergize the value of data sources. However, searching for desired data elements among the thousands of available elements and harmonizing differences in terminology, data collection, and structure, is arduous and time consuming. To speed up biobank data pooling we developed BiobankConnect, a system to semi-automatically match desired data elements to available elements by: (1) annotating the desired elements with ontology terms using BioPortal; (2) automatically expanding the query for these elements with synonyms and subclass information using OntoCAT; (3) automatically searching available elements for these expanded terms using Lucene lexical matching; and (4) shortlisting relevant matches sorted by matching score. We evaluated BiobankConnect using human curated matches from EU-BioSHaRE, searching for 32 desired data elements in 7461 available elements from six biobanks. We found 0.75 precision at rank 1 and 0.74 recall at rank 10 compared to a manually curated set of relevant matches. In addition, best matches chosen by BioSHaRE experts ranked first in 63.0% and in the top 10 in 98.4% of cases, indicating that our system has the potential to significantly reduce manual matching work. BiobankConnect provides an easy user interface to significantly speed up the biobank harmonization process. It may also prove useful for other forms of biomedical data integration. All the software can be downloaded as a MOLGENIS open source app from http://www.github.com/molgenis, with a demo available at http://www.biobankconnect.org. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  4. BiobankConnect: software to rapidly connect data elements for pooled analysis across biobanks using ontological and lexical indexing

    PubMed Central

    Pang, Chao; Hendriksen, Dennis; Dijkstra, Martijn; van der Velde, K Joeri; Kuiper, Joel; Hillege, Hans L; Swertz, Morris A

    2015-01-01

    Objective Pooling data across biobanks is necessary to increase statistical power, reveal more subtle associations, and synergize the value of data sources. However, searching for desired data elements among the thousands of available elements and harmonizing differences in terminology, data collection, and structure, is arduous and time consuming. Materials and methods To speed up biobank data pooling we developed BiobankConnect, a system to semi-automatically match desired data elements to available elements by: (1) annotating the desired elements with ontology terms using BioPortal; (2) automatically expanding the query for these elements with synonyms and subclass information using OntoCAT; (3) automatically searching available elements for these expanded terms using Lucene lexical matching; and (4) shortlisting relevant matches sorted by matching score. Results We evaluated BiobankConnect using human curated matches from EU-BioSHaRE, searching for 32 desired data elements in 7461 available elements from six biobanks. We found 0.75 precision at rank 1 and 0.74 recall at rank 10 compared to a manually curated set of relevant matches. In addition, best matches chosen by BioSHaRE experts ranked first in 63.0% and in the top 10 in 98.4% of cases, indicating that our system has the potential to significantly reduce manual matching work. Conclusions BiobankConnect provides an easy user interface to significantly speed up the biobank harmonization process. It may also prove useful for other forms of biomedical data integration. All the software can be downloaded as a MOLGENIS open source app from http://www.github.com/molgenis, with a demo available at http://www.biobankconnect.org. PMID:25361575

  5. A preliminary study into performing routine tube output and automatic exposure control quality assurance using radiology information system data.

    PubMed

    Charnock, P; Jones, R; Fazakerley, J; Wilde, R; Dunn, A F

    2011-09-01

    Data are currently being collected from hospital radiology information systems in the North West of the UK for the purposes of both clinical audit and patient dose audit. Could these data also be used to satisfy quality assurance (QA) requirements according to UK guidance? From 2008 to 2009, 731 653 records were submitted from 8 hospitals from the North West England. For automatic exposure control QA, the protocol from Institute of Physics and Engineering in Medicine (IPEM) report 91 recommends that milliamperes per second can be monitored for repeatability and reproducibility using a suitable phantom, at 70-81 kV. Abdomen AP and chest PA examinations were analysed to find the most common kilovoltage used with these records then used to plot average monthly milliamperes per second with time. IPEM report 91 also recommends that a range of commonly used clinical settings is used to check output reproducibility and repeatability. For each tube, the dose area product values were plotted over time for two most common exposure factor sets. Results show that it is possible to do performance checks of AEC systems; however more work is required to be able to monitor tube output performance. Procedurally, the management system requires work and the benefits to the workflow would need to be demonstrated.

  6. Charged Coupled Device Debris Telescope Observations of the Geosynchronous Orbital Debris Environment - Observing Year: 1998

    NASA Technical Reports Server (NTRS)

    Jarvis, K. S.; Thumm, T. L.; Matney, M. J.; Jorgensen, K.; Stansbery, E. G.; Africano, J. L.; Sydney, P. F.; Mulrooney, M. K.

    2002-01-01

    NASA has been using the charged coupled device (CCD) debris telescope (CDT)--a transportable 32-cm Schmidt telescope located near Cloudcroft, New Mexico-to help characterize the debris environment in geosynchronous Earth orbit (GEO). The CDT is equipped with a SITe 512 x 512 CCD camera whose 24 m2 (12.5 arc sec) pixels produce a 1.7 x 1.7-deg field of view. The CDT system can therefore detect l7th-magnitude objects in a 20-sec integration corresponding to an approx. 0.6-m diameter, 0.20 albedo object at 36,000 km. The telescope pointing and CCD operation are computer controlled to collect data automatically for an entire night. The CDT has collected more than 1500 hrs of data since November 1997. This report describes the collection and analysis of 58 nights (approx. 420 hrs) of data acquired in 1998.

  7. Instrumentation for a dry-pond detention study

    USGS Publications Warehouse

    Pope, L.M.; Jennings, M.E.; Thibodeaux, K.G.

    1988-01-01

    A 12.3-acre, fully urbanized, residential land-use catchment was instrumented by the U. S. Geological Survey in Topeka, Kansas. Hydraulic instrumentation for flow measurement includes two types of flumes, a pipe-insert flume and a culvert-inlet (manhole) flume. Samples of rainfall and runoff for water-quality analyses were collected by automatic, 3-liter, 24-sample capacity water samples controlled by multichannel data loggers. Ancillary equipment included a raingage and wet/dry atmospheric-deposition sampler. Nineteen stormwater runoff events were monitored at the site using the instrumentation system. The system has a high reliability of data capture and permits an accurate determination of storm-water loads.

  8. Energy Balance Bowen Ratio (EBBR) Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, D. R.

    2016-01-01

    The Energy Balance Bowen Ratio (EBBR) system produces 30-minute estimates of the vertical fluxes of sensible and latent heat at the local surface. Flux estimates are calculated from observations of net radiation, soil surface heat flux, and the vertical gradients of temperature and relative humidity (RH). Meteorological data collected by the EBBR are used to calculate bulk aerodynamic fluxes, which are used in the Bulk Aerodynamic Technique (BA) EBBR value-added product (VAP) to replace sunrise and sunset spikes in the flux data. A unique aspect of the system is the automatic exchange mechanism (AEM), which helps to reduce errors frommore » instrument offset drift.« less

  9. Energy Balance Bowen Ratio Station (EBBR) Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, DR

    2011-02-23

    The energy balance Bowen ratio (EBBR) system produces 30-minute estimates of the vertical fluxes of sensible and latent heat at the local surface. Flux estimates are calculated from observations of net radiation, soil surface heat flux, and the vertical gradients of temperature and relative humidity (RH). Meteorological data collected by the EBBR are used to calculate bulk aerodynamic fluxes, which are used in the Bulk Aerodynamic Technique (BA) EBBR value-added product (VAP) to replace sunrise and sunset spikes in the flux data. A unique aspect of the system is the automatic exchange mechanism (AEM), which helps to reduce errors frommore » instrument offset drift.« less

  10. A system for automatic analysis of blood pressure data for digital computer entry

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1972-01-01

    Operation of automatic blood pressure data system is described. Analog blood pressure signal is analyzed by three separate circuits, systolic, diastolic, and cycle defect. Digital computer output is displayed on teletype paper tape punch and video screen. Illustration of system is included.

  11. Intelligence Surveillance And Reconnaissance Full Motion Video Automatic Anomaly Detection Of Crowd Movements: System Requirements For Airborne Application

    DTIC Science & Technology

    The collection of Intelligence , Surveillance, and Reconnaissance (ISR) Full Motion Video (FMV) is growing at an exponential rate, and the manual... intelligence for the warfighter. This paper will address the question of how can automatic pattern extraction, based on computer vision, extract anomalies in

  12. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  13. SPINS: standardized protein NMR storage. A data dictionary and object-oriented relational database for archiving protein NMR spectra.

    PubMed

    Baran, Michael C; Moseley, Hunter N B; Sahota, Gurmukh; Montelione, Gaetano T

    2002-10-01

    Modern protein NMR spectroscopy laboratories have a rapidly growing need for an easily queried local archival system of raw experimental NMR datasets. SPINS (Standardized ProteIn Nmr Storage) is an object-oriented relational database that provides facilities for high-volume NMR data archival, organization of analyses, and dissemination of results to the public domain by automatic preparation of the header files required for submission of data to the BioMagResBank (BMRB). The current version of SPINS coordinates the process from data collection to BMRB deposition of raw NMR data by standardizing and integrating the storage and retrieval of these data in a local laboratory file system. Additional facilities include a data mining query tool, graphical database administration tools, and a NMRStar v2. 1.1 file generator. SPINS also includes a user-friendly internet-based graphical user interface, which is optionally integrated with Varian VNMR NMR data collection software. This paper provides an overview of the data model underlying the SPINS database system, a description of its implementation in Oracle, and an outline of future plans for the SPINS project.

  14. High-resolution seismic-reflection profiles collected aboard R/V James M. Gilliss Cruise GS-7903-3, over the Atlantic Continental Slope and Rise off New England

    USGS Publications Warehouse

    Bailey, Norman G.; Aaron, John M.

    1982-01-01

    During June 1979, the U.S. Geological Survey (USGS) collected 4,032 km of single-channel seismic-reflection data from the Atlantic Continental Slope and Rise off New England. The work was conducted aboard R/V JAMES M. GILLISS (cruise GS-7903-3). The purpose of the cruise was to determine the characteristics of mass sediment movement on the Continental Slope, and to study and correlate the stratigraphy of the Jurassic and Cretaceous strata lying north and south of the New England seamount chain.Seismic instrumentation included 40-in3, 160-in3, and 500-in3 airguns; a Teledyne 800-joule minisparker system; a 3-5-kHz to 7-kHz, hull-mounted tunable transducer; and a 7-channel analog tape recorder.Navigation control during the cruise was provided by a Western Integrated Navigation System capable of integrating satellite, rho-rho Loran-C, hyperbolic Loran-C, gyro compass, and doppler speed-log position data. The prime navigation sensor was the rho-rho Loran-C automatically recorded at 20-second intervals and manually plotted every 15 minutes, backed up by hyperbolic Loran-C fixes automatically recorded every 5 minutes.Of the 4,032 km of data collected, 3,257 km of 3-5-kHz, minisparker and 40-in3 airgun were for the sediment-slump studand the other 775 km of 3-5-kHz, minis parker, 160-in3 air gun and 500-in3 airgun were for the deep stratigraphy study. Overall, the quality of the data is excellent with good resolution and penetration.The original data may be examined at the U.S. Geological Survey, Woods Hole, MA 02543. Copies of the data can be purchased only from the National Geophysical and Solar-Terrestrial Data Center, NOAA/EDIS/NGSDC, Code D621, 325 Broadway, Boulder, CO 80303 (303-497-6338).

  15. Design and pilot evaluation of the RAH-66 Comanche Core AFCS

    NASA Technical Reports Server (NTRS)

    Fogler, Donald L., Jr.; Keller, James F.

    1993-01-01

    This paper addresses the design and pilot evaluation of the Core Automatic Flight Control System (AFCS) for the Reconnaissance/Attack Helicopter (RAH-66) Comanche. During the period from November 1991 through February 1992, the RAH-66 Comanche control laws were evaluated through a structured pilot acceptance test using a motion base simulator. Design requirements, descriptions of the control law design, and handling qualities data collected from ADS-33 maneuvers are presented.

  16. Automatic Calibration of an Airborne Imaging System to an Inertial Navigation Unit

    NASA Technical Reports Server (NTRS)

    Ansar, Adnan I.; Clouse, Daniel S.; McHenry, Michael C.; Zarzhitsky, Dimitri V.; Pagdett, Curtis W.

    2013-01-01

    This software automatically calibrates a camera or an imaging array to an inertial navigation system (INS) that is rigidly mounted to the array or imager. In effect, it recovers the coordinate frame transformation between the reference frame of the imager and the reference frame of the INS. This innovation can automatically derive the camera-to-INS alignment using image data only. The assumption is that the camera fixates on an area while the aircraft flies on orbit. The system then, fully automatically, solves for the camera orientation in the INS frame. No manual intervention or ground tie point data is required.

  17. Telematic integration of health data: a practicable contribution.

    PubMed

    Guerriero, Lorenzo; Ferdeghini, Ezio M; Viola, Silvia R; Porro, Ivan; Testi, Angela; Bedini, Remo

    2011-09-01

    The patients' clinical and healthcare data should virtually be available everywhere, both to provide a more efficient and effective medical approach to their pathologies, as well as to make public healthcare decision makers able to verify the efficacy and efficiency of the adopted healthcare processes. Unfortunately, customised solutions adopted by many local Health Information Systems in Italy make it difficult to share the stored data outside their own environment. In the last years, worldwide initiatives have aimed to overcome such sharing limitation. An important issue during the passage towards standardised, integrated information systems is the possible loss of previously collected data. The herein presented project realises a suitable architecture able to guarantee reliable, automatic, user-transparent storing and retrieval of information from both modern and legacy systems. The technical and management solutions provided by the project avoid data loss and overlapping, and allow data integration and organisation suitable for data-mining and data-warehousing analysis.

  18. Evaluation of carbon emission reductions promoted by private driving restrictions based on automatic fare collection data in Beijing, China.

    PubMed

    Zhang, Wandi; Chen, Feng; Wang, Zijia; Huang, Jianling; Wang, Bo

    2017-11-01

    Public transportation automatic fare collection (AFC) systems are able to continuously record large amounts of passenger travel information, providing massive, low-cost data for research on regulations pertaining to public transport. These data can be used not only to analyze characteristics of passengers' trips but also to evaluate transport policies that promote a travel mode shift and emission reduction. In this study, models combining card, survey, and geographic information systems (GIS) data are established with a research focus on the private driving restriction policies being implemented in an ever-increasing number of cities. The study aims to evaluate the impact of these policies on the travel mode shift, as well as relevant carbon emission reductions. The private driving restriction policy implemented in Beijing is taken as an example. The impact of the restriction policy on the travel mode shift from cars to subways is analyzed through a model based on metro AFC data. The routing paths of these passengers are also analyzed based on the GIS method and on survey data, while associated carbon emission reductions are estimated. The analysis method used in this study can provide reference for the application of big data in evaluating transport policies. Motor vehicles have become the most prevalent source of emissions and subsequently air pollution within Chinese cities. The evaluation of the effects of driving restriction policies on the travel mode shift and vehicle emissions will be useful for other cities in the future. Transport big data, playing an important support role in estimating the travel mode shift and emission reduction considered, can help related departments to estimate the effects of traffic jam alleviation and environment improvement before the implementation of these restriction policies and provide a reference for relevant decisions.

  19. Arrhythmia Classification Based on Multi-Domain Feature Extraction for an ECG Recognition System.

    PubMed

    Li, Hongqiang; Yuan, Danyang; Wang, Youxi; Cui, Dianyin; Cao, Lu

    2016-10-20

    Automatic recognition of arrhythmias is particularly important in the diagnosis of heart diseases. This study presents an electrocardiogram (ECG) recognition system based on multi-domain feature extraction to classify ECG beats. An improved wavelet threshold method for ECG signal pre-processing is applied to remove noise interference. A novel multi-domain feature extraction method is proposed; this method employs kernel-independent component analysis in nonlinear feature extraction and uses discrete wavelet transform to extract frequency domain features. The proposed system utilises a support vector machine classifier optimized with a genetic algorithm to recognize different types of heartbeats. An ECG acquisition experimental platform, in which ECG beats are collected as ECG data for classification, is constructed to demonstrate the effectiveness of the system in ECG beat classification. The presented system, when applied to the MIT-BIH arrhythmia database, achieves a high classification accuracy of 98.8%. Experimental results based on the ECG acquisition experimental platform show that the system obtains a satisfactory classification accuracy of 97.3% and is able to classify ECG beats efficiently for the automatic identification of cardiac arrhythmias.

  20. Arrhythmia Classification Based on Multi-Domain Feature Extraction for an ECG Recognition System

    PubMed Central

    Li, Hongqiang; Yuan, Danyang; Wang, Youxi; Cui, Dianyin; Cao, Lu

    2016-01-01

    Automatic recognition of arrhythmias is particularly important in the diagnosis of heart diseases. This study presents an electrocardiogram (ECG) recognition system based on multi-domain feature extraction to classify ECG beats. An improved wavelet threshold method for ECG signal pre-processing is applied to remove noise interference. A novel multi-domain feature extraction method is proposed; this method employs kernel-independent component analysis in nonlinear feature extraction and uses discrete wavelet transform to extract frequency domain features. The proposed system utilises a support vector machine classifier optimized with a genetic algorithm to recognize different types of heartbeats. An ECG acquisition experimental platform, in which ECG beats are collected as ECG data for classification, is constructed to demonstrate the effectiveness of the system in ECG beat classification. The presented system, when applied to the MIT-BIH arrhythmia database, achieves a high classification accuracy of 98.8%. Experimental results based on the ECG acquisition experimental platform show that the system obtains a satisfactory classification accuracy of 97.3% and is able to classify ECG beats efficiently for the automatic identification of cardiac arrhythmias. PMID:27775596

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamieson, Kevin; Davis, IV, Warren L.

    Active learning methods automatically adapt data collection by selecting the most informative samples in order to accelerate machine learning. Because of this, real-world testing and comparing active learning algorithms requires collecting new datasets (adaptively), rather than simply applying algorithms to benchmark datasets, as is the norm in (passive) machine learning research. To facilitate the development, testing and deployment of active learning for real applications, we have built an open-source software system for large-scale active learning research and experimentation. The system, called NEXT, provides a unique platform for realworld, reproducible active learning research. This paper details the challenges of building themore » system and demonstrates its capabilities with several experiments. The results show how experimentation can help expose strengths and weaknesses of active learning algorithms, in sometimes unexpected and enlightening ways.« less

  2. Urine sampling and collection system optimization and testing

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Geating, J. A.; Koesterer, M. G.

    1975-01-01

    A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.

  3. ARAN/GIS video integration.

    DOT National Transportation Integrated Search

    2001-08-01

    The Maine Department of Transportation (MDOT) operates an Automatic : Road ANalyzer (ARAN) to collect roadway information to make pavement quality : assessments. Nearly 9000 miles of roadway data are collected by the ARAN in a : two-year cycle. The A...

  4. Convolutional neural networks with balanced batches for facial expressions recognition

    NASA Astrophysics Data System (ADS)

    Battini Sönmez, Elena; Cangelosi, Angelo

    2017-03-01

    This paper considers the issue of fully automatic emotion classification on 2D faces. In spite of the great effort done in recent years, traditional machine learning approaches based on hand-crafted feature extraction followed by the classification stage failed to develop a real-time automatic facial expression recognition system. The proposed architecture uses Convolutional Neural Networks (CNN), which are built as a collection of interconnected processing elements to simulate the brain of human beings. The basic idea of CNNs is to learn a hierarchical representation of the input data, which results in a better classification performance. In this work we present a block-based CNN algorithm, which uses noise, as data augmentation technique, and builds batches with a balanced number of samples per class. The proposed architecture is a very simple yet powerful CNN, which can yield state-of-the-art accuracy on the very competitive benchmark algorithm of the Extended Cohn Kanade database.

  5. A Machine Vision System for Automatically Grading Hardwood Lumber - (Proceedings)

    Treesearch

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas H. Drayer; Joe G. Tront; Philip A. Araman; Robert L. Brisbon

    1990-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  6. Design and analysis of an automatic method of measuring silicon-controlled-rectifier holding current

    NASA Technical Reports Server (NTRS)

    Maslowski, E. A.

    1971-01-01

    The design of an automated SCR holding-current measurement system is described. The circuits used in the measurement system were designed to meet the major requirements of automatic data acquisition, reliability, and repeatability. Performance data are presented and compared with calibration data. The data verified the accuracy of the measurement system. Data taken over a 48-hr period showed that the measurement system operated satisfactorily and met all the design requirements.

  7. 10 CFR 95.49 - Security of automatic data processing (ADP) systems.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Security of automatic data processing (ADP) systems. 95.49 Section 95.49 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.49 Security of...

  8. Image acquisition device of inspection robot based on adaptive rotation regulation of polarizer

    NASA Astrophysics Data System (ADS)

    Dong, Maoqi; Wang, Xingguang; Liang, Tao; Yang, Guoqing; Zhang, Chuangyou; Gao, Faqin

    2017-12-01

    An image processing device of inspection robot with adaptive polarization adjustment is proposed, that the device includes the inspection robot body, the image collecting mechanism, the polarizer and the polarizer automatic actuating device. Where, the image acquisition mechanism is arranged at the front of the inspection robot body for collecting equipment image data in the substation. Polarizer is fixed on the automatic actuating device of polarizer, and installed in front of the image acquisition mechanism, and that the optical axis of the camera vertically goes through the polarizer and the polarizer rotates with the optical axis of the visible camera as the central axis. The simulation results show that the system solves the fuzzy problems of the equipment that are caused by glare, reflection of light and shadow, and the robot can observe details of the running status of electrical equipment. And the full coverage of the substation equipment inspection robot observation target is achieved, which ensures the safe operation of the substation equipment.

  9. An approach to the real time risk evaluation system of boreal forest fire

    NASA Astrophysics Data System (ADS)

    Nakau, K.; Fukuda, M.; Kimura, K.; Hayasaka, H.; Tani, H.; Kushida, K.

    2005-12-01

    Huge boreal forest fire may cause massive impacts not only on global warming gas emission but also local communities. Thus, it is important to control forest fire. We collected data about boreal forest fire as satellite imagery and fire observation simultaneously in Alaska and east Siberia in summer fire seasons for these three years. Fire observation data was collected from aircraft flying between Japan and Europe. Fire detection results were compared with observed data to evaluate the accuracy and earliness of automatic detection. NOAA and MODIS satellite images covering Alaska and East Siberia are collected. We are also developing fire expansion simulation model to forecast the possible fire expansion area. On the basis of fire expansion forecast, risk analysis of possible fire expansion for decision aid of fire-fighting activities will be analyzed. To identify the risk of boreal forest fire and public concern about forest fire, we collected local news paper in Fairbanks, AK and discuss the statistics of articles related to forest fire on the newspaper.

  10. 77 FR 64383 - Proposed Information Collection (Verification of VA Benefits) Activity: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0406] Proposed Information Collection... any VA-guaranteed loans on an automatic basis. DATES: Written comments and recommendations on the... written comments on the collection of information through the Federal Docket Management System (FDMS) at...

  11. HIPAA-compliant automatic monitoring system for RIS-integrated PACS operation

    NASA Astrophysics Data System (ADS)

    Jin, Jin; Zhang, Jianguo; Chen, Xiaomeng; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Feng, Jie; Sheng, Liwei; Huang, H. K.

    2006-03-01

    As a governmental regulation, Health Insurance Portability and Accountability Act (HIPAA) was issued to protect the privacy of health information that identifies individuals who are living or deceased. HIPAA requires security services supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. These controls, which proposed in HIPAA Security Standards, are Audit trails here. Audit trails can be used for surveillance purposes, to detect when interesting events might be happening that warrant further investigation. Or they can be used forensically, after the detection of a security breach, to determine what went wrong and who or what was at fault. In order to provide security control services and to achieve the high and continuous availability, we design the HIPAA-Compliant Automatic Monitoring System for RIS-Integrated PACS operation. The system consists of two parts: monitoring agents running in each PACS component computer and a Monitor Server running in a remote computer. Monitoring agents are deployed on all computer nodes in RIS-Integrated PACS system to collect the Audit trail messages defined by the Supplement 95 of the DICOM standard: Audit Trail Messages. Then the Monitor Server gathers all audit messages and processes them to provide security information in three levels: system resources, PACS/RIS applications, and users/patients data accessing. Now the RIS-Integrated PACS managers can monitor and control the entire RIS-Integrated PACS operation through web service provided by the Monitor Server. This paper presents the design of a HIPAA-compliant automatic monitoring system for RIS-Integrated PACS Operation, and gives the preliminary results performed by this monitoring system on a clinical RIS-integrated PACS.

  12. 78 FR 23546 - Proposed Extension of Approval of Information Collection; Comment Request: Virginia Graeme Baker...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-19

    ..., 2011. In addition to the anti-entrapment devices or systems, each public pool and spa in the United... release system; suction-limiting vent system; gravity drainage system; automatic pump shut-off system or...

  13. Design and development of an automatic data acquisition system for a balance study using a smartcard system.

    PubMed

    Ambrozy, C; Kolar, N A; Rattay, F

    2010-01-01

    For measurement value logging of board angle values during balance training, it is necessary to develop a measurement system. This study will provide data for a balance study using the smartcard. The data acquisition comes automatically. An individually training plan for each proband is necessary. To store the proband identification a smartcard with an I2C data bus protocol and an E2PROM memory system is used. For reading the smartcard data a smartcard reader is connected via universal serial bus (USB) to a notebook. The data acquisition and smartcard read programme is designed with Microsoft® Visual C#. A training plan file contains the individual training plan for each proband. The data of the test persons are saved in a proband directory. Each event is automatically saved as a log-file for the exact documentation. This system makes study development easy and time-saving.

  14. Documentation of Heritage Structures Through Geo-Crowdsourcing and Web-Mapping

    NASA Astrophysics Data System (ADS)

    Dhonju, H. K.; Xiao, W.; Shakya, B.; Mills, J. P.; Sarhosis, V.

    2017-09-01

    Heritage documentation has become increasingly urgent due to both natural impacts and human influences. The documentation of countless heritage sites around the globe is a massive project that requires significant amounts of financial and labour resources. With the concepts of volunteered geographic information (VGI) and citizen science, heritage data such as digital photographs can be collected through online crowd participation. Whilst photographs are not strictly geographic data, they can be geo-tagged by the participants. They can also be automatically geo-referenced into a global coordinate system if collected via mobile phones which are now ubiquitous. With the assistance of web-mapping, an online geo-crowdsourcing platform has been developed to collect and display heritage structure photographs. Details of platform development are presented in this paper. The prototype is demonstrated with several heritage examples. Potential applications and advancements are discussed.

  15. Mobile text messaging solutions for obesity prevention

    NASA Astrophysics Data System (ADS)

    Akopian, David; Jayaram, Varun; Aaleswara, Lakshmipathi; Esfahanian, Moosa; Mojica, Cynthia; Parra-Medina, Deborah; Kaghyan, Sahak

    2011-02-01

    Cellular telephony has become a bright example of co-evolution of human society and information technology. This trend has also been reflected in health care and health promotion projects which included cell phones in data collection and communication chain. While many successful projects have been realized, the review of phone-based data collection techniques reveals that the existing technologies do not completely address health promotion research needs. The paper presents approaches which close this gap by extending existing versatile platforms. The messaging systems are designed for a health-promotion research to prevent obesity and obesity-related health disparities among low-income Latino adolescent girls. Messaging and polling mechanisms are used to communicate and automatically process response data for the target constituency. Preliminary survey data provide an insight on phone availability and technology perception for the study group.

  16. Automated content and quality assessment of full-motion-video for the generation of meta data

    NASA Astrophysics Data System (ADS)

    Harguess, Josh

    2015-05-01

    Virtually all of the video data (and full-motion-video (FMV)) that is currently collected and stored in support of missions has been corrupted to various extents by image acquisition and compression artifacts. Additionally, video collected by wide-area motion imagery (WAMI) surveillance systems and unmanned aerial vehicles (UAVs) and similar sources is often of low quality or in other ways corrupted so that it is not worth storing or analyzing. In order to make progress in the problem of automatic video analysis, the first problem that should be solved is deciding whether the content of the video is even worth analyzing to begin with. We present a work in progress to address three types of scenes which are typically found in real-world data stored in support of Department of Defense (DoD) missions: no or very little motion in the scene, large occlusions in the scene, and fast camera motion. Each of these produce video that is generally not usable to an analyst or automated algorithm for mission support and therefore should be removed or flagged to the user as such. We utilize recent computer vision advances in motion detection and optical flow to automatically assess FMV for the identification and generation of meta-data (or tagging) of video segments which exhibit unwanted scenarios as described above. Results are shown on representative real-world video data.

  17. Mining the Mind Research Network: A Novel Framework for Exploring Large Scale, Heterogeneous Translational Neuroscience Research Data Sources

    PubMed Central

    Bockholt, Henry J.; Scully, Mark; Courtney, William; Rachakonda, Srinivas; Scott, Adam; Caprihan, Arvind; Fries, Jill; Kalyanam, Ravi; Segall, Judith M.; de la Garza, Raul; Lane, Susan; Calhoun, Vince D.

    2009-01-01

    A neuroinformatics (NI) system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN), database system has been designed and improved through our experience with 200 research studies and 250 researchers from seven different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining. PMID:20461147

  18. Applications of digital image acquisition in anthropometry

    NASA Technical Reports Server (NTRS)

    Woolford, B.; Lewis, J. L.

    1981-01-01

    A description is given of a video kinesimeter, a device for the automatic real-time collection of kinematic and dynamic data. Based on the detection of a single bright spot by three TV cameras, the system provides automatic real-time recording of three-dimensional position and force data. It comprises three cameras, two incandescent lights, a voltage comparator circuit, a central control unit, and a mass storage device. The control unit determines the signal threshold for each camera before testing, sequences the lights, synchronizes and analyzes the scan voltages from the three cameras, digitizes force from a dynamometer, and codes the data for transmission to a floppy disk for recording. Two of the three cameras face each other along the 'X' axis; the third camera, which faces the center of the line between the first two, defines the 'Y' axis. An image from the 'Y' camera and either 'X' camera is necessary for determining the three-dimensional coordinates of the point.

  19. A Machine Vision System for Automatically Grading Hardwood Lumber - (Industrial Metrology)

    Treesearch

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas T. Drayer; Philip A. Araman; Robert L. Brisbon

    1992-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  20. Systems and methods for data quality control and cleansing

    DOEpatents

    Wenzel, Michael; Boettcher, Andrew; Drees, Kirk; Kummer, James

    2016-05-31

    A method for detecting and cleansing suspect building automation system data is shown and described. The method includes using processing electronics to automatically determine which of a plurality of error detectors and which of a plurality of data cleansers to use with building automation system data. The method further includes using processing electronics to automatically detect errors in the data and cleanse the data using a subset of the error detectors and a subset of the cleansers.

  1. Design of an automatic production monitoring system on job shop manufacturing

    NASA Astrophysics Data System (ADS)

    Prasetyo, Hoedi; Sugiarto, Yohanes; Rosyidi, Cucuk Nur

    2018-02-01

    Every production process requires monitoring system, so the desired efficiency and productivity can be monitored at any time. This system is also needed in the job shop type of manufacturing which is mainly influenced by the manufacturing lead time. Processing time is one of the factors that affect the manufacturing lead time. In a conventional company, the recording of processing time is done manually by the operator on a sheet of paper. This method is prone to errors. This paper aims to overcome this problem by creating a system which is able to record and monitor the processing time automatically. The solution is realized by utilizing electric current sensor, barcode, RFID, wireless network and windows-based application. An automatic monitoring device is attached to the production machine. It is equipped with a touch screen-LCD so that the operator can use it easily. Operator identity is recorded through RFID which is embedded in his ID card. The workpiece data are collected from the database by scanning the barcode listed on its monitoring sheet. A sensor is mounted on the machine to measure the actual machining time. The system's outputs are actual processing time and machine's capacity information. This system is connected wirelessly to a workshop planning application belongs to the firm. Test results indicated that all functions of the system can run properly. This system successfully enables supervisors, PPIC or higher level management staffs to monitor the processing time quickly with a better accuracy.

  2. Comparison of manual versus semiautomatic milk recording systems in dairy goats.

    PubMed

    Ait-Saidi, A; Caja, G; Carné, S; Salama, A A K; Ghirardi, J J

    2008-04-01

    A total of 24 Murciano-Granadina dairy goats in early-midlactation were used to compare the labor time and data collection efficiency of using manual (M) vs. semiautomated (SA) systems for milk recording. Goats were milked once daily in a 2 x 12 parallel platform, with 6 milking units on each side. The M system used visual identification (ID) by large plastic ear tags, on-paper data recording, and data manually uploaded to a computer. The SA system used electronic ID, automatic ID, manual data recording on reader keyboard, and automatic data uploading to computer by Bluetooth connection. Data were collected for groups of 2 x 12 goats for 15 test days of each system during a period of 70 d. Time data were converted to a decimal scale. No difference in milk recording time between M and SA (1.32 +/- 0.03 and 1.34 +/- 0.03 min/goat, respectively) was observed. Time needed for transferring data to the computer was greater for M when compared with SA (0.20 +/- 0.01 and 0.05 +/- 0.01 min/goat). Overall milk recording time was greater in M than in SA (1.52 +/- 0.04 vs. 1.39 +/- 0.04 min/goat), the latter decreasing with operator training. Time for transferring milk recording data to the computer was 4.81 +/- 0.34 and 1.09 +/- 0.10 min for M and SA groups of 24 goats, respectively, but only increased by 0.19 min in SA for each additional 24 goats. No difference in errors of data acquisition was detected between M and SA systems during milk recording (0.6%), but an additional 1.1% error was found in the M system during data uploading. Predicted differences between M and SA increased with the number of goats processed on the test-day. Reduction in labor time cost ranged from euro0.5 to 12.9 (US$0.7 to 17.4) per milk recording, according to number of goats from 24 to 480 goats and accounted for 40% of the electronic ID costs. In conclusion, electronic ID was more efficient for labor costs and resulted in fewer data errors, the benefit being greater with trained operators and larger goat herds.

  3. ORAC-DR -- integral field spectroscopy data reduction

    NASA Astrophysics Data System (ADS)

    Todd, Stephen

    ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce integral field unit (IFU) data collected at the United Kingdom Infrared Telescope (UKIRT) with the UIST instrument.

  4. Automatic extraction of pavement markings on streets from point cloud data of mobile LiDAR

    NASA Astrophysics Data System (ADS)

    Gao, Yang; Zhong, Ruofei; Tang, Tao; Wang, Liuzhao; Liu, Xianlin

    2017-08-01

    Pavement markings provide an important foundation as they help to keep roads users safe. Accurate and comprehensive information about pavement markings assists the road regulators and is useful in developing driverless technology. Mobile light detection and ranging (LiDAR) systems offer new opportunities to collect and process accurate pavement markings’ information. Mobile LiDAR systems can directly obtain the three-dimensional (3D) coordinates of an object, thus defining spatial data and the intensity of (3D) objects in a fast and efficient way. The RGB attribute information of data points can be obtained based on the panoramic camera in the system. In this paper, we present a novel method process to automatically extract pavement markings using multiple attribute information of the laser scanning point cloud from the mobile LiDAR data. This method process utilizes a differential grayscale of RGB color, laser pulse reflection intensity, and the differential intensity to identify and extract pavement markings. We utilized point cloud density to remove the noise and used morphological operations to eliminate the errors. In the application, we tested our method process on different sections of roads in Beijing, China, and Buffalo, NY, USA. The results indicated that both correctness (p) and completeness (r) were higher than 90%. The method process of this research can be applied to extract pavement markings from huge point cloud data produced by mobile LiDAR.

  5. Data management and data enrichment for systems biology projects.

    PubMed

    Wittig, Ulrike; Rey, Maja; Weidemann, Andreas; Müller, Wolfgang

    2017-11-10

    Collecting, curating, interlinking, and sharing high quality data are central to de.NBI-SysBio, the systems biology data management service center within the de.NBI network (German Network for Bioinformatics Infrastructure). The work of the center is guided by the FAIR principles for scientific data management and stewardship. FAIR stands for the four foundational principles Findability, Accessibility, Interoperability, and Reusability which were established to enhance the ability of machines to automatically find, access, exchange and use data. Within this overview paper we describe three tools (SABIO-RK, Excemplify, SEEK) that exemplify the contribution of de.NBI-SysBio services to FAIR data, models, and experimental methods storage and exchange. The interconnectivity of the tools and the data workflow within systems biology projects will be explained. For many years we are the German partner in the FAIRDOM initiative (http://fair-dom.org) to establish a European data and model management service facility for systems biology. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Cloud Detection from Satellite Imagery: A Comparison of Expert-Generated and Automatically-Generated Decision Trees

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar

    2004-01-01

    Automated cloud detection and tracking is an important step in assessing global climate change via remote sensing. Cloud masks, which indicate whether individual pixels depict clouds, are included in many of the data products that are based on data acquired on- board earth satellites. Many cloud-mask algorithms have the form of decision trees, which employ sequential tests that scientists designed based on empirical astrophysics studies and astrophysics simulations. Limitations of existing cloud masks restrict our ability to accurately track changes in cloud patterns over time. In this study we explored the potential benefits of automatically-learned decision trees for detecting clouds from images acquired using the Advanced Very High Resolution Radiometer (AVHRR) instrument on board the NOAA-14 weather satellite of the National Oceanic and Atmospheric Administration. We constructed three decision trees for a sample of 8km-daily AVHRR data from 2000 using a decision-tree learning procedure provided within MATLAB(R), and compared the accuracy of the decision trees to the accuracy of the cloud mask. We used ground observations collected by the National Aeronautics and Space Administration Clouds and the Earth s Radiant Energy Systems S COOL project as the gold standard. For the sample data, the accuracy of automatically learned decision trees was greater than the accuracy of the cloud masks included in the AVHRR data product.

  7. Uav-Based Automatic Tree Growth Measurement for Biomass Estimation

    NASA Astrophysics Data System (ADS)

    Karpina, M.; Jarząbek-Rychard, M.; Tymków, P.; Borkowski, A.

    2016-06-01

    Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV) imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.

  8. Remote sensing-based detection and quantification of roadway debris following natural disasters

    NASA Astrophysics Data System (ADS)

    Axel, Colin; van Aardt, Jan A. N.; Aros-Vera, Felipe; Holguín-Veras, José

    2016-05-01

    Rapid knowledge of road network conditions is vital to formulate an efficient emergency response plan following any major disaster. Fallen buildings, immobile vehicles, and other forms of debris often render roads impassable to responders. The status of roadways is generally determined through time and resource heavy methods, such as field surveys and manual interpretation of remotely sensed imagery. Airborne lidar systems provide an alternative, cost-effective option for performing network assessments. The 3D data can be collected quickly over a wide area and provide valuable insight about the geometry and structure of the scene. This paper presents a method for automatically detecting and characterizing debris in roadways using airborne lidar data. Points falling within the road extent are extracted from the point cloud and clustered into individual objects using region growing. Objects are classified as debris or non-debris using surface properties and contextual cues. Debris piles are reconstructed as surfaces using alpha shapes, from which an estimate of debris volume can be computed. Results using real lidar data collected after a natural disaster are presented. Initial results indicate that accurate debris maps can be automatically generated using the proposed method. These debris maps would be an invaluable asset to disaster management and emergency response teams attempting to reach survivors despite a crippled transportation network.

  9. Towards the automatic scanning of indoors with robots.

    PubMed

    Adán, Antonio; Quintana, Blanca; Vázquez, Andres S; Olivares, Alberto; Parra, Eduardo; Prieto, Samuel

    2015-05-19

    This paper is framed in both 3D digitization and 3D data intelligent processing research fields. Our objective is focused on developing a set of techniques for the automatic creation of simple three-dimensional indoor models with mobile robots. The document presents the principal steps of the process, the experimental setup and the results achieved. We distinguish between the stages concerning intelligent data acquisition and 3D data processing. This paper is focused on the first stage. We show how the mobile robot, which carries a 3D scanner, is able to, on the one hand, make decisions about the next best scanner position and, on the other hand, navigate autonomously in the scene with the help of the data collected from earlier scans. After this stage, millions of 3D data are converted into a simplified 3D indoor model. The robot imposes a stopping criterion when the whole point cloud covers the essential parts of the scene. This system has been tested under real conditions indoors with promising results. The future is addressed to extend the method in much more complex and larger scenarios.

  10. Towards the Automatic Scanning of Indoors with Robots

    PubMed Central

    Adán, Antonio; Quintana, Blanca; Vázquez, Andres S.; Olivares, Alberto; Parra, Eduardo; Prieto, Samuel

    2015-01-01

    This paper is framed in both 3D digitization and 3D data intelligent processing research fields. Our objective is focused on developing a set of techniques for the automatic creation of simple three-dimensional indoor models with mobile robots. The document presents the principal steps of the process, the experimental setup and the results achieved. We distinguish between the stages concerning intelligent data acquisition and 3D data processing. This paper is focused on the first stage. We show how the mobile robot, which carries a 3D scanner, is able to, on the one hand, make decisions about the next best scanner position and, on the other hand, navigate autonomously in the scene with the help of the data collected from earlier scans. After this stage, millions of 3D data are converted into a simplified 3D indoor model. The robot imposes a stopping criterion when the whole point cloud covers the essential parts of the scene. This system has been tested under real conditions indoors with promising results. The future is addressed to extend the method in much more complex and larger scenarios. PMID:25996513

  11. Generalized Self-Organizing Maps for Automatic Determination of the Number of Clusters and Their Multiprototypes in Cluster Analysis.

    PubMed

    Gorzalczany, Marian B; Rudzinski, Filip

    2017-06-07

    This paper presents a generalization of self-organizing maps with 1-D neighborhoods (neuron chains) that can be effectively applied to complex cluster analysis problems. The essence of the generalization consists in introducing mechanisms that allow the neuron chain--during learning--to disconnect into subchains, to reconnect some of the subchains again, and to dynamically regulate the overall number of neurons in the system. These features enable the network--working in a fully unsupervised way (i.e., using unlabeled data without a predefined number of clusters)--to automatically generate collections of multiprototypes that are able to represent a broad range of clusters in data sets. First, the operation of the proposed approach is illustrated on some synthetic data sets. Then, this technique is tested using several real-life, complex, and multidimensional benchmark data sets available from the University of California at Irvine (UCI) Machine Learning repository and the Knowledge Extraction based on Evolutionary Learning data set repository. A sensitivity analysis of our approach to changes in control parameters and a comparative analysis with an alternative approach are also performed.

  12. Automatic measurement; Mesures automatiques (in French)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ringeard, C.

    1974-11-28

    By its ability to link-up operations sequentially and memorize the data collected, the computer can introduce a statistical approach in the evaluation of a result. To benefit fully from the advantages of automation, a special effort was made to reduce the programming time to a minimum and to simplify link-ups between the existing system and instruments from different sources. The practical solution of the test laboratory of the C.E.A. Centralized Administration Groupe (GEC) is given.

  13. Validation of Computerized Automatic Calculation of the Sequential Organ Failure Assessment Score

    PubMed Central

    Harrison, Andrew M.; Pickering, Brian W.; Herasevich, Vitaly

    2013-01-01

    Purpose. To validate the use of a computer program for the automatic calculation of the sequential organ failure assessment (SOFA) score, as compared to the gold standard of manual chart review. Materials and Methods. Adult admissions (age > 18 years) to the medical ICU with a length of stay greater than 24 hours were studied in the setting of an academic tertiary referral center. A retrospective cross-sectional analysis was performed using a derivation cohort to compare automatic calculation of the SOFA score to the gold standard of manual chart review. After critical appraisal of sources of disagreement, another analysis was performed using an independent validation cohort. Then, a prospective observational analysis was performed using an implementation of this computer program in AWARE Dashboard, which is an existing real-time patient EMR system for use in the ICU. Results. Good agreement between the manual and automatic SOFA calculations was observed for both the derivation (N=94) and validation (N=268) cohorts: 0.02 ± 2.33 and 0.29 ± 1.75 points, respectively. These results were validated in AWARE (N=60). Conclusion. This EMR-based automatic tool accurately calculates SOFA scores and can facilitate ICU decisions without the need for manual data collection. This tool can also be employed in a real-time electronic environment. PMID:23936639

  14. The Additional Secondary Phase Correction System for AIS Signals

    PubMed Central

    Wang, Xiaoye; Zhang, Shufang; Sun, Xiaowen

    2017-01-01

    This paper looks at the development and implementation of the additional secondary phase factor (ASF) real-time correction system for the Automatic Identification System (AIS) signal. A large number of test data were collected using the developed ASF correction system and the propagation characteristics of the AIS signal that transmits at sea and the ASF real-time correction algorithm of the AIS signal were analyzed and verified. Accounting for the different hardware of the receivers in the land-based positioning system and the variation of the actual environmental factors, the ASF correction system corrects original measurements of positioning receivers in real time and provides corrected positioning accuracy within 10 m. PMID:28362330

  15. Creating a medical English-Swedish dictionary using interactive word alignment.

    PubMed

    Nyström, Mikael; Merkel, Magnus; Ahrenberg, Lars; Zweigenbaum, Pierre; Petersson, Håkan; Ahlfeldt, Hans

    2006-10-12

    This paper reports on a parallel collection of rubrics from the medical terminology systems ICD-10, ICF, MeSH, NCSP and KSH97-P and its use for semi-automatic creation of an English-Swedish dictionary of medical terminology. The methods presented are relevant for many other West European language pairs than English-Swedish. The medical terminology systems were collected in electronic format in both English and Swedish and the rubrics were extracted in parallel language pairs. Initially, interactive word alignment was used to create training data from a sample. Then the training data were utilised in automatic word alignment in order to generate candidate term pairs. The last step was manual verification of the term pair candidates. A dictionary of 31,000 verified entries has been created in less than three man weeks, thus with considerably less time and effort needed compared to a manual approach, and without compromising quality. As a side effect of our work we found 40 different translation problems in the terminology systems and these results indicate the power of the method for finding inconsistencies in terminology translations. We also report on some factors that may contribute to making the process of dictionary creation with similar tools even more expedient. Finally, the contribution is discussed in relation to other ongoing efforts in constructing medical lexicons for non-English languages. In three man weeks we were able to produce a medical English-Swedish dictionary consisting of 31,000 entries and also found hidden translation errors in the utilized medical terminology systems.

  16. Creating a medical English-Swedish dictionary using interactive word alignment

    PubMed Central

    Nyström, Mikael; Merkel, Magnus; Ahrenberg, Lars; Zweigenbaum, Pierre; Petersson, Håkan; Åhlfeldt, Hans

    2006-01-01

    Background This paper reports on a parallel collection of rubrics from the medical terminology systems ICD-10, ICF, MeSH, NCSP and KSH97-P and its use for semi-automatic creation of an English-Swedish dictionary of medical terminology. The methods presented are relevant for many other West European language pairs than English-Swedish. Methods The medical terminology systems were collected in electronic format in both English and Swedish and the rubrics were extracted in parallel language pairs. Initially, interactive word alignment was used to create training data from a sample. Then the training data were utilised in automatic word alignment in order to generate candidate term pairs. The last step was manual verification of the term pair candidates. Results A dictionary of 31,000 verified entries has been created in less than three man weeks, thus with considerably less time and effort needed compared to a manual approach, and without compromising quality. As a side effect of our work we found 40 different translation problems in the terminology systems and these results indicate the power of the method for finding inconsistencies in terminology translations. We also report on some factors that may contribute to making the process of dictionary creation with similar tools even more expedient. Finally, the contribution is discussed in relation to other ongoing efforts in constructing medical lexicons for non-English languages. Conclusion In three man weeks we were able to produce a medical English-Swedish dictionary consisting of 31,000 entries and also found hidden translation errors in the utilized medical terminology systems. PMID:17034649

  17. Development of Mobile Mapping System for 3D Road Asset Inventory.

    PubMed

    Sairam, Nivedita; Nagarajan, Sudhagar; Ornitz, Scott

    2016-03-12

    Asset Management is an important component of an infrastructure project. A significant cost is involved in maintaining and updating the asset information. Data collection is the most time-consuming task in the development of an asset management system. In order to reduce the time and cost involved in data collection, this paper proposes a low cost Mobile Mapping System using an equipped laser scanner and cameras. First, the feasibility of low cost sensors for 3D asset inventory is discussed by deriving appropriate sensor models. Then, through calibration procedures, respective alignments of the laser scanner, cameras, Inertial Measurement Unit and GPS (Global Positioning System) antenna are determined. The efficiency of this Mobile Mapping System is experimented by mounting it on a truck and golf cart. By using derived sensor models, geo-referenced images and 3D point clouds are derived. After validating the quality of the derived data, the paper provides a framework to extract road assets both automatically and manually using techniques implementing RANSAC plane fitting and edge extraction algorithms. Then the scope of such extraction techniques along with a sample GIS (Geographic Information System) database structure for unified 3D asset inventory are discussed.

  18. Development of Mobile Mapping System for 3D Road Asset Inventory

    PubMed Central

    Sairam, Nivedita; Nagarajan, Sudhagar; Ornitz, Scott

    2016-01-01

    Asset Management is an important component of an infrastructure project. A significant cost is involved in maintaining and updating the asset information. Data collection is the most time-consuming task in the development of an asset management system. In order to reduce the time and cost involved in data collection, this paper proposes a low cost Mobile Mapping System using an equipped laser scanner and cameras. First, the feasibility of low cost sensors for 3D asset inventory is discussed by deriving appropriate sensor models. Then, through calibration procedures, respective alignments of the laser scanner, cameras, Inertial Measurement Unit and GPS (Global Positioning System) antenna are determined. The efficiency of this Mobile Mapping System is experimented by mounting it on a truck and golf cart. By using derived sensor models, geo-referenced images and 3D point clouds are derived. After validating the quality of the derived data, the paper provides a framework to extract road assets both automatically and manually using techniques implementing RANSAC plane fitting and edge extraction algorithms. Then the scope of such extraction techniques along with a sample GIS (Geographic Information System) database structure for unified 3D asset inventory are discussed. PMID:26985897

  19. The Environment-Power System Analysis Tool development program. [for spacecraft power supplies

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Wilcox, Katherine G.; Stevens, N. John; Putnam, Rand M.; Roche, James C.

    1989-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide engineers with the ability to assess the effects of a broad range of environmental interactions on space power systems. A unique user-interface-data-dictionary code architecture oversees a collection of existing and future environmental modeling codes (e.g., neutral density) and physical interaction models (e.g., sheath ionization). The user-interface presents the engineer with tables, graphs, and plots which, under supervision of the data dictionary, are automatically updated in response to parameter change. EPSAT thus provides the engineer with a comprehensive and responsive environmental assessment tool and the scientist with a framework into which new environmental or physical models can be easily incorporated.

  20. Integrating clinical and biological information in a shanghai biobank: an introduction to the sample repository and information sharing platform project.

    PubMed

    Cui, Wenbin; Zheng, Peiyong; Yang, Jiahong; Zhao, Rong; Gao, Jiechun; Yu, Guangjun

    2015-02-01

    Biobanks are important resources and central tools for translational medicine, which brings scientific research outcomes to clinical practice. The key purpose of biobanking in translational medicine and other medical research is to provide biological samples that are integrated with clinical information. In 2008, the Shanghai Municipal Government launched the "Shanghai Tissue Bank" in an effort to promote research in translational medicine. Now a sharing service platform has been constructed to integrate clinical practice and biological information that can be used in diverse medical and pharmaceutical research studies. The platform collects two kinds of data: sample data and clinical data. The sample data are obtained from the hospital biobank management system, and mainly include the donors' age, gender, marital status, sample source, sample type, collection time, deposit time, and storage method. The clinical data are collected from the "Hospital-Link" system (a medical information sharing system that connects 23 tertiary hospitals in Shanghai). The main contents include donors' corresponding medication information, test reports, inspection reports, and hospital information. As of the end of September 2014, the project has a collection of 16,020 donors and 148,282 samples, which were obtained from 12 medical institutions, and automatically acquired donors' corresponding clinical data from the "Hospital-Link" system for 6830 occurrences. This project will contribute to scientific research at medical institutions in Shanghai, and will also support the development of the biopharmaceutical industry. In this article, we will describe the significance, the construction phases, the application prospects, and benefits of the sample repository and information sharing service platform.

  1. Hydrologic Monitoring in the Deep Subsurface to Support Repository Performance

    NASA Astrophysics Data System (ADS)

    Hubbell, J. M.; Heath, G. L.; Scott, C. L.

    2007-12-01

    The INL has installed and operated several vadose and ground water monitoring systems in arid and humid sites to depths of about 200m. Some of these systems have been in continuous operation for over 12 years. It is important that the systems be physically robust, simple, yet versatile enough that it can operate for extended time periods with little or no maintenance. Monitoring instruments are frequently installed and run to characterize the site, collect data during site operation, and continue to run for long-term stewardship, necessitating sensors that can be maintained or serviced. Sensors are carefully chosen based on the perceived data requirements over the life of the site. An emphasis is given on direct measurements such as tensiometers (portable and advanced), neutron probe, drain gauge, temperature, wells or sampling for fluids and gases. Other complementary data can include using TDR/capacitance, radiation detectors, and larger scale geophysical techniques (3-d resistivity and EM) for volumetric measurements. Commercially available instruments may have to be modified for their use at greater depths, to allow multiple instruments in a single borehole or to perform the intended monitoring function. Access tubes (some open at the bottom) can be placed to allow insertion of multiple sensors (radiation, neutron and portable sensors/samplers), future drilling/sampling and to install new instruments at a later time. The installation techniques and backfill materials must be chosen and the measurement technique tested to ensure representative data collection for the parameters of interest. The data collection system can be linked to climatic data (precipitation, barometric pressure, snow depth, runoff, surface water sources) that may influence the site's subsurface hydrology. The instruments are then connected to a real-time automated data collection system that collect, stores, and provides access to the data. These systems have been developed that allow easy access, automatic data quality checks with notification, processing, and presentation of the data in real time through the web. The systems can be designed to manipulate/test the system remotely. Data from several sites will be presented showing that continuous monitoring is necessary to detect rapid changes in the deep vadose zone and ground water at fractured rock sites.

  2. Mapping AIS coverage for trusted surveillance

    NASA Astrophysics Data System (ADS)

    Lapinski, Anna-Liesa S.; Isenor, Anthony W.

    2010-10-01

    Automatic Identification System (AIS) is an unattended vessel reporting system developed for collision avoidance. Shipboard AIS equipment automatically broadcasts vessel positional data at regular intervals. The real-time position and identity data from a vessel is received by other vessels in the area thereby assisting with local navigation. As well, AIS broadcasts are beneficial to those concerned with coastal and harbour security. Land-based AIS receiving stations can also collect the AIS broadcasts. However, reception at the land station is dependent upon the ship's position relative to the receiving station. For AIS to be used as a trusted surveillance system, the characteristics of the AIS coverage area in the vicinity of the station (or stations) should be understood. This paper presents some results of a method being investigated at DRDC Atlantic, Canada) to map the AIS coverage characteristics of a dynamic AIS reception network. The method is shown to clearly distinguish AIS reception edges from those edges caused by vessel traffic patterns. The method can also be used to identify temporal changes in the coverage area, an important characteristic for local maritime security surveillance activities. Future research using the coverage estimate technique is also proposed to support surveillance activities.

  3. TEODOOR, a blueprint for distributed terrestrial observation data infrastructures

    NASA Astrophysics Data System (ADS)

    Kunkel, Ralf; Sorg, Jürgen; Abbrent, Martin; Borg, Erik; Gasche, Rainer; Kolditz, Olaf; Neidl, Frank; Priesack, Eckart; Stender, Vivien

    2017-04-01

    TERENO (TERrestrial ENvironmental Observatories) is an initiative funded by the large research infrastructure program of the Helmholtz Association of Germany. Four observation platforms to facilitate the investigation of consequences of global change for terrestrial ecosys-tems and the socioeconomic implications of these have been implemented and equipped from 2007 until 2013. Data collection, however, is planned to be performed for at least 30 years. TERENO provides series of system variables (e.g. precipitation, runoff, groundwater level, soil moisture, water vapor and trace gases fluxes) for the analysis and prognosis of global change consequences using integrated model systems, which will be used to derive efficient prevention, mitigation and adaptation strategies. Each platform is operated by a different Helmholtz-Institution, which maintains its local data infrastructure. Within the individual observatories, areas with intensive measurement programs have been implemented. Different sensors provide information on various physical parameters like soil moisture, temperatures, ground water levels or gas fluxes. Sensor data from more than 900 stations are collected automatically with a frequency of 20 s-1 up to 2 h-1, summing up to about 2,500,000 data values per day. In addition, three weather radar devices create raster data with a frequency of 12 to 60 h-1. The data are automatically imported into local relational database systems using a common data quality assessment framework, used to handle processing and assessment of heterogeneous environmental observation data. Starting with the way data are imported into the data infrastructure, custom workflows are developed. Data levels implying the underlying data processing, stages of quality assessment and data ac-cessibility are defined. In order to facilitate the acquisition, provision, integration, management and exchange of heterogeneous geospatial resources within a scientific and non-scientific environment the dis-tributed spatial data infrastructure TEODOOR (TEreno Online Data RepOsitORry) has been build-up. The individual observatories are connected via OGC-compliant web-services, while the TERENO Data Discovery Portal (DDP) enables data discovery, visualization and data ac-cess. Currently, free access to data from more than 900 monitoring stations is provided.

  4. MO-F-16A-06: Implementation of a Radiation Exposure Monitoring System for Surveillance of Multi-Modality Radiation Dose Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, B; Kanal, K; Dickinson, R

    2014-06-15

    Purpose: We have implemented a commercially available Radiation Exposure Monitoring System (REMS) to enhance the processes of radiation dose data collection, analysis and alerting developed over the past decade at our sites of practice. REMS allows for consolidation of multiple radiation dose information sources and quicker alerting than previously developed processes. Methods: Thirty-nine x-ray producing imaging modalities were interfaced with the REMS: thirteen computed tomography scanners, sixteen angiography/interventional systems, nine digital radiography systems and one mammography system. A number of methodologies were used to provide dose data to the REMS: Modality Performed Procedure Step (MPPS) messages, DICOM Radiation Dose Structuredmore » Reports (RDSR), and DICOM header information. Once interfaced, the dosimetry information from each device underwent validation (first 15–20 exams) before release for viewing by end-users: physicians, medical physicists, technologists and administrators. Results: Before REMS, our diagnostic physics group pulled dosimetry data from seven disparate databases throughout the radiology, radiation oncology, cardiology, electrophysiology, anesthesiology/pain management and vascular surgery departments at two major medical centers and four associated outpatient clinics. With the REMS implementation, we now have one authoritative source of dose information for alerting, longitudinal analysis, dashboard/graphics generation and benchmarking. REMS provides immediate automatic dose alerts utilizing thresholds calculated through daily statistical analysis. This has streamlined our Closing the Loop process for estimated skin exposures in excess of our institutional specific substantial radiation dose level which relied on technologist notification of the diagnostic physics group and daily report from the radiology information system (RIS). REMS also automatically calculates the CT size-specific dose estimate (SSDE) as well as provides two-dimensional angulation dose maps for angiography/interventional procedures. Conclusion: REMS implementation has streamlined and consolidated the dosimetry data collection and analysis process at our institutions while eliminating manual entry error and providing immediate alerting and access to dosimetry data to both physicists and physicians. Brent Stewart has funded research through GE Healthcare.« less

  5. Enhancing acronym/abbreviation knowledge bases with semantic information.

    PubMed

    Torii, Manabu; Liu, Hongfang

    2007-10-11

    In the biomedical domain, a terminology knowledge base that associates acronyms/abbreviations (denoted as SFs) with the definitions (denoted as LFs) is highly needed. For the construction such terminology knowledge base, we investigate the feasibility to build a system automatically assigning semantic categories to LFs extracted from text. Given a collection of pairs (SF,LF) derived from text, we i) assess the coverage of LFs and pairs (SF,LF) in the UMLS and justify the need of a semantic category assignment system; and ii) automatically derive name phrases annotated with semantic category and construct a system using machine learning. Utilizing ADAM, an existing collection of (SF,LF) pairs extracted from MEDLINE, our system achieved an f-measure of 87% when assigning eight UMLS-based semantic groups to LFs. The system has been incorporated into a web interface which integrates SF knowledge from multiple SF knowledge bases. Web site: http://gauss.dbb.georgetown.edu/liblab/SFThesurus.

  6. Web platform using digital image processing and geographic information system tools: a Brazilian case study on dengue.

    PubMed

    Brasil, Lourdes M; Gomes, Marília M F; Miosso, Cristiano J; da Silva, Marlete M; Amvame-Nze, Georges D

    2015-07-16

    Dengue fever is endemic in Asia, the Americas, the East of the Mediterranean and the Western Pacific. According to the World Health Organization, it is one of the diseases of greatest impact on health, affecting millions of people each year worldwide. A fast detection of increases in populations of the transmitting vector, the Aedes aegypti mosquito, is essential to avoid dengue outbreaks. Unfortunately, in several countries, such as Brazil, the current methods for detecting populations changes and disseminating this information are too slow to allow efficient allocation of resources to fight outbreaks. To reduce the delay in providing the information regarding A. aegypti population changes, we propose, develop, and evaluate a system for counting the eggs found in special traps and to provide the collected data using a web structure with geographical location resources. One of the most useful tools for the detection and surveillance of arthropods is the ovitrap, a special trap built to collect the mosquito eggs. This allows for an egg counting process, which is still usually performed manually, in countries such as Brazil. We implement and evaluate a novel system for automatically counting the eggs found in the ovitraps' cardboards. The system we propose is based on digital image processing (DIP) techniques, as well as a Web based Semi-Automatic Counting System (SCSA-WEB). All data collected are geographically referenced in a geographic information system (GIS) and made available on a Web platform. The work was developed in Gama's administrative region, in Brasília/Brazil, with the aid of the Environmental Surveillance Directory (DIVAL-Gama) and Brasília's Board of Health (SSDF), in partnership with the University of Brasília (UnB). The system was built based on a field survey carried out during three months and provided by health professionals. These professionals provided 84 cardboards from 84 ovitraps, sized 15 × 5 cm. In developing the system, we conducted the following steps: i. Obtain images from the eggs on an ovitrap's cardboards, with a microscope. ii. Apply a proposed image-processing-based semi-automatic counting system. The system we developed uses the Java programming language and the Java Server Faces technology. This is a framework suite for web applications development. This approach will allow a simple migration to any Operating System platform and future applications on mobile devices. iii. Collect and store all data into a Database (DB) and then georeference them in a GIS. The Database Management System used to develop the DB is based on PostgreSQL. The GIS will assist in the visualization and spatial analysis of digital maps, allowing the location of Dengue outbreaks in the region of study. This will also facilitate the planning, analysis, and evaluation of temporal and spatial epidemiology, as required by the Brazilian Health Care Control Center. iv. Deploy the SCSA-WEB, DB and GIS on a single Web platform. The statistical results obtained by DIP were satisfactory when compared with the SCSA-WEB's semi-automated eggs count. The results also indicate that the time spent in manual counting has being considerably reduced when using our fully automated DIP algorithm and semi-automated SCSA-WEB. The developed georeferencing Web platform proves to be of great support for future visualization with statistical and trace analysis of the disease. The analyses suggest the efficiency of our algorithm for automatic eggs counting, in terms of expediting the work of the laboratory technician, reducing considerably its time and error counting rates. We believe that this kind of integrated platform and tools can simplify the decision making process of the Brazilian Health Care Control Center.

  7. A unified approach for development of Urdu Corpus for OCR and demographic purpose

    NASA Astrophysics Data System (ADS)

    Choudhary, Prakash; Nain, Neeta; Ahmed, Mushtaq

    2015-02-01

    This paper presents a methodology for the development of an Urdu handwritten text image Corpus and application of Corpus linguistics in the field of OCR and information retrieval from handwritten document. Compared to other language scripts, Urdu script is little bit complicated for data entry. To enter a single character it requires a combination of multiple keys entry. Here, a mixed approach is proposed and demonstrated for building Urdu Corpus for OCR and Demographic data collection. Demographic part of database could be used to train a system to fetch the data automatically, which will be helpful to simplify existing manual data-processing task involved in the field of data collection such as input forms like Passport, Ration Card, Voting Card, AADHAR, Driving licence, Indian Railway Reservation, Census data etc. This would increase the participation of Urdu language community in understanding and taking benefit of the Government schemes. To make availability and applicability of database in a vast area of corpus linguistics, we propose a methodology for data collection, mark-up, digital transcription, and XML metadata information for benchmarking.

  8. Quality assurance and quality control for autonomously collected geoscience data

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Richardson, A.; Labrecque, D.

    2006-12-01

    The growing interest in processes, coupled with the reduction in cost and complexity of sensors which allow for continuous data collection and transmission is giving rise to vast amounts of semi autonomously collected data. Such data is typically collected from a range of physical and chemical sensors and transmitted - either at the time of collection, or periodically as a collection of measurements - to a central server. Such setups can collect vast amounts of data. In cases where power is not an issue one datapoint can be collected every minute, resulting in tens of thousands of data points per month per sensor. Especially in cases in which multiple sensors are deployed it is infeasible to examine each individual datapoint for each individual sensor, and users typically will look at aggregates of such data on a periodic (once a week to once every few months) basis. Such aggregates (and the timelag between data collection and data evaluation) will impact the ability to rapidly identify and resolve data issues. Thus, there is a need to integrate data qa/qc rules and procedures in the data collection process. These should be implemented such that data is analyzed for compliance the moment it arrives at the server, and that any issues with this data result in notification of cognizant personnel. Typical issues (encountered in the field) include complete system failure (resulting in no data arriving at all), to complete sensor failure (data is collected, but is meaningless), to partial sensor failure (sensor gives erratic readings, or starts to exhibit a bias) to partial powerloss (system collects and transmits data only intermittently). We have implemented a suite of such rules and tests as part of the INL developed performance monitoring system. These rules are invoked as part of a data qa/qc workflow, and result in quality indicators for each datapoint as well as user alerts in case of issues. Tests which are applied to the data include tests on individual datapoints, tests on suites of datapoints, and tests applied over the whole dataset. Example of tests include: Did data arrive on time, is received data in a valid format, are all measurements present, is data within valid range, is data collected at appropriate time intervals, are the statistics of the data changing over time and is the data collected within an appropriate instrument calibration window? This approach, which is executed automatically on all data provides data end users with confidence and auditability regarding the quality and useability of autonomously collected data.

  9. Simulation analysis of a microcomputer-based, low-cost Omega navigation system

    NASA Technical Reports Server (NTRS)

    Lilley, R. W.; Salter, R. J., Jr.

    1976-01-01

    The current status of research on a proposed micro-computer-based, low-cost Omega Navigation System (ONS) is described. The design approach emphasizes minimum hardware, maximum software, and the use of a low-cost, commercially-available microcomputer. Currently under investigation is the implementation of a low-cost navigation processor and its interface with an omega sensor to complete the hardware-based ONS. Sensor processor functions are simulated to determine how many of the sensor processor functions can be handled by innovative software. An input data base of live Omega ground and flight test data was created. The Omega sensor and microcomputer interface modules used to collect the data are functionally described. Automatic synchronization to the Omega transmission pattern is described as an example of the algorithms developed using this data base.

  10. Minimum Energy Routing through Interactive Techniques (MERIT) modeling

    NASA Technical Reports Server (NTRS)

    Wylie, Donald P.

    1988-01-01

    The MERIT program is designed to demonstrate the feasibility of fuel savings by airlines through improved route selection using wind observations from their own fleet. After a discussion of weather and aircraft data, manually correcting wind fields, automatic corrections to wind fields, and short-range prediction models, it is concluded that improvements in wind information are possible if a system is developed for analyzing wind observations and correcting the forecasts made by the major models. One data handling system, McIDAS, can easily collect and display wind observations and model forecasts. Changing the wind forecasts beyond the time of the most recent observations is more difficult; an Australian Mesoscale Model was tested with promising but not definitive results.

  11. Method and apparatus for reading meters from a video image

    DOEpatents

    Lewis, Trevor J.; Ferguson, Jeffrey J.

    1997-01-01

    A method and system to enable acquisition of data about an environment from one or more meters using video images. One or more meters are imaged by a video camera and the video signal is digitized. Then, each region of the digital image which corresponds to the indicator of the meter is calibrated and the video signal is analyzed to determine the value indicated by each meter indicator. Finally, from the value indicated by each meter indicator in the calibrated region, a meter reading is generated. The method and system offer the advantages of automatic data collection in a relatively non-intrusive manner without making any complicated or expensive electronic connections, and without requiring intensive manpower.

  12. Heat stress effects on Holstein dairy cows' rumination.

    PubMed

    Moretti, R; Biffani, S; Chessa, S; Bozzi, R

    2017-12-01

    The objective of this study was to investigate the relationship between temperature-humidity index (THI) and rumination time (RT) in order to possibly exploit it as a useful tool for animal welfare improvement. During summer 2015 (1 June to 31 August), data from an Italian Holstein dairy farm located in the North of Italy were collected along with environmental data (i.e. ambient temperature and relative humidity) recorded with a weather station installed inside the barn. Rumination data were collected through the Heatime® HR system (SCR Engineers Ltd., Hadarim, Netanya, Israel), an automatic system composed of a neck collar with a Tag that records the RT and activity of each cow. A significant negative correlation was observed between RT and THI. Mixed linear models were fitted, including animal and test day as random effects, and parity, milk production level and date of last calving as fixed effects. A statistically significant effect of THI on RT was identified, with RT decreasing as THI increased.

  13. Rainfall, discharge, and water-quality data during stormwater monitoring, H-1 storm drain, Oahu, Hawaii, July 1, 2009, to June 30, 2010

    USGS Publications Warehouse

    Presley, Todd K.; Jamison, Marcael T.J.

    2010-01-01

    Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. The program is designed to assess the effects of highway runoff and urban runoff collected by the H-1 storm drain on the Manoa-Palolo Drainage Canal. This report summarizes rainfall, discharge, and water-quality data collected between July 1, 2009, and June 30, 2010. As part of this program, rainfall and continuous discharge data were collected at the H-1 storm drain. During the year, sampling strategy and sample processing methods were modified to improve the characterization of the effects of discharge from the storm drain on the Manoa-Palolo Drainage Canal. During July 1, 2009, to February 1, 2010, samples were collected from only the H-1 storm drain. Beginning February 2, 2010, samples were collected simultaneously from the H-1 storm drain and the Manoa-Palolo Drainage Canal at a location about 50 feet upstream of the discharge point of the H-1 storm drain. Three storms were sampled during July 1, 2009, to June 30, 2010. All samples were collected using automatic samplers. For the storm of August 12, 2009, grab samples (for oil and grease, and total petroleum hydrocarbons) and a composite sample were collected. The composite sample was analyzed for total suspended solids, nutrients, and selected dissolved and total (filtered and unfiltered) trace metals (cadmium, chromium, nickel, copper, lead, and zinc). Two storms were sampled in March 2010 at the H-1 storm drain and from the Manoa-Palolo Drainage Canal. Two samples were collected during the storm of March 4, 2010, and six samples were collected during the storm of March 8, 2010. These two storms were sampled using the modified strategy, in which discrete samples from the automatic sampler were processed and analyzed individually, rather than as a composite sample, using the simultaneously collected samples from the H-1 storm drain and from the Manoa-Palolo Drainage Canal. The discrete samples were analyzed for some or all of the following constituents: total suspended solids, nutrients, oil and grease, and selected dissolved (filtered) trace metals (cadmium, chromium, nickel, copper, lead, and zinc). Five quality-assurance/quality-control samples were analyzed during the year. These samples included one laboratory-duplicate, one field-duplicate, and one matrix-spike sample prepared and analyzed with the storm samples. In addition, two inorganic blank-water samples, one sample at the H-1 storm drain and one sample at the Manoa-Palolo Drainage Canal, were collected by running the blank water (water purified of all inorganic constituents) through the sampling and processing systems after cleaning automatic sampler lines to verify that the sampling lines were not contaminated.

  14. Collaboration spotting for dental science.

    PubMed

    Leonardi, E; Agocs, A; Fragkiskos, S; Kasfikis, N; Le Goff, J M; Cristalli, M P; Luzzi, V; Polimeni, A

    2014-10-06

    The goal of the Collaboration Spotting project is to create an automatic system to collect information about publications and patents related to a given technology, to identify the key players involved, and to highlight collaborations and related technologies. The collected information can be visualized in a web browser as interactive graphical maps showing in an intuitive way the players and their collaborations (Sociogram) and the relations among the technologies (Technogram). We propose to use the system to study technologies related to Dental Science. In order to create a Sociogram, we create a logical filter based on a set of keywords related to the technology under study. This filter is used to extract a list of publications from the Web of Science™ database. The list is validated by an expert in the technology and sent to CERN where it is inserted in the Collaboration Spotting database. Here, an automatic software system uses the data to generate the final maps. We studied a set of recent technologies related to bone regeneration procedures of oro--maxillo--facial critical size defects, namely the use of Porous HydroxyApatite (HA) as a bone substitute alone (bone graft) or as a tridimensional support (scaffold) for insemination and differentiation ex--vivo of Mesenchymal Stem Cells. We produced the Sociograms for these technologies and the resulting maps are now accessible on--line. The Collaboration Spotting system allows the automatic creation of interactive maps to show the current and historical state of research on a specific technology. These maps are an ideal tool both for researchers who want to assess the state--of--the--art in a given technology, and for research organizations who want to evaluate their contribution to the technological development in a given field. We demonstrated that the system can be used for Dental Science and produced the maps for an initial set of technologies in this field. We now plan to enlarge the set of mapped technologies in order to make the Collaboration Spotting system a useful reference tool for Dental Science research.

  15. Collaboration Spotting for oral medicine.

    PubMed

    Leonardi, E; Agocs, A; Fragkiskos, S; Kasfikis, N; Le Goff, J M; Cristalli, M P; Luzzi, V; Polimeni, A

    2014-09-01

    The goal of the Collaboration Spotting project is to create an automatic system to collect information about publications and patents related to a given technology, to identify the key players involved, and to highlight collaborations and related technologies. The collected information can be visualized in a web browser as interactive graphical maps showing in an intuitive way the players and their collaborations (Sociogram) and the relations among the technologies (Technogram). We propose to use the system to study technologies related to oral medicine. In order to create a sociogram, we create a logical filter based on a set of keywords related to the technology under study. This filter is used to extract a list of publications from the Web of Science™ database. The list is validated by an expert in the technology and sent to CERN where it is inserted in the Collaboration Spotting database. Here, an automatic software system uses the data to generate the final maps. We studied a set of recent technologies related to bone regeneration procedures of oro-maxillo-facial critical size defects, namely the use of porous hydroxyapatite (HA) as a bone substitute alone (bone graft) or as a tridimensional support (scaffold) for insemination and differentiation ex vivo of mesenchymal stem cells. We produced the sociograms for these technologies and the resulting maps are now accessible on-line. The Collaboration Spotting system allows the automatic creation of interactive maps to show the current and historical state of research on a specific technology. These maps are an ideal tool both for researchers who want to assess the state-of-the-art in a given technology, and for research organizations who want to evaluate their contribution to the technological development in a given field. We demonstrated that the system can be used in oral medicine as is produced the maps for an initial set of technologies in this field. We now plan to enlarge the set of mapped technologies in order to make the Collaboration Spotting system a useful reference tool for oral medicine research.

  16. Military applications of automatic speech recognition and future requirements

    NASA Technical Reports Server (NTRS)

    Beek, Bruno; Cupples, Edward J.

    1977-01-01

    An updated summary of the state-of-the-art of automatic speech recognition and its relevance to military applications is provided. A number of potential systems for military applications are under development. These include: (1) digital narrowband communication systems; (2) automatic speech verification; (3) on-line cartographic processing unit; (4) word recognition for militarized tactical data system; and (5) voice recognition and synthesis for aircraft cockpit.

  17. Design and Implementation of a Wireless Sensor and Actuator Network to Support the Intelligent Control of Efficient Energy Usage.

    PubMed

    Blanco, Jesús; García, Andrés; Morenas, Javier de Las

    2018-06-09

    Energy saving has become a major concern for the developed society of our days. This paper presents a Wireless Sensor and Actuator Network (WSAN) designed to provide support to an automatic intelligent system, based on the Internet of Things (IoT), which enables a responsible consumption of energy. The proposed overall system performs an efficient energetic management of devices, machines and processes, optimizing their operation to achieve a reduction in their overall energy usage at any given time. For this purpose, relevant data is collected from intelligent sensors, which are in-stalled at the required locations, as well as from the energy market through the Internet. This information is analysed to provide knowledge about energy utilization, and to improve efficiency. The system takes autonomous decisions automatically, based on the available information and the specific requirements in each case. The proposed system has been implanted and tested in a food factory. Results show a great optimization of energy efficiency and a substantial improvement on energy and costs savings.

  18. Integration of external metadata into the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Berger, Katharina; Levavasseur, Guillaume; Stockhause, Martina; Lautenschlager, Michael

    2015-04-01

    International projects with high volume data usually disseminate their data in a federated data infrastructure, e.g.~the Earth System Grid Federation (ESGF). The ESGF aims to make the geographically distributed data seamlessly discoverable and accessible. Additional data-related information is currently collected and stored in separate repositories by each data provider. This scattered and useful information is not or only partly available for ESGF users. Examples for such additional information systems are ES-DOC/metafor for model and simulation information, IPSL's versioning information, CHARMe for user annotations, DKRZ's quality information and data citation information. The ESGF Quality Control working team (esgf-qcwt) aims to integrate these valuable pieces of additional information into the ESGF in order to make them available to users and data archive managers by (i) integrating external information into ESGF portal, (ii) integrating links to external information objects into the ESGF metadata index, e.g. by the use of PIDs (Persistent IDentifiers), and (iii) automating the collection of external information during the ESGF data publication process. For the sixth phase of CMIP (Coupled Model Intercomparison Project), the ESGF metadata index is to be enriched by additional information on data citation, file version, etc. This information will support users directly and can be automatically exploited by higher level services (human and machine readability).

  19. 23 CFR 950.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS ELECTRONIC TOLL... Express Lanes Demonstration Program, and the Interstate System Construction Toll Pilot Program. Electronic toll collection means the ability for vehicle operators to pay tolls automatically without slowing down...

  20. A Study about the 3S-based Great Ruins Monitoring and Early-warning System

    NASA Astrophysics Data System (ADS)

    Xuefeng, W.; Zhongyuan, H.; Gongli, L.; Li, Z.

    2015-08-01

    Large-scale urbanization construction and new countryside construction, frequent natural disasters, and natural corrosion pose severe threat to the great ruins. It is not uncommon that the cultural relics are damaged and great ruins are occupied. Now the ruins monitoring mainly adopt general monitoring data processing system which can not effectively exert management, display, excavation analysis and data sharing of the relics monitoring data. Meanwhile those general software systems require layout of large number of devices or apparatuses, but they are applied to small-scope relics monitoring only. Therefore, this paper proposes a method to make use of the stereoscopic cartographic satellite technology to improve and supplement the great ruins monitoring index system and combine GIS and GPS to establish a highly automatic, real-time and intelligent great ruins monitoring and early-warning system in order to realize collection, processing, updating, spatial visualization, analysis, distribution and sharing of the monitoring data, and provide scientific and effective data for the relics protection, scientific planning, reasonable development and sustainable utilization.

  1. Automatic Fare Collection Equipment, Reliability and Maintainability Assessment Plan for Urban Rail Transit Properties

    DOT National Transportation Integrated Search

    1981-03-01

    This project was conducted as part of UMTA's Rail Transit Fare Collection Program developed by the Transportation Systems Center of the U.S. Department of Transportation. The report presents a generalized survey methodology for conducting assessments...

  2. Rainfall, Discharge, and Water-Quality Data During Stormwater Monitoring, July 1, 2008, to June 30, 2009 - Halawa Stream Drainage Basin and the H-1 Storm Drain, Oahu, Hawaii

    USGS Publications Warehouse

    Presley, Todd K.; Jamison, Marcael T.J.

    2009-01-01

    Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. The program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream, and to assess the effects from the H-1 storm drain on Manoa Stream. For this program, rainfall data were collected at three stations, continuous discharge data at five stations, and water-quality data at six stations, which include the five continuous discharge stations. This report summarizes rainfall, discharge, and water-quality data collected between July 1, 2008, and June 30, 2009. Within the Halawa Stream drainage area, three storms (October 25 and December 11, 2008, and February 3, 2009) were sampled during July 1, 2008, to June 30, 2009. A total of 43 environmental samples were collected during these three storms. During the storm of October 25, 2009, 31 samples were collected and analyzed individually for metals only. The other 12 samples from the other two storms were analyzed for some or all of the following analytes: total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, and zinc). Additionally, grab samples were analyzed for some or all of the following analytes: oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Some grab and composite samples were analyzed for only a partial list of these analytes, either because samples could not be delivered to the laboratory in a timely manner, or an insufficient volume of sample was collected by the automatic samplers. Two quality-assurance/quality-control samples were collected after cleaning automatic sampler lines to verify that the sampling lines were not contaminated. Four environmental samples were collected at the H-1 Storm Drain during July 1, 2008, to June 30, 2009. An oil and grease sample and a composite sample were collected during the storm on November 15, 2008, and two composite samples were collected during the January 11, 2009, storm. All samples at this site were collected using an automatic sampler. Samples were analyzed for some or all of the following analytes: total suspended solids, nutrients, oil and grease, total petroleum hydrocarbons, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). One qualityassurance/quality-control sample was collected after cleaning automatic sampler lines to verify that the sampling lines were not contaminated. During the storm of January 11, 2009, the two composite samples collected at H-1 Storm Drain were collected about three hours apart. Higher constituent concentrations were detected in the first 2 composite sample relative to the second composite sample, although the average discharge was higher during the period when the second sample was collected.

  3. Mobile SO2 and NO2 DIAL Lidar system for enforcement use

    NASA Astrophysics Data System (ADS)

    Cunningham, David L.; Pence, William H.; Moody, Stephen E.

    1994-06-01

    A self-contained mobile differential absorption lidar (DIAL) system intended for measuring SO2 and NO2 concentrations from stationary combustion sources has been completed for enforcement use. The system uses tunable Ti:sapphire laser technology, with nonlinear conversion to the blue and UV absorption wavelengths. Separate tunable laser oscillators at slightly offset wavelengths are pumped on alternate pulses of a 20 Hz doubled Nd:YAG pump laser; the outputs are amplified in a common amplifier, doubled or tripled, and transmitted toward a target region via a two-mirror beam director. Scattered atmospheric returns are collected in a 0.27-m-diameter telescope, detected with a filtered photomultiplier, and digitized and stored for analysis. Extensive software-based control and display windows are provided for operator interaction with the system. The DIAL system is built into a small motor coach. Gasoline- powered electrical generation, laser cooling, and air conditioning services are present. Separate computers are provided for simultaneous data collection and data analysis activities, with shared data base access. A laser printer supplies hardcopy output. The system includes the capability for automatic data collection at a series of scanner angles, and computer processing to present results in a variety of formats. Plumes from coal-fired and mixed-fuel-fired combusters have been examined for NO2 and SO2 content. Noise levels of a few parts per million are reached with averaging times of less than one minute.

  4. Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data

    NASA Astrophysics Data System (ADS)

    Bellakaout, A.; Cherkaoui, M.; Ettarid, M.; Touzani, A.

    2016-06-01

    Aerial topographic surveys using Light Detection and Ranging (LiDAR) technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS), mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.

  5. Indoor air quality analysis based on Hadoop

    NASA Astrophysics Data System (ADS)

    Tuo, Wang; Yunhua, Sun; Song, Tian; Liang, Yu; Weihong, Cui

    2014-03-01

    The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper.

  6. Evaluation of an automatic-timed insecticide application system for backyard mosquito control.

    PubMed

    Cilek, J E; Hallmon, C F; Johnson, R

    2008-12-01

    Several manufacturers and pest management companies have begun to market and install outdoor automatically timed insecticide application systems that claim to provide an envelope of protection against host-seeking mosquitoes within a defined area, e.g., residential backyards. A typical system consists of a multi-gallon reservoir attached to a continuous loop of plastic tubing with multiple single spray head nozzles. Nozzles are usually placed along the perimeter of a backyard in landscaping or other areas suitable for mosquito harborage. This array is then connected to a programmable electric pump set to automatically apply an insecticide at predetermined intervals. An operational field study was conducted to evaluate this technology using previously installed MistAway systems at.3 residences in northwestern Florida. This system applied a mist-like application of 0.05% AI synergized pyrethrins for 45 sec at dawn and again at dusk in each backyard. Twice-weekly collections from ABC suction light traps, baited with carbon dioxide, were used as the evaluation tool. Female mosquitoes from treatment backyards were compared with trap collections from 3 backyards without automatic misting systems used as controls. We found that weekly mosquito reduction was highly variable and ranged from 98% to 14% during the 35-wk study. Because the primary method of reduction by these application systems was not well understood, a MistAway system was installed in an outdoor simulated residential backyard to determine exposure pathway under controlled conditions with field cage and excised-leaf bioassays. Using laboratory-reared females of Aedes albopictus and Culex quinquefasciatus in those assays, we found that reduction by the MistAway system was primarily achieved by direct exposure of the mosquitoes to the insecticide application and not from residual deposits on treated vegetation.

  7. Fourth Conference on Artificial Intelligence for Space Applications

    NASA Technical Reports Server (NTRS)

    Odell, Stephen L. (Compiler); Denton, Judith S. (Compiler); Vereen, Mary (Compiler)

    1988-01-01

    Proceedings of a conference held in Huntsville, Alabama, on November 15-16, 1988. The Fourth Conference on Artificial Intelligence for Space Applications brings together diverse technical and scientific work in order to help those who employ AI methods in space applications to identify common goals and to address issues of general interest in the AI community. Topics include the following: space applications of expert systems in fault diagnostics, in telemetry monitoring and data collection, in design and systems integration; and in planning and scheduling; knowledge representation, capture, verification, and management; robotics and vision; adaptive learning; and automatic programming.

  8. Telescope Automation and Remote Observing System (TAROS)

    NASA Astrophysics Data System (ADS)

    Wilson, G.; Czezowski, A.; Hovey, G. R.; Jarnyk, M. A.; Nielsen, J.; Roberts, B.; Sebo, K.; Smith, D.; Vaccarella, A.; Young, P.

    2005-12-01

    TAROS is a system that will allow for the Australian National University telescopes at a remote location to be operated automatically or interactively with authenticated control via the internet. TAROS is operated by a Java front-end GUI and employs the use of several Java technologies - such as Java Message Service (JMS) for communication between the telescope and the remote observer, Java Native Interface to integrate existing data acquisition software written in C++ (CICADA) with new Java programs and the JSky collection of Java GUI components for parts of the remote observer client. In this poster the design and implementation of TAROS is described.

  9. [Wound information management system: a standardized scheme for acquisition, storage and management of wound information].

    PubMed

    Liu, Hu; Su, Rong-jia; Wu, Min-jie; Zhang, Yi; Qiu, Xiang-jun; Feng, Jian-gang; Xie, Ting; Lu, Shu-liang

    2012-06-01

    To form a wound information management scheme with objectivity, standardization, and convenience by means of wound information management system. A wound information management system was set up with the acquisition terminal, the defined wound description, the data bank, and related softwares. The efficacy of this system was evaluated in clinical practice. The acquisition terminal was composed of the third generation mobile phone and the software. It was feasible to get access to the wound information, including description, image, and therapeutic plan from the data bank by mobile phone. During 4 months, a collection of a total of 232 wound treatment information was entered, and accordingly standardized data of 38 patients were formed automatically. This system can provide standardized wound information management by standardized techniques of acquisition, transmission, and storage of wound information. It can be used widely in hospitals, especially primary medical institutions. Data resource of the system makes it possible for epidemiological study with large sample size in future.

  10. Automatic real-time control of suspended sediment based upon high frequency in situ measurements of nephelometric turbidity

    Treesearch

    Jack Lewis; Rand Eads

    1998-01-01

    Abstract - For estimating suspended sediment concentration (SSC) in rivers, turbidity is potentially a much better predictor than water discharge. Since about 1990, it has been feasible to automatically collect high frequency turbidity data at remote sites using battery-powered turbidity probes that are properly mounted in the river or stream. With sensors calibrated...

  11. Implementation and results of an integrated data quality assurance protocol in a randomized controlled trial in Uttar Pradesh, India.

    PubMed

    Gass, Jonathon D; Misra, Anamika; Yadav, Mahendra Nath Singh; Sana, Fatima; Singh, Chetna; Mankar, Anup; Neal, Brandon J; Fisher-Bowman, Jennifer; Maisonneuve, Jenny; Delaney, Megan Marx; Kumar, Krishan; Singh, Vinay Pratap; Sharma, Narender; Gawande, Atul; Semrau, Katherine; Hirschhorn, Lisa R

    2017-09-07

    There are few published standards or methodological guidelines for integrating Data Quality Assurance (DQA) protocols into large-scale health systems research trials, especially in resource-limited settings. The BetterBirth Trial is a matched-pair, cluster-randomized controlled trial (RCT) of the BetterBirth Program, which seeks to improve quality of facility-based deliveries and reduce 7-day maternal and neonatal mortality and maternal morbidity in Uttar Pradesh, India. In the trial, over 6300 deliveries were observed and over 153,000 mother-baby pairs across 120 study sites were followed to assess health outcomes. We designed and implemented a robust and integrated DQA system to sustain high-quality data throughout the trial. We designed the Data Quality Monitoring and Improvement System (DQMIS) to reinforce six dimensions of data quality: accuracy, reliability, timeliness, completeness, precision, and integrity. The DQMIS was comprised of five functional components: 1) a monitoring and evaluation team to support the system; 2) a DQA protocol, including data collection audits and targets, rapid data feedback, and supportive supervision; 3) training; 4) standard operating procedures for data collection; and 5) an electronic data collection and reporting system. Routine audits by supervisors included double data entry, simultaneous delivery observations, and review of recorded calls to patients. Data feedback reports identified errors automatically, facilitating supportive supervision through a continuous quality improvement model. The five functional components of the DQMIS successfully reinforced data reliability, timeliness, completeness, precision, and integrity. The DQMIS also resulted in 98.33% accuracy across all data collection activities in the trial. All data collection activities demonstrated improvement in accuracy throughout implementation. Data collectors demonstrated a statistically significant (p = 0.0004) increase in accuracy throughout consecutive audits. The DQMIS was successful, despite an increase from 20 to 130 data collectors. In the absence of widely disseminated data quality methods and standards for large RCT interventions in limited-resource settings, we developed an integrated DQA system, combining auditing, rapid data feedback, and supportive supervision, which ensured high-quality data and could serve as a model for future health systems research trials. Future efforts should focus on standardization of DQA processes for health systems research. ClinicalTrials.gov identifier, NCT02148952 . Registered on 13 February 2014.

  12. Demonstration of Advanced Geophysics and Classification Methods on Munitions Response Sites: Closed Castner Range Fort Bliss, TX

    DTIC Science & Technology

    2016-04-01

    Program ft foot/feet GPS Global Positioning System HE High Explosive ID Identification IDA Institute for Defense Analysis IMU Inertial Measurement Unit ISO...were replaced with two ski-shaped runners, and a new mount above the array was used to hold the Inertial Measurement Unit (IMU) and Trimble R8 Real...to collect a cued data measurement (Figure 9). The instrument’s pitch, roll , and yaw angles automatically were measured by the IMU. These angles and

  13. An Electronic Pillbox for Continuous Monitoring of Medication Adherence

    PubMed Central

    Hayes, Tamara. L.; Hunt, John M.; Adami, Andre; Kaye, Jeffrey A.

    2010-01-01

    We have developed an instrumented pillbox, called a MedTracker, which allows monitoring of medication adherence on a continuous basis. This device improves on existing systems by providing mobility, frequent and automatic data collection, more detailed information about nonadherence and medication errors, and the familiar interface of a 7-day drug store pillbox. We report on the design of the MedTracker, and on the results of a field trial in 39 homes to evaluate the device. PMID:17946369

  14. Using RGB-D sensors and evolutionary algorithms for the optimization of workstation layouts.

    PubMed

    Diego-Mas, Jose Antonio; Poveda-Bautista, Rocio; Garzon-Leal, Diana

    2017-11-01

    RGB-D sensors can collect postural data in an automatized way. However, the application of these devices in real work environments requires overcoming problems such as lack of accuracy or body parts' occlusion. This work presents the use of RGB-D sensors and genetic algorithms for the optimization of workstation layouts. RGB-D sensors are used to capture workers' movements when they reach objects on workbenches. Collected data are then used to optimize workstation layout by means of genetic algorithms considering multiple ergonomic criteria. Results show that typical drawbacks of using RGB-D sensors for body tracking are not a problem for this application, and that the combination with intelligent algorithms can automatize the layout design process. The procedure described can be used to automatically suggest new layouts when workers or processes of production change, to adapt layouts to specific workers based on their ways to do the tasks, or to obtain layouts simultaneously optimized for several production processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The USGS ``Did You Feel It?'' Internet-based Macroseismic Intensity Maps: Lessons Learned from a Decade of Online Data Collection (Invited)

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Quitoriano, V. R.; Hopper, M.; Mathias, S.; Dewey, J. W.

    2010-12-01

    Over the past decade, the U.S. Geological Survey’s “Did You Feel It?” (DYFI) system has automatically collected shaking and damage reports from Internet users immediately following earthquakes. This 10-yr stint of citizen-based science preceded the recently in vogue notion of "crowdsourcing" by nearly a decade. DYFI is a rapid and vast source of macroseismic data, providing quantitative and qualitative information about shaking intensities for earthquakes in the US and around the globe. Statistics attest to the abundance and rapid availability of these Internet-based macroseismic data: Over 1.8 million entries have been logged over the decade, and there are 30 events each with over 10,000 responses (230 events have over 1,000 entries). The greatest number of responses to date for an earthquake is over 78,000 for the April 2010, M7.2 Baja California, Mexico, event. Questionnaire response rates have reached 62,000 per hour (1,000 per min!) obviously requiring substantial web resource allocation and capacity. Outside the US, DYFI has gathered over 189,000 entries in 9,500 cities covering 140 countries since its global inception in late 2004. The rapid intensity data are automatically used in the Global ShakeMap (GSM) system, providing intensity constraints near population centers and in places without instrumental coverage (most of the world), and allowing for bias correction to the empirical prediction equations employed. ShakeMap has also been recently refined to automatically use macroseismic input data in their native form, and treat their uncertainties rigorously in concert with ground-motion data. Recent DYFI system improvements include a graphical user interface that allows seismic analysts to perform common functions, including map triggering and resizing , as well as sorting, searching, geocoding, and flagging entries. New web-based geolocation and geocoding services are being incorporated into DYFI for improving the accuracy of the users’ locations. A database containing the entire DYFI archive facilitates research by streamlining the selection, organization and export of data. For example, recent quantitative analyses of uncertainties of DYFI data provide confidence in their use: Averaging ten or more responses at a given location results in uncertainties of less than 0.2 intensity units. Systems comparable or complimentary to DYFI now operate in several countries, and collaborative efforts to uniformly collect and exchange data in near real time are being further explored. From our experience with DYFI, essential components of an Internet-based citizen science portal include i) easy-to-use forms, ii) instant feedback so that a user may see his contribution (validating their experience), iii) open space for first-person accounts (catharsis) and discussion of effects not covered in the questionnaire, and iv) routinely addressing user comments and questions. In addition, online user-friendly tools now include common searches, statistics, sorting of responses, time-entry histories, comparisons of data with empirical intensity estimates, and an easily-downloadable data format for researchers. A number of these functions were originally recommended by users, again emphasizing the need to attend to user feedback.

  16. A guidebook for using automatic passenger counter data for national transit database (NTD) reporting : summary.

    DOT National Transportation Integrated Search

    2010-01-01

    Planning for and funding public transportation is based on ridership. This information can be collected manually, but increasingly, it is collected by Automated Passenger Counters (APC). These electronic devices can be based on different technologies...

  17. Automatic Road Sign Inventory Using Mobile Mapping Systems

    NASA Astrophysics Data System (ADS)

    Soilán, M.; Riveiro, B.; Martínez-Sánchez, J.; Arias, P.

    2016-06-01

    The periodic inspection of certain infrastructure features plays a key role for road network safety and preservation, and for developing optimal maintenance planning that minimize the life-cycle cost of the inspected features. Mobile Mapping Systems (MMS) use laser scanner technology in order to collect dense and precise three-dimensional point clouds that gather both geometric and radiometric information of the road network. Furthermore, time-stamped RGB imagery that is synchronized with the MMS trajectory is also available. In this paper a methodology for the automatic detection and classification of road signs from point cloud and imagery data provided by a LYNX Mobile Mapper System is presented. First, road signs are detected in the point cloud. Subsequently, the inventory is enriched with geometrical and contextual data such as orientation or distance to the trajectory. Finally, semantic content is given to the detected road signs. As point cloud resolution is insufficient, RGB imagery is used projecting the 3D points in the corresponding images and analysing the RGB data within the bounding box defined by the projected points. The methodology was tested in urban and road environments in Spain, obtaining global recall results greater than 95%, and F-score greater than 90%. In this way, inventory data is obtained in a fast, reliable manner, and it can be applied to improve the maintenance planning of the road network, or to feed a Spatial Information System (SIS), thus, road sign information can be available to be used in a Smart City context.

  18. A Retrospective Review of the Clinical Characteristics and Blood Glucose Data from Cellnovo System Users using Data Collected from the Cellnovo Online Platform.

    PubMed

    Hautier-Suply, Olivia; Friedmann, Yasmin; Shapley, Julian

    2018-04-01

    Technological advances have led to innovative insulin delivery systems for patients with type 1 diabetes mellitus. In particular, the combination of miniature engineering and software algorithms contained in continuous subcutaneous insulin infusion (CSII) system pumps provide the user and the healthcare practitioner with an opportunity to review and adjust blood glucose (BG) levels according to system feedback, and to modify or programme their regimen according to their needs. While CSII pumps record a number of data parameters such as BG level, carbohydrate intake, activity and insulin delivered, these data are generally 'locked in' and can only be accessed by uploading to a cloud-based system, thus information is not contemporaneous. The Cellnovo Diabetes Management System (Cellnovo, Bridgend, UK) allows data to be transmitted securely and wirelessly in real time to a secure server, which is then retrieved by an online platform, the Cellnovo Online platform, enabling continuous access by the user and by clinicians. In this article, the authors describe a retrospective review of the patient data automatically uploaded to the Cellnovo Online platform. Baseline clinical and demographic characteristics collected at the start of pump therapy are shown for all patients, and BG data from a sub-cohort of patients who have been using the system for at least 6 months and who take and record an average of three BG level tests per day are presented to demonstrate glycaemic data over time.

  19. How I do it: a practical database management system to assist clinical research teams with data collection, organization, and reporting.

    PubMed

    Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe

    2015-04-01

    The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  20. Can an online clinical data management service help in improving data collection and data quality in a developing country setting?

    PubMed

    Wildeman, Maarten A; Zandbergen, Jeroen; Vincent, Andrew; Herdini, Camelia; Middeldorp, Jaap M; Fles, Renske; Dalesio, Otilia; van der Donk, Emile; Tan, I Bing

    2011-08-08

    Data collection by electronic medical record (EMR) systems have been proven to be helpful in data collection for scientific research and in improving healthcare. For a multi-centre trial in Indonesia and the Netherlands a web based system was selected to enable all participating centres to easily access data. This study assesses whether the introduction of a clinical trial data management service (CTDMS) composed of electronic case report forms (eCRF) can result in effective data collection and treatment monitoring. Data items entered were checked for inconsistencies automatically when submitted online. The data were divided into primary and secondary data items. We analysed both the total number of errors and the change in error rate, for both primary and secondary items, over the first five month of the trial. In the first five months 51 patients were entered. The primary data error rate was 1.6%, whilst that for secondary data was 2.7% against acceptable error rates for analysis of 1% and 2.5% respectively. The presented analysis shows that after five months since the introduction of the CTDMS the primary and secondary data error rates reflect acceptable levels of data quality. Furthermore, these error rates were decreasing over time. The digital nature of the CTDMS, as well as the online availability of that data, gives fast and easy insight in adherence to treatment protocols. As such, the CTDMS can serve as a tool to train and educate medical doctors and can improve treatment protocols.

  1. Water Mapping Using Multispectral Airborne LIDAR Data

    NASA Astrophysics Data System (ADS)

    Yan, W. Y.; Shaker, A.; LaRocque, P. E.

    2018-04-01

    This study investigates the use of the world's first multispectral airborne LiDAR sensor, Optech Titan, manufactured by Teledyne Optech to serve the purpose of automatic land-water classification with a particular focus on near shore region and river environment. Although there exist recent studies utilizing airborne LiDAR data for shoreline detection and water surface mapping, the majority of them only perform experimental testing on clipped data subset or rely on data fusion with aerial/satellite image. In addition, most of the existing approaches require manual intervention or existing tidal/datum data for sample collection of training data. To tackle the drawbacks of previous approaches, we propose and develop an automatic data processing workflow for land-water classification using multispectral airborne LiDAR data. Depending on the nature of the study scene, two methods are proposed for automatic training data selection. The first method utilizes the elevation/intensity histogram fitted with Gaussian mixture model (GMM) to preliminarily split the land and water bodies. The second method mainly relies on the use of a newly developed scan line elevation intensity ratio (SLIER) to estimate the water surface data points. Regardless of the training methods being used, feature spaces can be constructed using the multispectral LiDAR intensity, elevation and other features derived from these parameters. The comprehensive workflow was tested with two datasets collected for different near shore region and river environment, where the overall accuracy yielded better than 96 %.

  2. The development of the Athens Emotional States Inventory (AESI): collection, validation and automatic processing of emotionally loaded sentences.

    PubMed

    Chaspari, Theodora; Soldatos, Constantin; Maragos, Petros

    2015-01-01

    The development of ecologically valid procedures for collecting reliable and unbiased emotional data towards computer interfaces with social and affective intelligence targeting patients with mental disorders. Following its development, presented with, the Athens Emotional States Inventory (AESI) proposes the design, recording and validation of an audiovisual database for five emotional states: anger, fear, joy, sadness and neutral. The items of the AESI consist of sentences each having content indicative of the corresponding emotion. Emotional content was assessed through a survey of 40 young participants with a questionnaire following the Latin square design. The emotional sentences that were correctly identified by 85% of the participants were recorded in a soundproof room with microphones and cameras. A preliminary validation of AESI is performed through automatic emotion recognition experiments from speech. The resulting database contains 696 recorded utterances in Greek language by 20 native speakers and has a total duration of approximately 28 min. Speech classification results yield accuracy up to 75.15% for automatically recognizing the emotions in AESI. These results indicate the usefulness of our approach for collecting emotional data with reliable content, balanced across classes and with reduced environmental variability.

  3. Using Activity-Related Behavioural Features towards More Effective Automatic Stress Detection

    PubMed Central

    Giakoumis, Dimitris; Drosou, Anastasios; Cipresso, Pietro; Tzovaras, Dimitrios; Hassapis, George; Gaggioli, Andrea; Riva, Giuseppe

    2012-01-01

    This paper introduces activity-related behavioural features that can be automatically extracted from a computer system, with the aim to increase the effectiveness of automatic stress detection. The proposed features are based on processing of appropriate video and accelerometer recordings taken from the monitored subjects. For the purposes of the present study, an experiment was conducted that utilized a stress-induction protocol based on the stroop colour word test. Video, accelerometer and biosignal (Electrocardiogram and Galvanic Skin Response) recordings were collected from nineteen participants. Then, an explorative study was conducted by following a methodology mainly based on spatiotemporal descriptors (Motion History Images) that are extracted from video sequences. A large set of activity-related behavioural features, potentially useful for automatic stress detection, were proposed and examined. Experimental evaluation showed that several of these behavioural features significantly correlate to self-reported stress. Moreover, it was found that the use of the proposed features can significantly enhance the performance of typical automatic stress detection systems, commonly based on biosignal processing. PMID:23028461

  4. Risk Identification in a Smart Monitoring System Used to Preserve Artefacts Based on Textile Materials

    NASA Astrophysics Data System (ADS)

    Diaconescu, V. D.; Scripcariu, L.; Mătăsaru, P. D.; Diaconescu, M. R.; Ignat, C. A.

    2018-06-01

    Exhibited textile-materials-based artefacts can be affected by the environmental conditions. A smart monitoring system that commands an adaptive automatic environment control system is proposed for indoor exhibition spaces containing various textile artefacts. All exhibited objects are monitored by many multi-sensor nodes containing temperature, relative humidity and light sensors. Data collected periodically from the entire sensor network is stored in a database and statistically processed in order to identify and classify the environment risk. Risk consequences are analyzed depending on the risk class and the smart system commands different control measures in order to stabilize the indoor environment conditions to the recommended values and prevent material degradation.

  5. MAIL LOG, program theory, volume 1. [Scout project automatic data system

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The program theory used to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (2) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL LOG outputs are given and the data base design is described.

  6. The CCD/Transit Instrument (CTI) data-analysis system

    NASA Technical Reports Server (NTRS)

    Cawson, M. G. M.; Mcgraw, J. T.; Keane, M. J.

    1995-01-01

    The automated software system for archiving, analyzing, and interrogating data from the CCD/Transit Instrument (CTI) is described. The CTI collects up to 450 Mbytes of image-data each clear night in the form of a narrow strip of sky observed in two colors. The large data-volumes and the scientific aims of the project make it imperative that the data are analyzed within the 24-hour period following the observations. To this end a fully automatic and self evaluating software system has been developed. The data are collected from the telescope in real-time and then transported to Tucson for analysis. Verification is performed by visual inspection of random subsets of the data and obvious cosmic rays are detected and removed before permanent archival is made to the optical disc. The analysis phase is performed by a pair of linked algorithms, one operating on the absolute pixel-values and the other on the spatial derivative of the data. In this way both isolated and merged images are reliably detected in a single pass. In order to isolate the latter algorithm from the effects of noise spikes a 3x3 Hanning filter is applied to the raw data before the analysis is run. The algorithms reduce the input pixel-data to a database of measured parameters for each image which has been found. A contrast filter is applied in order to assign a detection-probability to each image and then x-y calibration and intensity calibration are performed using known reference stars in the strip. These are added to as necessary by secondary standards boot-strapped from the CTI data itself. The final stages involve merging the new data into the CTI Master-list and History-list and the automatic comparison of each new detection with a set of pre-defined templates in parameter-space to find interesting objects such as supernovae, quasars and variable stars. Each stage of the processing from verification to interesting image selection is performed under a data-logging system which both controls the pipe-lining of data through the system and records key performance monitor parameters which are built into the software. Furthermore, the data from each stage are stored in databases to facilitate evaluation, and all stages offer the facility to enter keyword-indexed free-format text into the data-logging system. In this way a large measure of certification is built into the system to provide the necessary confidence in the end results.

  7. Development of a web-based register for the Dutch national study on biologicals in JIA: www.ABC-register.nl.

    PubMed

    Prince, F H M; Ferket, I S; Kamphuis, S; Armbrust, W; Ten Cate, R; Hoppenreijs, E P A H; Koopman-Keemink, Y; van Rossum, M A J; van Santen-Hoeufft, M; Twilt, M; van Suijlekom-Smit, L W A

    2008-09-01

    Most clinical studies use paper case record forms (CRFs) to collect data. In the Dutch multi-centre observational study on biologicals we encountered several disadvantages of using the paper CRFs. These are delay in data collection, lack of overview in collected data and difficulties in obtaining up-to-date interim reports. Therefore, we wanted to create a more effective method of data collection compared with CRFs on paper in a multi-centre study. We designed a web-based register with the intention to make it easy to use for participating physicians and at the same time accurate and up-to-date. Security demands were taken into account to secure the safety of the patient data. The web-based register was tested with data from 161 juvenile idiopathic arthritis patients from nine different centres. Internal validity was obtained and user-friendliness guaranteed. To secure the completeness of the data automatically generated e-mail alerts were implemented into the web-based register. More transparency of data was achieved by including the option to automatically generate interim reports of data in the web-based register. The safety was tested and approved. By digitalizing the CRF we achieved our aim to provide easy, rapid and safe access to the database and contributed to a new way of data collection. Although the web-based register was designed for the current multi-centre observational study, this type of instrument can also be applied to other types of studies. We expect that especially collaborative study groups will find it an efficient tool to collect data.

  8. Towards data integration automation for the French rare disease registry.

    PubMed

    Maaroufi, Meriem; Choquet, Rémy; Landais, Paul; Jaulent, Marie-Christine

    2015-01-01

    Building a medical registry upon an existing infrastructure and rooted practices is not an easy task. It is the case for the BNDMR project, the French rare disease registry, that aims to collect administrative and medical data of rare disease patients seen in different hospitals. To avoid duplicating data entry for health professionals, the project plans to deploy connectors with the existing systems to automatically retrieve data. Given the data heterogeneity and the large number of source systems, the automation of connectors creation is required. In this context, we propose a methodology that optimizes the use of existing alignment approaches in the data integration processes. The generated mappings are formalized in exploitable mapping expressions. Following this methodology, a process has been experimented on specific data types of a source system: Boolean and predefined lists. As a result, effectiveness of the used alignment approach has been enhanced and more good mappings have been detected. Nonetheless, further improvements could be done to deal with the semantic issue and process other data types.

  9. Towards data integration automation for the French rare disease registry

    PubMed Central

    Maaroufi, Meriem; Choquet, Rémy; Landais, Paul; Jaulent, Marie-Christine

    2015-01-01

    Building a medical registry upon an existing infrastructure and rooted practices is not an easy task. It is the case for the BNDMR project, the French rare disease registry, that aims to collect administrative and medical data of rare disease patients seen in different hospitals. To avoid duplicating data entry for health professionals, the project plans to deploy connectors with the existing systems to automatically retrieve data. Given the data heterogeneity and the large number of source systems, the automation of connectors creation is required. In this context, we propose a methodology that optimizes the use of existing alignment approaches in the data integration processes. The generated mappings are formalized in exploitable mapping expressions. Following this methodology, a process has been experimented on specific data types of a source system: Boolean and predefined lists. As a result, effectiveness of the used alignment approach has been enhanced and more good mappings have been detected. Nonetheless, further improvements could be done to deal with the semantic issue and process other data types. PMID:26958224

  10. Automatic Car Identification - an Evaluation

    DOT National Transportation Integrated Search

    1972-03-01

    In response to a Federal Railroad Administration request, the Transportation Systems Center evaluated the Automatic Car Identification System (ACI) used on the nation's railroads. The ACI scanner was found to be adequate for reliable data output whil...

  11. Automatically exposing OpenLifeData via SADI semantic Web Services.

    PubMed

    González, Alejandro Rodríguez; Callahan, Alison; Cruz-Toledo, José; Garcia, Adrian; Egaña Aranguren, Mikel; Dumontier, Michel; Wilkinson, Mark D

    2014-01-01

    Two distinct trends are emerging with respect to how data is shared, collected, and analyzed within the bioinformatics community. First, Linked Data, exposed as SPARQL endpoints, promises to make data easier to collect and integrate by moving towards the harmonization of data syntax, descriptive vocabularies, and identifiers, as well as providing a standardized mechanism for data access. Second, Web Services, often linked together into workflows, normalize data access and create transparent, reproducible scientific methodologies that can, in principle, be re-used and customized to suit new scientific questions. Constructing queries that traverse semantically-rich Linked Data requires substantial expertise, yet traditional RESTful or SOAP Web Services cannot adequately describe the content of a SPARQL endpoint. We propose that content-driven Semantic Web Services can enable facile discovery of Linked Data, independent of their location. We use a well-curated Linked Dataset - OpenLifeData - and utilize its descriptive metadata to automatically configure a series of more than 22,000 Semantic Web Services that expose all of its content via the SADI set of design principles. The OpenLifeData SADI services are discoverable via queries to the SHARE registry and easy to integrate into new or existing bioinformatics workflows and analytical pipelines. We demonstrate the utility of this system through comparison of Web Service-mediated data access with traditional SPARQL, and note that this approach not only simplifies data retrieval, but simultaneously provides protection against resource-intensive queries. We show, through a variety of different clients and examples of varying complexity, that data from the myriad OpenLifeData can be recovered without any need for prior-knowledge of the content or structure of the SPARQL endpoints. We also demonstrate that, via clients such as SHARE, the complexity of federated SPARQL queries is dramatically reduced.

  12. SU-E-T-50: Automatic Validation of Megavoltage Beams Modeled for Clinical Use in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melchior, M; Salinas Aranda, F; 21st Century Oncology, Ft. Myers, FL

    2014-06-01

    Purpose: To automatically validate megavoltage beams modeled in XiO™ 4.50 (Elekta, Stockholm, Sweden) and Varian Eclipse™ Treatment Planning Systems (TPS) (Varian Associates, Palo Alto, CA, USA), reducing validation time before beam-on for clinical use. Methods: A software application that can automatically read and analyze DICOM RT Dose and W2CAD files was developed using MatLab integrated development environment.TPS calculated dose distributions, in DICOM RT Dose format, and dose values measured in different Varian Clinac beams, in W2CAD format, were compared. Experimental beam data used were those acquired for beam commissioning, collected on a water phantom with a 2D automatic beam scanningmore » system.Two methods were chosen to evaluate dose distributions fitting: gamma analysis and point tests described in Appendix E of IAEA TECDOC-1583. Depth dose curves and beam profiles were evaluated for both open and wedged beams. Tolerance parameters chosen for gamma analysis are 3% and 3 mm dose and distance, respectively.Absolute dose was measured independently at points proposed in Appendix E of TECDOC-1583 to validate software results. Results: TPS calculated depth dose distributions agree with measured beam data under fixed precision values at all depths analyzed. Measured beam dose profiles match TPS calculated doses with high accuracy in both open and wedged beams. Depth and profile dose distributions fitting analysis show gamma values < 1. Relative errors at points proposed in Appendix E of TECDOC-1583 meet therein recommended tolerances.Independent absolute dose measurements at points proposed in Appendix E of TECDOC-1583 confirm software results. Conclusion: Automatic validation of megavoltage beams modeled for their use in the clinic was accomplished. The software tool developed proved efficient, giving users a convenient and reliable environment to decide whether to accept or not a beam model for clinical use. Validation time before beam-on for clinical use was reduced to a few hours.« less

  13. Automatic sample Dewar for MX beam-line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charignon, T.; Tanchon, J.; Trollier, T.

    2014-01-29

    It is very common for crystals of large biological macromolecules to show considerable variation in quality of their diffraction. In order to increase the number of samples that are tested for diffraction quality before any full data collections at the ESRF*, an automatic sample Dewar has been implemented. Conception and performances of the Dewar are reported in this paper. The automatic sample Dewar has 240 samples capability with automatic loading/unloading ports. The storing Dewar is capable to work with robots and it can be integrated in a full automatic MX** beam-line. The samples are positioned in the front of themore » loading/unloading ports with and automatic rotating plate. A view port has been implemented for data matrix camera reading on each sample loaded in the Dewar. At last, the Dewar is insulated with polyurethane foam that keeps the liquid nitrogen consumption below 1.6 L/h. At last, the static insulation also makes vacuum equipment and maintenance unnecessary. This Dewar will be useful for increasing the number of samples tested in synchrotrons.« less

  14. Network-Capable Application Process and Wireless Intelligent Sensors for ISHM

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Morris, Jon; Turowski, Mark; Wang, Ray

    2011-01-01

    Intelligent sensor technology and systems are increasingly becoming attractive means to serve as frameworks for intelligent rocket test facilities with embedded intelligent sensor elements, distributed data acquisition elements, and onboard data acquisition elements. Networked intelligent processors enable users and systems integrators to automatically configure their measurement automation systems for analog sensors. NASA and leading sensor vendors are working together to apply the IEEE 1451 standard for adding plug-and-play capabilities for wireless analog transducers through the use of a Transducer Electronic Data Sheet (TEDS) in order to simplify sensor setup, use, and maintenance, to automatically obtain calibration data, and to eliminate manual data entry and error. A TEDS contains the critical information needed by an instrument or measurement system to identify, characterize, interface, and properly use the signal from an analog sensor. A TEDS is deployed for a sensor in one of two ways. First, the TEDS can reside in embedded, nonvolatile memory (typically flash memory) within the intelligent processor. Second, a virtual TEDS can exist as a separate file, downloadable from the Internet. This concept of virtual TEDS extends the benefits of the standardized TEDS to legacy sensors and applications where the embedded memory is not available. An HTML-based user interface provides a visual tool to interface with those distributed sensors that a TEDS is associated with, to automate the sensor management process. Implementing and deploying the IEEE 1451.1-based Network-Capable Application Process (NCAP) can achieve support for intelligent process in Integrated Systems Health Management (ISHM) for the purpose of monitoring, detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, mitigation to maintain operability, and integrated awareness of system health by the operator. It can also support local data collection and storage. This invention enables wide-area sensing and employs numerous globally distributed sensing devices that observe the physical world through the existing sensor network. This innovation enables distributed storage, distributed processing, distributed intelligence, and the availability of DiaK (Data, Information, and Knowledge) to any element as needed. It also enables the simultaneous execution of multiple processes, and represents models that contribute to the determination of the condition and health of each element in the system. The NCAP (intelligent process) can configure data-collection and filtering processes in reaction to sensed data, allowing it to decide when and how to adapt collection and processing with regard to sophisticated analysis of data derived from multiple sensors. The user will be able to view the sensing device network as a single unit that supports a high-level query language. Each query would be able to operate over data collected from across the global sensor network just as a search query encompasses millions of Web pages. The sensor web can preserve ubiquitous information access between the querier and the queried data. Pervasive monitoring of the physical world raises significant data and privacy concerns. This innovation enables different authorities to control portions of the sensing infrastructure, and sensor service authors may wish to compose services across authority boundaries.

  15. Behavioral and physiological changes around estrus events identified using multiple automated monitoring technologies.

    PubMed

    Dolecheck, K A; Silvia, W J; Heersche, G; Chang, Y M; Ray, D L; Stone, A E; Wadsworth, B A; Bewley, J M

    2015-12-01

    This study included 2 objectives. The first objective was to describe estrus-related changes in parameters automatically recorded by the CowManager SensOor (Agis Automatisering, Harmelen, the Netherlands), DVM bolus (DVM Systems LLC, Greeley, CO), HR Tag (SCR Engineers Ltd., Netanya, Israel), IceQube (IceRobotics Ltd., Edinburgh, UK), and Track a Cow (Animart Inc., Beaver Dam, WI). This objective was accomplished using 35 cows in 3 groups between January and June 2013 at the University of Kentucky Coldstream Dairy. We used a modified Ovsynch with G7G protocol to partially synchronize ovulation, ending after the last PGF2α injection (d 0) to allow estrus expression. Visual observation for standing estrus was conducted for four 30-min periods at 0330, 1000, 1430, and 2200h on d 2, 3, 4, and 5. Eighteen of the 35 cows stood to be mounted at least once during the observation period. These cows were used to compare differences between the 6h before and after the first standing event (estrus) and the 2wk preceding that period (nonestrus) for all technology parameters. Differences between estrus and nonestrus were observed for CowManager SensOor minutes feeding per hour, minutes of high ear activity per hour, and minutes ruminating per hour; twice daily DVM bolus reticulorumen temperature; HR Tag neck activity per 2h and minutes ruminating per 2h; IceQube lying bouts per hour, minutes lying per hour, and number of steps per hour; and Track a Cow leg activity per hour and minutes lying per hour. No difference between estrus and nonestrus was observed for CowManager SensOor ear surface temperature per hour. The second objective of this study was to explore the estrus detection potential of machine-learning techniques using automatically collected data. Three machine-learning techniques (random forest, linear discriminant analysis, and neural network) were applied to automatically collected parameter data from the 18 cows observed in standing estrus. Machine learning accuracy for all technologies ranged from 91.0 to 100.0%. When we compared visual observation with progesterone profiles of all 32 cows, we found 65.6% accuracy. Based on these results, machine-learning techniques have potential to be applied to automatically collected technology data for estrus detection. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. Visualization and characterization of users in a citizen science project

    NASA Astrophysics Data System (ADS)

    Morais, Alessandra M. M.; Raddick, Jordan; Coelho dos Santos, Rafael D.

    2013-05-01

    Recent technological advances allowed the creation and use of internet-based systems where many users can collaborate gathering and sharing information for specific or general purposes: social networks, e-commerce review systems, collaborative knowledge systems, etc. Since most of the data collected in these systems is user-generated, understanding of the motivations and general behavior of users is a very important issue. Of particular interest are citizen science projects, where users without scientific training are asked for collaboration labeling and classifying information (either automatically by giving away idle computer time or manually by actually seeing data and providing information about it). Understanding behavior of users of those types of data collection systems may help increase the involvement of the users, categorize users accordingly to different parameters, facilitate their collaboration with the systems, design better user interfaces, and allow better planning and deployment of similar projects and systems. Behavior of those users could be estimated through analysis of their collaboration track: registers of which user did what and when can be easily and unobtrusively collected in several different ways, the simplest being a log of activities. In this paper we present some results on the visualization and characterization of almost 150.000 users with more than 80.000.000 collaborations with a citizen science project - Galaxy Zoo I, which asked users to classify galaxies' images. Basic visualization techniques are not applicable due to the number of users, so techniques to characterize users' behavior based on feature extraction and clustering are used.

  17. Bio-optical data integration based on a 4 D database system approach

    NASA Astrophysics Data System (ADS)

    Imai, N. N.; Shimabukuro, M. H.; Carmo, A. F. C.; Alcantara, E. H.; Rodrigues, T. W. P.; Watanabe, F. S. Y.

    2015-04-01

    Bio-optical characterization of water bodies requires spatio-temporal data about Inherent Optical Properties and Apparent Optical Properties which allow the comprehension of underwater light field aiming at the development of models for monitoring water quality. Measurements are taken to represent optical properties along a column of water, and then the spectral data must be related to depth. However, the spatial positions of measurement may differ since collecting instruments vary. In addition, the records should not refer to the same wavelengths. Additional difficulty is that distinct instruments store data in different formats. A data integration approach is needed to make these large and multi source data sets suitable for analysis. Thus, it becomes possible, even automatically, semi-empirical models evaluation, preceded by preliminary tasks of quality control. In this work it is presented a solution, in the stated scenario, based on spatial - geographic - database approach with the adoption of an object relational Database Management System - DBMS - due to the possibilities to represent all data collected in the field, in conjunction with data obtained by laboratory analysis and Remote Sensing images that have been taken at the time of field data collection. This data integration approach leads to a 4D representation since that its coordinate system includes 3D spatial coordinates - planimetric and depth - and the time when each data was taken. It was adopted PostgreSQL DBMS extended by PostGIS module to provide abilities to manage spatial/geospatial data. It was developed a prototype which has the mainly tools an analyst needs to prepare the data sets for analysis.

  18. Application of semi-active RFID power meter in automatic verification pipeline and intelligent storage system

    NASA Astrophysics Data System (ADS)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.

  19. iRODS: A Distributed Data Management Cyberinfrastructure for Observatories

    NASA Astrophysics Data System (ADS)

    Rajasekar, A.; Moore, R.; Vernon, F.

    2007-12-01

    Large-scale and long-term preservation of both observational and synthesized data requires a system that virtualizes data management concepts. A methodology is needed that can work across long distances in space (distribution) and long-periods in time (preservation). The system needs to manage data stored on multiple types of storage systems including new systems that become available in the future. This concept is called infrastructure independence, and is typically implemented through virtualization mechanisms. Data grids are built upon concepts of data and trust virtualization. These concepts enable the management of collections of data that are distributed across multiple institutions, stored on multiple types of storage systems, and accessed by multiple types of clients. Data virtualization ensures that the name spaces used to identify files, users, and storage systems are persistent, even when files are migrated onto future technology. This is required to preserve authenticity, the link between the record and descriptive and provenance metadata. Trust virtualization ensures that access controls remain invariant as files are moved within the data grid. This is required to track the chain of custody of records over time. The Storage Resource Broker (http://www.sdsc.edu/srb) is one such data grid used in a wide variety of applications in earth and space sciences such as ROADNet (roadnet.ucsd.edu), SEEK (seek.ecoinformatics.org), GEON (www.geongrid.org) and NOAO (www.noao.edu). Recent extensions to data grids provide one more level of virtualization - policy or management virtualization. Management virtualization ensures that execution of management policies can be automated, and that rules can be created that verify assertions about the shared collections of data. When dealing with distributed large-scale data over long periods of time, the policies used to manage the data and provide assurances about the authenticity of the data become paramount. The integrated Rule-Oriented Data System (iRODS) (http://irods.sdsc.edu) provides the mechanisms needed to describe not only management policies, but also to track how the policies are applied and their execution results. The iRODS data grid maps management policies to rules that control the execution of the remote micro-services. As an example, a rule can be created that automatically creates a replica whenever a file is added to a specific collection, or extracts its metadata automatically and registers it in a searchable catalog. For the replication operation, the persistent state information consists of the replica location, the creation date, the owner, the replica size, etc. The mechanism used by iRODS for providing policy virtualization is based on well-defined functions, called micro-services, which are chained into alternative workflows using rules. A rule engine, based on the event-condition-action paradigm executes the rule-based workflows after an event. Rules can be deferred to a pre-determined time or executed on a periodic basis. As the data management policies evolve, the iRODS system can implement new rules, new micro-services, and new state information (metadata content) needed to manage the new policies. Each sub- collection can be managed using a different set of policies. The discussion of the concepts in rule-based policy virtualization and its application to long-term and large-scale data management for observatories such as ORION and NEON will be the basis of the paper.

  20. 75 FR 56662 - Proposed Information Collection (Application for Authority To Close Loans on an Automatic Basis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-16

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0252] Proposed Information Collection (Application for Authority To Close Loans on an Automatic Basis--Nonsupervised Lenders) Activity: Comment... on an automatic basis. DATES: Written comments and recommendations on the proposed collection of...

  1. ENT COBRA (Consortium for Brachytherapy Data Analysis): interdisciplinary standardized data collection system for head and neck patients treated with interventional radiotherapy (brachytherapy).

    PubMed

    Tagliaferri, Luca; Kovács, György; Autorino, Rosa; Budrukkar, Ashwini; Guinot, Jose Luis; Hildebrand, Guido; Johansson, Bengt; Monge, Rafael Martìnez; Meyer, Jens E; Niehoff, Peter; Rovirosa, Angeles; Takàcsi-Nagy, Zoltàn; Dinapoli, Nicola; Lanzotti, Vito; Damiani, Andrea; Soror, Tamer; Valentini, Vincenzo

    2016-08-01

    Aim of the COBRA (Consortium for Brachytherapy Data Analysis) project is to create a multicenter group (consortium) and a web-based system for standardized data collection. GEC-ESTRO (Groupe Européen de Curiethérapie - European Society for Radiotherapy & Oncology) Head and Neck (H&N) Working Group participated in the project and in the implementation of the consortium agreement, the ontology (data-set) and the necessary COBRA software services as well as the peer reviewing of the general anatomic site-specific COBRA protocol. The ontology was defined by a multicenter task-group. Eleven centers from 6 countries signed an agreement and the consortium approved the ontology. We identified 3 tiers for the data set: Registry (epidemiology analysis), Procedures (prediction models and DSS), and Research (radiomics). The COBRA-Storage System (C-SS) is not time-consuming as, thanks to the use of "brokers", data can be extracted directly from the single center's storage systems through a connection with "structured query language database" (SQL-DB), Microsoft Access(®), FileMaker Pro(®), or Microsoft Excel(®). The system is also structured to perform automatic archiving directly from the treatment planning system or afterloading machine. The architecture is based on the concept of "on-purpose data projection". The C-SS architecture is privacy protecting because it will never make visible data that could identify an individual patient. This C-SS can also benefit from the so called "distributed learning" approaches, in which data never leave the collecting institution, while learning algorithms and proposed predictive models are commonly shared. Setting up a consortium is a feasible and practicable tool in the creation of an international and multi-system data sharing system. COBRA C-SS seems to be well accepted by all involved parties, primarily because it does not influence the center's own data storing technologies, procedures, and habits. Furthermore, the method preserves the privacy of all patients.

  2. Sentry: An Automated Close Approach Monitoring System for Near-Earth Objects

    NASA Astrophysics Data System (ADS)

    Chamberlin, A. B.; Chesley, S. R.; Chodas, P. W.; Giorgini, J. D.; Keesey, M. S.; Wimberly, R. N.; Yeomans, D. K.

    2001-11-01

    In response to international concern about potential asteroid impacts on Earth, NASA's Near-Earth Object (NEO) Program Office has implemented a new system called ``Sentry'' to automatically update the orbits of all NEOs on a daily basis and compute Earth close approaches up to 100 years into the future. Results are published on our web site (http://neo.jpl.nasa.gov/) and updated orbits and ephemerides made available via the JPL Horizons ephemeris service (http://ssd.jpl.nasa.gov/horizons.html). Sentry collects new and revised astrometric observations from the Minor Planet Center (MPC) via their electronic circulars (MPECs) in near real time as well as radar and optical astrometry sent directly from observers. NEO discoveries and identifications are detected in MPECs and processed appropriately. In addition to these daily updates, Sentry synchronizes with each monthly batch of MPC astrometry and automatically updates all NEO observation files. Daily and monthly processing of NEO astrometry is managed using a queuing system which allows for manual intervention of selected NEOs without interfering with the automatic system. At the heart of Sentry is a fully automatic orbit determination program which handles outlier rejection and ensures convergence in the new solution. Updated orbital elements and their covariances are published via Horizons and our NEO web site, typically within 24 hours. A new version of Horizons, in development, will allow computation of ephemeris uncertainties using covariance data. The positions of NEOs with updated orbits are numerically integrated up to 100 years into the future and each close approach to any perturbing body in our dynamic model (all planets, Moon, Ceres, Pallas, Vesta) is recorded. Significant approaches are flagged for extended analysis including Monte Carlo studies. Results, such as minimum encounter distances and future Earth impact probabilities, are published on our NEO web site.

  3. Legal Medicine Information System using CDISC ODM.

    PubMed

    Kiuchi, Takahiro; Yoshida, Ken-ichi; Kotani, Hirokazu; Tamaki, Keiji; Nagai, Hisashi; Harada, Kazuki; Ishikawa, Hirono

    2013-11-01

    We have developed a new database system for forensic autopsies, called the Legal Medicine Information System, using the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM). This system comprises two subsystems, namely the Institutional Database System (IDS) located in each institute and containing personal information, and the Central Anonymous Database System (CADS) located in the University Hospital Medical Information Network Center containing only anonymous information. CDISC ODM is used as the data transfer protocol between the two subsystems. Using the IDS, forensic pathologists and other staff can register and search for institutional autopsy information, print death certificates, and extract data for statistical analysis. They can also submit anonymous autopsy information to the CADS semi-automatically. This reduces the burden of double data entry, the time-lag of central data collection, and anxiety regarding legal and ethical issues. Using the CADS, various studies on the causes of death can be conducted quickly and easily, and the results can be used to prevent similar accidents, diseases, and abuse. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Instrumentation and Automation of Wastewater Collection and Treatment Systems.

    ERIC Educational Resources Information Center

    Roesler, Joseph F.; Cummins, Michael D.

    1978-01-01

    Presents a literature review of the use of instrumentation and automation of wastewater treatment systems, covering publications of 1976-77. This review includes automatic control systems and cost effectiveness of automation of wastewater treatment. A list of 115 references is also presented. (HM)

  5. Automatic trajectory measurement of large numbers of crowded objects

    NASA Astrophysics Data System (ADS)

    Li, Hui; Liu, Ye; Chen, Yan Qiu

    2013-06-01

    Complex motion patterns of natural systems, such as fish schools, bird flocks, and cell groups, have attracted great attention from scientists for years. Trajectory measurement of individuals is vital for quantitative and high-throughput study of their collective behaviors. However, such data are rare mainly due to the challenges of detection and tracking of large numbers of objects with similar visual features and frequent occlusions. We present an automatic and effective framework to measure trajectories of large numbers of crowded oval-shaped objects, such as fish and cells. We first use a novel dual ellipse locator to detect the coarse position of each individual and then propose a variance minimization active contour method to obtain the optimal segmentation results. For tracking, cost matrix of assignment between consecutive frames is trainable via a random forest classifier with many spatial, texture, and shape features. The optimal trajectories are found for the whole image sequence by solving two linear assignment problems. We evaluate the proposed method on many challenging data sets.

  6. Automatic optical detection and classification of marine animals around MHK converters using machine vision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunton, Steven

    Optical systems provide valuable information for evaluating interactions and associations between organisms and MHK energy converters and for capturing potentially rare encounters between marine organisms and MHK device. The deluge of optical data from cabled monitoring packages makes expert review time-consuming and expensive. We propose algorithms and a processing framework to automatically extract events of interest from underwater video. The open-source software framework consists of background subtraction, filtering, feature extraction and hierarchical classification algorithms. This principle classification pipeline was validated on real-world data collected with an experimental underwater monitoring package. An event detection rate of 100% was achieved using robustmore » principal components analysis (RPCA), Fourier feature extraction and a support vector machine (SVM) binary classifier. The detected events were then further classified into more complex classes – algae | invertebrate | vertebrate, one species | multiple species of fish, and interest rank. Greater than 80% accuracy was achieved using a combination of machine learning techniques.« less

  7. Automatic identification of physical activity types and sedentary behaviors from triaxial accelerometer: laboratory-based calibrations are not enough.

    PubMed

    Bastian, Thomas; Maire, Aurélia; Dugas, Julien; Ataya, Abbas; Villars, Clément; Gris, Florence; Perrin, Emilie; Caritu, Yanis; Doron, Maeva; Blanc, Stéphane; Jallon, Pierre; Simon, Chantal

    2015-03-15

    "Objective" methods to monitor physical activity and sedentary patterns in free-living conditions are necessary to further our understanding of their impacts on health. In recent years, many software solutions capable of automatically identifying activity types from portable accelerometry data have been developed, with promising results in controlled conditions, but virtually no reports on field tests. An automatic classification algorithm initially developed using laboratory-acquired data (59 subjects engaging in a set of 24 standardized activities) to discriminate between 8 activity classes (lying, slouching, sitting, standing, walking, running, and cycling) was applied to data collected in the field. Twenty volunteers equipped with a hip-worn triaxial accelerometer performed at their own pace an activity set that included, among others, activities such as walking the streets, running, cycling, and taking the bus. Performances of the laboratory-calibrated classification algorithm were compared with those of an alternative version of the same model including field-collected data in the learning set. Despite good results in laboratory conditions, the performances of the laboratory-calibrated algorithm (assessed by confusion matrices) decreased for several activities when applied to free-living data. Recalibrating the algorithm with data closer to real-life conditions and from an independent group of subjects proved useful, especially for the detection of sedentary behaviors while in transports, thereby improving the detection of overall sitting (sensitivity: laboratory model = 24.9%; recalibrated model = 95.7%). Automatic identification methods should be developed using data acquired in free-living conditions rather than data from standardized laboratory activity sets only, and their limits carefully tested before they are used in field studies. Copyright © 2015 the American Physiological Society.

  8. Introduction To ITS/CVO Participant Manual, Course 1

    DOT National Transportation Integrated Search

    1999-08-01

    WEIGH-IN-MOTION OR WIM, COMMERCIAL VEHICLE INFORMATION SYSTEMS AND NETWORK OR CVISN, AUTOMATIC VEHICLE INDENTIFICATION OR AVI, AUTOMATIC VEHICLE LOCATION OR AVL, ELECTRONIC DATA INTERCHANGE OR EDI, GLOCAL POSITIONING SYSTEM OR GPS, INTERNET OR WORD W...

  9. The Wettzell System Monitoring Concept and First Realizations

    NASA Technical Reports Server (NTRS)

    Ettl, Martin; Neidhardt, Alexander; Muehlbauer, Matthias; Ploetz, Christian; Beaudoin, Christopher

    2010-01-01

    Automated monitoring of operational system parameters for the geodetic space techniques is becoming more important in order to improve the geodetic data and to ensure the safety and stability of automatic and remote-controlled observations. Therefore, the Wettzell group has developed the system monitoring software, SysMon, which is based on a reliable, remotely-controllable hardware/software realization. A multi-layered data logging system based on a fanless, robust industrial PC with an internal database system is used to collect data from several external, serial, bus, or PCI-based sensors. The internal communication is realized with Remote Procedure Calls (RPC) and uses generative programming with the interface software generator idl2rpc.pl developed at Wettzell. Each data monitoring stream can be configured individually via configuration files to define the logging rates or analog-digital-conversion parameters. First realizations are currently installed at the new laser ranging system at Wettzell to address safety issues and at the VLBI station O Higgins as a meteorological data logger. The system monitoring concept should be realized for the Wettzell radio telescope in the near future.

  10. Implementing Single Source: The STARBRITE Proof-of-Concept Study

    PubMed Central

    Kush, Rebecca; Alschuler, Liora; Ruggeri, Roberto; Cassells, Sally; Gupta, Nitin; Bain, Landen; Claise, Karen; Shah, Monica; Nahm, Meredith

    2007-01-01

    Objective Inefficiencies in clinical trial data collection cause delays, increase costs, and may reduce clinician participation in medical research. In this proof-of-concept study, we examine the feasibility of using point-of-care data capture for both the medical record and clinical research in the setting of a working clinical trial. We hypothesized that by doing so, we could increase reuse of patient data, eliminate redundant data entry, and minimize disruption to clinic workflow. Design We developed and used a point-of-care electronic data capture system to record data during patient visits. The standards-based system was used for clinical research and to generate the clinic note for the medical record. The system worked in parallel with data collection procedures already in place for an ongoing multicenter clinical trial. Our system was iteratively designed after analyzing case report forms and clinic notes, and observing clinic workflow patterns and business procedures. Existing data standards from CDISC and HL7 were used for database insertion and clinical document exchange. Results Our system was successfully integrated into the clinic environment and used in two live test cases without disrupting existing workflow. Analyses performed during system design yielded detailed information on practical issues affecting implementation of systems that automatically extract, store, and reuse healthcare data. Conclusion Although subject to the limitations of a small feasibility study, our study demonstrates that electronic patient data can be reused for prospective multicenter clinical research and patient care, and demonstrates a need for further development of therapeutic area standards that can facilitate researcher use of healthcare data. PMID:17600107

  11. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, X; Li, S; Zheng, D

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheetmore » every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm{sup 2}. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.« less

  12. The National Shipbuilding Research Program. Photogrammetric Dimensioning of Distributive Systems Models. Phase 1

    DTIC Science & Technology

    1978-08-01

    21- accepts piping geometry as one of its basic inputs; whether this geometry comes from arrangement drawings or models is of no real consequence. c ... computer . Geometric data is taken from the catalogue and automatically merged with the piping geometry data. Also, fitting orientation is automatically...systems require a number of data manipulation routines to convert raw digitized data into logical pipe geometry acceptable to a computer -aided piping design

  13. Operational testing of system for automatic sleep analysis

    NASA Technical Reports Server (NTRS)

    Kellaway, P.

    1972-01-01

    Tables on the performance, under operational conditions, of an automatic sleep monitoring system are presented. Data are recorded from patients who were undergoing heart and great vessel surgery. This study resulted in cap, electrode, and preamplifier improvements. Children were used to test the sleep analyzer and medical console write out units. From these data, an automatic voltage control circuit for the analyzer was developed. A special circuitry for obviating the possibility of incorrect sleep staging due to the presence of a movement artifact was also developed as a result of the study.

  14. 48 CFR 245.608-72 - Screening excess automatic data processing equipment (ADPE).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... data processing equipment (ADPE). 245.608-72 Section 245.608-72 Federal Acquisition Regulations System... Reporting, Redistribution, and Disposal of Contractor Inventory 245.608-72 Screening excess automatic data... Agency, Defense Automation Resources Management Program Division (DARMP). DARMP does all required...

  15. Grading Multiple Choice Exams with Low-Cost and Portable Computer-Vision Techniques

    NASA Astrophysics Data System (ADS)

    Fisteus, Jesus Arias; Pardo, Abelardo; García, Norberto Fernández

    2013-08-01

    Although technology for automatic grading of multiple choice exams has existed for several decades, it is not yet as widely available or affordable as it should be. The main reasons preventing this adoption are the cost and the complexity of the setup procedures. In this paper, Eyegrade, a system for automatic grading of multiple choice exams is presented. While most current solutions are based on expensive scanners, Eyegrade offers a truly low-cost solution requiring only a regular off-the-shelf webcam. Additionally, Eyegrade performs both mark recognition as well as optical character recognition of handwritten student identification numbers, which avoids the use of bubbles in the answer sheet. When compared with similar webcam-based systems, the user interface in Eyegrade has been designed to provide a more efficient and error-free data collection procedure. The tool has been validated with a set of experiments that show the ease of use (both setup and operation), the reduction in grading time, and an increase in the reliability of the results when compared with conventional, more expensive systems.

  16. [Micron]ADS-B Detect and Avoid Flight Tests on Phantom 4 Unmanned Aircraft System

    NASA Technical Reports Server (NTRS)

    Arteaga, Ricardo; Dandachy, Mike; Truong, Hong; Aruljothi, Arun; Vedantam, Mihir; Epperson, Kraettli; McCartney, Reed

    2018-01-01

    Researchers at the National Aeronautics and Space Administration Armstrong Flight Research Center in Edwards, California and Vigilant Aerospace Systems collaborated for the flight-test demonstration of an Automatic Dependent Surveillance-Broadcast based collision avoidance technology on a small unmanned aircraft system equipped with the uAvionix Automatic Dependent Surveillance-Broadcast transponder. The purpose of the testing was to demonstrate that National Aeronautics and Space Administration / Vigilant software and algorithms, commercialized as the FlightHorizon UAS"TM", are compatible with uAvionix hardware systems and the DJI Phantom 4 small unmanned aircraft system. The testing and demonstrations were necessary for both parties to further develop and certify the technology in three key areas: flights beyond visual line of sight, collision avoidance, and autonomous operations. The National Aeronautics and Space Administration and Vigilant Aerospace Systems have developed and successfully flight-tested an Automatic Dependent Surveillance-Broadcast Detect and Avoid system on the Phantom 4 small unmanned aircraft system. The Automatic Dependent Surveillance-Broadcast Detect and Avoid system architecture is especially suited for small unmanned aircraft systems because it integrates: 1) miniaturized Automatic Dependent Surveillance-Broadcast hardware; 2) radio data-link communications; 3) software algorithms for real-time Automatic Dependent Surveillance-Broadcast data integration, conflict detection, and alerting; and 4) a synthetic vision display using a fully-integrated National Aeronautics and Space Administration geobrowser for three dimensional graphical representations for ownship and air traffic situational awareness. The flight-test objectives were to evaluate the performance of Automatic Dependent Surveillance-Broadcast Detect and Avoid collision avoidance technology as installed on two small unmanned aircraft systems. In December 2016, four flight tests were conducted at Edwards Air Force Base. Researchers in the ground control station looking at displays were able to verify the Automatic Dependent Surveillance-Broadcast target detection and collision avoidance resolutions.

  17. Pedestrian mobile mapping system for indoor environments based on MEMS IMU and range camera

    NASA Astrophysics Data System (ADS)

    Haala, N.; Fritsch, D.; Peter, M.; Khosravani, A. M.

    2011-12-01

    This paper describes an approach for the modeling of building interiors based on a mobile device, which integrates modules for pedestrian navigation and low-cost 3D data collection. Personal navigation is realized by a foot mounted low cost MEMS IMU, while 3D data capture for subsequent indoor modeling uses a low cost range camera, which was originally developed for gaming applications. Both steps, navigation and modeling, are supported by additional information as provided from the automatic interpretation of evacuation plans. Such emergency plans are compulsory for public buildings in a number of countries. They consist of an approximate floor plan, the current position and escape routes. Additionally, semantic information like stairs, elevators or the floor number is available. After the user has captured an image of such a floor plan, this information is made explicit again by an automatic raster-to-vector-conversion. The resulting coarse indoor model then provides constraints at stairs or building walls, which restrict the potential movement of the user. This information is then used to support pedestrian navigation by eliminating drift effects of the used low-cost sensor system. The approximate indoor building model additionally provides a priori information during subsequent indoor modeling. Within this process, the low cost range camera Kinect is used for the collection of multiple 3D point clouds, which are aligned by a suitable matching step and then further analyzed to refine the coarse building model.

  18. Autonomous collection of dynamically-cued multi-sensor imagery

    NASA Astrophysics Data System (ADS)

    Daniel, Brian; Wilson, Michael L.; Edelberg, Jason; Jensen, Mark; Johnson, Troy; Anderson, Scott

    2011-05-01

    The availability of imagery simultaneously collected from sensors of disparate modalities enhances an image analyst's situational awareness and expands the overall detection capability to a larger array of target classes. Dynamic cooperation between sensors is increasingly important for the collection of coincident data from multiple sensors either on the same or on different platforms suitable for UAV deployment. Of particular interest is autonomous collaboration between wide area survey detection, high-resolution inspection, and RF sensors that span large segments of the electromagnetic spectrum. The Naval Research Laboratory (NRL) in conjunction with the Space Dynamics Laboratory (SDL) is building sensors with such networked communications capability and is conducting field tests to demonstrate the feasibility of collaborative sensor data collection and exploitation. Example survey / detection sensors include: NuSAR (NRL Unmanned SAR), a UAV compatible synthetic aperture radar system; microHSI, an NRL developed lightweight hyper-spectral imager; RASAR (Real-time Autonomous SAR), a lightweight podded synthetic aperture radar; and N-WAPSS-16 (Nighttime Wide-Area Persistent Surveillance Sensor-16Mpix), a MWIR large array gimbaled system. From these sensors, detected target cues are automatically sent to the NRL/SDL developed EyePod, a high-resolution, narrow FOV EO/IR sensor, for target inspection. In addition to this cooperative data collection, EyePod's real-time, autonomous target tracking capabilities will be demonstrated. Preliminary results and target analysis will be presented.

  19. Method and apparatus for reading meters from a video image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, T.J.; Ferguson, J.J.

    1995-12-31

    A method and system enable acquisition of data about an environment from one or more meters using video images. One or more meters are imaged by a video camera and the video signal is digitized. Then, each region of the digital image which corresponds to the indicator of the meter is calibrated and the video signal is analyzed to determine the value indicated by each meter indicator. Finally, from the value indicated by each meter indicator in the calibrated region, a meter reading is generated. The method and system offer the advantages of automatic data collection in a relatively non-intrusivemore » manner without making any complicated or expensive electronic connections, and without requiring intensive manpower.« less

  20. Method and apparatus for reading meters from a video image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, T.J.; Ferguson, J.J.

    1997-09-30

    A method and system to enable acquisition of data about an environment from one or more meters using video images. One or more meters are imaged by a video camera and the video signal is digitized. Then, each region of the digital image which corresponds to the indicator of the meter is calibrated and the video signal is analyzed to determine the value indicated by each meter indicator. Finally, from the value indicated by each meter indicator in the calibrated region, a meter reading is generated. The method and system offer the advantages of automatic data collection in a relativelymore » non-intrusive manner without making any complicated or expensive electronic connections, and without requiring intensive manpower. 1 fig.« less

  1. V/STOLAND digital avionics system for XV-15 tilt rotor

    NASA Technical Reports Server (NTRS)

    Liden, S.

    1980-01-01

    A digital flight control system for the tilt rotor research aircraft provides sophisticated navigation, guidance, control, display and data acquisition capabilities for performing terminal area navigation, guidance and control research. All functions of the XV-15 V/STOLAND system were demonstrated on the NASA-ARC S-19 simulation facility under a comprehensive dynamic acceptance test. The most noteworthy accomplishments of the system are: (1) automatic configuration control of a tilt-rotor aircraft over the total operating range; (2) total hands-off landing to touchdown on various selectable straight-in glide slopes and on a flight path that includes a two-revolution helix; (3) automatic guidance along a programmed three-dimensional reference flight path; (4) navigation data for the automatic guidance computed on board, based on VOR/DME, TACAN, or MLS navid data; and (5) integration of a large set of functions in a single computer, utilizing 16k words of storage for programs and data.

  2. Preliminary evaluation of a nest usage sensor to detect double nest occupations of laying hens.

    PubMed

    Zaninelli, Mauro; Costa, Annamaria; Tangorra, Francesco Maria; Rossi, Luciana; Agazzi, Alessandro; Savoini, Giovanni

    2015-01-26

    Conventional cage systems will be replaced by housing systems that allow hens to move freely. These systems may improve hens' welfare, but they lead to some disadvantages: disease, bone fractures, cannibalism, piling and lower egg production. New selection criteria for existing commercial strains should be identified considering individual data about laying performance and the behavior of hens. Many recording systems have been developed to collect these data. However, the management of double nest occupations remains critical for the correct egg-to-hen assignment. To limit such events, most systems adopt specific trap devices and additional mechanical components. Others, instead, only prevent these occurrences by narrowing the nest, without any detection and management. The aim of this study was to develop and test a nest usage "sensor", based on imaging analysis, that is able to automatically detect a double nest occupation. Results showed that the developed sensor correctly identified the double nest occupation occurrences. Therefore, the imaging analysis resulted in being a useful solution that could simplify the nest construction for this type of recording system, allowing the collection of more precise and accurate data, since double nest occupations would be managed and the normal laying behavior of hens would not be discouraged by the presence of the trap devices.

  3. Preliminary Evaluation of a Nest Usage Sensor to Detect Double Nest Occupations of Laying Hens

    PubMed Central

    Zaninelli, Mauro; Costa, Annamaria; Tangorra, Francesco Maria; Rossi, Luciana; Agazzi, Alessandro; Savoini, Giovanni

    2015-01-01

    Conventional cage systems will be replaced by housing systems that allow hens to move freely. These systems may improve hens' welfare, but they lead to some disadvantages: disease, bone fractures, cannibalism, piling and lower egg production. New selection criteria for existing commercial strains should be identified considering individual data about laying performance and the behavior of hens. Many recording systems have been developed to collect these data. However, the management of double nest occupations remains critical for the correct egg-to-hen assignment. To limit such events, most systems adopt specific trap devices and additional mechanical components. Others, instead, only prevent these occurrences by narrowing the nest, without any detection and management. The aim of this study was to develop and test a nest usage “sensor”, based on imaging analysis, that is able to automatically detect a double nest occupation. Results showed that the developed sensor correctly identified the double nest occupation occurrences. Therefore, the imaging analysis resulted in being a useful solution that could simplify the nest construction for this type of recording system, allowing the collection of more precise and accurate data, since double nest occupations would be managed and the normal laying behavior of hens would not be discouraged by the presence of the trap devices. PMID:25629704

  4. The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)

    1997-01-01

    Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.

  5. Development of an automated analysis system for data from flow cytometric intracellular cytokine staining assays from clinical vaccine trials

    PubMed Central

    Shulman, Nick; Bellew, Matthew; Snelling, George; Carter, Donald; Huang, Yunda; Li, Hongli; Self, Steven G.; McElrath, M. Juliana; De Rosa, Stephen C.

    2008-01-01

    Background Intracellular cytokine staining (ICS) by multiparameter flow cytometry is one of the primary methods for determining T cell immunogenicity in HIV-1 clinical vaccine trials. Data analysis requires considerable expertise and time. The amount of data is quickly increasing as more and larger trials are performed, and thus there is a critical need for high throughput methods of data analysis. Methods A web based flow cytometric analysis system, LabKey Flow, was developed for analyses of data from standardized ICS assays. A gating template was created manually in commercially-available flow cytometric analysis software. Using this template, the system automatically compensated and analyzed all data sets. Quality control queries were designed to identify potentially incorrect sample collections. Results Comparison of the semi-automated analysis performed by LabKey Flow and the manual analysis performed using FlowJo software demonstrated excellent concordance (concordance correlation coefficient >0.990). Manual inspection of the analyses performed by LabKey Flow for 8-color ICS data files from several clinical vaccine trials indicates that template gates can appropriately be used for most data sets. Conclusions The semi-automated LabKey Flow analysis system can analyze accurately large ICS data files. Routine use of the system does not require specialized expertise. This high-throughput analysis will provide great utility for rapid evaluation of complex multiparameter flow cytometric measurements collected from large clinical trials. PMID:18615598

  6. Evaluation of longitudinal tracking and data mining for an imaging informatics-based multiple sclerosis e-folder (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ma, Kevin C.; Forsyth, Sydney; Amezcua, Lilyana; Liu, Brent J.

    2017-03-01

    We have designed and developed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results to allow patient tracking. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and data analysis. The system quantifies lesion volumes, identify and register lesion locations to track shifts in volume and quantity of lesions in a longitudinal study. We aim to evaluate the two most important features of the system, data mining and longitudinal lesion tracking, to demonstrate the MS eFolder's capability in improving clinical workflow efficiency and outcome analysis for research. In order to evaluate data mining capabilities, we have collected radiological and neurological data from 72 patients, 36 Caucasian and 36 Hispanic matched by gender, disease duration, and age. Data analysis on those patients based on ethnicity is performed, and analysis results are displayed by the system's web-based user interface. The data mining module is able to successfully separate Hispanic and Caucasian patients and compare their disease profiles. For longitudinal lesion tracking, we have collected 4 longitudinal cases and simulated different lesion growths over the next year. As a result, the eFolder is able to detect changes in lesion volume and identifying lesions with the most changes. Data mining and lesion tracking evaluation results show high potential of eFolder's usefulness in patientcare and informatics research for multiple sclerosis.

  7. Preparing Electronic Clinical Data for Quality Improvement and Comparative Effectiveness Research: The SCOAP CERTAIN Automation and Validation Project

    PubMed Central

    Devine, Emily Beth; Capurro, Daniel; van Eaton, Erik; Alfonso-Cristancho, Rafael; Devlin, Allison; Yanez, N. David; Yetisgen-Yildiz, Meliha; Flum, David R.; Tarczy-Hornoch, Peter

    2013-01-01

    Background: The field of clinical research informatics includes creation of clinical data repositories (CDRs) used to conduct quality improvement (QI) activities and comparative effectiveness research (CER). Ideally, CDR data are accurately and directly abstracted from disparate electronic health records (EHRs), across diverse health-systems. Objective: Investigators from Washington State’s Surgical Care Outcomes and Assessment Program (SCOAP) Comparative Effectiveness Research Translation Network (CERTAIN) are creating such a CDR. This manuscript describes the automation and validation methods used to create this digital infrastructure. Methods: SCOAP is a QI benchmarking initiative. Data are manually abstracted from EHRs and entered into a data management system. CERTAIN investigators are now deploying Caradigm’s Amalga™ tool to facilitate automated abstraction of data from multiple, disparate EHRs. Concordance is calculated to compare data automatically to manually abstracted. Performance measures are calculated between Amalga and each parent EHR. Validation takes place in repeated loops, with improvements made over time. When automated abstraction reaches the current benchmark for abstraction accuracy - 95% - itwill ‘go-live’ at each site. Progress to Date: A technical analysis was completed at 14 sites. Five sites are contributing; the remaining sites prioritized meeting Meaningful Use criteria. Participating sites are contributing 15–18 unique data feeds, totaling 13 surgical registry use cases. Common feeds are registration, laboratory, transcription/dictation, radiology, and medications. Approximately 50% of 1,320 designated data elements are being automatically abstracted—25% from structured data; 25% from text mining. Conclusion: In semi-automating data abstraction and conducting a rigorous validation, CERTAIN investigators will semi-automate data collection to conduct QI and CER, while advancing the Learning Healthcare System. PMID:25848565

  8. Sample collection system for gel electrophoresis

    DOEpatents

    Olivares, Jose A.; Stark, Peter C.; Dunbar, John M.; Hill, Karen K.; Kuske, Cheryl R.; Roybal, Gustavo

    2004-09-21

    An automatic sample collection system for use with an electrophoretic slab gel system is presented. The collection system can be used with a slab gel have one or more lanes. A detector is used to detect particle bands on the slab gel within a detection zone. Such detectors may use a laser to excite fluorescently labeled particles. The fluorescent light emitted from the excited particles is transmitted to low-level light detection electronics. Upon the detection of a particle of interest within the detection zone, a syringe pump is activated, sending a stream of buffer solution across the lane of the slab gel. The buffer solution collects the sample of interest and carries it through a collection port into a sample collection vial.

  9. Multistation alarm system for eruptive activity based on the automatic classification of volcanic tremor: specifications and performance

    NASA Astrophysics Data System (ADS)

    Langer, Horst; Falsaperla, Susanna; Messina, Alfio; Spampinato, Salvatore

    2015-04-01

    With over fifty eruptive episodes (Strombolian activity, lava fountains, and lava flows) between 2006 and 2013, Mt Etna, Italy, underscored its role as the most active volcano in Europe. Seven paroxysmal lava fountains at the South East Crater occurred in 2007-2008 and 46 at the New South East Crater between 2011 and 2013. Month-lasting lava emissions affected the upper eastern flank of the volcano in 2006 and 2008-2009. On this background, effective monitoring and forecast of volcanic phenomena are a first order issue for their potential socio-economic impact in a densely populated region like the town of Catania and its surroundings. For example, explosive activity has often formed thick ash clouds with widespread tephra fall able to disrupt the air traffic, as well as to cause severe problems at infrastructures, such as highways and roads. For timely information on changes in the state of the volcano and possible onset of dangerous eruptive phenomena, the analysis of the continuous background seismic signal, the so-called volcanic tremor, turned out of paramount importance. Changes in the state of the volcano as well as in its eruptive style are usually concurrent with variations of the spectral characteristics (amplitude and frequency content) of tremor. The huge amount of digital data continuously acquired by INGV's broadband seismic stations every day makes a manual analysis difficult, and techniques of automatic classification of the tremor signal are therefore applied. The application of unsupervised classification techniques to the tremor data revealed significant changes well before the onset of the eruptive episodes. This evidence led to the development of specific software packages related to real-time processing of the tremor data. The operational characteristics of these tools - fail-safe, robustness with respect to noise and data outages, as well as computational efficiency - allowed the identification of criteria for automatic alarm flagging. The system is hitherto one of the main automatic alerting tools to identify impending eruptive events at Etna. The currently operating software named KKAnalysis is applied to the data stream continuously recorded at two seismic stations. The data are merged with reference datasets of past eruptive episodes. In doing so, the results of pattern classification can be immediately compared to previous eruptive scenarios. Given the rich material collected in recent years, here we propose the application of the alert system to a wider range (up to a total of eleven) stations at different elevations (1200-3050 m) and distances (1-8 km) from the summit craters. Critical alert parameters were empirically defined to obtain an optimal tuning of the alert system for each station. To verify the robustness of this new, multistation alert system, a dataset encompassing about eight years of continuous seismic records (since 2006) was processed automatically using KKAnalysis and collateral software offline. Then, we analyzed the performance of the classifier in terms of timing and spatial distribution of the stations.

  10. Automatic vehicle identification technology applications to toll collection services

    DOT National Transportation Integrated Search

    1997-01-01

    Intelligent transportation systems technologies are being developed and applied through transportation systems in the United States. An example of this type of innovation can be seen on toll roads where a driver is required to deposit a toll in order...

  11. Automatic lesion tracking for a PET/CT based computer aided cancer therapy monitoring system

    NASA Astrophysics Data System (ADS)

    Opfer, Roland; Brenner, Winfried; Carlsen, Ingwer; Renisch, Steffen; Sabczynski, Jörg; Wiemker, Rafael

    2008-03-01

    Response assessment of cancer therapy is a crucial component towards a more effective and patient individualized cancer therapy. Integrated PET/CT systems provide the opportunity to combine morphologic with functional information. However, dealing simultaneously with several PET/CT scans poses a serious workflow problem. It can be a difficult and tedious task to extract response criteria based upon an integrated analysis of PET and CT images and to track these criteria over time. In order to improve the workflow for serial analysis of PET/CT scans we introduce in this paper a fast lesion tracking algorithm. We combine a global multi-resolution rigid registration algorithm with a local block matching and a local region growing algorithm. Whenever the user clicks on a lesion in the base-line PET scan the course of standardized uptake values (SUV) is automatically identified and shown to the user as a graph plot. We have validated our method by a data collection from 7 patients. Each patient underwent two or three PET/CT scans during the course of a cancer therapy. An experienced nuclear medicine physician manually measured the courses of the maximum SUVs for altogether 18 lesions. As a result we obtained that the automatic detection of the corresponding lesions resulted in SUV measurements which are nearly identical to the manually measured SUVs. Between 38 measured maximum SUVs derived from manual and automatic detected lesions we observed a correlation of 0.9994 and a average error of 0.4 SUV units.

  12. Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Mary

    2014-09-19

    Enhancing the performance of SciDAC applications on petascale systems has high priority within DOE SC. As we look to the future, achieving expected levels of performance on high-end com-puting (HEC) systems is growing ever more challenging due to enormous scale, increasing archi-tectural complexity, and increasing application complexity. To address these challenges, PERI has implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineering of high profile applications. The PERI performance modeling and prediction activity is developing and refining performance models, significantly reducing the cost of collecting the data upon whichmore » the models are based, and increasing model fidelity, speed and generality. Our primary research activity is automatic tuning (autotuning) of scientific software. This activity is spurred by the strong user preference for automatic tools and is based on previous successful activities such as ATLAS, which has automatically tuned components of the LAPACK linear algebra library, and other re-cent work on autotuning domain-specific libraries. Our third major component is application en-gagement, to which we are devoting approximately 30% of our effort to work directly with Sci-DAC-2 applications. This last activity not only helps DOE scientists meet their near-term per-formance goals, but also helps keep PERI research focused on the real challenges facing DOE computational scientists as they enter the Petascale Era.« less

  13. Understanding ITS/CVO Technology Applications, Student Manual, Course 3

    DOT National Transportation Integrated Search

    1999-01-01

    WEIGHT-IN-MOTION OR WIM, COMMERCIAL VEHICLE INFORMATION SYSTEMS AND NETWORK OR CVISN, AUTOMATIC VEHICLE IDENTIFICATION OR AVI, AUTOMATIC LOCATION OR AVL, ELECTRONIC DATA INTERCHANGE OR EDI, GLOBAL POSITIONING SYSTEM OR GPS, INTERNET OR WORLD WIDE WEB...

  14. A comparison of oxygen saturation data in inpatients with low oxygen saturation using automated continuous monitoring and intermittent manual data charting.

    PubMed

    Taenzer, Andreas H; Pyke, Joshua; Herrick, Michael D; Dodds, Thomas M; McGrath, Susan P

    2014-02-01

    The manual collection and charting of traditional vital signs data in inpatient populations have been shown to be inaccurate when compared with true physiologic values. This issue has not been examined with respect to oxygen saturation data despite the increased use of this measurement in systems designed to assess the risk of patient deterioration. Of particular note are the lack of available data examining the accuracy of oxygen saturation charting in a particularly vulnerable group of patients who have prolonged oxygen desaturations (mean SpO2 <90% over at least 15 minutes). In addition, no data are currently available that investigate the often suspected "wake up" effect, resulting from a nurse entering a patient's room to obtain vital signs. In this study, we compared oxygen saturation data recorded manually with data collected by an automated continuous monitoring system in 16 inpatients considered to be at high risk for deterioration (average SpO2 values <90% collected by the automated system in a 15-minute interval before a manual charting event). Data were sampled from the automatic collection system from 2 periods: over a 15-minute period that ended 5 minutes before the time of the manual data collection and charting, and over a 5-minute range before and after the time of the manual data collection and charting. Average saturations from prolonged baseline desaturations (15-minute period) were compared with both the manual and automated data sampled at the time of the nurse's visit to analyze for systematic change and to investigate the presence of an arousal effect. The manually charted data were higher than those recorded by the automated system. Manually recorded data were on average 6.5% (confidence interval, 4.0%-9.0%) higher in oxygen saturation. No significant arousal effect resulting from the nurse's visit to the patient's room was detected. In a cohort of patients with prolonged desaturations, manual recordings of SpO2 did not reflect physiologic patient state when compared with continuous automated sampling. Currently, early warning scores depend on manual vital sign recordings in many settings; the study data suggest that SpO2 ought to be added to the list of vital sign values that have been shown to be recorded inaccurately.

  15. Method and system for spatial data input, manipulation and distribution via an adaptive wireless transceiver

    NASA Technical Reports Server (NTRS)

    Wang, Ray (Inventor)

    2009-01-01

    A method and system for spatial data manipulation input and distribution via an adaptive wireless transceiver. The method and system include a wireless transceiver for automatically and adaptively controlling wireless transmissions using a Waveform-DNA method. The wireless transceiver can operate simultaneously over both the short and long distances. The wireless transceiver is automatically adaptive and wireless devices can send and receive wireless digital and analog data from various sources rapidly in real-time via available networks and network services.

  16. [Use of the Elektronika-T3-16M special-purpose computer for the automatic processing of cytophotometric and cytofluorimetric data].

    PubMed

    Loktionov, A S; Prianishnikov, V A

    1981-05-01

    A system has been proposed to provide the automatic analysis of data on: a) point cytophotometry, b) two-wave cytophotometry, c) cytofluorimetry. The system provides the input of the data from a photomultiplier to a specialized computer "Electronica-T3-16M" in addition to the simultaneous statistical analysis of these. The information on the programs used is presented. The advantages of the system, compared with some commercially available cytophotometers, are indicated.

  17. Computer automation of ultrasonic testing. [inspection of ultrasonic welding

    NASA Technical Reports Server (NTRS)

    Yee, B. G. W.; Kerlin, E. E.; Gardner, A. H.; Dunmyer, D.; Wells, T. G.; Robinson, A. R.; Kunselman, J. S.; Walker, T. C.

    1974-01-01

    Report describes a prototype computer-automated ultrasonic system developed for the inspection of weldments. This system can be operated in three modes: manual, automatic, and computer-controlled. In the computer-controlled mode, the system will automatically acquire, process, analyze, store, and display ultrasonic inspection data in real-time. Flaw size (in cross-section), location (depth), and type (porosity-like or crack-like) can be automatically discerned and displayed. The results and pertinent parameters are recorded.

  18. Using Machine Learning to Increase Research Efficiency: A New Approach in Environmental Sciences

    USDA-ARS?s Scientific Manuscript database

    Data collection has evolved from tedious in-person fieldwork to automatic data gathering from multiple sensor remotely. Scientist in environmental sciences have not fully exploited this data deluge, including legacy and new data, because the traditional scientific method is focused on small, high qu...

  19. 40 CFR 92.131 - Smoke, data analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Smoke, data analysis. 92.131 Section... analysis. The following procedure shall be used to analyze the smoke test data: (a) Locate each throttle... performed by direct analysis of the recorder traces, or by computer analysis of data collected by automatic...

  20. Detection of clinical mastitis with sensor data from automatic milking systems is improved by using decision-tree induction.

    PubMed

    Kamphuis, C; Mollenhorst, H; Heesterbeek, J A P; Hogeveen, H

    2010-08-01

    The objective was to develop and validate a clinical mastitis (CM) detection model by means of decision-tree induction. For farmers milking with an automatic milking system (AMS), it is desirable that the detection model has a high level of sensitivity (Se), especially for more severe cases of CM, at a very high specificity (Sp). In addition, an alert for CM should be generated preferably at the quarter milking (QM) at which the CM infection is visible for the first time. Data were collected from 9 Dutch dairy herds milking automatically during a 2.5-yr period. Data included sensor data (electrical conductivity, color, and yield) at the QM level and visual observations of quarters with CM recorded by the farmers. Visual observations of quarters with CM were combined with sensor data of the most recent automatic milking recorded for that same quarter, within a 24-h time window before the visual assessment time. Sensor data of 3.5 million QM were collected, of which 348 QM were combined with a CM observation. Data were divided into a training set, including two-thirds of all data, and a test set. Cows in the training set were not included in the test set and vice versa. A decision-tree model was trained using only clear examples of healthy (n=24,717) or diseased (n=243) QM. The model was tested on 105 QM with CM and a random sample of 50,000 QM without CM. While keeping the Se at a level comparable to that of models currently used by AMS, the decision-tree model was able to decrease the number of false-positive alerts by more than 50%. At an Sp of 99%, 40% of the CM cases were detected. Sixty-four percent of the severe CM cases were detected and only 12.5% of the CM that were scored as watery milk. The Se increased considerably from 40% to 66.7% when the time window increased from less than 24h before the CM observation, to a time window from 24h before to 24h after the CM observation. Even at very wide time windows, however, it was impossible to reach an Se of 100%. This indicates the inability to detect all CM cases based on sensor data alone. Sensitivity levels varied largely when the decision tree was validated per herd. This trend was confirmed when decision trees were trained using data from 8 herds and tested on data from the ninth herd. This indicates that when using the decision tree as a generic CM detection model in practice, some herds will continue having difficulties in detecting CM using mastitis alert lists, whereas others will perform well. Copyright (c) 2010 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  1. New taxonomy and old collections: integrating DNA barcoding into the collection curation process.

    PubMed

    Puillandre, N; Bouchet, P; Boisselier-Dubayle, M-C; Brisset, J; Buge, B; Castelin, M; Chagnoux, S; Christophe, T; Corbari, L; Lambourdière, J; Lozouet, P; Marani, G; Rivasseau, A; Silva, N; Terryn, Y; Tillier, S; Utge, J; Samadi, S

    2012-05-01

    Because they house large biodiversity collections and are also research centres with sequencing facilities, natural history museums are well placed to develop DNA barcoding best practices. The main difficulty is generally the vouchering system: it must ensure that all data produced remain attached to the corresponding specimen, from the field to publication in articles and online databases. The Museum National d'Histoire Naturelle in Paris is one of the leading laboratories in the Marine Barcode of Life (MarBOL) project, which was used as a pilot programme to include barcode collections for marine molluscs and crustaceans. The system is based on two relational databases. The first one classically records the data (locality and identification) attached to the specimens. In the second one, tissue-clippings, DNA extractions (both preserved in 2D barcode tubes) and PCR data (including primers) are linked to the corresponding specimen. All the steps of the process [sampling event, specimen identification, molecular processing, data submission to Barcode Of Life Database (BOLD) and GenBank] are thus linked together. Furthermore, we have developed several web-based tools to automatically upload data into the system, control the quality of the sequences produced and facilitate the submission to online databases. This work is the result of a joint effort from several teams in the Museum National d'Histoire Naturelle (MNHN), but also from a collaborative network of taxonomists and molecular systematists outside the museum, resulting in the vouchering so far of ∼41,000 sequences and the production of ∼11,000 COI sequences. © 2012 Blackwell Publishing Ltd.

  2. Rainfall, Streamflow, and Water-Quality Data During Stormwater Monitoring, Halawa Stream Drainage Basin, Oahu, Hawaii, July 1, 2005 to June 30, 2006

    USGS Publications Warehouse

    Presley, Todd K.; Jamison, Marcael T.J.; Young-Smith, Stacie T. M.

    2006-01-01

    Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous discharge data at one station, continuous streamflow data at two stations, and water-quality data at five stations, which include the continuous discharge and streamflow stations. This report summarizes rainfall, discharge, streamflow, and water-quality data collected between July 1, 2005 and June 30, 2006. A total of 23 samples was collected over five storms during July 1, 2005 to June 30, 2006. The goal was to collect grab samples nearly simultaneously at all five stations, and flow-weighted time-composite samples at the three stations equipped with automatic samplers; however, all five storms were partially sampled owing to lack of flow at the time of sampling at some sites, or because some samples collected by the automatic sampler did not represent water from the storm. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Additionally, grab samples were analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.

  3. View_SPECPR: Software for Plotting Spectra (Installation Manual and User's Guide, Version 1.2)

    USGS Publications Warehouse

    Kokaly, Raymond F.

    2008-01-01

    This document describes procedures for installing and using the 'View_SPECPR' software system to plot spectra stored in SPECPR (SPECtrum Processing Routines) files. The View_SPECPR software is comprised of programs written in IDL (Interactive Data Language) that run within the ENVI (ENvironment for Visualizing Images) image processing system. SPECPR files are used by earth-remote-sensing scientists and planetary scientists for storing spectra collected by laboratory, field, and remote sensing instruments. A widely distributed SPECPR file is the U.S. Geological Survey (USGS) spectral library that contains thousands of spectra of minerals, vegetation, and man-made materials (Clark and others, 2007). SPECPR files contain reflectance data and associated wavelength and spectral resolution data, as well as meta-data on the time and date of collection and spectrometer settings. Furthermore, the SPECPR file automatically tracks changes to data records through its 'history' fields. For more details on the format and content of SPECPR files, see Clark (1993). For more details on ENVI, see ITT (2008). This program has been updated using an ENVI 4.5/IDL7.0 full license operating on a Windows XP operating system and requires the installation of the iTools components of IDL7.0; however, this program should work with full licenses on UNIX/LINUX systems. This software has not been tested with ENVI licenses on Windows Vista or Apple Operating Systems.

  4. Clustering-Based Ensemble Learning for Activity Recognition in Smart Homes

    PubMed Central

    Jurek, Anna; Nugent, Chris; Bi, Yaxin; Wu, Shengli

    2014-01-01

    Application of sensor-based technology within activity monitoring systems is becoming a popular technique within the smart environment paradigm. Nevertheless, the use of such an approach generates complex constructs of data, which subsequently requires the use of intricate activity recognition techniques to automatically infer the underlying activity. This paper explores a cluster-based ensemble method as a new solution for the purposes of activity recognition within smart environments. With this approach activities are modelled as collections of clusters built on different subsets of features. A classification process is performed by assigning a new instance to its closest cluster from each collection. Two different sensor data representations have been investigated, namely numeric and binary. Following the evaluation of the proposed methodology it has been demonstrated that the cluster-based ensemble method can be successfully applied as a viable option for activity recognition. Results following exposure to data collected from a range of activities indicated that the ensemble method had the ability to perform with accuracies of 94.2% and 97.5% for numeric and binary data, respectively. These results outperformed a range of single classifiers considered as benchmarks. PMID:25014095

  5. Clustering-based ensemble learning for activity recognition in smart homes.

    PubMed

    Jurek, Anna; Nugent, Chris; Bi, Yaxin; Wu, Shengli

    2014-07-10

    Application of sensor-based technology within activity monitoring systems is becoming a popular technique within the smart environment paradigm. Nevertheless, the use of such an approach generates complex constructs of data, which subsequently requires the use of intricate activity recognition techniques to automatically infer the underlying activity. This paper explores a cluster-based ensemble method as a new solution for the purposes of activity recognition within smart environments. With this approach activities are modelled as collections of clusters built on different subsets of features. A classification process is performed by assigning a new instance to its closest cluster from each collection. Two different sensor data representations have been investigated, namely numeric and binary. Following the evaluation of the proposed methodology it has been demonstrated that the cluster-based ensemble method can be successfully applied as a viable option for activity recognition. Results following exposure to data collected from a range of activities indicated that the ensemble method had the ability to perform with accuracies of 94.2% and 97.5% for numeric and binary data, respectively. These results outperformed a range of single classifiers considered as benchmarks.

  6. Science information systems: Archive, access, and retrieval

    NASA Technical Reports Server (NTRS)

    Campbell, William J.

    1991-01-01

    The objective of this research is to develop technology for the automated characterization and interactive retrieval and visualization of very large, complex scientific data sets. Technologies will be developed for the following specific areas: (1) rapidly archiving data sets; (2) automatically characterizing and labeling data in near real-time; (3) providing users with the ability to browse contents of databases efficiently and effectively; (4) providing users with the ability to access and retrieve system independent data sets electronically; and (5) automatically alerting scientists to anomalies detected in data.

  7. Conceptual design of novel IP-conveyor-belt Weissenberg-mode data-collection system with multi-readers for macromolecular crystallography. A comparison between Galaxy and Super Galaxy.

    PubMed

    Sakabe, N; Sakabe, K; Sasaki, K

    2004-01-01

    Galaxy is a Weissenberg-type high-speed high-resolution and highly accurate fully automatic data-collection system using two cylindrical IP-cassettes each with a radius of 400 mm and a width of 450 mm. It was originally developed for static three-dimensional analysis using X-ray diffraction and was installed on bending-magnet beamline BL6C at the Photon Factory. It was found, however, that Galaxy was also very useful for time-resolved protein crystallography on a time scale of minutes. This has prompted us to design a new IP-conveyor-belt Weissenberg-mode data-collection system called Super Galaxy for time-resolved crystallography with improved time and crystallographic resolution over that achievable with Galaxy. Super Galaxy was designed with a half-cylinder-shaped cassette with a radius of 420 mm and a width of 690 mm. Using 1.0 A incident X-rays, these dimensions correspond to a maximum resolutions of 0.71 A in the vertical direction and 1.58 A in the horizontal. Upper and lower screens can be used to set the frame size of the recorded image. This function is useful not only to reduce the frame exchange time but also to save disk space on the data server. The use of an IP-conveyor-belt and many IP-readers make Super Galaxy well suited for time-resolved, monochromatic X-ray crystallography at a very intense third-generation SR beamline. Here, Galaxy and a conceptual design for Super Galaxy are described, and their suitability for use as data-collection systems for macromolecular time-resolved monochromatic X-ray crystallography are compared.

  8. RDFBuilder: a tool to automatically build RDF-based interfaces for MAGE-OM microarray data sources.

    PubMed

    Anguita, Alberto; Martin, Luis; Garcia-Remesal, Miguel; Maojo, Victor

    2013-07-01

    This paper presents RDFBuilder, a tool that enables RDF-based access to MAGE-ML-compliant microarray databases. We have developed a system that automatically transforms the MAGE-OM model and microarray data stored in the ArrayExpress database into RDF format. Additionally, the system automatically enables a SPARQL endpoint. This allows users to execute SPARQL queries for retrieving microarray data, either from specific experiments or from more than one experiment at a time. Our system optimizes response times by caching and reusing information from previous queries. In this paper, we describe our methods for achieving this transformation. We show that our approach is complementary to other existing initiatives, such as Bio2RDF, for accessing and retrieving data from the ArrayExpress database. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Emerging technology becomes an opportunity for EOS

    NASA Astrophysics Data System (ADS)

    Fargion, Giulietta S.; Harberts, Robert; Masek, Jeffrey G.

    1996-11-01

    During the last decade, we have seen an explosive growth in our ability to collect and generate data. When implemented, NASA's Earth observing system data information system (EOSDIS) will receive about 50 gigabytes of remotely sensed image data per hour. This will generate an urgent need for new techniques and tools that can automatically and intelligently assist in transforming this abundance of data into useful knowledge. Some emerging technologies that address these challenges include data mining and knowledge discovery in databases (KDD). The most basic data mining application is a content-based search (examples include finding images of particular meteorological phenomena or identifying data that have been previously mined or interpreted). In order that these technologies be effectively exploited for EOSDIS development, a better understanding of data mining and the requirements for using this technology is necessary. The authors are currently undertaking a project exploring the requirements and options of content-based search and data mining for use on EOSDIS. The scope of the project is to develop a prototype with which to investigate user interface concepts, requirements, and designs relevant for EOSDIS core system (ECS) subsystem utilizing these techniques. The goal is to identify a generic handling of these functions. This prototype will help identify opportunities which the earth science community and EOSDIS can use to meet the challenges of collecting, searching, retrieving, and interacting with abundant data resources in highly productive ways.

  10. Training effectiveness of an intelligent tutoring system for a propulsion console trainer

    NASA Technical Reports Server (NTRS)

    Johnson, Debra Steele

    1990-01-01

    A formative evaluation was conducted on an Intelligent Tutoring System (ITS) developed for tasks performed on the Propulsion Console. The ITS, which was developed primarily as a research tool, provides training on use of the Manual Select Keyboard (MSK). Three subjects completed three phases of training using the ITS: declarative, speed, and automaticity training. Data were collected on several performance dimensions, including training time, number of trials performed in each training phase, and number of errors. Information was also collected regarding the user interface and content of training. Suggestions for refining the ITS are discussed. Further, future potential uses and limitations of the ITS are discussed. The results provide an initial demonstration of the effectiveness of the Propulsion Console ITS and indicate the potential benefits of this form of training tool for related tasks.

  11. A World-Wide Net of Solar Radio Spectrometers: e-CALLISTO

    NASA Astrophysics Data System (ADS)

    Benz, A. O.; Monstein, C.; Meyer, H.; Manoharan, P. K.; Ramesh, R.; Altyntsev, A.; Lara, A.; Paez, J.; Cho, K.-S.

    2009-04-01

    Radio spectrometers of the CALLISTO type to observe solar flares have been distributed to nine locations around the globe. The instruments observe automatically, their data is collected every day via internet and stored in a central data base. A public web-interface exists through which data can be browsed and retrieved. The nine instruments form a network called e-CALLISTO. It is still growing in the number of stations, as redundancy is desirable for full 24 h coverage of the solar radio emission in the meter and low decimeter band. The e-CALLISTO system has already proven to be a valuable new tool for monitoring solar activity and for space weather research.

  12. Saturn S-2 Automatic Software System /SASS/

    NASA Technical Reports Server (NTRS)

    Parker, P. E.

    1967-01-01

    SATURN S-2 Automatic Software System /SASS/ was designed and implemented to aid SATURN S-2 program development and to increase the overall operating efficiency within the S-2 data laboratory. This program is written in FORTRAN 2 for SDS 920 computers.

  13. Detection and identification of benthic communities and shoreline features in Biscayne Bay

    NASA Technical Reports Server (NTRS)

    Kolipinski, M. C.; Higer, A. L.

    1970-01-01

    Progress made in the development of a technique for identifying and delinating benthic and shoreline communities using multispectral imagery is described. Images were collected with a multispectral scanner system mounted in a C-47 aircraft. Concurrent with the overflight, ecological ground- and sea-truth information was collected at 19 sites in the bay and on the shore. Preliminary processing of the scanner imagery with a CDC 1604 digital computer provided the optimum channels for discernment among different underwater and coastal objects. Automatic mapping of the benthic plants by multiband imagery and the mapping of isotherms and hydrodynamic parameters by digital model can become an effective predictive ecological tool when coupled together. Using the two systems, it appears possible to predict conditions that could adversely affect the benthic communities. With the advent of the ERTS satellites and space platforms, imagery data could be obtained which, when used in conjunction with water-level and meteorological data, would provide for continuous ecological monitoring.

  14. Image Registration Workshop Proceedings

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline (Editor)

    1997-01-01

    Automatic image registration has often been considered as a preliminary step for higher-level processing, such as object recognition or data fusion. But with the unprecedented amounts of data which are being and will continue to be generated by newly developed sensors, the very topic of automatic image registration has become and important research topic. This workshop presents a collection of very high quality work which has been grouped in four main areas: (1) theoretical aspects of image registration; (2) applications to satellite imagery; (3) applications to medical imagery; and (4) image registration for computer vision research.

  15. Going mobile with a multiaccess service for the management of diabetic patients.

    PubMed

    Lanzola, Giordano; Capozzi, Davide; D'Annunzio, Giuseppe; Ferrari, Pietro; Bellazzi, Riccardo; Larizza, Cristiana

    2007-09-01

    Diabetes mellitus is one of the chronic diseases exploiting the largest number of telemedicine systems. Our research group has been involved since 1996 in two projects funded by the European Union proposing innovative architectures and services according to the best current medical practices and advances in the information technology area. We propose an enhanced architecture for telemedicine giving rise to a multitier application. The lower tier is represented by a mobile phone hosting the patient unit able to acquire data and provide first-level advice to the patient. The patient unit also facilitates interaction with the health care center, representing the higher tier, by automatically uploading data and receiving back any therapeutic plan supplied by the physician. On the patient's side the mobile phone exploits Bluetooth technology and therefore acts as a hub for a wireless network, possibly including several devices in addition to the glucometer. A new system architecture based on mobile technology is being used to implement several prototypes for assessing its functionality. A subsequent effort will be undertaken to exploit the new system within a pilot study for the follow-up of patients cared at a major hospital located in northern Italy. We expect that the new architecture will enhance the interaction between patient and caring physician, simplifying and improving metabolic control. In addition to sending glycemic data to the caring center, we also plan to automatically download the therapeutic protocols provided by the physician to the insulin pump and collect data from multiple sensors.

  16. The WAIS Melt Monitor: An automated ice core melting system for meltwater sample handling and the collection of high resolution microparticle size distribution data

    NASA Astrophysics Data System (ADS)

    Breton, D. J.; Koffman, B. G.; Kreutz, K. J.; Hamilton, G. S.

    2010-12-01

    Paleoclimate data are often extracted from ice cores by careful geochemical analysis of meltwater samples. The analysis of the microparticles found in ice cores can also yield unique clues about atmospheric dust loading and transport, dust provenance and past environmental conditions. Determination of microparticle concentration, size distribution and chemical makeup as a function of depth is especially difficult because the particle size measurement either consumes or contaminates the meltwater, preventing further geochemical analysis. Here we describe a microcontroller-based ice core melting system which allows the collection of separate microparticle and chemistry samples from the same depth intervals in the ice core, while logging and accurately depth-tagging real-time electrical conductivity and particle size distribution data. This system was designed specifically to support microparticle analysis of the WAIS Divide WDC06A deep ice core, but many of the subsystems are applicable to more general ice core melting operations. Major system components include: a rotary encoder to measure ice core melt displacement with 0.1 millimeter accuracy, a meltwater tracking system to assign core depths to conductivity, particle and sample vial data, an optical debubbler level control system to protect the Abakus laser particle counter from damage due to air bubbles, a Rabbit 3700 microcontroller which communicates with a host PC, collects encoder and optical sensor data and autonomously operates Gilson peristaltic pumps and fraction collectors to provide automatic sample handling, melt monitor control software operating on a standard PC allowing the user to control and view the status of the system, data logging software operating on the same PC to collect data from the melting, electrical conductivity and microparticle measurement systems. Because microparticle samples can easily be contaminated, we use optical air bubble sensors and high resolution ice core density profiles to guide the melting process. The combination of these data allow us to analyze melt head performance, minimize outer-to-inner fraction contamination and avoid melt head flooding. The WAIS Melt Monitor system allows the collection of real-time, sub-annual microparticle and electrical conductivity data while producing and storing enough sample for traditional Coulter-Counter particle measurements as well long term acid leaching of bioactive metals (e.g., Fe, Co, Cd, Cu, Zn) prior to chemical analysis.

  17. Investigating energy-saving potentials in the cloud.

    PubMed

    Lee, Da-Sheng

    2014-02-20

    Collecting webpage messages can serve as a sensor for investigating the energy-saving potential of buildings. Focusing on stores, a cloud sensor system is developed to collect data and determine their energy-saving potential. The owner of a store under investigation must register online, report the store address, area, and the customer ID number on the electric meter. The cloud sensor system automatically surveys the energy usage records by connecting to the power company website and calculating the energy use index (EUI) of the store. Other data includes the chain store check, company capital, location price, and the influence of weather conditions on the store; even the exposure frequency of store under investigation may impact the energy usage collected online. After collecting data from numerous stores, a multi-dimensional data array is constructed to determine energy-saving potential by identifying stores with similarity conditions. Similarity conditions refer to analyzed results that indicate that two stores have similar capital, business scale, weather conditions, and exposure frequency on web. Calculating the EUI difference or pure technical efficiency of stores, the energy-saving potential is determined. In this study, a real case study is performed. An 8-dimensional (8D) data array is constructed by surveying web data related to 67 stores. Then, this study investigated the savings potential of the 33 stores, using a site visit, and employed the cloud sensor system to determine the saving potential. The case study results show good agreement between the data obtained by the site visit and the cloud investigation, with errors within 4.17%. Among 33 the samples, eight stores have low saving potentials of less than 5%. The developed sensor on the cloud successfully identifies them as having low saving potential and avoids wasting money on the site visit.

  18. Investigating Energy-Saving Potentials in the Cloud

    PubMed Central

    Lee, Da-Sheng

    2014-01-01

    Collecting webpage messages can serve as a sensor for investigating the energy-saving potential of buildings. Focusing on stores, a cloud sensor system is developed to collect data and determine their energy-saving potential. The owner of a store under investigation must register online, report the store address, area, and the customer ID number on the electric meter. The cloud sensor system automatically surveys the energy usage records by connecting to the power company website and calculating the energy use index (EUI) of the store. Other data includes the chain store check, company capital, location price, and the influence of weather conditions on the store; even the exposure frequency of store under investigation may impact the energy usage collected online. After collecting data from numerous stores, a multi-dimensional data array is constructed to determine energy-saving potential by identifying stores with similarity conditions. Similarity conditions refer to analyzed results that indicate that two stores have similar capital, business scale, weather conditions, and exposure frequency on web. Calculating the EUI difference or pure technical efficiency of stores, the energy-saving potential is determined. In this study, a real case study is performed. An 8-dimensional (8D) data array is constructed by surveying web data related to 67 stores. Then, this study investigated the savings potential of the 33 stores, using a site visit, and employed the cloud sensor system to determine the saving potential. The case study results show good agreement between the data obtained by the site visit and the cloud investigation, with errors within 4.17%. Among 33 the samples, eight stores have low saving potentials of less than 5%. The developed sensor on the cloud successfully identifies them as having low saving potential and avoids wasting money on the site visit. PMID:24561405

  19. Automatic-repeat-request error control schemes

    NASA Technical Reports Server (NTRS)

    Lin, S.; Costello, D. J., Jr.; Miller, M. J.

    1983-01-01

    Error detection incorporated with automatic-repeat-request (ARQ) is widely used for error control in data communication systems. This method of error control is simple and provides high system reliability. If a properly chosen code is used for error detection, virtually error-free data transmission can be attained. Various types of ARQ and hybrid ARQ schemes, and error detection using linear block codes are surveyed.

  20. Auspice: Automatic Service Planning in Cloud/Grid Environments

    NASA Astrophysics Data System (ADS)

    Chiu, David; Agrawal, Gagan

    Recent scientific advances have fostered a mounting number of services and data sets available for utilization. These resources, though scattered across disparate locations, are often loosely coupled both semantically and operationally. This loosely coupled relationship implies the possibility of linking together operations and data sets to answer queries. This task, generally known as automatic service composition, therefore abstracts the process of complex scientific workflow planning from the user. We have been exploring a metadata-driven approach toward automatic service workflow composition, among other enabling mechanisms, in our system, Auspice: Automatic Service Planning in Cloud/Grid Environments. In this paper, we present a complete overview of our system's unique features and outlooks for future deployment as the Cloud computing paradigm becomes increasingly eminent in enabling scientific computing.

  1. Headway Deviation Effects on Bus Passenger Loads : Analysis of Tri-Met's Archived AVL-APC Data

    DOT National Transportation Integrated Search

    2003-01-01

    In this paper we empirically analyze the relationship between transit service headway deviations and passenger loads, using archived data from Tri-Met's automatic vehicle location and automatic passenger counter systems. The analysis employs twostage...

  2. Evolution of user analysis on the grid in ATLAS

    NASA Astrophysics Data System (ADS)

    Dewhurst, A.; Legger, F.; ATLAS Collaboration

    2017-10-01

    More than one thousand physicists analyse data collected by the ATLAS experiment at the Large Hadron Collider (LHC) at CERN through 150 computing facilities around the world. Efficient distributed analysis requires optimal resource usage and the interplay of several factors: robust grid and software infrastructures, and system capability to adapt to different workloads. The continuous automatic validation of grid sites and the user support provided by a dedicated team of expert shifters have been proven to provide a solid distributed analysis system for ATLAS users. Typical user workflows on the grid, and their associated metrics, are discussed. Measurements of user job performance and typical requirements are also shown.

  3. Manual editing of automatically recorded data in an anesthesia information management system.

    PubMed

    Wax, David B; Beilin, Yaakov; Hossain, Sabera; Lin, Hung-Mo; Reich, David L

    2008-11-01

    Anesthesia information management systems allow automatic recording of physiologic and anesthetic data. The authors investigated the prevalence of such data modification in an academic medical center. The authors queried their anesthesia information management system database of anesthetics performed in 2006 and tabulated the counts of data points for automatically recorded physiologic and anesthetic parameters as well as the subset of those data that were manually invalidated by clinicians (both with and without alternate values manually appended). Patient, practitioner, data source, and timing characteristics of recorded values were also extracted to determine their associations with editing of various parameters in the anesthesia information management system record. A total of 29,491 cases were analyzed, 19% of which had one or more data points manually invalidated. Among 58 attending anesthesiologists, each invalidated data in a median of 7% of their cases when working as a sole practitioner. A minority of invalidated values were manually appended with alternate values. Pulse rate, blood pressure, and pulse oximetry were the most commonly invalidated parameters. Data invalidation usually resulted in a decrease in parameter variance. Factors independently associated with invalidation included extreme physiologic values, American Society of Anesthesiologists physical status classification, emergency status, timing (phase of the procedure/anesthetic), presence of an intraarterial catheter, resident or certified registered nurse anesthetist involvement, and procedure duration. Editing of physiologic data automatically recorded in an anesthesia information management system is a common practice and results in decreased variability of intraoperative data. Further investigation may clarify the reasons for and consequences of this behavior.

  4. Computer-Assisted Search Of Large Textual Data Bases

    NASA Technical Reports Server (NTRS)

    Driscoll, James R.

    1995-01-01

    "QA" denotes high-speed computer system for searching diverse collections of documents including (but not limited to) technical reference manuals, legal documents, medical documents, news releases, and patents. Incorporates previously available and emerging information-retrieval technology to help user intelligently and rapidly locate information found in large textual data bases. Technology includes provision for inquiries in natural language; statistical ranking of retrieved information; artificial-intelligence implementation of semantics, in which "surface level" knowledge found in text used to improve ranking of retrieved information; and relevance feedback, in which user's judgements of relevance of some retrieved documents used automatically to modify search for further information.

  5. Automatic ICD-10 multi-class classification of cause of death from plaintext autopsy reports through expert-driven feature selection.

    PubMed

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali

    2017-01-01

    Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports.

  6. Automatic ICD-10 multi-class classification of cause of death from plaintext autopsy reports through expert-driven feature selection

    PubMed Central

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali

    2017-01-01

    Objectives Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Methods Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Results Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. Conclusion The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports. PMID:28166263

  7. Construction Management Activities of Governmental Agencies in the New England Area During the Construction Phase.

    DTIC Science & Technology

    1980-02-01

    automatic data exchange ... 56 There are currently 12 Data Systems available: I. Integrated Disbursing and Accounting (IDA) 2. Integrated Program Management...construction project progress through the use of a CPM scheduling and progress reporting system . It automatically generates invoices for payment and payment...posted on the project. Water will be drained daily from tanks of vehicle air brake systems . Rtigging, hooks, pendants and slings will be examined

  8. The Northern California Earthquake Management System: A Unified System From Realtime Monitoring to Data Distribution

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Dietz, L.; Lombard, P.; Klein, F.; Zuzlewski, S.; Kohler, W.; Hellweg, M.; Luetgert, J.; Oppenheimer, D.; Romanowicz, B.

    2006-12-01

    The longstanding cooperation between the USGS Menlo Park and UC Berkeley's Seismological Laboratory for monitoring earthquakes and providing data to the research community is achieving a new level of integration. While station support and data collection for each network (NC, BK, BP) remain the responsibilities of the host institution, picks, codas and amplitudes will be produced and shared between the data centers continuously. Thus, realtime earthquake processing from triggering and locating through magnitude and moment tensor calculation and Shakemap production will take place independently at both locations, improving the robustness of event reporting in the Northern California Earthquake Management Center. Parametric data will also be exchanged with the Southern California Earthquake Management System to allow statewide earthquake detection and processing for further redundancy within the California Integrated Seismic Network (CISN). The database plays an integral part in this system, providing the coordination for event processing as well as the repository for event, instrument (metadata) and waveform information. The same master database serves both realtime processing, data quality control and archival, and the data center which provides waveforms and earthquake data to users in the research community. Continuous waveforms from all BK, BP, and NC stations, event waveform gathers, and event information automatically become available at the Northern California Earthquake Data Center (NCEDC). Currently, the NCEDC is collecting and makes available over 4 TByes of data per year from the NCEMC stations and other seismic networks, as well as from GPS and and other geophysical instrumentation.

  9. Automatic Pre-Hospital Vital Signs Waveform and Trend Data Capture Fills Quality Management, Triage and Outcome Prediction Gaps

    PubMed Central

    Mackenzie, Colin F; Hu, Peter; Sen, Ayan; Dutton, Rick; Seebode, Steve; Floccare, Doug; Scalea, Tom

    2008-01-01

    Trauma Triage errors are frequent and costly. What happens in pre-hospital care remains anecdotal because of the dual responsibility of treatment (resuscitation and stabilization) and documentation in a time-critical environment. Continuous pre-hospital vital signs waveforms and numerical trends were automatically collected in our study. Abnormalities of pulse oximeter oxygen saturation (< 95%) and validated heart rate (> 100/min) showed better prediction of injury severity, need for immediate blood transfusion, intra-abdominal surgery, tracheal intubation and chest tube insertion than Trauma Registry data or Pre-hospital provider estimations. Automated means of data collection introduced the potential for more accurate and objective reporting of patient vital signs helping in evaluating quality of care and establishing performance indicators and benchmarks. Addition of novel and existing non-invasive monitors and waveform analyses could make the pulse oximeter the decision aid of choice to improve trauma patient triage. PMID:18999022

  10. An electric propulsion long term test facility

    NASA Technical Reports Server (NTRS)

    Trump, G.; James, E.; Vetrone, R.; Bechtel, R.

    1979-01-01

    An existing test facility was modified to provide for extended testing of multiple electric propulsion thruster subsystems. A program to document thruster subsystem characteristics as a function of time is currently in progress. The facility is capable of simultaneously operating three 2.7-kW, 30-cm mercury ion thrusters and their power processing units. Each thruster is installed via a separate air lock so that it can be extended into the 7m x 10m main chamber without violating vacuum integrity. The thrusters exhaust into a 3m x 5m frozen mercury target. An array of cryopanels collect sputtered target material. Power processor units are tested in an adjacent 1.5m x 2m vacuum chamber or accompanying forced convection enclosure. The thruster subsystems and the test facility are designed for automatic unattended operation with thruster operation computer controlled. Test data are recorded by a central data collection system scanning 200 channels of data a second every two minutes. Results of the Systems Demonstration Test, a short shakedown test of 500 hours, and facility performance during the first year of testing are presented.

  11. A PC-based computer package for automatic detection and location of earthquakes: Application to a seismic network in eastern sicity (Italy)

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano

    Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the automatic results with the manual picks indicates that the ASDP module can accurately pick 80% of the P-waves and 65% of S-waves. The on-line application on the latter data set shows that automatic locations are affected by larger errors, due to the preliminary setting of the configuration parameters in the program. However, both automatic ASDP and manual hypocenter locations are comparable within the estimated error bounds. New improvements of the PC-Seism software for on-line analysis are also discussed.

  12. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyer, W.B.

    1979-09-01

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described.

  13. A procedure for automating CFD simulations of an inlet-bleed problem

    NASA Technical Reports Server (NTRS)

    Chyu, Wei J.; Rimlinger, Mark J.; Shih, Tom I.-P.

    1995-01-01

    A procedure was developed to improve the turn-around time for computational fluid dynamics (CFD) simulations of an inlet-bleed problem involving oblique shock-wave/boundary-layer interactions on a flat plate with bleed into a plenum through one or more circular holes. This procedure is embodied in a preprocessor called AUTOMAT. With AUTOMAT, once data for the geometry and flow conditions have been specified (either interactively or via a namelist), it will automatically generate all input files needed to perform a three-dimensional Navier-Stokes simulation of the prescribed inlet-bleed problem by using the PEGASUS and OVERFLOW codes. The input files automatically generated by AUTOMAT include those for the grid system and those for the initial and boundary conditions. The grid systems automatically generated by AUTOMAT are multi-block structured grids of the overlapping type. Results obtained by using AUTOMAT are presented to illustrate its capability.

  14. From ontology selection and semantic web to an integrated information system for food-borne diseases and food safety.

    PubMed

    Yan, Xianghe; Peng, Yun; Meng, Jianghong; Ruzante, Juliana; Fratamico, Pina M; Huang, Lihan; Juneja, Vijay; Needleman, David S

    2011-01-01

    Several factors have hindered effective use of information and resources related to food safety due to inconsistency among semantically heterogeneous data resources, lack of knowledge on profiling of food-borne pathogens, and knowledge gaps among research communities, government risk assessors/managers, and end-users of the information. This paper discusses technical aspects in the establishment of a comprehensive food safety information system consisting of the following steps: (a) computational collection and compiling publicly available information, including published pathogen genomic, proteomic, and metabolomic data; (b) development of ontology libraries on food-borne pathogens and design automatic algorithms with formal inference and fuzzy and probabilistic reasoning to address the consistency and accuracy of distributed information resources (e.g., PulseNet, FoodNet, OutbreakNet, PubMed, NCBI, EMBL, and other online genetic databases and information); (c) integration of collected pathogen profiling data, Foodrisk.org ( http://www.foodrisk.org ), PMP, Combase, and other relevant information into a user-friendly, searchable, "homogeneous" information system available to scientists in academia, the food industry, and government agencies; and (d) development of a computational model in semantic web for greater adaptability and robustness.

  15. Customized laboratory information management system for a clinical and research leukemia cytogenetics laboratory.

    PubMed

    Bakshi, Sonal R; Shukla, Shilin N; Shah, Pankaj M

    2009-01-01

    We developed a Microsoft Access-based laboratory management system to facilitate database management of leukemia patients referred for cytogenetic tests in regards to karyotyping and fluorescence in situ hybridization (FISH). The database is custom-made for entry of patient data, clinical details, sample details, cytogenetics test results, and data mining for various ongoing research areas. A number of clinical research laboratoryrelated tasks are carried out faster using specific "queries." The tasks include tracking clinical progression of a particular patient for multiple visits, treatment response, morphological and cytogenetics response, survival time, automatic grouping of patient inclusion criteria in a research project, tracking various processing steps of samples, turn-around time, and revenue generated. Since 2005 we have collected of over 5,000 samples. The database is easily updated and is being adapted for various data maintenance and mining needs.

  16. A real-time freehand ultrasound calibration system with automatic accuracy feedback and control.

    PubMed

    Chen, Thomas Kuiran; Thurston, Adrian D; Ellis, Randy E; Abolmaesumi, Purang

    2009-01-01

    This article describes a fully automatic, real-time, freehand ultrasound calibration system. The system was designed to be simple and sterilizable, intended for operating-room usage. The calibration system employed an automatic-error-retrieval and accuracy-control mechanism based on a set of ground-truth data. Extensive validations were conducted on a data set of 10,000 images in 50 independent calibration trials to thoroughly investigate the accuracy, robustness, and performance of the calibration system. On average, the calibration accuracy (measured in three-dimensional reconstruction error against a known ground truth) of all 50 trials was 0.66 mm. In addition, the calibration errors converged to submillimeter in 98% of all trials within 12.5 s on average. Overall, the calibration system was able to consistently, efficiently and robustly achieve high calibration accuracy with real-time performance.

  17. 75 FR 61487 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-05

    ... by more than one system, automatic monitoring equipment must be installed at the base station to prevent activation of the transmitter when signals of co-channel stations are present and activation would... of the interconnected base station transmitter. A statement must be submitted to the Commission...

  18. Acquisition and use of Orlando, Florida and Continental Airbus radar flight test data

    NASA Technical Reports Server (NTRS)

    Eide, Michael C.; Mathews, Bruce

    1992-01-01

    Westinghouse is developing a lookdown pulse Doppler radar for production as the sensor and processor of a forward looking hazardous windshear detection and avoidance system. A data collection prototype of that product was ready for flight testing in Orlando to encounter low level windshear in corroboration with the FAA-Terminal Doppler Weather Radar (TDWR). Airborne real-time processing and display of the hazard factor were demonstrated with TDWR facilitated intercepts and penetrations of over 80 microbursts in a three day period, including microbursts with hazard factors in excess of .16 (with 500 ft. PIREP altitude loss) and the hazard factor display at 6 n.mi. of a visually transparent ('dry') microburst with TDWR corroborated outflow reflectivities of +5 dBz. Range gated Doppler spectrum data was recorded for subsequent development and refinement of hazard factor detection and urban clutter rejection algorithms. Following Orlando, the data collection radar was supplemental type certified for in revenue service on a Continental Airlines Airbus in an automatic and non-interferring basis with its ARINC 708 radar to allow Westinghouse to confirm its understanding of commercial aircraft installation, interface realities, and urban airport clutter. A number of software upgrades, all of which were verified at the Receiver-Transmitter-Processor (RTP) hardware bench with Orlando microburst data to produce desired advanced warning hazard factor detection, included some preliminary loads with automatic (sliding window average hazard factor) detection and annunciation recording. The current (14-APR-92) configured software is free from false and/or nuisance alerts (CAUTIONS, WARNINGS, etc.) for all take-off and landing approaches, under 2500 ft. altitude to weight-on-wheels, into all encountered airports, including Newark (NJ), LAX, Denver, Houston, Cleveland, etc. Using the Orlando data collected on hazardous microbursts, Westinghouse has developed a lookdown pulse Doppler radar product with signal and data processing algorithms which detect realistic microburst hazards and has demonstrated those algorithms produce no false alerts (or nuisance alerts) in urban airport ground moving vehicle (GMTI) and/or clutter environments.

  19. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.

  20. The ISAC-CNR micrometeorological base and database in Lecce

    NASA Astrophysics Data System (ADS)

    Martano, P.; Grasso, F.; Elefante, C.

    2010-09-01

    The micrometeorological base of CNR-ISAC in Lecce, South-East of Italy, is active since 2002, in collecting experimental data about surface-atmosphere transfer of momentum heat and water vapour. It operates in a periurban site inside the Salento University campus and has been improved along the past years in terms of active sensors to give a quite complete description of the soil-atmosphere vertical transfer. It is composed by a 16 m mast with fast response (eddy correlation) instrumentation and an ancillary automatic meteorological station collecting also soil data at 2 levels depth. Fast response data are pre-processed in half-hour averaged statistics. All collected data are available in a web database (www.basesperimentale.le.isac.cnr.it) where they can be visualized or downloaded. A real time automated connection between the base and the data base is also in progress. At present the Lecce data base is also a pilot reference structure for the Climate Change Section of the CNR-DTA GIIDA project (National Research Council - Earth and Environment Department, Interdisciplinary and Interoperational Management of Environmental Data), aimed to build a spatial data infrastructure between different CNR-DTA structures collecting environmental data. This will allow easier search and availability for a great deal of environmental information in terms of data associated to international quality standards and metadata systems (GEOSS, GMES, INSPIRE).

  1. Improved automatic adjustment of density and contrast in FCR system using neural network

    NASA Astrophysics Data System (ADS)

    Takeo, Hideya; Nakajima, Nobuyoshi; Ishida, Masamitsu; Kato, Hisatoyo

    1994-05-01

    FCR system has an automatic adjustment of image density and contrast by analyzing the histogram of image data in the radiation field. Advanced image recognition methods proposed in this paper can improve the automatic adjustment performance, in which neural network technology is used. There are two methods. Both methods are basically used 3-layer neural network with back propagation. The image data are directly input to the input-layer in one method and the histogram data is input in the other method. The former is effective to the imaging menu such as shoulder joint in which the position of interest region occupied on the histogram changes by difference of positioning and the latter is effective to the imaging menu such as chest-pediatrics in which the histogram shape changes by difference of positioning. We experimentally confirm the validity of these methods (about the automatic adjustment performance) as compared with the conventional histogram analysis methods.

  2. Automatic Sleep Stage Determination by Multi-Valued Decision Making Based on Conditional Probability with Optimal Parameters

    NASA Astrophysics Data System (ADS)

    Wang, Bei; Sugi, Takenao; Wang, Xingyu; Nakamura, Masatoshi

    Data for human sleep study may be affected by internal and external influences. The recorded sleep data contains complex and stochastic factors, which increase the difficulties for the computerized sleep stage determination techniques to be applied for clinical practice. The aim of this study is to develop an automatic sleep stage determination system which is optimized for variable sleep data. The main methodology includes two modules: expert knowledge database construction and automatic sleep stage determination. Visual inspection by a qualified clinician is utilized to obtain the probability density function of parameters during the learning process of expert knowledge database construction. Parameter selection is introduced in order to make the algorithm flexible. Automatic sleep stage determination is manipulated based on conditional probability. The result showed close agreement comparing with the visual inspection by clinician. The developed system can meet the customized requirements in hospitals and institutions.

  3. EXIF Custom: Automatic image metadata extraction for Scratchpads and Drupal.

    PubMed

    Baker, Ed

    2013-01-01

    Many institutions and individuals use embedded metadata to aid in the management of their image collections. Many deskop image management solutions such as Adobe Bridge and online tools such as Flickr also make use of embedded metadata to describe, categorise and license images. Until now Scratchpads (a data management system and virtual research environment for biodiversity) have not made use of these metadata, and users have had to manually re-enter this information if they have wanted to display it on their Scratchpad site. The Drupal described here allows users to map metadata embedded in their images to the associated field in the Scratchpads image form using one or more customised mappings. The module works seamlessly with the bulk image uploader used on Scratchpads and it is therefore possible to upload hundreds of images easily with automatic metadata (EXIF, XMP and IPTC) extraction and mapping.

  4. EXIF Custom: Automatic image metadata extraction for Scratchpads and Drupal

    PubMed Central

    2013-01-01

    Abstract Many institutions and individuals use embedded metadata to aid in the management of their image collections. Many deskop image management solutions such as Adobe Bridge and online tools such as Flickr also make use of embedded metadata to describe, categorise and license images. Until now Scratchpads (a data management system and virtual research environment for biodiversity) have not made use of these metadata, and users have had to manually re-enter this information if they have wanted to display it on their Scratchpad site. The Drupal described here allows users to map metadata embedded in their images to the associated field in the Scratchpads image form using one or more customised mappings. The module works seamlessly with the bulk image uploader used on Scratchpads and it is therefore possible to upload hundreds of images easily with automatic metadata (EXIF, XMP and IPTC) extraction and mapping. PMID:24723768

  5. Microcomputer control soft tube measuring-testing instrument

    NASA Astrophysics Data System (ADS)

    Zhou, Yanzhou; Jiang, Xiu-Zhen; Wang, Wen-Yi

    1993-09-01

    Soft tube are key and easily spoiled parts used by the vehicles in the transportation with large numbers. Measuring and testing of the tubes were made by hands for a long time. Cooperating with Harbin Railway Bureau recently we have developed a new kind of automatical measuring and testing instrument In the paper the instrument structure property and measuring principle are presented in details. Centre of the system is a singlechip processor INTEL 80C31 . It can collect deal with data and display the results on LED. Furthermore it brings electromagnetic valves and motors under control. Five soft tubes are measured and tested in the same time all the process is finished automatically. On the hardware and software counter-electromagnetic disturbance methods is adopted efficiently so the performance of the instrument is improved significantly. In the long run the instrument is reliable and practical It solves a quite difficult problem in the railway transportation.

  6. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    NASA Technical Reports Server (NTRS)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  7. Feasibility study ASCS remote sensing/compliance determination system

    NASA Technical Reports Server (NTRS)

    Duggan, I. E.; Minter, T. C., Jr.; Moore, B. H.; Nosworthy, C. T.

    1973-01-01

    A short-term technical study was performed by the MSC Earth Observations Division to determine the feasibility of the proposed Agricultural Stabilization and Conservation Service Automatic Remote Sensing/Compliance Determination System. For the study, the term automatic was interpreted as applying to an automated remote-sensing system that includes data acquisition, processing, and management.

  8. Automatic Detection of Storm Damages Using High-Altitude Photogrammetric Imaging

    NASA Astrophysics Data System (ADS)

    Litkey, P.; Nurminen, K.; Honkavaara, E.

    2013-05-01

    The risks of storms that cause damage in forests are increasing due to climate change. Quickly detecting fallen trees, assessing the amount of fallen trees and efficiently collecting them are of great importance for economic and environmental reasons. Visually detecting and delineating storm damage is a laborious and error-prone process; thus, it is important to develop cost-efficient and highly automated methods. Objective of our research project is to investigate and develop a reliable and efficient method for automatic storm damage detection, which is based on airborne imagery that is collected after a storm. The requirements for the method are the before-storm and after-storm surface models. A difference surface is calculated using two DSMs and the locations where significant changes have appeared are automatically detected. In our previous research we used four-year old airborne laser scanning surface model as the before-storm surface. The after-storm DSM was provided from the photogrammetric images using the Next Generation Automatic Terrain Extraction (NGATE) algorithm of Socet Set software. We obtained 100% accuracy in detection of major storm damages. In this investigation we will further evaluate the sensitivity of the storm-damage detection process. We will investigate the potential of national airborne photography, that is collected at no-leaf season, to automatically produce a before-storm DSM using image matching. We will also compare impact of the terrain extraction algorithm to the results. Our results will also promote the potential of national open source data sets in the management of natural disasters.

  9. Preparing a collection of radiology examinations for distribution and retrieval.

    PubMed

    Demner-Fushman, Dina; Kohli, Marc D; Rosenman, Marc B; Shooshan, Sonya E; Rodriguez, Laritza; Antani, Sameer; Thoma, George R; McDonald, Clement J

    2016-03-01

    Clinical documents made available for secondary use play an increasingly important role in discovery of clinical knowledge, development of research methods, and education. An important step in facilitating secondary use of clinical document collections is easy access to descriptions and samples that represent the content of the collections. This paper presents an approach to developing a collection of radiology examinations, including both the images and radiologist narrative reports, and making them publicly available in a searchable database. The authors collected 3996 radiology reports from the Indiana Network for Patient Care and 8121 associated images from the hospitals' picture archiving systems. The images and reports were de-identified automatically and then the automatic de-identification was manually verified. The authors coded the key findings of the reports and empirically assessed the benefits of manual coding on retrieval. The automatic de-identification of the narrative was aggressive and achieved 100% precision at the cost of rendering a few findings uninterpretable. Automatic de-identification of images was not quite as perfect. Images for two of 3996 patients (0.05%) showed protected health information. Manual encoding of findings improved retrieval precision. Stringent de-identification methods can remove all identifiers from text radiology reports. DICOM de-identification of images does not remove all identifying information and needs special attention to images scanned from film. Adding manual coding to the radiologist narrative reports significantly improved relevancy of the retrieved clinical documents. The de-identified Indiana chest X-ray collection is available for searching and downloading from the National Library of Medicine (http://openi.nlm.nih.gov/). Published by Oxford University Press on behalf of the American Medical Informatics Association 2015. This work is written by US Government employees and is in the public domain in the US.

  10. Automatic 3d Building Model Generations with Airborne LiDAR Data

    NASA Astrophysics Data System (ADS)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D building models can be generated successfully using raw LiDAR point cloud data.

  11. A simulator evaluation of an automatic terminal approach system

    NASA Technical Reports Server (NTRS)

    Hinton, D. A.

    1983-01-01

    The automatic terminal approach system (ATAS) is a concept for improving the pilot/machine interface with cockpit automation. The ATAS can automatically fly a published instrument approach by using stored instrument approach data to automatically tune airplane avionics, control the airplane's autopilot, and display status information to the pilot. A piloted simulation study was conducted to determine the feasibility of an ATAS, determine pilot acceptance, and examine pilot/ATAS interaction. Seven instrument-rated pilots each flew four instrument approaches with a base-line heading select autopilot mode. The ATAS runs resulted in lower flight technical error, lower pilot workload, and fewer blunders than with the baseline autopilot. The ATAS status display enabled the pilots to maintain situational awareness during the automatic approaches. The system was well accepted by the pilots.

  12. The SmartOR: a distributed sensor network to improve operating room efficiency.

    PubMed

    Huang, Albert Y; Joerger, Guillaume; Fikfak, Vid; Salmon, Remi; Dunkin, Brian J; Bass, Barbara L; Garbey, Marc

    2017-09-01

    Despite the significant expense of OR time, best practice achieves only 70% efficiency. Compounding this problem is a lack of real-time data. Most current OR utilization programs require manual data entry. Automated systems require installation and maintenance of expensive tracking hardware throughout the institution. This study developed an inexpensive, automated OR utilization system and analyzed data from multiple operating rooms. OR activity was deconstructed into four room states. A sensor network was then developed to automatically capture these states using only three sensors, a local wireless network, and a data capture computer. Two systems were then installed into two ORs, recordings captured 24/7. The SmartOR recorded the following events: any room activity, patient entry/exit time, anesthesia time, laparoscopy time, room turnover time, and time of preoperative patient identification by the surgeon. From November 2014 to December 2015, data on 1003 cases were collected. The mean turnover time was 36 min, and 38% of cases met the institutional goal of ≤30 min. Data analysis also identified outlier cases (>1 SD from mean) in the domains of time from patient entry into the OR to intubation (11% of cases) and time from extubation to patient exiting the OR (11% of cases). Time from surgeon identification of patient to scheduled procedure start time was 11 min (institution bylaws require 20 min before scheduled start time), yet OR teams required 22 min on average to bring a patient into the room after surgeon identification. The SmartOR automatically and reliably captures data on OR room state and, in real time, identifies outlier cases that may be examined closer to improve efficiency. As no manual entry is required, the data are indisputable and allow OR teams to maintain a patient-centric focus.

  13. High precision dual-axis tracking solar wireless charging system based on the four quadrant photoelectric sensor

    NASA Astrophysics Data System (ADS)

    Liu, Zhilong; Wang, Biao; Tong, Weichao

    2015-08-01

    This paper designs a solar automatic tracking wireless charging system based on the four quadrant photoelectric sensor. The system track the sun's rays automatically in real time to received the maximum energy and wireless charging to the load through electromagnetic coupling. Four quadrant photoelectric sensor responsive to the solar spectrum, the system could get the current azimuth and elevation angle of the light by calculating the solar energy incident on the sensor profile. System driver the solar panels by the biaxial movement mechanism to rotate and tilt movement until the battery plate and light perpendicular to each other. Maximize the use of solar energy, and does not require external power supply to achieve energy self-sufficiency. Solar energy can be collected for portable devices and load wireless charging by close electromagnetic field coupling. Experimental data show that: Four quadrant photoelectric sensor more sensitive to light angle measurement. when track positioning solar light, Azimuth deviation is less than 0.8°, Elevation angle deviation is less than 0.6°. Use efficiency of a conventional solar cell is only 10% -20%.The system uses a Four quadrant dual-axis tracking to raise the utilization rate of 25% -35%.Wireless charging electromagnetic coupling efficiency reached 60%.

  14. GET REAL!

    EPA Science Inventory

    Combined sewer overflow (CSO) is a significant source of pollution in receiving waters. However, implementing a real-time control scheme operates automatic regulators more efficiently to maximize a collection system's storage, treatment, and transport capacities, reducing the vol...

  15. Word Naming in the L1 and L2: A Dynamic Perspective on Automatization and the Degree of Semantic Involvement in Naming.

    PubMed

    Plat, Rika; Lowie, Wander; de Bot, Kees

    2017-01-01

    Reaction time data have long been collected in order to gain insight into the underlying mechanisms involved in language processing. Means analyses often attempt to break down what factors relate to what portion of the total reaction time. From a dynamic systems theory perspective or an interaction dominant view of language processing, it is impossible to isolate discrete factors contributing to language processing, since these continually and interactively play a role. Non-linear analyses offer the tools to investigate the underlying process of language use in time, without having to isolate discrete factors. Patterns of variability in reaction time data may disclose the relative contribution of automatic (grapheme-to-phoneme conversion) processing and attention-demanding (semantic) processing. The presence of a fractal structure in the variability of a reaction time series indicates automaticity in the mental structures contributing to a task. A decorrelated pattern of variability will indicate a higher degree of attention-demanding processing. A focus on variability patterns allows us to examine the relative contribution of automatic and attention-demanding processing when a speaker is using the mother tongue (L1) or a second language (L2). A word naming task conducted in the L1 (Dutch) and L2 (English) shows L1 word processing to rely more on automatic spelling-to-sound conversion than L2 word processing. A word naming task with a semantic categorization subtask showed more reliance on attention-demanding semantic processing when using the L2. A comparison to L1 English data shows this was not only due to the amount of language use or language dominance, but also to the difference in orthographic depth between Dutch and English. An important implication of this finding is that when the same task is used to test and compare different languages, one cannot straightforwardly assume the same cognitive sub processes are involved to an equal degree using the same task in different languages.

  16. Implementation of an Electronic Data Collection Tool to Monitor Nursing-Sensitive Indicators in a Large Academic Health Sciences Centre.

    PubMed

    Backman, Chantal; Vanderloo, Saskia; Momtahan, Kathy; d'Entremont, Barb; Freeman, Lisa; Kachuik, Lynn; Rossy, Dianne; Mille, Toba; Mojaverian, Naghmeh; Lemire-Rodger, Ginette; Forster, Alan

    2015-09-01

    Monitoring the quality of nursing care is essential to identify patients at risk, measure adherence to hospital policies and evaluate the effectiveness of best practice interventions. However, monitoring nursing-sensitive indicators (NSI) is a challenge. Prevalence surveys are one method used by some organizations to monitor NSI, which are patient outcomes that are directly affected by the quantity or quality of nursing care that the patient receives. The aim of this paper is to describe the development of an innovative electronic data collection tool to monitor NSI. In the preliminary development work, we designed a mobile computing application with pre-populated patient census information to collect the nursing quality data. In subsequent phases, we refined this process by designing an electronic trigger using The Ottawa Hospital's Patient Safety Learning System, which automatically generated a case report form for each inpatient based on the hospital's daily patient census on the day of the prevalence survey. Both of these electronic data collection tools were accessible on tablet computers, which substantially reduced data collection, analysis and reporting time compared to previous paper-based methods. The electronic trigger provided improved completeness of the data. This work leveraged the use of tablet computers combined with a web-based application for patient data collection at point of care. Overall, the electronic methods improved data completeness and timeliness compared to traditional paper-based methods. This initiative has resulted in the ability to collect and report on NSI organization-wide to advance decision-making support and identify quality improvement opportunities within the organization. Copyright © 2015 Longwoods Publishing.

  17. Automatic Feature Selection and Improved Classification in SICADA Counterfeit Electronics Detection

    DTIC Science & Technology

    2017-03-20

    The SICADA methodology was developed to detect such counterfeit microelectronics by collecting power side channel data and applying machine learning...to identify counterfeits. This methodology has been extended to include a two-step automated feature selection process and now uses a one-class SVM...classifier. We describe this methodology and show results for empirical data collected from several types of Microchip dsPIC33F microcontrollers

  18. Pinyon, Version 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Logan; Hackenberg, Robert

    2017-02-13

    Pinyon is a tool that stores steps involved in creating a model derived from a collection of data. The main function of Pinyon is to store descriptions of calculations used to analyze or visualize the data in a database, and allow users to view the results of these calculations via a web interface. Additionally, users may also use the web interface to make adjustments to the calculations and rerun the entire collection of analysis steps automatically.

  19. Distribution of Software Changes for Battlefield Computer Systems: A lingering Problem

    DTIC Science & Technology

    1983-06-03

    Defense, 10 June 1963), pp. 1-4. 3 Ibid. 4Automatic Data Processing Systems, Book - 1 Introduction (U.S. Army Signal School, Fort Monmouth, New Jersey, 15...January 1960) , passim. 5Automatic Data Processing Systems, Book - 2 Army Use of ADPS (U.S. Army Signal School, Fort Monmouth, New Jersey, 15 October...execute an application or utility program. It controls how the computer functions during a given operation. Utility programs are merely general use

  20. Automatic Identification of Critical Data Items in a Database to Mitigate the Effects of Malicious Insiders

    NASA Astrophysics Data System (ADS)

    White, Jonathan; Panda, Brajendra

    A major concern for computer system security is the threat from malicious insiders who target and abuse critical data items in the system. In this paper, we propose a solution to enable automatic identification of critical data items in a database by way of data dependency relationships. This identification of critical data items is necessary because insider threats often target mission critical data in order to accomplish malicious tasks. Unfortunately, currently available systems fail to address this problem in a comprehensive manner. It is more difficult for non-experts to identify these critical data items because of their lack of familiarity and due to the fact that data systems are constantly changing. By identifying the critical data items automatically, security engineers will be better prepared to protect what is critical to the mission of the organization and also have the ability to focus their security efforts on these critical data items. We have developed an algorithm that scans the database logs and forms a directed graph showing which items influence a large number of other items and at what frequency this influence occurs. This graph is traversed to reveal the data items which have a large influence throughout the database system by using a novel metric based formula. These items are critical to the system because if they are maliciously altered or stolen, the malicious alterations will spread throughout the system, delaying recovery and causing a much more malignant effect. As these items have significant influence, they are deemed to be critical and worthy of extra security measures. Our proposal is not intended to replace existing intrusion detection systems, but rather is intended to complement current and future technologies. Our proposal has never been performed before, and our experimental results have shown that it is very effective in revealing critical data items automatically.

  1. IoT in Radiology: Using Raspberry Pi to Automatically Log Telephone Calls in the Reading Room.

    PubMed

    Chen, Po-Hao; Cross, Nathan

    2018-05-03

    The work environment for medical imaging such as distractions, ergonomics, distance, temperature, humidity, and lighting conditions generates a paucity of data and is difficult to analyze. The emergence of Internet of Things (IoT) with decreasing cost of single-board computers like Raspberry Pi makes creating customized hardware to collect data from the clinical environment within the reach of a clinical imaging informaticist. This article will walk the reader through a series of basic project using a variety sensors and devices in conjunction with a Pi to gather data, culminating in a complex example designed to automatically detect and log telephone calls.

  2. Automatic Recognition of Indoor Navigation Elements from Kinect Point Clouds

    NASA Astrophysics Data System (ADS)

    Zeng, L.; Kang, Z.

    2017-09-01

    This paper realizes automatically the navigating elements defined by indoorGML data standard - door, stairway and wall. The data used is indoor 3D point cloud collected by Kinect v2 launched in 2011 through the means of ORB-SLAM. By contrast, it is cheaper and more convenient than lidar, but the point clouds also have the problem of noise, registration error and large data volume. Hence, we adopt a shape descriptor - histogram of distances between two randomly chosen points, proposed by Osada and merges with other descriptor - in conjunction with random forest classifier to recognize the navigation elements (door, stairway and wall) from Kinect point clouds. This research acquires navigation elements and their 3-d location information from each single data frame through segmentation of point clouds, boundary extraction, feature calculation and classification. Finally, this paper utilizes the acquired navigation elements and their information to generate the state data of the indoor navigation module automatically. The experimental results demonstrate a high recognition accuracy of the proposed method.

  3. On-line data collection platform for national dose surveys in diagnostic and interventional radiology.

    PubMed

    Vassileva, J; Simeonov, F; Avramova-Cholakova, S

    2015-07-01

    According to the Bulgarian regulation for radiation protection at medical exposure, the National Centre of Radiobiology and Radiation Protection (NCRRP) is responsible for performing national dose surveys in diagnostic and interventional radiology and nuclear medicine and for establishing of national diagnostic reference levels (DRLs). The next national dose survey is under preparation to be performed in the period of 2015-16, with the aim to cover conventional radiography, mammography, conventional fluoroscopy, interventional and fluoroscopy guided procedures and CT. It will be performed electronically using centralised on-line data collection platform established by the NCRRP. The aim is to increase the response rate and to improve the accuracy by reducing human errors. The concept of the on-line dose data collection platform is presented. Radiological facilities are provided with a tool to determine local typical patient doses, and the NCRRP to establish national DRLs. Future work will include automatic retrieval of dose data from hospital picture archival and communicating system. The on-line data collection platform is expected to facilitate the process of dose audit and optimisation of radiological procedures in Bulgarian hospitals. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Addressing fundamental architectural challenges of an activity-based intelligence and advanced analytics (ABIAA) system

    NASA Astrophysics Data System (ADS)

    Yager, Kevin; Albert, Thomas; Brower, Bernard V.; Pellechia, Matthew F.

    2015-06-01

    The domain of Geospatial Intelligence Analysis is rapidly shifting toward a new paradigm of Activity Based Intelligence (ABI) and information-based Tipping and Cueing. General requirements for an advanced ABIAA system present significant challenges in architectural design, computing resources, data volumes, workflow efficiency, data mining and analysis algorithms, and database structures. These sophisticated ABI software systems must include advanced algorithms that automatically flag activities of interest in less time and within larger data volumes than can be processed by human analysts. In doing this, they must also maintain the geospatial accuracy necessary for cross-correlation of multi-intelligence data sources. Historically, serial architectural workflows have been employed in ABIAA system design for tasking, collection, processing, exploitation, and dissemination. These simpler architectures may produce implementations that solve short term requirements; however, they have serious limitations that preclude them from being used effectively in an automated ABIAA system with multiple data sources. This paper discusses modern ABIAA architectural considerations providing an overview of an advanced ABIAA system and comparisons to legacy systems. It concludes with a recommended strategy and incremental approach to the research, development, and construction of a fully automated ABIAA system.

  5. Automatic Identification System (AIS) Collection and Reach-back System: System Description

    DTIC Science & Technology

    2014-08-20

    installation. 8.1.1 Power The HPS requires three NEMA 5 outlets, one each for the RPC, BPC, and KG-175D TACLANE-micro COMSEC device. The HPS draws less than...Military Sealift Command NEMA National Electrical Manufacturers Association NMEA National Marine Electronics Association NRL Naval Research Laboratory OTH-G

  6. 75 FR 29520 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-26

    ... section 1404(b) of the Act (``Drain Cover Standard''). In addition to the anti-entrapment devices or... system; gravity drainage system; automatic pump shut-off system or drain disablement. The Pool and Spa... the drain covers, anti-entrapment device/systems, sump or equalizer lines at the site; and report on...

  7. Instance-Based Question Answering

    DTIC Science & Technology

    2006-12-01

    answer clustering, composition, and scoring. Moreover, with the effort dedicated to improving monolingual system performance, system parameters are...text collections: document type, manual or automatic annotations (if any), and stylistic and notational differences in technical terms. Monolingual ...forum in which cross language retrieval systems and question answering systems are tested for various Eu- ropean languages. The CLEF QA monolingual task

  8. Container-code recognition system based on computer vision and deep neural networks

    NASA Astrophysics Data System (ADS)

    Liu, Yi; Li, Tianjian; Jiang, Li; Liang, Xiaoyao

    2018-04-01

    Automatic container-code recognition system becomes a crucial requirement for ship transportation industry in recent years. In this paper, an automatic container-code recognition system based on computer vision and deep neural networks is proposed. The system consists of two modules, detection module and recognition module. The detection module applies both algorithms based on computer vision and neural networks, and generates a better detection result through combination to avoid the drawbacks of the two methods. The combined detection results are also collected for online training of the neural networks. The recognition module exploits both character segmentation and end-to-end recognition, and outputs the recognition result which passes the verification. When the recognition module generates false recognition, the result will be corrected and collected for online training of the end-to-end recognition sub-module. By combining several algorithms, the system is able to deal with more situations, and the online training mechanism can improve the performance of the neural networks at runtime. The proposed system is able to achieve 93% of overall recognition accuracy.

  9. Application of the Golden Software Surfer mapping software for automation of visualisation of meteorological and oceanographic data in IMGW Maritime Branch.

    NASA Astrophysics Data System (ADS)

    Piliczewski, B.

    2003-04-01

    The Golden Software Surfer has been used in IMGW Maritime Branch for more than ten years. This tool provides ActiveX Automation objects, which allow scripts to control practically every feature of Surfer. These objects can be accessed from any Automation-enabled environment, such as Visual Basic or Excel. Several applications based on Surfer has been developed in IMGW. The first example is an on-line oceanographic service, which presents forecasts of the water temperature, sea level and currents originating from the HIROMB model and is automatically updated every day. Surfer was also utilised in MERMAID, an international project supported by EC under the 5th Framework Programme. The main aim of this project was to create a prototype of the Internet-based data brokerage system, which would enable to search, extract, buy and download datasets containing meteorological or oceanographic data. During the project IMGW developed an online application, called Mermaid Viewer, which enables communication with the data broker and automatic visualisation of the downloaded data using Surfer. Both the above mentioned applications were developed in Visual Basic. Currently it is considered to adopt Surfer for the monitoring service, which provides access to the data collected in the monitoring of the Baltic Sea environment.

  10. Lynx: Automatic Elderly Behavior Prediction in Home Telecare

    PubMed Central

    Lopez-Guede, Jose Manuel; Moreno-Fernandez-de-Leceta, Aitor; Martinez-Garcia, Alexeiw; Graña, Manuel

    2015-01-01

    This paper introduces Lynx, an intelligent system for personal safety at home environments, oriented to elderly people living independently, which encompasses a decision support machine for automatic home risk prevention, tested in real-life environments to respond to real time situations. The automatic system described in this paper prevents such risks by an advanced analytic methods supported by an expert knowledge system. It is minimally intrusive, using plug-and-play sensors and machine learning algorithms to learn the elder's daily activity taking into account even his health records. If the system detects that something unusual happens (in a wide sense) or if something is wrong relative to the user's health habits or medical recommendations, it sends at real-time alarm to the family, care center, or medical agents, without human intervention. The system feeds on information from sensors deployed in the home and knowledge of subject physical activities, which can be collected by mobile applications and enriched by personalized health information from clinical reports encoded in the system. The system usability and reliability have been tested in real-life conditions, with an accuracy larger than 81%. PMID:26783514

  11. Lynx: Automatic Elderly Behavior Prediction in Home Telecare.

    PubMed

    Lopez-Guede, Jose Manuel; Moreno-Fernandez-de-Leceta, Aitor; Martinez-Garcia, Alexeiw; Graña, Manuel

    2015-01-01

    This paper introduces Lynx, an intelligent system for personal safety at home environments, oriented to elderly people living independently, which encompasses a decision support machine for automatic home risk prevention, tested in real-life environments to respond to real time situations. The automatic system described in this paper prevents such risks by an advanced analytic methods supported by an expert knowledge system. It is minimally intrusive, using plug-and-play sensors and machine learning algorithms to learn the elder's daily activity taking into account even his health records. If the system detects that something unusual happens (in a wide sense) or if something is wrong relative to the user's health habits or medical recommendations, it sends at real-time alarm to the family, care center, or medical agents, without human intervention. The system feeds on information from sensors deployed in the home and knowledge of subject physical activities, which can be collected by mobile applications and enriched by personalized health information from clinical reports encoded in the system. The system usability and reliability have been tested in real-life conditions, with an accuracy larger than 81%.

  12. The State of Retrieval System Evaluation.

    ERIC Educational Resources Information Center

    Salton, Gerald

    1992-01-01

    The current state of information retrieval (IR) evaluation is reviewed with criticisms directed at the available test collections and the research and evaluation methodologies used, including precision and recall rates for online searches and laboratory tests not including real users. Automatic text retrieval systems are also discussed. (32…

  13. A Limited-Vocabulary, Multi-Speaker Automatic Isolated Word Recognition System.

    ERIC Educational Resources Information Center

    Paul, James E., Jr.

    Techniques for automatic recognition of isolated words are investigated, and a computer simulation of a word recognition system is effected. Considered in detail are data acquisition and digitizing, word detection, amplitude and time normalization, short-time spectral estimation including spectral windowing, spectral envelope approximation,…

  14. Mode Transitions in Glass Cockpit Aircraft: Results of a Field Study

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Kirlik, Alex; Shafto, Michael (Technical Monitor)

    1995-01-01

    One consequence of increased levels of automation in complex control systems is the presence of modes. A mode is a particular configuration of a control system that defines how human command inputs are interpreted. In complex systems, modes also often determine a specific allocation of control authority between the human and automated systems. Even in simple static devices (e.g., electronic watches, word processors), the presence of modes has been found to cause problems in either-the acquisition or production of skilled performance. Many of these problems arise due to the fact that the selection of a mode causes device behavior to be mediated by hidden internal state information. For these simple systems, many of these interaction problems can be solved by the design of appropriate feedback to communicate internal state information to the human operator. In complex dynamic systems, however, the design issues associated with modes seem to trancend the problem of merely communicating internal state information via displayed feedback. In complex supervisory control systems (e.g., aircraft, spacecraft, military command and control), a key function of modes is the selection of a particular configuration of control authority between the human operator and automated control systems. One mode may result in full manual control, another may result in a mix of manual and automatic control, while a third may result in full automatic control over the entire system. The human operator selects an appropriate mode as a function of current goals, operating conditions, and operating procedures. Thus, the operator is put in a position of essentially trying to control two coupled dynamic systems: the target system itself, and also a highly complex suite of automation controlling the target system. From a historical perspective, it should probably not come as a surprise that very little information is available to guide the design of mode-oriented control systems. The topic of function allocation (i.e., the proper division of control authority among human and computer) has a long history in human-machine systems research. Although this research has produced some relevant guidelines, a design approach capable of defining appropriate allocations of control function between the human and automation is not yet available. As a result, the function allocation decision itself has been allocated to the operator, to be performed in real-time, in the operation of mode-oriented control systems. A variety of documented aircraft accidents and incidents suggest that the real-time selection and monitoring of control modes is a weak link in the effective operation of complex supervisory control systems. Research in human-machine systems and human-computer interaction has barely scraped the surface of the problem of understanding how operators manage this task.The purpose of this paper is to present the results of a field study which examined how operators manage mode selection in a complex supervisory control system. Data on mode engagements using the Boeing B757/767 auto-flight system were collected during approach and descent into four major airports in the East Coast of the United States. Protocols documenting mode selection, automatic mode changes, pilot actions, quantitative records of flight-path variables, and verbal reports during and after mode engagements were collected by an observer from the jumpseat. Observations were conducted on two typical trips between three airports. Each trip was be replicated 11 times, which yielded a total of 22 trips and 66 legs on which data were collected. All data collected concerned the same flight numbers, and therefore, the same time of day, same type of aircraft, and identical operational environments (e.g., ATC facilities, weather patterns, traffic flow etc.)

  15. Online service for monitoring the ionosphere based on data from the global navigation satellite system

    NASA Astrophysics Data System (ADS)

    Aleshin, I. M.; Alpatov, V. V.; Vasil'ev, A. E.; Burguchev, S. S.; Kholodkov, K. I.; Budnikov, P. A.; Molodtsov, D. A.; Koryagin, V. N.; Perederin, F. V.

    2014-07-01

    A service is described that makes possible the effective construction of a three-dimensional ionospheric model based on the data of ground receivers of signals from global navigation satellite positioning systems (GNSS). The obtained image has a high resolution, mainly because data from the IPG GNSS network of the Federal Service for Hydrometeorology and Environmental Monitoring (Rosgidromet) are used. A specially developed format and its implementation in the form of SQL structures are used to collect, transmit, and store data. The method of high-altitude radio tomography is used to construct the three-dimensional model. The operation of all system components (from registration point organization to the procedure for constructing the electron density three-dimensional distribution and publication of the total electron content map on the Internet) has been described in detail. The three-dimensional image of the ionosphere, obtained automatically, is compared with the ionosonde measurements, calculated using the two-dimensional low-altitude tomography method and averaged by the ionospheric model.

  16. A web-based system architecture for ontology-based data integration in the domain of IT benchmarking

    NASA Astrophysics Data System (ADS)

    Pfaff, Matthias; Krcmar, Helmut

    2018-03-01

    In the domain of IT benchmarking (ITBM), a variety of data and information are collected. Although these data serve as the basis for business analyses, no unified semantic representation of such data yet exists. Consequently, data analysis across different distributed data sets and different benchmarks is almost impossible. This paper presents a system architecture and prototypical implementation for an integrated data management of distributed databases based on a domain-specific ontology. To preserve the semantic meaning of the data, the ITBM ontology is linked to data sources and functions as the central concept for database access. Thus, additional databases can be integrated by linking them to this domain-specific ontology and are directly available for further business analyses. Moreover, the web-based system supports the process of mapping ontology concepts to external databases by introducing a semi-automatic mapping recommender and by visualizing possible mapping candidates. The system also provides a natural language interface to easily query linked databases. The expected result of this ontology-based approach of knowledge representation and data access is an increase in knowledge and data sharing in this domain, which will enhance existing business analysis methods.

  17. Automatic Method for Building Indoor Boundary Models from Dense Point Clouds Collected by Laser Scanners

    PubMed Central

    Valero, Enrique; Adán, Antonio; Cerrada, Carlos

    2012-01-01

    In this paper we present a method that automatically yields Boundary Representation Models (B-rep) for indoors after processing dense point clouds collected by laser scanners from key locations through an existing facility. Our objective is particularly focused on providing single models which contain the shape, location and relationship of primitive structural elements of inhabited scenarios such as walls, ceilings and floors. We propose a discretization of the space in order to accurately segment the 3D data and generate complete B-rep models of indoors in which faces, edges and vertices are coherently connected. The approach has been tested in real scenarios with data coming from laser scanners yielding promising results. We have deeply evaluated the results by analyzing how reliably these elements can be detected and how accurately they are modeled. PMID:23443369

  18. Runway Incursion Prevention System ADS-B and DGPS Data Link Analysis Dallas-Fort Worth International Airport

    NASA Technical Reports Server (NTRS)

    Timmerman, J.; Jones, Denise R. (Technical Monitor)

    2001-01-01

    A Runway Incursion Prevention System (RIPS) was tested at the Dallas - Ft. Worth International Airport in October 2000. The system integrated airborne and ground components to provide both pilots and controllers with enhanced situational awareness, supplemental guidance cues, a real-time display of traffic information, and warning of runway incursions in order to prevent runway incidents while also improving operational capability. Rockwell Collins provided and supported a prototype Automatic Dependent Surveillance - Broadcast (ADS-B) system using 1090 MHz and a prototype Differential GPS (DGPS) system onboard the NASA Boeing 757 research aircraft. This report describes the Rockwell Collins contributions to the RIPS flight test, summarizes the development process, and analyzes both ADS-B and DGPS data collected during the flight test. In addition, results are report on interoperability tests conducted between the NASA Advanced General Aviation Transport Experiments (AGATE) ADS-B flight test system and the NASA Boeing 757 ADS-B system.

  19. Automation of surface observations program

    NASA Technical Reports Server (NTRS)

    Short, Steve E.

    1988-01-01

    At present, surface weather observing methods are still largely manual and labor intensive. Through the nationwide implementation of Automated Surface Observing Systems (ASOS), this situation can be improved. Two ASOS capability levels are planned. The first is a basic-level system which will automatically observe the weather parameters essential for aviation operations and will operate either with or without supplemental contributions by an observer. The second is a more fully automated, stand-alone system which will observe and report the full range of weather parameters and will operate primarily in the unattended mode. Approximately 250 systems are planned by the end of the decade. When deployed, these systems will generate the standard hourly and special long-line transmitted weather observations, as well as provide continuous weather information direct to airport users. Specific ASOS configurations will vary depending upon whether the operation is unattended, minimally attended, or fully attended. The major functions of ASOS are data collection, data processing, product distribution, and system control. The program phases of development, demonstration, production system acquisition, and operational implementation are described.

  20. Computer-based information collection and feedback for Norwegian municipal health and social services.

    PubMed

    Yang, J J

    1995-01-01

    Norway is governed by a three-tier parliamentary system where each tier is governed by a popularly selected body: the national parliament, the county councils, and the municipality councils. This three-tier system is in many ways also reflected in the organization, management, and financing of health and social services. A large amount of information (e.g.,statistics and annual reports) flows between the three levels of management. In order to have a proper and efficient information flow, The Norwegian Ministry of Health and Social Affairs has, since 1992, been conducting a nation-wide project for information collection from and feedback to municipal health and social services (see Figure 1). In this presentation, we will present the basic idea behind The Wheel. We will also discuss some of the major activities in and experiences from the project of using Information Technology to implement an electronic Wheel. The following are basic issues to consider in implementing such a system, related to the following basic issues in implementing such a system [1]: Obtaining a unified information basis to: increase the data quality, and compile "definition catalogs" that contain commonly agreed-upon definitions of central concepts and data sets that are used in the municipal health and social services [2]. Achieving electronic data collection, both in terms of the automatic selection and aggregation of relevant data from operational systems in the municipalities and in terms of using Electronic Forms. Experiments with various ways of electronically feeding back the statistics and other comparative data to the municipalities. Providing the municipal users with appropriate tools for using the statistics that are fed back.

  1. Sma3s: a three-step modular annotator for large sequence datasets.

    PubMed

    Muñoz-Mérida, Antonio; Viguera, Enrique; Claros, M Gonzalo; Trelles, Oswaldo; Pérez-Pulido, Antonio J

    2014-08-01

    Automatic sequence annotation is an essential component of modern 'omics' studies, which aim to extract information from large collections of sequence data. Most existing tools use sequence homology to establish evolutionary relationships and assign putative functions to sequences. However, it can be difficult to define a similarity threshold that achieves sufficient coverage without sacrificing annotation quality. Defining the correct configuration is critical and can be challenging for non-specialist users. Thus, the development of robust automatic annotation techniques that generate high-quality annotations without needing expert knowledge would be very valuable for the research community. We present Sma3s, a tool for automatically annotating very large collections of biological sequences from any kind of gene library or genome. Sma3s is composed of three modules that progressively annotate query sequences using either: (i) very similar homologues, (ii) orthologous sequences or (iii) terms enriched in groups of homologous sequences. We trained the system using several random sets of known sequences, demonstrating average sensitivity and specificity values of ~85%. In conclusion, Sma3s is a versatile tool for high-throughput annotation of a wide variety of sequence datasets that outperforms the accuracy of other well-established annotation algorithms, and it can enrich existing database annotations and uncover previously hidden features. Importantly, Sma3s has already been used in the functional annotation of two published transcriptomes. © The Author 2014. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  2. Automatic Radiated Susceptibility Test System for Payload Equipment

    NASA Technical Reports Server (NTRS)

    Ngo, Hoai T.; Sturman, John C.; Sargent, Noel B.

    1995-01-01

    An automatic radiated susceptibility test system (ARSTS) was developed for NASA Lewis Research Center's Electro-magnetic Interference laboratory. According to MSFC-SPEC 521B, any electrical or electronic equipment that will be transported by the spacelab and space shuttle must be tested for susceptibility to electromagnetic interference. This state-of-the-art automatic test system performs necessary calculations; analyzes, processes, and records a great quantity of measured data; and monitors the equipment being tested in real-time and with minimal user intervention. ARSTS reduces costly test time, increases test accuracy, and provides reliable test results.

  3. Model-driven approach to data collection and reporting for quality improvement

    PubMed Central

    Curcin, Vasa; Woodcock, Thomas; Poots, Alan J.; Majeed, Azeem; Bell, Derek

    2014-01-01

    Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. PMID:24874182

  4. STARS 2.0: 2nd-generation open-source archiving and query software

    NASA Astrophysics Data System (ADS)

    Winegar, Tom

    2008-07-01

    The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

  5. Strategies for automatic planning: A collection of ideas

    NASA Technical Reports Server (NTRS)

    Collins, Carol; George, Julia; Zamani, Elaine

    1989-01-01

    The main goal of the Jet Propulsion Laboratory (JPL) is to obtain science return from interplanetary probes. The uplink process is concerned with communicating commands to a spacecraft in order to achieve science objectives. There are two main parts to the development of the command file which is sent to a spacecraft. First, the activity planning process integrates the science requests for utilization of spacecraft time into a feasible sequence. Then the command generation process converts the sequence into a set of commands. The development of a feasible sequence plan is an expensive and labor intensive process requiring many months of effort. In order to save time and manpower in the uplink process, automation of parts of this process is desired. There is an ongoing effort to develop automatic planning systems. This has met with some success, but has also been informative about the nature of this effort. It is now clear that innovative techniques and state-of-the-art technology will be required in order to produce a system which can provide automatic sequence planning. As part of this effort to develop automatic planning systems, a survey of the literature, looking for known techniques which may be applicable to our work was conducted. Descriptions of and references for these methods are given, together with ideas for applying the techniques to automatic planning.

  6. P19-S Managing Proteomics Data from Data Generation and Data Warehousing to Central Data Repository and Journal Reviewing Processes

    PubMed Central

    Thiele, H.; Glandorf, J.; Koerting, G.; Reidegeld, K.; Blüggel, M.; Meyer, H.; Stephan, C.

    2007-01-01

    In today’s proteomics research, various techniques and instrumentation bioinformatics tools are necessary to manage the large amount of heterogeneous data with an automatic quality control to produce reliable and comparable results. Therefore a data-processing pipeline is mandatory for data validation and comparison in a data-warehousing system. The proteome bioinformatics platform ProteinScape has been proven to cover these needs. The reprocessing of HUPO BPP participants’ MS data was done within ProteinScape. The reprocessed information was transferred into the global data repository PRIDE. ProteinScape as a data-warehousing system covers two main aspects: archiving relevant data of the proteomics workflow and information extraction functionality (protein identification, quantification and generation of biological knowledge). As a strategy for automatic data validation, different protein search engines are integrated. Result analysis is performed using a decoy database search strategy, which allows the measurement of the false-positive identification rate. Peptide identifications across different workflows, different MS techniques, and different search engines are merged to obtain a quality-controlled protein list. The proteomics identifications database (PRIDE), as a public data repository, is an archiving system where data are finally stored and no longer changed by further processing steps. Data submission to PRIDE is open to proteomics laboratories generating protein and peptide identifications. An export tool has been developed for transferring all relevant HUPO BPP data from ProteinScape into PRIDE using the PRIDE.xml format. The EU-funded ProDac project will coordinate the development of software tools covering international standards for the representation of proteomics data. The implementation of data submission pipelines and systematic data collection in public standards–compliant repositories will cover all aspects, from the generation of MS data in each laboratory to the conversion of all the annotating information and identifications to a standardized format. Such datasets can be used in the course of publishing in scientific journals.

  7. Landslide monitoring and early warning systems in Lower Austria - current situation and new developments

    NASA Astrophysics Data System (ADS)

    Thiebes, Benni; Glade, Thomas; Schweigl, Joachim; Jäger, Stefan; Canli, Ekrem

    2014-05-01

    Landslides represent significant hazards in the mountainous areas of Austria. The Regional Geological Surveys are responsible to inform and protect the population, and to mitigate damage to infrastructure. Efforts of the Regional Geological Survey of Lower Austria include detailed site investigations, the planning and installation of protective structures (e.g. rock fall nets) as well as preventive measures such as regional scale landslide susceptibility assessments. For potentially endangered areas, where protection works are not feasible or would simply be too costly, monitoring systems have been installed. However, these systems are dominantly not automatic and require regular field visits to take measurements. Therefore, it is difficult to establish any relation between initiating and controlling factors, thus to fully understand the underlying process mechanism which is essential for any early warning system. Consequently, the implementation of new state-of-the-art monitoring and early warning systems has been started. In this presentation, the design of four landslide monitoring and early warning systems is introduced. The investigated landslide process types include a deep-seated landslide, a rock fall site, a complex earth flow, and a debris flow catchment. The monitoring equipment was chosen depending on the landslide processes and their activity. It aims to allow for a detailed investigation of process mechanisms in relation to its triggers and for reliable prediction of future landslide activities. The deep-seated landslide will be investigated by manual and automatic inclinometers to get detailed insights into subsurface displacements. In addition, TDR sensors and a weather station will be employed to get a better understanding on the influence of rainfall on sub-surface hydrology. For the rockfall site, a wireless sensor network will be installed to get real-time information on acceleration and inclination of potentially unstable blocks. The movement of the earth flow site will be monitored by differential GPS to get high precision information on displacements of marked points. Photogrammtetry based on octocopter surveys will provide spatial information on movement patterns. A similar approach will be followed for the debris flow catchment. Here, the focus lies on a monitoring of the landslide failures in the source area which prepares the material for subsequent debris flow transport. In addition to the methods already mentioned, repeated terrestrial laserscanning campaigns will be used to monitor geomorphological changes at all sites. All important data, which can be single measurements, episodic or continuous monitoring data for a given point (e.g. rainfall, inclination) or of spatial character (e.g. LiDAR measurements), are collected and analysed on an external server. Automatic data analysis methods, such as progressive failure analysis, are carried out automatically based on field measurements. The data and results from all monitoring sites are visualised on a web-based platform which enables registered users to analyse the respective information in near-real-time. Moreover, thresholds can be determined which trigger automated warning messages to the involved scientists if thresholds are exceeded by field measurements. The described system will enable scientists and decision-makers to access the latest data from the monitoring systems. Automatic alarms are raised when thresholds are exceeded to inform them about potentially hazardous changes. Thereby, a more efficient hazard management and early warning can be achieved. Keywords: landslide, rockfall, debris flow, earth flow, monitoring, early warning system.

  8. A system for programming experiments and for recording and analyzing data automatically1

    PubMed Central

    Herrick, Robert M.; Denelsbeck, John S.

    1963-01-01

    A system designed for use in complex operant conditioning experiments is described. Some of its key features are: (a) plugboards that permit the experimenter to change either from one program to another or from one analysis to another in less than a minute, (b) time-sharing of permanently-wired, electronic logic components, (c) recordings suitable for automatic analyses. Included are flow diagrams of the system and sample logic diagrams for programming experiments and for analyzing data. ImagesFig. 4. PMID:14055967

  9. Relationship of Automatic Data Processing Training Curriculum and Methodology in the Federal Government.

    ERIC Educational Resources Information Center

    Office of Education (DHEW), Washington, DC.

    A conference, held in Washington, D. C., in 1967 by the Association for Educational Data Systems and the U.S. Office of Education, attempted to lay the groundwork for an efficient automatic data processing training program for the Federal Government utilizing new instructional methodologies. The rapid growth of computer applications and computer…

  10. Open-Source Programming for Automated Generation of Graphene Raman Spectral Maps

    NASA Astrophysics Data System (ADS)

    Vendola, P.; Blades, M.; Pierre, W.; Jedlicka, S.; Rotkin, S. V.

    Raman microscopy is a useful tool for studying the structural characteristics of graphene deposited onto substrates. However, extracting useful information from the Raman spectra requires data processing and 2D map generation. An existing home-built confocal Raman microscope was optimized for graphene samples and programmed to automatically generate Raman spectral maps across a specified area. In particular, an open source data collection scheme was generated to allow the efficient collection and analysis of the Raman spectral data for future use. NSF ECCS-1509786.

  11. Observing control and data reduction at the UKIRT

    NASA Astrophysics Data System (ADS)

    Bridger, Alan; Economou, Frossie; Wright, Gillian S.; Currie, Malcolm J.

    1998-07-01

    For the past seven years observing with the major instruments at the United Kingdom IR Telescope (UKIRT) has been semi-automated, using ASCII files top configure the instruments and then sequence a series of exposures and telescope movements to acquire the data. For one instrument automatic data reduction completes the cycle. The emergence of recent software technologies has suggested an evolution of this successful system to provide a friendlier and more powerful interface to observing at UKIRT. The Observatory Reduction and Acquisition Control (ORAC) project is now underway to construct this system. A key aim of ORAC is to allow a more complete description of the observing program, including the target sources and the recipe that will be used to provide on-line data reduction. Remote observation preparation and submission will also be supported. In parallel the observatory control system will be upgraded to use these descriptions for more automatic observing, while retaining the 'classical' interactive observing mode. The final component of the project is an improved automatic data reduction system, allowing on-line reduction of data at the telescope while retaining the flexibility to cope with changing observing techniques and instruments. The user will also automatically be provided with the scripts used for the real-time reduction to help provide post-observing data reduction support. The overall project goal is to improve the scientific productivity of the telescope, but it should also reduce the overall ongoing support requirements, and has the eventual goal of supporting the use of queue- scheduled observing.

  12. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.

  13. Solving Common Mathematical Problems

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Mathematical Solutions Toolset is a collection of five software programs that rapidly solve some common mathematical problems. The programs consist of a set of Microsoft Excel worksheets. The programs provide for entry of input data and display of output data in a user-friendly, menu-driven format, and for automatic execution once the input data has been entered.

  14. Achieving continuous improvement in laboratory organization through performance measurements: a seven-year experience.

    PubMed

    Salinas, Maria; López-Garrigós, Maite; Gutiérrez, Mercedes; Lugo, Javier; Sirvent, Jose Vicente; Uris, Joaquin

    2010-01-01

    Laboratory performance can be measured using a set of model key performance indicators (KPIs). The design and implementation of KPIs are important issues. KPI results from 7 years are reported and their implementation, monitoring, objectives, interventions, result reporting and delivery are analyzed. The KPIs of the entire laboratory process were obtained using Laboratory Information System (LIS) registers. These were collected automatically using a data warehouse application, spreadsheets and external quality program reports. Customer satisfaction was assessed using surveys. Nine model laboratory KPIs were proposed and measured. The results of some examples of KPIs used in our laboratory are reported. Their corrective measurements or the implementation of objectives led to improvement in the associated KPIs results. Measurement of laboratory performance using KPIs and a data warehouse application that continuously collects registers and calculates KPIs confirmed the reliability of indicators, indicator acceptability and usability for users, and continuous process improvement.

  15. 78 FR 46419 - Proposed Information Collection (Application for Authority To Close Loans on an Automatic Basis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-31

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0252] Proposed Information Collection (Application for Authority To Close Loans on an Automatic Basis--Nonsupervised Lenders) Activity: Comment... solicits comments for information needed to authorize nonsupervised lenders to close loans on an automatic...

  16. An automatic locating system for cloud-to-ground lightning. [which utilizes a microcomputer

    NASA Technical Reports Server (NTRS)

    Krider, E. P.; Pifer, A. E.; Uman, M. A.

    1980-01-01

    Automatic locating systems which respond to cloud to ground lightning and which discriminate against cloud discharges and background noise are described. Subsystems of the locating system, which include the direction finder and the position analyzer, are discussed. The direction finder senses the electromagnetic fields radiated by lightning on two orthogonal magnetic loop antennas and on a flat plate electric antenna. The position analyzer is a preprogrammed microcomputer system which automatically computes, maps, and records lightning locations in real time using data inputs from the direction finder. The use of the locating systems for wildfire management and fire weather forecasting is discussed.

  17. Computer-aided personal interviewing. A new technique for data collection in epidemiologic surveys.

    PubMed

    Birkett, N J

    1988-03-01

    Most epidemiologic studies involve the collection of data directly from selected respondents. Traditionally, interviewers are provided with the interview in booklet form on paper and answers are recorded therein. On receipt at the study office, the interview results are coded, transcribed, and keypunched for analysis. The author's team has developed a method of personal interviewing which uses a structured interview stored on a lap-sized computer. Responses are entered into the computer and are subject to immediate error-checking and correction. All skip-patterns are automatic. Data entry to the final data-base involves no manual data transcription. A pilot evaluation with a preliminary version of the system using tape-recorded interviews in a test/re-test methodology revealed a slightly higher error rate, probably related to weaknesses in the pilot system and the training process. Computer interviews tended to be longer but other features of the interview process were not affected by computer. The author's team has now completed 2,505 interviews using this system in a community-based blood pressure survey. It has been well accepted by both interviewers and respondents. Failure to complete an interview on the computer was uncommon (5 per cent) and well-handled by paper back-up questionnaires. The results show that computer-aided personal interviewing in the home is feasible but that further evaluation is needed to establish the impact of this methodology on overall data quality.

  18. Speeding response, saving lives : automatic vehicle location capabilities for emergency services.

    DOT National Transportation Integrated Search

    1999-01-01

    Information from automatic vehicle location systems, when combined with computeraided dispatch software, can provide a rich source of data for analyzing emergency vehicle operations and evaluating agency performance.

  19. The use of automatic programming techniques for fault tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Wild, C.

    1985-01-01

    It is conjectured that the production of software for ultra-reliable computing systems such as required by Space Station, aircraft, nuclear power plants and the like will require a high degree of automation as well as fault tolerance. In this paper, the relationship between automatic programming techniques and fault tolerant computing systems is explored. Initial efforts in the automatic synthesis of code from assertions to be used for error detection as well as the automatic generation of assertions and test cases from abstract data type specifications is outlined. Speculation on the ability to generate truly diverse designs capable of recovery from errors by exploring alternate paths in the program synthesis tree is discussed. Some initial thoughts on the use of knowledge based systems for the global detection of abnormal behavior using expectations and the goal-directed reconfiguration of resources to meet critical mission objectives are given. One of the sources of information for these systems would be the knowledge captured during the automatic programming process.

  20. Guidelines for Automatic Data Processing Physical Security and Risk Management. Federal Information Processing Standards Publication 31.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC.

    These guidelines provide a handbook for use by federal organizations in structuring physical security and risk management programs for their automatic data processing facilities. This publication discusses security analysis, natural disasters, supporting utilities, system reliability, procedural measures and controls, off-site facilities,…

Top