Sample records for computer monitor means

  1. Effects of Dual Monitor Computer Work Versus Laptop Work on Cervical Muscular and Proprioceptive Characteristics of Males and Females.

    PubMed

    Farias Zuniga, Amanda M; Côté, Julie N

    2017-06-01

    The effects of performing a 90-minute computer task with a laptop versus a dual monitor desktop workstation were investigated in healthy young male and female adults. Work-related musculoskeletal disorders are common among computer (especially female) users. Laptops have surpassed desktop computer sales, and working with multiple monitors has also become popular. However, few studies have provided objective evidence on how they affect the musculoskeletal system in both genders. Twenty-seven healthy participants (mean age = 24.6 years; 13 males) completed a 90-minute computer task while using a laptop or dual monitor (DualMon) desktop. Electromyography (EMG) from eight upper body muscles and visual strain were measured throughout the task. Neck proprioception was tested before and after the computer task using a head-repositioning test. EMG amplitude (root mean square [RMS]), variability (coefficients of variation [CV]), and normalized mutual information (NMI) were computed. Visual strain ( p < .01) and right upper trapezius RMS ( p = .03) increased significantly over time regardless of workstation. Right cervical erector spinae RMS and cervical NMI were smaller, while degrees of overshoot (mean = 4.15°) and end position error (mean = 1.26°) were larger in DualMon regardless of time. Effects on muscle activity were more pronounced in males, whereas effects on proprioception were more pronounced in females. Results suggest that compared to laptop, DualMon work is effective in reducing cervical muscle activity, dissociating cervical connectivity, and maintaining more typical neck repositioning patterns, suggesting some health-protective effects. This evidence could be considered when deciding on computer workstation designs.

  2. Computer controlled fluorometer device and method of operating same

    DOEpatents

    Kolber, Z.; Falkowski, P.

    1990-07-17

    A computer controlled fluorometer device and method of operating same, said device being made to include a pump flash source and a probe flash source and one or more sample chambers in combination with a light condenser lens system and associated filters and reflectors and collimators, as well as signal conditioning and monitoring means and a programmable computer means and a software programmable source of background irradiance that is operable according to the method of the invention to rapidly, efficiently and accurately measure photosynthetic activity by precisely monitoring and recording changes in fluorescence yield produced by a controlled series of predetermined cycles of probe and pump flashes from the respective probe and pump sources that are controlled by the computer means. 13 figs.

  3. Computer controlled fluorometer device and method of operating same

    DOEpatents

    Kolber, Zbigniew; Falkowski, Paul

    1990-01-01

    A computer controlled fluorometer device and method of operating same, said device being made to include a pump flash source and a probe flash source and one or more sample chambers in combination with a light condenser lens system and associated filters and reflectors and collimators, as well as signal conditioning and monitoring means and a programmable computer means and a software programmable source of background irradiance that is operable according to the method of the invention to rapidly, efficiently and accurately measure photosynthetic activity by precisely monitoring and recording changes in fluorescence yield produced by a controlled series of predetermined cycles of probe and pump flashes from the respective probe and pump sources that are controlled by the computer means.

  4. Unobtrusive measurement of daily computer use to detect mild cognitive impairment

    PubMed Central

    Kaye, Jeffrey; Mattek, Nora; Dodge, Hiroko H; Campbell, Ian; Hayes, Tamara; Austin, Daniel; Hatt, William; Wild, Katherine; Jimison, Holly; Pavel, Michael

    2013-01-01

    Background Mild disturbances of higher order activities of daily living are present in people diagnosed with mild cognitive impairment (MCI). These deficits may be difficult to detect among those still living independently. Unobtrusive continuous assessment of a complex activity such as home computer use may detect mild functional changes and identify MCI. We sought to determine whether long-term changes in remotely monitored computer use differ in persons with MCI in comparison to cognitively intact volunteers. Methods Participants enrolled in a longitudinal cohort study of unobtrusive in-home technologies to detect cognitive and motor decline in independently living seniors were assessed for computer usage (number of days with use, mean daily usage and coefficient of variation of use) measured by remotely monitoring computer session start and end times. Results Over 230,000 computer sessions from 113 computer users (mean age, 85; 38 with MCI) were acquired during a mean of 36 months. In mixed effects models there was no difference in computer usage at baseline between MCI and intact participants controlling for age, sex, education, race and computer experience. However, over time, between MCI and intact participants, there was a significant decrease in number of days with use (p=0.01), mean daily usage (~1% greater decrease/month; p=0.009) and an increase in day-to-day use variability (p=0.002). Conclusions Computer use change can be unobtrusively monitored and indicate individuals with MCI. With 79% of those 55–64 years old now online, this may be an ecologically valid and efficient approach to track subtle clinically meaningful change with aging. PMID:23688576

  5. Unobtrusive measurement of daily computer use to detect mild cognitive impairment.

    PubMed

    Kaye, Jeffrey; Mattek, Nora; Dodge, Hiroko H; Campbell, Ian; Hayes, Tamara; Austin, Daniel; Hatt, William; Wild, Katherine; Jimison, Holly; Pavel, Michael

    2014-01-01

    Mild disturbances of higher order activities of daily living are present in people diagnosed with mild cognitive impairment (MCI). These deficits may be difficult to detect among those still living independently. Unobtrusive continuous assessment of a complex activity such as home computer use may detect mild functional changes and identify MCI. We sought to determine whether long-term changes in remotely monitored computer use differ in persons with MCI in comparison with cognitively intact volunteers. Participants enrolled in a longitudinal cohort study of unobtrusive in-home technologies to detect cognitive and motor decline in independently living seniors were assessed for computer use (number of days with use, mean daily use, and coefficient of variation of use) measured by remotely monitoring computer session start and end times. More than 230,000 computer sessions from 113 computer users (mean age, 85 years; 38 with MCI) were acquired during a mean of 36 months. In mixed-effects models, there was no difference in computer use at baseline between MCI and intact participants controlling for age, sex, education, race, and computer experience. However, over time, between MCI and intact participants, there was a significant decrease in number of days with use (P = .01), mean daily use (∼1% greater decrease/month; P = .009), and an increase in day-to-day use variability (P = .002). Computer use change can be monitored unobtrusively and indicates individuals with MCI. With 79% of those 55 to 64 years old now online, this may be an ecologically valid and efficient approach to track subtle, clinically meaningful change with aging. Copyright © 2014 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  6. Computer vision-based analysis of foods: a non-destructive colour measurement tool to monitor quality and safety.

    PubMed

    Mogol, Burçe Ataç; Gökmen, Vural

    2014-05-01

    Computer vision-based image analysis has been widely used in food industry to monitor food quality. It allows low-cost and non-contact measurements of colour to be performed. In this paper, two computer vision-based image analysis approaches are discussed to extract mean colour or featured colour information from the digital images of foods. These types of information may be of particular importance as colour indicates certain chemical changes or physical properties in foods. As exemplified here, the mean CIE a* value or browning ratio determined by means of computer vision-based image analysis algorithms can be correlated with acrylamide content of potato chips or cookies. Or, porosity index as an important physical property of breadcrumb can be calculated easily. In this respect, computer vision-based image analysis provides a useful tool for automatic inspection of food products in a manufacturing line, and it can be actively involved in the decision-making process where rapid quality/safety evaluation is needed. © 2013 Society of Chemical Industry.

  7. Design of a specialized computer for on-line monitoring of cardiac stroke volume

    NASA Technical Reports Server (NTRS)

    Webb, J. A., Jr.; Gebben, V. D.

    1972-01-01

    The design of a specialized analog computer for on-line determination of cardiac stroke volume by means of a modified version of the pressure pulse contour method is presented. The design consists of an analog circuit for computation and a timing circuit for detecting necessary events on the pressure waveform. Readouts of arterial pressures, systolic duration, heart rate, percent change in stroke volume, and percent change in cardiac output are provided for monitoring cardiac patients. Laboratory results showed that computational accuracy was within 3 percent, while animal experiments verified the operational capability of the computer. Patient safety considerations are also discussed.

  8. Ubiquitous human computing.

    PubMed

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  9. 48 CFR 23.701 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Definitions. 23.701... DRUG-FREE WORKPLACE Contracting for Environmentally Preferable Products and Services 23.701 Definitions. As used in this subpart— Computer monitor means a video display unit used with a computer. Desktop...

  10. Electromagnetic tracking of motion in the proximity of computer generated graphical stimuli: a tutorial.

    PubMed

    Schnabel, Ulf H; Hegenloh, Michael; Müller, Hermann J; Zehetleitner, Michael

    2013-09-01

    Electromagnetic motion-tracking systems have the advantage of capturing the tempo-spatial kinematics of movements independently of the visibility of the sensors. However, they are limited in that they cannot be used in the proximity of electromagnetic field sources, such as computer monitors. This prevents exploiting the tracking potential of the sensor system together with that of computer-generated visual stimulation. Here we present a solution for presenting computer-generated visual stimulation that does not distort the electromagnetic field required for precise motion tracking, by means of a back projection medium. In one experiment, we verify that cathode ray tube monitors, as well as thin-film-transistor monitors, distort electro-magnetic sensor signals even at a distance of 18 cm. Our back projection medium, by contrast, leads to no distortion of the motion-tracking signals even when the sensor is touching the medium. This novel solution permits combining the advantages of electromagnetic motion tracking with computer-generated visual stimulation.

  11. Energy Use and Power Levels in New Monitors and Personal Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberson, Judy A.; Homan, Gregory K.; Mahajan, Akshay

    2002-07-23

    Our research was conducted in support of the EPA ENERGY STAR Office Equipment program, whose goal is to reduce the amount of electricity consumed by office equipment in the U.S. The most energy-efficient models in each office equipment category are eligible for the ENERGY STAR label, which consumers can use to identify and select efficient products. As the efficiency of each category improves over time, the ENERGY STAR criteria need to be revised accordingly. The purpose of this study was to provide reliable data on the energy consumption of the newest personal computers and monitors that the EPA can usemore » to evaluate revisions to current ENERGY STAR criteria as well as to improve the accuracy of ENERGY STAR program savings estimates. We report the results of measuring the power consumption and power management capabilities of a sample of new monitors and computers. These results will be used to improve estimates of program energy savings and carbon emission reductions, and to inform rev isions of the ENERGY STAR criteria for these products. Our sample consists of 35 monitors and 26 computers manufactured between July 2000 and October 2001; it includes cathode ray tube (CRT) and liquid crystal display (LCD) monitors, Macintosh and Intel-architecture computers, desktop and laptop computers, and integrated computer systems, in which power consumption of the computer and monitor cannot be measured separately. For each machine we measured power consumption when off, on, and in each low-power level. We identify trends in and opportunities to reduce power consumption in new personal computers and monitors. Our results include a trend among monitor manufacturers to provide a single very low low-power level, well below the current ENERGY STAR criteria for sleep power consumption. These very low sleep power results mean that energy consumed when monitors are off or in active use has become more important in terms of contribution to the overall unit energy consumption (UEC). Cur rent ENERGY STAR monitor and computer criteria do not specify off or on power, but our results suggest opportunities for saving energy in these modes. Also, significant differences between CRT and LCD technology, and between field-measured and manufacturer-reported power levels reveal the need for standard methods and metrics for measuring and comparing monitor power consumption.« less

  12. A handheld computer as part of a portable in vivo knee joint load monitoring system

    PubMed Central

    Szivek, JA; Nandakumar, VS; Geffre, CP; Townsend, CP

    2009-01-01

    In vivo measurement of loads and pressures acting on articular cartilage in the knee joint during various activities and rehabilitative therapies following focal defect repair will provide a means of designing activities that encourage faster and more complete healing of focal defects. It was the goal of this study to develop a totally portable monitoring system that could be used during various activities and allow continuous monitoring of forces acting on the knee. In order to make the monitoring system portable, a handheld computer with custom software, a USB powered miniature wireless receiver and a battery-powered coil were developed to replace a currently used computer, AC powered bench top receiver and power supply. A Dell handheld running Windows Mobile operating system(OS) programmed using Labview was used to collect strain measurements. Measurements collected by the handheld based system connected to the miniature wireless receiver were compared with the measurements collected by a hardwired system and a computer based system during bench top testing and in vivo testing. The newly developed handheld based system had a maximum accuracy of 99% when compared to the computer based system. PMID:19789715

  13. Research and realization implementation of monitor technology on illegal external link of classified computer

    NASA Astrophysics Data System (ADS)

    Zhang, Hong

    2017-06-01

    In recent years, with the continuous development and application of network technology, network security has gradually entered people's field of vision. The host computer network external network of violations is an important reason for the threat of network security. At present, most of the work units have a certain degree of attention to network security, has taken a lot of means and methods to prevent network security problems such as the physical isolation of the internal network, install the firewall at the exit. However, these measures and methods to improve network security are often not comply with the safety rules of human behavior damage. For example, the host to wireless Internet access and dual-network card to access the Internet, inadvertently formed a two-way network of external networks and computer connections [1]. As a result, it is possible to cause some important documents and confidentiality leak even in the the circumstances of user unaware completely. Secrecy Computer Violation Out-of-band monitoring technology can largely prevent the violation by monitoring the behavior of the offending connection. In this paper, we mainly research and discuss the technology of secret computer monitoring.

  14. The design of an m-Health monitoring system based on a cloud computing platform

    NASA Astrophysics Data System (ADS)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  15. Multisite Testing of the Discrete Address Beacon System (DABS).

    DTIC Science & Technology

    1981-07-01

    downlink messages from an airborne distributed computer system containing , transponder in addition to performing 36 minicomputers, most of which are...the lockout function. organized into groups (or ensembles) of four computers interfaced to a local Each sensor may provide surveillance and data bus...position and velocity. Depending upon computer subsystem, which monitors the means used for scenario generation, in real time all communication and aircraft

  16. Introduction to Digital Logic Systems for Energy Monitoring and Control Systems.

    DTIC Science & Technology

    1985-05-01

    computer were first set down by Charles Babbage in 1830. An additional criteria was proposed by Von Neumann in 1947. These criteria state: (1) An input means...criteria requirements as set down by Babbage and Von Neumann. The computer equipment ("hardware") and internal operating system ("software

  17. User guide to a command and control system; a part of a prelaunch wind monitoring program

    NASA Technical Reports Server (NTRS)

    Cowgill, G. R.

    1976-01-01

    A set of programs called Command and Control System (CCS), intended as a user manual, is described for the operation of CCS by the personnel supporting the wind monitoring portion of the launch mission. Wind data obtained by tracking balloons is sent by electronic means using telephone lines to other locations. Steering commands are computed from a system called ADDJUST for the on-board computer and relays this data. Data are received and automatically stored in a microprocessor, then via a real time program transferred to the UNIVAC 1100/40 computer. At this point the data is available to be used by the Command and Control system.

  18. Analyzing Dental Implant Sites From Cone Beam Computed Tomography Scans on a Tablet Computer: A Comparative Study Between iPad and 3 Display Systems.

    PubMed

    Carrasco, Alejandro; Jalali, Elnaz; Dhingra, Ajay; Lurie, Alan; Yadav, Sumit; Tadinada, Aditya

    2017-06-01

    The aim of this study was to compare a medical-grade PACS (picture archiving and communication system) monitor, a consumer-grade monitor, a laptop computer, and a tablet computer for linear measurements of height and width for specific implant sites in the posterior maxilla and mandible, along with visualization of the associated anatomical structures. Cone beam computed tomography (CBCT) scans were evaluated. The images were reviewed using PACS-LCD monitor, consumer-grade LCD monitor using CB-Works software, a 13″ MacBook Pro, and an iPad 4 using OsiriX DICOM reader software. The operators had to identify anatomical structures in each display using a 2-point scale. User experience between PACS and iPad was also evaluated by means of a questionnaire. The measurements were very similar for each device. P-values were all greater than 0.05, indicating no significant difference between the monitors for each measurement. The intraoperator reliability was very high. The user experience was similar in each category with the most significant difference regarding the portability where the PACS display received the lowest score and the iPad received the highest score. The iPad with retina display was comparable with the medical-grade monitor, producing similar measurements and image visualization, and thus providing an inexpensive, portable, and reliable screen to analyze CBCT images in the operating room during the implant surgery.

  19. Monitoring task loading with multivariate EEG measures during complex forms of human-computer interaction

    NASA Technical Reports Server (NTRS)

    Smith, M. E.; Gevins, A.; Brown, H.; Karnik, A.; Du, R.

    2001-01-01

    Electroencephalographic (EEG) recordings were made while 16 participants performed versions of a personal-computer-based flight simulation task of low, moderate, or high difficulty. As task difficulty increased, frontal midline theta EEG activity increased and alpha band activity decreased. A participant-specific function that combined multiple EEG features to create a single load index was derived from a sample of each participant's data and then applied to new test data from that participant. Index values were computed for every 4 s of task data. Across participants, mean task load index values increased systematically with increasing task difficulty and differed significantly between the different task versions. Actual or potential applications of this research include the use of multivariate EEG-based methods to monitor task loading during naturalistic computer-based work.

  20. Error Analysis of Indirect Broadband Monitoring of Multilayer Optical Coatings using Computer Simulations

    NASA Astrophysics Data System (ADS)

    Semenov, Z. V.; Labusov, V. A.

    2017-11-01

    Results of studying the errors of indirect monitoring by means of computer simulations are reported. The monitoring method is based on measuring spectra of reflection from additional monitoring substrates in a wide spectral range. Special software (Deposition Control Simulator) is developed, which allows one to estimate the influence of the monitoring system parameters (noise of the photodetector array, operating spectral range of the spectrometer and errors of its calibration in terms of wavelengths, drift of the radiation source intensity, and errors in the refractive index of deposited materials) on the random and systematic errors of deposited layer thickness measurements. The direct and inverse problems of multilayer coatings are solved using the OptiReOpt library. Curves of the random and systematic errors of measurements of the deposited layer thickness as functions of the layer thickness are presented for various values of the system parameters. Recommendations are given on using the indirect monitoring method for the purpose of reducing the layer thickness measurement error.

  1. Construction of Expert Knowledge Monitoring and Assessment System Based on Integral Method of Knowledge Evaluation

    ERIC Educational Resources Information Center

    Golovachyova, Viktoriya N.; Menlibekova, Gulbakhyt Zh.; Abayeva, Nella F.; Ten, Tatyana L.; Kogaya, Galina D.

    2016-01-01

    Using computer-based monitoring systems that rely on tests could be the most effective way of knowledge evaluation. The problem of objective knowledge assessment by means of testing takes on a new dimension in the context of new paradigms in education. The analysis of the existing test methods enabled us to conclude that tests with selected…

  2. Intelligent Systems for Assessing Aging Changes: Home-Based, Unobtrusive, and Continuous Assessment of Aging

    PubMed Central

    Maxwell, Shoshana A.; Mattek, Nora; Hayes, Tamara L.; Dodge, Hiroko; Pavel, Misha; Jimison, Holly B.; Wild, Katherine; Boise, Linda; Zitzelberger, Tracy A.

    2011-01-01

    Objectives. To describe a longitudinal community cohort study, Intelligent Systems for Assessing Aging Changes, that has deployed an unobtrusive home-based assessment platform in many seniors homes in the existing community. Methods. Several types of sensors have been installed in the homes of 265 elderly persons for an average of 33 months. Metrics assessed by the sensors include total daily activity, time out of home, and walking speed. Participants were given a computer as well as training, and computer usage was monitored. Participants are assessed annually with health and function questionnaires, physical examinations, and neuropsychological testing. Results. Mean age was 83.3 years, mean years of education was 15.5, and 73% of cohort were women. During a 4-week snapshot, participants left their home twice a day on average for a total of 208 min per day. Mean in-home walking speed was 61.0 cm/s. Participants spent 43% of days on the computer averaging 76 min per day. Discussion. These results demonstrate for the first time the feasibility of engaging seniors in a large-scale deployment of in-home activity assessment technology and the successful collection of these activity metrics. We plan to use this platform to determine if continuous unobtrusive monitoring may detect incident cognitive decline. PMID:21743050

  3. Interpreting text messages with graphic facial expression by deaf and hearing people.

    PubMed

    Saegusa, Chihiro; Namatame, Miki; Watanabe, Katsumi

    2015-01-01

    In interpreting verbal messages, humans use not only verbal information but also non-verbal signals such as facial expression. For example, when a person says "yes" with a troubled face, what he or she really means appears ambiguous. In the present study, we examined how deaf and hearing people differ in perceiving real meanings in texts accompanied by representations of facial expression. Deaf and hearing participants were asked to imagine that the face presented on the computer monitor was asked a question from another person (e.g., do you like her?). They observed either a realistic or a schematic face with a different magnitude of positive or negative expression on a computer monitor. A balloon that contained either a positive or negative text response to the question appeared at the same time as the face. Then, participants rated how much the individual on the monitor really meant it (i.e., perceived earnestness), using a 7-point scale. Results showed that the facial expression significantly modulated the perceived earnestness. The influence of positive expression on negative text responses was relatively weaker than that of negative expression on positive responses (i.e., "no" tended to mean "no" irrespective of facial expression) for both participant groups. However, this asymmetrical effect was stronger in the hearing group. These results suggest that the contribution of facial expression in perceiving real meanings from text messages is qualitatively similar but quantitatively different between deaf and hearing people.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nitao, J J

    The goal of the Event Reconstruction Project is to find the location and strength of atmospheric release points, both stationary and moving. Source inversion relies on observational data as input. The methodology is sufficiently general to allow various forms of data. In this report, the authors will focus primarily on concentration measurements obtained at point monitoring locations at various times. The algorithms being investigated in the Project are the MCMC (Markov Chain Monte Carlo), SMC (Sequential Monte Carlo) Methods, classical inversion methods, and hybrids of these. They refer the reader to the report by Johannesson et al. (2004) for explanationsmore » of these methods. These methods require computing the concentrations at all monitoring locations for a given ''proposed'' source characteristic (locations and strength history). It is anticipated that the largest portion of the CPU time will take place performing this computation. MCMC and SMC will require this computation to be done at least tens of thousands of times. Therefore, an efficient means of computing forward model predictions is important to making the inversion practical. In this report they show how Green's functions and reciprocal Green's functions can significantly accelerate forward model computations. First, instead of computing a plume for each possible source strength history, they can compute plumes from unit impulse sources only. By using linear superposition, they can obtain the response for any strength history. This response is given by the forward Green's function. Second, they may use the law of reciprocity. Suppose that they require the concentration at a single monitoring point x{sub m} due to a potential (unit impulse) source that is located at x{sub s}. instead of computing a plume with source location x{sub s}, they compute a ''reciprocal plume'' whose (unit impulse) source is at the monitoring locations x{sub m}. The reciprocal plume is computed using a reversed-direction wind field. The wind field and transport coefficients must also be appropriately time-reversed. Reciprocity says that the concentration of reciprocal plume at x{sub s} is related to the desired concentration at x{sub m}. Since there are many less monitoring points than potential source locations, the number of forward model computations is drastically reduced.« less

  5. Quantification of abnormal intracranial pressure waves and isotope cisternography for diagnosis of occult communicating hydrocephalus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardoso, E.R.; Piatek, D.; Del Bigio, M.R.

    1989-01-01

    Nineteen consecutive patients with suspected occult communicating hydrocephalus were investigated by means of clinical evaluation, neuropsychological testing, isotope cisternography, computed tomography scanning, and continuous intracranial pressure monitoring. Semi-quantitative grading systems were used in the evaluation of the clinical, neuropsychological, and cisternographic assessments. Clinical examination, neuropsychological testing, and computed tomography scanning were repeated 3 months after ventriculoperitoneal shunting. All patients showed abnormal intracranial pressure waves and all improved after shunting. There was close correlation between number, peak, and pulse pressures of B waves and the mean intracranial pressure. However, quantification of B waves by means of number, frequency, and amplitude didmore » not help in predicting the degree of clinical improvement postshunting. The most sensitive predictor of favorable response to shunting was enlargement of the temporal horns on computed tomography scan. Furthermore, the size of temporal horns correlated with mean intracranial pressure. There was no correlation between abnormalities on isotope cisternography and clinical improvement.« less

  6. Monitoring and decision making by people in man machine systems

    NASA Technical Reports Server (NTRS)

    Johannsen, G.

    1979-01-01

    The analysis of human monitoring and decision making behavior as well as its modeling are described. Classic and optimal control theoretical, monitoring models are surveyed. The relationship between attention allocation and eye movements is discussed. As an example of applications, the evaluation of predictor displays by means of the optimal control model is explained. Fault detection involving continuous signals and decision making behavior of a human operator engaged in fault diagnosis during different operation and maintenance situations are illustrated. Computer aided decision making is considered as a queueing problem. It is shown to what extent computer aids can be based on the state of human activity as measured by psychophysiological quantities. Finally, management information systems for different application areas are mentioned. The possibilities of mathematical modeling of human behavior in complex man machine systems are also critically assessed.

  7. Low-cost, email-based system for self blood pressure monitoring at home.

    PubMed

    Nakajima, Kazuki; Nambu, Masayuki; Kiryu, Tohru; Tamura, Toshiyo; Sasaki, Kazuo

    2006-01-01

    We have developed a low-cost monitoring system, which allows subjects to send blood pressure (BP) data obtained at home to health-care professionals by email. The system consists of a wrist BP monitor and a personal computer (PC) with an Internet connection. The wrist BP monitor includes an advanced positioning sensor to verify that the wrist is placed properly at heart level. Subjects at home can self-measure their BP every day, automatically transfer the BP data to their PC each week, and then send a comma-separated values (CSV) file to their health-care professional by email. In a feasibility study, 10 subjects used the system for a mean period of 207 days (SD 149). The mean percent achievement of measurement in the 10 subjects was 84% (SD 12). There was a seasonal variation in systolic and diastolic BP, which was inversely correlated with temperature. Eight of the 10 subjects evaluated the system favourably. The results of the present study demonstrate the feasibility of our email-based system for self-monitoring of blood pressure. Its low cost means that it may have widespread application in future home telecare studies.

  8. The vertical monitor position for presbyopic computer users with progressive lenses: how to reach clear vision and comfortable head posture.

    PubMed

    Weidling, Patrick; Jaschinski, Wolfgang

    2015-01-01

    When presbyopic employees are wearing general-purpose progressive lenses, they have clear vision only with a lower gaze inclination to the computer monitor, given the head assumes a comfortable inclination. Therefore, in the present intervention field study the monitor position was lowered, also with the aim to reduce musculoskeletal symptoms. A comparison group comprised users of lenses that do not restrict the field of clear vision. The lower monitor positions led the participants to lower their head inclination, which was linearly associated with a significant reduction in musculoskeletal symptoms. However, for progressive lenses a lower head inclination means a lower zone of clear vision, so that clear vision of the complete monitor was not achieved, rather the monitor should have been placed even lower. The procedures of this study may be useful for optimising the individual monitor position depending on the comfortable head and gaze inclination and the vertical zone of clear vision of progressive lenses. For users of general-purpose progressive lenses, it is suggested that low monitor positions allow for clear vision at the monitor and for a physiologically favourable head inclination. Employees may improve their workplace using a flyer providing ergonomic-optometric information.

  9. A Machine-Learning and Filtering Based Data Assimilation Framework for Geologic Carbon Sequestration Monitoring Optimization

    NASA Astrophysics Data System (ADS)

    Chen, B.; Harp, D. R.; Lin, Y.; Keating, E. H.; Pawar, R.

    2017-12-01

    Monitoring is a crucial aspect of geologic carbon sequestration (GCS) risk management. It has gained importance as a means to ensure CO2 is safely and permanently stored underground throughout the lifecycle of a GCS project. Three issues are often involved in a monitoring project: (i) where is the optimal location to place the monitoring well(s), (ii) what type of data (pressure, rate and/or CO2 concentration) should be measured, and (iii) What is the optimal frequency to collect the data. In order to address these important issues, a filtering-based data assimilation procedure is developed to perform the monitoring optimization. The optimal monitoring strategy is selected based on the uncertainty reduction of the objective of interest (e.g., cumulative CO2 leak) for all potential monitoring strategies. To reduce the computational cost of the filtering-based data assimilation process, two machine-learning algorithms: Support Vector Regression (SVR) and Multivariate Adaptive Regression Splines (MARS) are used to develop the computationally efficient reduced-order-models (ROMs) from full numerical simulations of CO2 and brine flow. The proposed framework for GCS monitoring optimization is demonstrated with two examples: a simple 3D synthetic case and a real field case named Rock Spring Uplift carbon storage site in Southwestern Wyoming.

  10. Real-time, high frequency QRS electrocardiograph

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T. (Inventor); DePalma, Jude L. (Inventor); Moradi, Saeed (Inventor)

    2006-01-01

    Real time cardiac electrical data are received from a patient, manipulated to determine various useful aspects of the ECG signal, and displayed in real time in a useful form on a computer screen or monitor. The monitor displays the high frequency data from the QRS complex in units of microvolts, juxtaposed with a display of conventional ECG data in units of millivolts or microvolts. The high frequency data are analyzed for their root mean square (RMS) voltage values and the discrete RMS values and related parameters are displayed in real time. The high frequency data from the QRS complex are analyzed with imbedded algorithms to determine the presence or absence of reduced amplitude zones, referred to herein as RAZs. RAZs are displayed as go, no-go signals on the computer monitor. The RMS and related values of the high frequency components are displayed as time varying signals, and the presence or absence of RAZs may be similarly displayed over time.

  11. Multivariate Spatial Condition Mapping Using Subtractive Fuzzy Cluster Means

    PubMed Central

    Sabit, Hakilo; Al-Anbuky, Adnan

    2014-01-01

    Wireless sensor networks are usually deployed for monitoring given physical phenomena taking place in a specific space and over a specific duration of time. The spatio-temporal distribution of these phenomena often correlates to certain physical events. To appropriately characterise these events-phenomena relationships over a given space for a given time frame, we require continuous monitoring of the conditions. WSNs are perfectly suited for these tasks, due to their inherent robustness. This paper presents a subtractive fuzzy cluster means algorithm and its application in data stream mining for wireless sensor systems over a cloud-computing-like architecture, which we call sensor cloud data stream mining. Benchmarking on standard mining algorithms, the k-means and the FCM algorithms, we have demonstrated that the subtractive fuzzy cluster means model can perform high quality distributed data stream mining tasks comparable to centralised data stream mining. PMID:25313495

  12. Algorithm for computing descriptive statistics for very large data sets and the exa-scale era

    NASA Astrophysics Data System (ADS)

    Beekman, Izaak

    2017-11-01

    An algorithm for Single-point, Parallel, Online, Converging Statistics (SPOCS) is presented. It is suited for in situ analysis that traditionally would be relegated to post-processing, and can be used to monitor the statistical convergence and estimate the error/residual in the quantity-useful for uncertainty quantification too. Today, data may be generated at an overwhelming rate by numerical simulations and proliferating sensing apparatuses in experiments and engineering applications. Monitoring descriptive statistics in real time lets costly computations and experiments be gracefully aborted if an error has occurred, and monitoring the level of statistical convergence allows them to be run for the shortest amount of time required to obtain good results. This algorithm extends work by Pébay (Sandia Report SAND2008-6212). Pébay's algorithms are recast into a converging delta formulation, with provably favorable properties. The mean, variance, covariances and arbitrary higher order statistical moments are computed in one pass. The algorithm is tested using Sillero, Jiménez, & Moser's (2013, 2014) publicly available UPM high Reynolds number turbulent boundary layer data set, demonstrating numerical robustness, efficiency and other favorable properties.

  13. Water quality real-time monitoring system via biological detection based on video analysis

    NASA Astrophysics Data System (ADS)

    Xin, Chen; Fei, Yuan

    2017-11-01

    With the development of society, water pollution has become the most serious problem in China. Therefore, real-time water quality monitoring is an important part of human activities and water pollution prevention. In this paper, the behavior of zebrafish was monitored by computer vision. Firstly, the moving target was extracted by the method of saliency detection, and tracked by fitting the ellipse model. Then the motion parameters were extracted by optical flow method, and the data were monitored in real time by means of Hinkley warning and threshold warning. We achieved classification warning through a number of dimensions by comprehensive toxicity index. The experimental results show that the system can achieve more accurate real-time monitoring.

  14. Attitudes of heart failure patients and health care providers towards mobile phone-based remote monitoring.

    PubMed

    Seto, Emily; Leonard, Kevin J; Masino, Caterina; Cafazzo, Joseph A; Barnsley, Jan; Ross, Heather J

    2010-11-29

    Mobile phone-based remote patient monitoring systems have been proposed for heart failure management because they are relatively inexpensive and enable patients to be monitored anywhere. However, little is known about whether patients and their health care providers are willing and able to use this technology. The objective of our study was to assess the attitudes of heart failure patients and their health care providers from a heart function clinic in a large urban teaching hospital toward the use of mobile phone-based remote monitoring. A questionnaire regarding attitudes toward home monitoring and technology was administered to 100 heart failure patients (94/100 returned a completed questionnaire). Semi-structured interviews were also conducted with 20 heart failure patients and 16 clinicians to determine the perceived benefits and barriers to using mobile phone-based remote monitoring, as well as their willingness and ability to use the technology. The survey results indicated that the patients were very comfortable using mobile phones (mean rating 4.5, SD 0.6, on a five-point Likert scale), even more so than with using computers (mean 4.1, SD 1.1). The difference in comfort level between mobile phones and computers was statistically significant (P< .001). Patients were also confident in using mobile phones to view health information (mean 4.4, SD 0.9). Patients and clinicians were willing to use the system as long as several conditions were met, including providing a system that was easy to use with clear tangible benefits, maintaining good patient-provider communication, and not increasing clinical workload. Clinicians cited several barriers to implementation of such a system, including lack of remuneration for telephone interactions with patients and medicolegal implications. Patients and clinicians want to use mobile phone-based remote monitoring and believe that they would be able to use the technology. However, they have several reservations, such as potential increased clinical workload, medicolegal issues, and difficulty of use for some patients due to lack of visual acuity or manual dexterity.

  15. Instrumentation, performance visualization, and debugging tools for multiprocessors

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Fineman, Charles E.; Hontalas, Philip J.

    1991-01-01

    The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessor architectures. However, without effective means to monitor (and visualize) program execution, debugging, and tuning parallel programs becomes intractably difficult as program complexity increases with the number of processors. Research on performance evaluation tools for multiprocessors is being carried out at ARC. Besides investigating new techniques for instrumenting, monitoring, and presenting the state of parallel program execution in a coherent and user-friendly manner, prototypes of software tools are being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Our current tool set, the Ames Instrumentation Systems (AIMS), incorporates features from various software systems developed in academia and industry. The execution of FORTRAN programs on the Intel iPSC/860 can be automatically instrumented and monitored. Performance data collected in this manner can be displayed graphically on workstations supporting X-Windows. We have successfully compared various parallel algorithms for computational fluid dynamics (CFD) applications in collaboration with scientists from the Numerical Aerodynamic Simulation Systems Division. By performing these comparisons, we show that performance monitors and debuggers such as AIMS are practical and can illuminate the complex dynamics that occur within parallel programs.

  16. An Interactive Graphics Program for Investigating Digital Signal Processing.

    ERIC Educational Resources Information Center

    Miller, Billy K.; And Others

    1983-01-01

    Describes development of an interactive computer graphics program for use in teaching digital signal processing. The program allows students to interactively configure digital systems on a monitor display and observe their system's performance by means of digital plots on the system's outputs. A sample program run is included. (JN)

  17. Towards an Autonomic Cluster Management System (ACMS) with Reflex Autonomicity

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Hinchey, Mike; Sterritt, Roy

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of providing a fault-tolerant environment and achieving significant computational capabilities for high-performance computing applications. However, the task of manually managing and configuring a cluster quickly becomes daunting as the cluster grows in size. Autonomic computing, with its vision to provide self-management, can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Autonomic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management and its evolution to include reflex reactions via pulse monitoring.

  18. Utilising handheld computers to monitor and support patients receiving chemotherapy: results of a UK-based feasibility study.

    PubMed

    Kearney, N; Kidd, L; Miller, M; Sage, M; Khorrami, J; McGee, M; Cassidy, J; Niven, K; Gray, P

    2006-07-01

    Recent changes in cancer service provision mean that many patients spend a limited time in hospital and therefore experience and must cope with and manage treatment-related side effects at home. Information technology can provide innovative solutions in promoting patient care through information provision, enhancing communication, monitoring treatment-related side effects and promoting self-care. The aim of this feasibility study was to evaluate the acceptability of using handheld computers as a symptom assessment and management tool for patients receiving chemotherapy for cancer. A convenience sample of patients (n = 18) and health professionals (n = 9) at one Scottish cancer centre was recruited. Patients used the handheld computer to record and send daily symptom reports to the cancer centre and receive instant, tailored symptom management advice during two treatment cycles. Both patients' and health professionals' perceptions of the handheld computer system were evaluated at baseline and at the end of the project. Patients believed the handheld computer had improved their symptom management and felt comfortable in using it. The health professionals also found the handheld computer to be helpful in assessing and managing patients' symptoms. This project suggests that a handheld-computer-based symptom management tool is feasible and acceptable to both patients and health professionals in complementing the care of patients receiving chemotherapy.

  19. Electric Fuel Pump Condition Monitor System Using Electricalsignature Analysis

    DOEpatents

    Haynes, Howard D [Knoxville, TN; Cox, Daryl F [Knoxville, TN; Welch, Donald E [Oak Ridge, TN

    2005-09-13

    A pump diagnostic system and method comprising current sensing probes clamped on electrical motor leads of a pump for sensing only current signals on incoming motor power, a signal processor having a means for buffering and anti-aliasing current signals into a pump motor current signal, and a computer having a means for analyzing, displaying, and reporting motor current signatures from the motor current signal to determine pump health using integrated motor and pump diagnostic parameters.

  20. Bayesian Asymmetric Regression as a Means to Estimate and Evaluate Oral Reading Fluency Slopes

    ERIC Educational Resources Information Center

    Solomon, Benjamin G.; Forsberg, Ole J.

    2017-01-01

    Bayesian techniques have become increasingly present in the social sciences, fueled by advances in computer speed and the development of user-friendly software. In this paper, we forward the use of Bayesian Asymmetric Regression (BAR) to monitor intervention responsiveness when using Curriculum-Based Measurement (CBM) to assess oral reading…

  1. 30 CFR 75.1912 - Fire suppression systems for permanent underground diesel fuel storage facilities.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... electrical system failure. (g) Electrically operated detection and actuation circuits shall be monitored and... operated, a means shall be provided to indicate the functional readiness status of the detection system. (h... susceptible to alteration or recorded electronically in a secured computer system that is not susceptible to...

  2. 30 CFR 75.1912 - Fire suppression systems for permanent underground diesel fuel storage facilities.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... electrical system failure. (g) Electrically operated detection and actuation circuits shall be monitored and... operated, a means shall be provided to indicate the functional readiness status of the detection system. (h... susceptible to alteration or recorded electronically in a secured computer system that is not susceptible to...

  3. 30 CFR 75.1912 - Fire suppression systems for permanent underground diesel fuel storage facilities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... electrical system failure. (g) Electrically operated detection and actuation circuits shall be monitored and... operated, a means shall be provided to indicate the functional readiness status of the detection system. (h... susceptible to alteration or recorded electronically in a secured computer system that is not susceptible to...

  4. 30 CFR 75.1912 - Fire suppression systems for permanent underground diesel fuel storage facilities.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... electrical system failure. (g) Electrically operated detection and actuation circuits shall be monitored and... operated, a means shall be provided to indicate the functional readiness status of the detection system. (h... susceptible to alteration or recorded electronically in a secured computer system that is not susceptible to...

  5. 30 CFR 75.1912 - Fire suppression systems for permanent underground diesel fuel storage facilities.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... electrical system failure. (g) Electrically operated detection and actuation circuits shall be monitored and... operated, a means shall be provided to indicate the functional readiness status of the detection system. (h... susceptible to alteration or recorded electronically in a secured computer system that is not susceptible to...

  6. Pervasive sensing

    NASA Astrophysics Data System (ADS)

    Nagel, David J.

    2000-11-01

    The coordinated exploitation of modern communication, micro- sensor and computer technologies makes it possible to give global reach to our senses. Web-cameras for vision, web- microphones for hearing and web-'noses' for smelling, plus the abilities to sense many factors we cannot ordinarily perceive, are either available or will be soon. Applications include (1) determination of weather and environmental conditions on dense grids or over large areas, (2) monitoring of energy usage in buildings, (3) sensing the condition of hardware in electrical power distribution and information systems, (4) improving process control and other manufacturing, (5) development of intelligent terrestrial, marine, aeronautical and space transportation systems, (6) managing the continuum of routine security monitoring, diverse crises and military actions, and (7) medicine, notably the monitoring of the physiology and living conditions of individuals. Some of the emerging capabilities, such as the ability to measure remotely the conditions inside of people in real time, raise interesting social concerns centered on privacy issues. Methods for sensor data fusion and designs for human-computer interfaces are both crucial for the full realization of the potential of pervasive sensing. Computer-generated virtual reality, augmented with real-time sensor data, should be an effective means for presenting information from distributed sensors.

  7. VLSI implementation of a new LMS-based algorithm for noise removal in ECG signal

    NASA Astrophysics Data System (ADS)

    Satheeskumaran, S.; Sabrigiriraj, M.

    2016-06-01

    Least mean square (LMS)-based adaptive filters are widely deployed for removing artefacts in electrocardiogram (ECG) due to less number of computations. But they posses high mean square error (MSE) under noisy environment. The transform domain variable step-size LMS algorithm reduces the MSE at the cost of computational complexity. In this paper, a variable step-size delayed LMS adaptive filter is used to remove the artefacts from the ECG signal for improved feature extraction. The dedicated digital Signal processors provide fast processing, but they are not flexible. By using field programmable gate arrays, the pipelined architectures can be used to enhance the system performance. The pipelined architecture can enhance the operation efficiency of the adaptive filter and save the power consumption. This technique provides high signal-to-noise ratio and low MSE with reduced computational complexity; hence, it is a useful method for monitoring patients with heart-related problem.

  8. Converting Static Image Datasets to Spiking Neuromorphic Datasets Using Saccades.

    PubMed

    Orchard, Garrick; Jayawant, Ajinkya; Cohen, Gregory K; Thakor, Nitish

    2015-01-01

    Creating datasets for Neuromorphic Vision is a challenging task. A lack of available recordings from Neuromorphic Vision sensors means that data must typically be recorded specifically for dataset creation rather than collecting and labeling existing data. The task is further complicated by a desire to simultaneously provide traditional frame-based recordings to allow for direct comparison with traditional Computer Vision algorithms. Here we propose a method for converting existing Computer Vision static image datasets into Neuromorphic Vision datasets using an actuated pan-tilt camera platform. Moving the sensor rather than the scene or image is a more biologically realistic approach to sensing and eliminates timing artifacts introduced by monitor updates when simulating motion on a computer monitor. We present conversion of two popular image datasets (MNIST and Caltech101) which have played important roles in the development of Computer Vision, and we provide performance metrics on these datasets using spike-based recognition algorithms. This work contributes datasets for future use in the field, as well as results from spike-based algorithms against which future works can compare. Furthermore, by converting datasets already popular in Computer Vision, we enable more direct comparison with frame-based approaches.

  9. Design of Test Articles and Monitoring System for the Characterization of HIRF Effects on a Fault-Tolerant Computer Communication System

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo; Malekpour, Mahyar R.; Miner, Paul S.; Koppen, Sandra V.

    2008-01-01

    This report describes the design of the test articles and monitoring systems developed to characterize the response of a fault-tolerant computer communication system when stressed beyond the theoretical limits for guaranteed correct performance. A high-intensity radiated electromagnetic field (HIRF) environment was selected as the means of injecting faults, as such environments are known to have the potential to cause arbitrary and coincident common-mode fault manifestations that can overwhelm redundancy management mechanisms. The monitors generate stimuli for the systems-under-test (SUTs) and collect data in real-time on the internal state and the response at the external interfaces. A real-time health assessment capability was developed to support the automation of the test. A detailed description of the nature and structure of the collected data is included. The goal of the report is to provide insight into the design and operation of these systems, and to serve as a reference document for use in post-test analyses.

  10. Real-time, high frequency QRS electrocardiograph with reduced amplitude zone detection

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T. (Inventor); DePalma, Jude L. (Inventor); Moradi, Saeed (Inventor)

    2009-01-01

    Real time cardiac electrical data are received from a patient, manipulated to determine various useful aspects of the ECG signal, and displayed in real time in a useful form on a computer screen or monitor. The monitor displays the high frequency data from the QRS complex in units of microvolts, juxtaposed with a display of conventional ECG data in units of millivolts or microvolts. The high frequency data are analyzed for their root mean square (RMS) voltage values and the discrete RMS values and related parameters are displayed in real time. The high frequency data from the QRS complex are analyzed with imbedded algorithms to determine the presence or absence of reduced amplitude zones, referred to herein as ''RAZs''. RAZs are displayed as ''go, no-go'' signals on the computer monitor. The RMS and related values of the high frequency components are displayed as time varying signals, and the presence or absence of RAZs may be similarly displayed over time.

  11. Pure-tone audiometry outside a sound booth using earphone attentuation, integrated noise monitoring, and automation.

    PubMed

    Swanepoel, De Wet; Matthysen, Cornelia; Eikelboom, Robert H; Clark, Jackie L; Hall, James W

    2015-01-01

    Accessibility of audiometry is hindered by the cost of sound booths and shortage of hearing health personnel. This study investigated the validity of an automated mobile diagnostic audiometer with increased attenuation and real-time noise monitoring for clinical testing outside a sound booth. Attenuation characteristics and reference ambient noise levels for the computer-based audiometer (KUDUwave) was evaluated alongside the validity of environmental noise monitoring. Clinical validity was determined by comparing air- and bone-conduction thresholds obtained inside and outside the sound booth (23 subjects). Twenty-three normal-hearing subjects (age range, 20-75 years; average age 35.5) and a sub group of 11 subjects to establish test-retest reliability. Improved passive attenuation and valid environmental noise monitoring was demonstrated. Clinically, air-conduction thresholds inside and outside the sound booth, corresponded within 5 dB or less > 90% of instances (mean absolute difference 3.3 ± 3.2 SD). Bone conduction thresholds corresponded within 5 dB or less in 80% of comparisons between test environments, with a mean absolute difference of 4.6 dB (3.7 SD). Threshold differences were not statistically significant. Mean absolute test-retest differences outside the sound booth was similar to those in the booth. Diagnostic pure-tone audiometry outside a sound booth, using automated testing, improved passive attenuation, and real-time environmental noise monitoring demonstrated reliable hearing assessments.

  12. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...

  13. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...

  14. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...

  15. Improved air trapping evaluation in chest computed tomography in children with cystic fibrosis using real-time spirometric monitoring and biofeedback.

    PubMed

    Kongstad, Thomas; Buchvald, Frederik F; Green, Kent; Lindblad, Anders; Robinson, Terry E; Nielsen, Kim G

    2013-12-01

    The quality of chest Computed Tomography (CT) images in children is dependent upon a sufficient breath hold during CT scanning. This study evaluates the influence of spirometric breath hold monitoring with biofeedback software on inspiratory and expiratory chest CT lung density measures, and on trapped air (TA) scoring in children with cystic fibrosis (CF). This is important because TA is an important component of early and progressive CF lung disease. A cross sectional comparison study was completed for chest CT imaging in two cohorts of CF children with comparable disease severity, using spirometric breath hold monitoring and biofeedback software (Copenhagen (COP)) or unmonitored breath hold manoeuvres (Gothenburg (GOT)). Inspiratory-expiratory lung density differences were calculated, and TA was scored to assess the difference between the two cohorts. Eighty-four chest CTs were evaluated. Mean (95%CI) change in inspiratory-expiratory lung density differences was 436 Hounsfield Units (HU) (408 to 464) in the COP cohort with spirometric breath hold monitoring versus 229 HU (188 to 269) in the GOT cohort with unmonitored breath hold manoeuvres (p<0.0001). The Mean TA (95%CI) score was 6.93 (6.05 to 7.82) in COP patients and 3.81 (2.89 to 4.73) in GOT (p<0.0001) patients. In children with comparable CF lung disease, spirometric breath hold monitoring during examination yielded a large difference in lung volume between inhalation and exhalation, and allowed for a significantly greater measured change in lung density and TA score, compared to unmonitored breath hold maneuvers. This has implications to the clinical use of chest CT, especially in children with early CF lung disease. Copyright © 2013 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  16. Hand held data collection and monitoring system for nuclear facilities

    DOEpatents

    Brayton, D.D.; Scharold, P.G.; Thornton, M.W.; Marquez, D.L.

    1999-01-26

    Apparatus and method is disclosed for a data collection and monitoring system that utilizes a pen based hand held computer unit which has contained therein interaction software that allows the user to review maintenance procedures, collect data, compare data with historical trends and safety limits, and input new information at various collection sites. The system has a means to allow automatic transfer of the collected data to a main computer data base for further review, reporting, and distribution purposes and uploading updated collection and maintenance procedures. The hand held computer has a running to-do list so sample collection and other general tasks, such as housekeeping are automatically scheduled for timely completion. A done list helps users to keep track of all completed tasks. The built-in check list assures that work process will meet the applicable processes and procedures. Users can hand write comments or drawings with an electronic pen that allows the users to directly interface information on the screen. 15 figs.

  17. Hand held data collection and monitoring system for nuclear facilities

    DOEpatents

    Brayton, Darryl D.; Scharold, Paul G.; Thornton, Michael W.; Marquez, Diana L.

    1999-01-01

    Apparatus and method is disclosed for a data collection and monitoring system that utilizes a pen based hand held computer unit which has contained therein interaction software that allows the user to review maintenance procedures, collect data, compare data with historical trends and safety limits, and input new information at various collection sites. The system has a means to allow automatic transfer of the collected data to a main computer data base for further review, reporting, and distribution purposes and uploading updated collection and maintenance procedures. The hand held computer has a running to-do list so sample collection and other general tasks, such as housekeeping are automatically scheduled for timely completion. A done list helps users to keep track of all completed tasks. The built-in check list assures that work process will meet the applicable processes and procedures. Users can hand write comments or drawings with an electronic pen that allows the users to directly interface information on the screen.

  18. Technical Note: scuda: A software platform for cumulative dose assessment.

    PubMed

    Park, Seyoun; McNutt, Todd; Plishker, William; Quon, Harry; Wong, John; Shekhar, Raj; Lee, Junghoon

    2016-10-01

    Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (scuda) that can be seamlessly integrated into the clinical workflow. scuda consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our image PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.

  19. Technical Note: SCUDA: A software platform for cumulative dose assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Seyoun; McNutt, Todd; Quon, Harry

    Purpose: Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (SCUDA) that can be seamlessly integrated into the clinical workflow. Methods: SCUDA consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our imagemore » PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. Results: The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. Conclusions: The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.« less

  20. Streamflow monitoring and statistics for development of water rights claims for Wild and Scenic Rivers, Owyhee Canyonlands Wilderness, Idaho, 2012

    USGS Publications Warehouse

    Wood, Molly S.; Fosness, Ryan L.

    2013-01-01

    The U.S. Geological Survey, in cooperation with the Bureau of Land Management (BLM), collected streamflow data in 2012 and estimated streamflow statistics for stream segments designated "Wild," "Scenic," or "Recreational" under the National Wild and Scenic Rivers System in the Owyhee Canyonlands Wilderness in southwestern Idaho. The streamflow statistics were used by BLM to develop and file a draft, federal reserved water right claim in autumn 2012 to protect federally designated "outstanding remarkable values" in the stream segments. BLM determined that the daily mean streamflow equaled or exceeded 20 and 80 percent of the time during bimonthly periods (two periods per month) and the bankfull streamflow are important streamflow thresholds for maintaining outstanding remarkable values. Prior to this study, streamflow statistics estimated using available datasets and tools for the Owyhee Canyonlands Wilderness were inaccurate for use in the water rights claim. Streamflow measurements were made at varying intervals during February–September 2012 at 14 monitoring sites; 2 of the monitoring sites were equipped with telemetered streamgaging equipment. Synthetic streamflow records were created for 11 of the 14 monitoring sites using a partial‑record method or a drainage-area-ratio method. Streamflow records were obtained directly from an operating, long-term streamgage at one monitoring site, and from discontinued streamgages at two monitoring sites. For 10 sites analyzed using the partial-record method, discrete measurements were related to daily mean streamflow at a nearby, telemetered “index” streamgage. Resulting regression equations were used to estimate daily mean and annual peak streamflow at the monitoring sites during the full period of record for the index sites. A synthetic streamflow record for Sheep Creek was developed using a drainage-area-ratio method, because measured streamflows did not relate well to any index site to allow use of the partial-record method. The synthetic and actual daily mean streamflow records were used to estimate daily mean streamflow that was exceeded 80, 50, and 20 percent of the time (80-, 50-, and 20-percent exceedances) for bimonthly and annual periods. Bankfull streamflow statistics were calculated by fitting the synthetic and actual annual peak streamflow records to a log Pearson Type III distribution using Bulletin 17B guidelines in the U.S. Geological Survey PeakFQ program. The coefficients of determination (R2) for the regressions between the monitoring and index sites ranged from 0.74 for Wickahoney Creek to 0.98 for the West Fork Bruneau River and Deep Creek. Confidence in computed streamflow statistics is highest among other sites for the East Fork Owyhee River and the West Fork Bruneau River on the basis of regression statistics, visual fit of the related data, and the range and number of streamflow measurements. Streamflow statistics for sites with the greatest uncertainty included Big Jacks, Little Jacks, Cottonwood, Wickahoney, and Sheep Creeks. The uncertainty in computed streamflow statistics was due to a number of factors which included the distance of index sites relative to monitoring sites, relatively low streamflow conditions that occurred during the study, and the limited number and range of streamflow measurements. However, the computed streamflow statistics are considered the best possible estimates given available datasets in the remote study area. Streamflow measurements over a wider range of hydrologic and climatic conditions would improve the relations between streamflow characteristics at monitoring and index sites. Additionally, field surveys are needed to verify if the streamflows selected for the water rights claims are sufficient for maintaining outstanding remarkable values in the Wild and Scenic rivers included in the study.

  1. Toothguide Trainer tests with color vision deficiency simulation monitor.

    PubMed

    Borbély, Judit; Varsányi, Balázs; Fejérdy, Pál; Hermann, Péter; Jakstat, Holger A

    2010-01-01

    The aim of this study was to evaluate whether simulated severe red and green color vision deficiency (CVD) influenced color matching results and to investigate whether training with Toothguide Trainer (TT) computer program enabled better color matching results. A total of 31 color normal dental students participated in the study. Every participant had to pass the Ishihara Test. Participants with a red/green color vision deficiency were excluded. A lecture on tooth color matching was given, and individual training with TT was performed. To measure the individual tooth color matching results in normal and color deficient display modes, the TT final exam was displayed on a calibrated monitor that served as a hardware-based method of simulating protanopy and deuteranopy. Data from the TT final exams were collected in normal and in severe red and green CVD-simulating monitor display modes. Color difference values for each participant in each display mode were computed (∑ΔE(ab)(*)), and the respective means and standard deviations were calculated. The Student's t-test was used in statistical evaluation. Participants made larger ΔE(ab)(*) errors in severe color vision deficient display modes than in the normal monitor mode. TT tests showed significant (p<0.05) difference in the tooth color matching results of severe green color vision deficiency simulation mode compared to normal vision mode. Students' shade matching results were significantly better after training (p=0.009). Computer-simulated severe color vision deficiency mode resulted in significantly worse color matching quality compared to normal color vision mode. Toothguide Trainer computer program improved color matching results. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Bubble nucleation in simple and molecular liquids via the largest spherical cavity method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, Miguel A., E-mail: m.gonzalez12@imperial.ac.uk; Department of Chemistry, Imperial College London, London SW7 2AZ; Abascal, José L. F.

    2015-04-21

    In this work, we propose a methodology to compute bubble nucleation free energy barriers using trajectories generated via molecular dynamics simulations. We follow the bubble nucleation process by means of a local order parameter, defined by the volume of the largest spherical cavity (LSC) formed in the nucleating trajectories. This order parameter simplifies considerably the monitoring of the nucleation events, as compared with the previous approaches which require ad hoc criteria to classify the atoms and molecules as liquid or vapor. The combination of the LSC and the mean first passage time technique can then be used to obtain themore » free energy curves. Upon computation of the cavity distribution function the nucleation rate and free-energy barrier can then be computed. We test our method against recent computations of bubble nucleation in simple liquids and water at negative pressures. We obtain free-energy barriers in good agreement with the previous works. The LSC method provides a versatile and computationally efficient route to estimate the volume of critical bubbles the nucleation rate and to compute bubble nucleation free-energies in both simple and molecular liquids.« less

  3. Parallel computer processing and modeling: applications for the ICU

    NASA Astrophysics Data System (ADS)

    Baxter, Grant; Pranger, L. Alex; Draghic, Nicole; Sims, Nathaniel M.; Wiesmann, William P.

    2003-07-01

    Current patient monitoring procedures in hospital intensive care units (ICUs) generate vast quantities of medical data, much of which is considered extemporaneous and not evaluated. Although sophisticated monitors to analyze individual types of patient data are routinely used in the hospital setting, this equipment lacks high order signal analysis tools for detecting long-term trends and correlations between different signals within a patient data set. Without the ability to continuously analyze disjoint sets of patient data, it is difficult to detect slow-forming complications. As a result, the early onset of conditions such as pneumonia or sepsis may not be apparent until the advanced stages. We report here on the development of a distributed software architecture test bed and software medical models to analyze both asynchronous and continuous patient data in real time. Hardware and software has been developed to support a multi-node distributed computer cluster capable of amassing data from multiple patient monitors and projecting near and long-term outcomes based upon the application of physiologic models to the incoming patient data stream. One computer acts as a central coordinating node; additional computers accommodate processing needs. A simple, non-clinical model for sepsis detection was implemented on the system for demonstration purposes. This work shows exceptional promise as a highly effective means to rapidly predict and thereby mitigate the effect of nosocomial infections.

  4. A computer-controlled scintiscanning system and associated computer graphic techniques for study of regional distribution of blood flow.

    NASA Technical Reports Server (NTRS)

    Coulam, C. M.; Dunnette, W. H.; Wood, E. H.

    1970-01-01

    Two methods whereby a digital computer may be used to regulate a scintiscanning process are discussed from the viewpoint of computer input-output software. The computer's function, in this case, is to govern the data acquisition and storage, and to display the results to the investigator in a meaningful manner, both during and subsequent to the scanning process. Several methods (such as three-dimensional maps, contour plots, and wall-reflection maps) have been developed by means of which the computer can graphically display the data on-line, for real-time monitoring purposes, during the scanning procedure and subsequently for detailed analysis of the data obtained. A computer-governed method for converting scintiscan data recorded over the dorsal or ventral surfaces of the thorax into fractions of pulmonary blood flow traversing the right and left lungs is presented.

  5. Monitoring minimization of grade B environments based on risk assessment using three-dimensional airflow measurements and computer simulation.

    PubMed

    Katayama, Hirohito; Higo, Takashi; Tokunaga, Yuji; Katoh, Shigeo; Hiyama, Yukio; Morikawa, Kaoru

    2008-01-01

    A practical, risk-based monitoring approach using the combined data collected from actual experiments and computer simulations was developed for the qualification of an EU GMP Annex 1 Grade B, ISO Class 7 area. This approach can locate and minimize the representative number of sampling points used for microbial contamination risk assessment. We conducted a case study on an aseptic clean room, newly constructed and specifically designed for the use of a restricted access barrier system (RABS). Hotspots were located using three-dimensional airflow analysis based on a previously published empirical measurement method, the three-dimensional airflow analysis. Local mean age of air (LMAA) values were calculated based on computer simulations. Comparable results were found using actual measurements and simulations, demonstrating the potential usefulness of such tools in estimating contamination risks based on the airflow characteristics of a clean room. Intensive microbial monitoring and particle monitoring at the Grade B environmental qualification stage, as well as three-dimensional airflow analysis, were also conducted to reveal contamination hotspots. We found representative hotspots were located at perforated panels covering the air exhausts where the major piston airflows collect in the Grade B room, as well as at any locations within the room that were identified as having stagnant air. However, we also found that the floor surface air around the exit airway of the RABS EU GMP Annex 1 Grade A, ISO Class 5 area was always remarkably clean, possibly due to the immediate sweep of the piston airflow, which prevents dispersed human microbes from falling in a Stokes-type manner on settling plates placed on the floor around the Grade A exit airway. In addition, this airflow is expected to be clean with a significantly low LMAA. Based on these observed results, we propose a simplified daily monitoring program to monitor microbial contamination in Grade B environments. To locate hotspots we propose using a combination of computer simulation, actual airflow measurements, and intensive environmental monitoring at the qualification stage. Thereafter, instead of particle or microbial air monitoring, we recommend the use of microbial surface monitoring at the main air exhaust. These measures would be sufficient to assure the efficiency of the monitoring program, as well as to minimize the number of surface sampling points used in environments surrounding a RABS.

  6. Method for monitoring environmental and corrosion

    DOEpatents

    Glass, Robert S.; Clarke, Jr., Willis L.; Ciarlo, Dino R.

    1995-01-01

    A corrosion sensor array incorporating individual elements for measuring various elements and ions, such as chloride, sulfide, copper, hydrogen (pH), etc. and elements for evaluating the instantaneous corrosion properties of structural materials. The exact combination and number of elements measured or monitored would depend upon the environmental conditions and materials used which are subject to corrosive effects. Such a corrosion monitoring system embedded in or mounted on a structure exposed to the environment would serve as an early warning system for the onset of severe corrosion problems for the structure, thus providing a safety factor as well as economic factors. The sensor array is accessed to an electronics/computational system, which provides a means for data collection and analysis.

  7. Centre of Gravity Plethysmography--A Means of Detecting Mass Transfer of Fluid within the Body.

    ERIC Educational Resources Information Center

    Buck, Michael

    1988-01-01

    Describes the monitoring of the redistribution of blood by using a technique which detects changes in the center of gravity of the body. Provides information about the principles and application, construction of apparatus, operating routines, and use of the computer as a recorder. Includes suggested investigations, demonstrations, and diagrams.…

  8. Monitoring system and methods for a distributed and recoverable digital control system

    NASA Technical Reports Server (NTRS)

    Stange, Kent (Inventor); Hess, Richard (Inventor); Kelley, Gerald B (Inventor); Rogers, Randy (Inventor)

    2010-01-01

    A monitoring system and methods are provided for a distributed and recoverable digital control system. The monitoring system generally comprises two independent monitoring planes within the control system. The first monitoring plane is internal to the computing units in the control system, and the second monitoring plane is external to the computing units. The internal first monitoring plane includes two in-line monitors. The first internal monitor is a self-checking, lock-step-processing monitor with integrated rapid recovery capability. The second internal monitor includes one or more reasonableness monitors, which compare actual effector position with commanded effector position. The external second monitor plane includes two monitors. The first external monitor includes a pre-recovery computing monitor, and the second external monitor includes a post recovery computing monitor. Various methods for implementing the monitoring functions are also disclosed.

  9. An Embedded Device for Real-Time Noninvasive Intracranial Pressure Estimation.

    PubMed

    Matthews, Jonathan M; Fanelli, Andrea; Heldt, Thomas

    2018-01-01

    The monitoring of intracranial pressure (ICP) is indicated for diagnosing and guiding therapy in many neurological conditions. Current monitoring methods, however, are highly invasive, limiting their use to the most critically ill patients only. Our goal is to develop and test an embedded device that performs all necessary mathematical operations in real-time for noninvasive ICP (nICP) estimation based on a previously developed model-based approach that uses cerebral blood flow velocity (CBFV) and arterial blood pressure (ABP) waveforms. The nICP estimation algorithm along with the required preprocessing steps were implemented on an NXP LPC4337 microcontroller unit (MCU). A prototype device using the MCU was also developed, complete with display, recording functionality, and peripheral interfaces for ABP and CBFV monitoring hardware. The device produces an estimate of mean ICP once per minute and performs the necessary computations in 410 ms, on average. Real-time nICP estimates differed from the original batch-mode MATLAB implementation of theestimation algorithm by 0.63 mmHg (root-mean-square error). We have demonstrated that real-time nICP estimation is possible on a microprocessor platform, which offers the advantages of low cost, small size, and product modularity over a general-purpose computer. These attributes take a step toward the goal of real-time nICP estimation at the patient's bedside in a variety of clinical settings.

  10. An Improved Clustering Algorithm of Tunnel Monitoring Data for Cloud Computing

    PubMed Central

    Zhong, Luo; Tang, KunHao; Li, Lin; Yang, Guang; Ye, JingJing

    2014-01-01

    With the rapid development of urban construction, the number of urban tunnels is increasing and the data they produce become more and more complex. It results in the fact that the traditional clustering algorithm cannot handle the mass data of the tunnel. To solve this problem, an improved parallel clustering algorithm based on k-means has been proposed. It is a clustering algorithm using the MapReduce within cloud computing that deals with data. It not only has the advantage of being used to deal with mass data but also is more efficient. Moreover, it is able to compute the average dissimilarity degree of each cluster in order to clean the abnormal data. PMID:24982971

  11. LabVIEW Serial Driver Software for an Electronic Load

    NASA Technical Reports Server (NTRS)

    Scullin, Vincent; Garcia, Christopher

    2003-01-01

    A LabVIEW-language computer program enables monitoring and control of a Transistor Devices, Inc., Dynaload WCL232 (or equivalent) electronic load via an RS-232 serial communication link between the electronic load and a remote personal computer. (The electronic load can operate at constant voltage, current, power consumption, or resistance.) The program generates a graphical user interface (GUI) at the computer that looks and acts like the front panel of the electronic load. Once the electronic load has been placed in remote-control mode, this program first queries the electronic load for the present values of all its operational and limit settings, and then drops into a cycle in which it reports the instantaneous voltage, current, and power values in displays that resemble those on the electronic load while monitoring the GUI images of pushbuttons for control actions by the user. By means of the pushbutton images and associated prompts, the user can perform such operations as changing limit values, the operating mode, or the set point. The benefit of this software is that it relieves the user of the need to learn one method for operating the electronic load locally and another method for operating it remotely via a personal computer.

  12. Wireless Augmented Reality Prototype (WARP)

    NASA Technical Reports Server (NTRS)

    Devereaux, A. S.

    1999-01-01

    Initiated in January, 1997, under NASA's Office of Life and Microgravity Sciences and Applications, the Wireless Augmented Reality Prototype (WARP) is a means to leverage recent advances in communications, displays, imaging sensors, biosensors, voice recognition and microelectronics to develop a hands-free, tetherless system capable of real-time personal display and control of computer system resources. Using WARP, an astronaut may efficiently operate and monitor any computer-controllable activity inside or outside the vehicle or station. The WARP concept is a lightweight, unobtrusive heads-up display with a wireless wearable control unit. Connectivity to the external system is achieved through a high-rate radio link from the WARP personal unit to a base station unit installed into any system PC. The radio link has been specially engineered to operate within the high- interference, high-multipath environment of a space shuttle or space station module. Through this virtual terminal, the astronaut will be able to view and manipulate imagery, text or video, using voice commands to control the terminal operations. WARP's hands-free access to computer-based instruction texts, diagrams and checklists replaces juggling manuals and clipboards, and tetherless computer system access allows free motion throughout a cabin while monitoring and operating equipment.

  13. Automated Power Systems Management (APSM)

    NASA Technical Reports Server (NTRS)

    Bridgeforth, A. O.

    1981-01-01

    A breadboard power system incorporating autonomous functions of monitoring, fault detection and recovery, command and control was developed, tested and evaluated to demonstrate technology feasibility. Autonomous functions including switching of redundant power processing elements, individual load fault removal, and battery charge/discharge control were implemented by means of a distributed microcomputer system within the power subsystem. Three local microcomputers provide the monitoring, control and command function interfaces between the central power subsystem microcomputer and the power sources, power processing and power distribution elements. The central microcomputer is the interface between the local microcomputers and the spacecraft central computer or ground test equipment.

  14. The evolution of computer monitoring of real time data during the Atlas Centaur launch countdown

    NASA Technical Reports Server (NTRS)

    Thomas, W. F.

    1981-01-01

    In the last decade, improvements in computer technology have provided new 'tools' for controlling and monitoring critical missile systems. In this connection, computers have gradually taken a large role in monitoring all flights and ground systems on the Atlas Centaur. The wide body Centaur which will be launched in the Space Shuttle Cargo Bay will use computers to an even greater extent. It is planned to use the wide body Centaur to boost the Galileo spacecraft toward Jupiter in 1985. The critical systems which must be monitored prior to liftoff are examined. Computers have now been programmed to monitor all critical parameters continuously. At this time, there are two separate computer systems used to monitor these parameters.

  15. Self-monitoring of dietary intake by young women: online food records completed on computer or smartphone are as accurate as paper-based food records but more acceptable.

    PubMed

    Hutchesson, Melinda J; Rollo, Megan E; Callister, Robin; Collins, Clare E

    2015-01-01

    Adherence and accuracy of self-monitoring of dietary intake influences success in weight management interventions. Information technologies such as computers and smartphones have the potential to improve adherence and accuracy by reducing the burden associated with monitoring dietary intake using traditional paper-based food records. We evaluated the acceptability and accuracy of three different 7-day food record methods (online accessed via computer, online accessed via smartphone, and paper-based). Young women (N=18; aged 23.4±2.9 years; body mass index 24.0±2.2) completed the three 7-day food records in random order with 7-day washout periods between each method. Total energy expenditure (TEE) was derived from resting energy expenditure (REE) measured by indirect calorimetry and physical activity level (PAL) derived from accelerometers (TEE=REE×PAL). Accuracy of the three methods was assessed by calculating absolute (energy intake [EI]-TEE) and percentage difference (EI/TEE×100) between self-reported EI and TEE. Acceptability was assessed via questionnaire. Mean±standard deviation TEE was 2,185±302 kcal/day and EI was 1,729±249 kcal/day, 1,675±287kcal/day, and 1,682±352 kcal/day for computer, smartphone, and paper records, respectively. There were no significant differences between absolute and percentage differences between EI and TEE for the three methods: computer, -510±389 kcal/day (78%); smartphone, -456±372 kcal/day (80%); and paper, -503±513 kcal/day (79%). Half of participants (n=9) preferred computer recording, 44.4% preferred smartphone, and 5.6% preferred paper-based records. Most participants (89%) least preferred the paper-based record. Because online food records completed on either computer or smartphone were as accurate as paper-based records but more acceptable to young women, they should be considered when self-monitoring of intake is recommended to young women. Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  16. Method for monitoring environmental and corrosion

    DOEpatents

    Glass, R.S.; Clarke, W.L. Jr.; Ciarlo, D.R.

    1995-08-01

    A corrosion sensor array is described incorporating individual elements for measuring various elements and ions, such as chloride, sulfide, copper, hydrogen (pH), etc. and elements for evaluating the instantaneous corrosion properties of structural materials. The exact combination and number of elements measured or monitored would depend upon the environmental conditions and materials used which are subject to corrosive effects. Such a corrosion monitoring system embedded in or mounted on a structure exposed to the environment would serve as an early warning system for the onset of severe corrosion problems for the structure, thus providing a safety factor as well as economic factors. The sensor array is accessed to an electronics/computational system, which provides a means for data collection and analysis. 7 figs.

  17. Interactive water monitoring system accessible by cordless telephone

    NASA Astrophysics Data System (ADS)

    Volpicelli, Richard; Andeweg, Pierre; Hagar, William G.

    1985-12-01

    A battery-operated, microcomputer-controlled monitoring device linked with a cordless telephone has been developed for remote measurements. This environmental sensor is self-contained and collects and processes data according to the information sent to its on-board computer system. An RCA model 1805 microprocessor forms the basic controller with a program encoded in memory for data acquisition and analysis. Signals from analog sensing devices used to monitor the environment are converted into digital signals and stored in random access memory of the microcomputer. This remote sensing system is linked to the laboratory by means of a cordless telephone whose base unit is connected to regular telephone lines. This offshore sensing system is simply accessed by a phone call originating from a computer terminal in the laboratory. Data acquisition is initiated upon request: Information continues to be processed and stored until the computer is reprogrammed by another phone call request. Information obtained may be recalled by a phone call after the desired environmental measurements are finished or while they are in progress. Data sampling parameters may be reset at any time, including in the middle of a measurement cycle. The range of the system is limited only by existing telephone grid systems and by the transmission characteristics of the cordless phone used as a communications link. This use of a cordless telephone, coupled with the on-board computer system, may be applied to other field studies requiring data transfer between an on-site analytical system and the laboratory.

  18. Preliminary feasibility analysis of a pressure modulator radiometer for remote sensing of tropospheric constituents

    NASA Technical Reports Server (NTRS)

    Orr, H. D., III; Rarig, P. L.

    1981-01-01

    A pressure modulator radiometer operated in a nadir viewing mode from the top of a midlatitude summer model of the atmosphere was theoretically studied for monitoring the mean volumetric mixing ratio of carbon monoxide in the troposphere. The mechanical characteristics of the instrument on the Nimbus 7 stratospheric and mesospheric sounder experiment are assumed and CO is assumed to be the only infrared active constituent. A line by line radiative transfer computer program is used to simulate the upwelling radiation reaching the top of the atmosphere. The performance of the instrument is examined as a function of the mean pressure in and the length of the instrument gas correlation cell. Instrument sensitivity is described in terms of signal to noise ratio for a 10 percent change in CO mixing ratio. Sensitivity to mixing ratio changes is also studied. It is concluded that tropospheric monitoring requires a pressure modulator drive having a larger swept volume and producing higher compression ratios at higher mean cell pressures than the Nimbus 7 design.

  19. 40 CFR 86.005-17 - On-board diagnostics.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... other available operating parameters), and functionality checks for computer output components (proper... considered acceptable. (e) Storing of computer codes. The OBD system shall record and store in computer... monitors that can be considered continuously operating monitors (e.g., misfire monitor, fuel system monitor...

  20. 40 CFR 86.005-17 - On-board diagnostics.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... other available operating parameters), and functionality checks for computer output components (proper... considered acceptable. (e) Storing of computer codes. The OBD system shall record and store in computer... monitors that can be considered continuously operating monitors (e.g., misfire monitor, fuel system monitor...

  1. An optimized network for phosphorus load monitoring for Lake Okeechobee, Florida

    USGS Publications Warehouse

    Gain, W.S.

    1997-01-01

    Phosphorus load data were evaluated for Lake Okeechobee, Florida, for water years 1982 through 1991. Standard errors for load estimates were computed from available phosphorus concentration and daily discharge data. Components of error were associated with uncertainty in concentration and discharge data and were calculated for existing conditions and for 6 alternative load-monitoring scenarios for each of 48 distinct inflows. Benefit-cost ratios were computed for each alternative monitoring scenario at each site by dividing estimated reductions in load uncertainty by the 5-year average costs of each scenario in 1992 dollars. Absolute and marginal benefit-cost ratios were compared in an iterative optimization scheme to determine the most cost-effective combination of discharge and concentration monitoring scenarios for the lake. If the current (1992) discharge-monitoring network around the lake is maintained, the water-quality sampling at each inflow site twice each year is continued, and the nature of loading remains the same, the standard error of computed mean-annual load is estimated at about 98 metric tons per year compared to an absolute loading rate (inflows and outflows) of 530 metric tons per year. This produces a relative uncertainty of nearly 20 percent. The standard error in load can be reduced to about 20 metric tons per year (4 percent) by adopting an optimized set of monitoring alternatives at a cost of an additional $200,000 per year. The final optimized network prescribes changes to improve both concentration and discharge monitoring. These changes include the addition of intensive sampling with automatic samplers at 11 sites, the initiation of event-based sampling by observers at another 5 sites, the continuation of periodic sampling 12 times per year at 1 site, the installation of acoustic velocity meters to improve discharge gaging at 9 sites, and the improvement of a discharge rating at 1 site.

  2. An integrated decision support system for TRAC: A proposal

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    Optimal allocation and usage of resources is a key to effective management. Resources of concern to TRAC are: Manpower (PSY), Money (Travel, contracts), Computing, Data, Models, etc. Management activities of TRAC include: Planning, Programming, Tasking, Monitoring, Updating, and Coordinating. Existing systems are insufficient, not completely automated, manpower intensive, and has the potential for data inconsistency exists. A system is proposed which suggests a means to integrate all project management activities of TRAC through the development of a sophisticated software and by utilizing the existing computing systems and network resources. The systems integration proposal is examined in detail.

  3. Study to design and develop remote manipulator system

    NASA Technical Reports Server (NTRS)

    Hill, J. W.; Sword, A. J.

    1973-01-01

    Human performance measurement techniques for remote manipulation tasks and remote sensing techniques for manipulators are described for common manipulation tasks, performance is monitored by means of an on-line computer capable of measuring the joint angles of both master and slave arms as a function of time. The computer programs allow measurements of the operator's strategy and physical quantities such as task time and power consumed. The results are printed out after a test run to compare different experimental conditions. For tracking tasks, we describe a method of displaying errors in three dimensions and measuring the end-effector position in three dimensions.

  4. Wide-area, real-time monitoring and visualization system

    DOEpatents

    Budhraja, Vikram S.; Dyer, James D.; Martinez Morales, Carlos A.

    2013-03-19

    A real-time performance monitoring system for monitoring an electric power grid. The electric power grid has a plurality of grid portions, each grid portion corresponding to one of a plurality of control areas. The real-time performance monitoring system includes a monitor computer for monitoring at least one of reliability metrics, generation metrics, transmission metrics, suppliers metrics, grid infrastructure security metrics, and markets metrics for the electric power grid. The data for metrics being monitored by the monitor computer are stored in a data base, and a visualization of the metrics is displayed on at least one display computer having a monitor. The at least one display computer in one said control area enables an operator to monitor the grid portion corresponding to a different said control area.

  5. Wide-area, real-time monitoring and visualization system

    DOEpatents

    Budhraja, Vikram S [Los Angeles, CA; Dyer, James D [La Mirada, CA; Martinez Morales, Carlos A [Upland, CA

    2011-11-15

    A real-time performance monitoring system for monitoring an electric power grid. The electric power grid has a plurality of grid portions, each grid portion corresponding to one of a plurality of control areas. The real-time performance monitoring system includes a monitor computer for monitoring at least one of reliability metrics, generation metrics, transmission metrics, suppliers metrics, grid infrastructure security metrics, and markets metrics for the electric power grid. The data for metrics being monitored by the monitor computer are stored in a data base, and a visualization of the metrics is displayed on at least one display computer having a monitor. The at least one display computer in one said control area enables an operator to monitor the grid portion corresponding to a different said control area.

  6. Real-time performance monitoring and management system

    DOEpatents

    Budhraja, Vikram S [Los Angeles, CA; Dyer, James D [La Mirada, CA; Martinez Morales, Carlos A [Upland, CA

    2007-06-19

    A real-time performance monitoring system for monitoring an electric power grid. The electric power grid has a plurality of grid portions, each grid portion corresponding to one of a plurality of control areas. The real-time performance monitoring system includes a monitor computer for monitoring at least one of reliability metrics, generation metrics, transmission metrics, suppliers metrics, grid infrastructure security metrics, and markets metrics for the electric power grid. The data for metrics being monitored by the monitor computer are stored in a data base, and a visualization of the metrics is displayed on at least one display computer having a monitor. The at least one display computer in one said control area enables an operator to monitor the grid portion corresponding to a different said control area.

  7. Energy absorption buildup factors, exposure buildup factors and Kerma for optically stimulated luminescence materials and their tissue equivalence for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Singh, Vishwanath P.; Badiger, N. M.

    2014-11-01

    Optically stimulated luminescence (OSL) materials are sensitive dosimetric materials used for precise and accurate dose measurement for low-energy ionizing radiation. Low dose measurement capability with improved sensitivity makes these dosimeters very useful for diagnostic imaging, personnel monitoring and environmental radiation dosimetry. Gamma ray energy absorption buildup factors and exposure build factors were computed for OSL materials using the five-parameter Geometric Progression (G-P) fitting method in the energy range 0.015-15 MeV for penetration depths up to 40 mean free path. The computed energy absorption buildup factor and exposure buildup factor values were studied as a function of penetration depth and incident photon energy. Effective atomic numbers and Kerma relative to air of the selected OSL materials and tissue equivalence were computed and compared with that of water, PMMA and ICRU standard tissues. The buildup factors and kerma relative to air were found dependent upon effective atomic numbers. Buildup factors determined in the present work should be useful in radiation dosimetry, medical diagnostics and therapy, space dosimetry, accident dosimetry and personnel monitoring.

  8. Design for Run-Time Monitor on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.

  9. The complications and the position of the Codman MicroSensor™ ICP device: an analysis of 549 patients and 650 Sensors.

    PubMed

    Koskinen, Lars-Owe D; Grayson, David; Olivecrona, Magnus

    2013-11-01

    Complications of and insertion depth of the Codman MicroSensor ICP monitoring device (CMS) is not well studied. To study complications and the insertion depth of the CMS in a clinical setting. We identified all patients who had their intracranial pressure (ICP) monitored using a CMS device between 2002 and 2010. The medical records and post implantation computed tomography (CT) scans were analyzed for occurrence of infection, hemorrhage and insertion depth. In all, 549 patients were monitored using 650 CMS. Mean monitoring time was 7.0 ± 4.9 days. The mean implantation depth was 21.3 ± 11.1 mm (0-88 mm). In 27 of the patients, a haematoma was identified; 26 of these were less than 1 ml, and one was 8 ml. No clinically significant bleeding was found. There was no statistically significant increase in the number of hemorrhages in presumed coagulopathic patients. The infection rate was 0.6 % and the calculated infection rate per 1,000 catheter days was 0.8. The risk for hemorrhagic and infectious complications when using the CMS for ICP monitoring is low. The depth of insertion varies considerably and should be taken into account if patients are treated with head elevation, since the pressure is measured at the tip of the sensor. To meet the need for ICP monitoring, an intraparenchymal ICP monitoring device should be preferred to the use of an external ventricular drainage (EVD).

  10. 40 CFR Appendix S to Part 50 - Interpretation of the Primary National Ambient Air Quality Standards for Oxides of Nitrogen...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...

  11. 40 CFR Appendix S to Part 50 - Interpretation of the Primary National Ambient Air Quality Standards for Oxides of Nitrogen...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...

  12. 40 CFR Appendix S to Part 50 - Interpretation of the Primary National Ambient Air Quality Standards for Oxides of Nitrogen...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...

  13. 40 CFR Appendix S to Part 50 - Interpretation of the Primary National Ambient Air Quality Standards for Oxides of Nitrogen...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...

  14. 40 CFR Appendix S to Part 50 - Interpretation of the Primary National Ambient Air Quality Standards for Oxides of Nitrogen...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... measured from midnight to midnight (local standard time) that are used in NAAQS computations. Design values..., calculated as specified in section 5 of this appendix. The design values for the primary NAAQS are: (1) The annual mean value for a monitoring site for one year (referred to as the “annual primary standard design...

  15. Mean composite fire severity metrics computed with Google Earth engine offer improved accuracy and expanded mapping potential

    Treesearch

    Sean A. Parks; Lisa M. Holsinger; Morgan A. Voss; Rachel A. Loehman; Nathaniel P. Robinson

    2018-01-01

    Landsat-based fire severity datasets are an invaluable resource for monitoring and research purposes. These gridded fire severity datasets are generally produced with pre- and post-fire imagery to estimate the degree of fire-induced ecological change. Here, we introduce methods to produce three Landsat-based fire severity metrics using the Google Earth Engine (GEE)...

  16. Development and application of operational techniques for the inventory and monitoring of resources and uses for the Texas coastal zone

    NASA Technical Reports Server (NTRS)

    Harwood, P. (Principal Investigator); Malin, P.; Finley, R.; Mcculloch, S.; Murphy, D.; Hupp, B.; Schell, J. A.

    1977-01-01

    The author has identified the following significant results. Four LANDSAT scenes were analyzed for the Harbor Island area test sites to produce land cover and land use maps using both image interpretation and computer-assisted techniques. When evaluated against aerial photography, the mean accuracy for three scenes was 84% for the image interpretation product and 62% for the computer-assisted classification maps. Analysis of the fourth scene was not completed using the image interpretation technique, because of poor quality, false color composite, but was available from the computer technique. Preliminary results indicate that these LANDSAT products can be applied to a variety of planning and management activities in the Texas coastal zone.

  17. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition dion and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. The knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use is discussed. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  18. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. This paper discusses the knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  19. Evaluation of surveillance methods for monitoring house fly abundance and activity on large commercial dairy operations.

    PubMed

    Gerry, Alec C; Higginbotham, G E; Periera, L N; Lam, A; Shelton, C R

    2011-06-01

    Relative house fly, Musca domestica L., activity at three large dairies in central California was monitored during the peak fly activity period from June to August 2005 by using spot cards, fly tapes, bait traps, and Alsynite traps. Counts for all monitoring methods were significantly related at two of three dairies; with spot card counts significantly related to fly tape counts recorded the same week, and both spot card counts and fly tape counts significantly related to bait trap counts 1-2 wk later. Mean fly counts differed significantly between dairies, but a significant interaction between dairies sampled and monitoring methods used demonstrates that between-dairy comparisons are unwise. Estimate precision was determined by the coefficient of variability (CV) (or SE/mean). Using a CV = 0.15 as a desired level of estimate precision and assuming an integrate pest management (IPM) action threshold near the peak house fly activity measured by each monitoring method, house fly monitoring at a large dairy would require 12 spot cards placed in midafternoon shaded fly resting sites near cattle or seven bait traps placed in open areas near cattle. Software (FlySpotter; http://ucanr.org/ sites/FlySpotter/download/) using computer vision technology was developed to count fly spots on a scanned image of a spot card to dramatically reduce time invested in monitoring house flies. Counts provided by the FlySpotter software were highly correlated to visual counts. The use of spot cards for monitoring house flies is recommended for dairy IPM programs.

  20. An affordable cuff-less blood pressure estimation solution.

    PubMed

    Jain, Monika; Kumar, Niranjan; Deb, Sujay

    2016-08-01

    This paper presents a cuff-less hypertension pre-screening device that non-invasively monitors the Blood Pressure (BP) and Heart Rate (HR) continuously. The proposed device simultaneously records two clinically significant and highly correlated biomedical signals, viz., Electrocardiogram (ECG) and Photoplethysmogram (PPG). The device provides a common data acquisition platform that can interface with PC/laptop, Smart phone/tablet and Raspberry-pi etc. The hardware stores and processes the recorded ECG and PPG in order to extract the real-time BP and HR using kernel regression approach. The BP and HR estimation error is measured in terms of normalized mean square error, Error Standard Deviation (ESD) and Mean Absolute Error (MAE), with respect to a clinically proven digital BP monitor (OMRON HBP1300). The computed error falls under the maximum standard allowable error mentioned by Association for the Advancement of Medical Instrumentation; MAE <; 5 mmHg and ESD <; 8mmHg. The results are validated using two-tailed dependent sample t-test also. The proposed device is a portable low-cost home and clinic bases solution for continuous health monitoring.

  1. Automatic vehicle monitoring systems study. Report of phase O. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A set of planning guidelines is presented to help law enforcement agencies and vehicle fleet operators decide which automatic vehicle monitoring (AVM) system could best meet their performance requirements. Improvements in emergency response times and resultant cost benefits obtainable with various operational and planned AVM systems may be synthesized and simulated by means of special computer programs for model city parameters applicable to small, medium, and large urban areas. Design characteristics of various AVM systems and the implementation requirements are illustrated and cost estimated for the vehicles, the fixed sites, and the base equipments. Vehicle location accuracies for different RF links and polling intervals are analyzed.

  2. Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing

    PubMed Central

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P.; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data. PMID:22163811

  3. Design and development of a run-time monitor for multi-core architectures in cloud computing.

    PubMed

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  4. SCADA-based Operator Support System for Power Plant Equipment Fault Forecasting

    NASA Astrophysics Data System (ADS)

    Mayadevi, N.; Ushakumari, S. S.; Vinodchandra, S. S.

    2014-12-01

    Power plant equipment must be monitored closely to prevent failures from disrupting plant availability. Online monitoring technology integrated with hybrid forecasting techniques can be used to prevent plant equipment faults. A self learning rule-based expert system is proposed in this paper for fault forecasting in power plants controlled by supervisory control and data acquisition (SCADA) system. Self-learning utilizes associative data mining algorithms on the SCADA history database to form new rules that can dynamically update the knowledge base of the rule-based expert system. In this study, a number of popular associative learning algorithms are considered for rule formation. Data mining results show that the Tertius algorithm is best suited for developing a learning engine for power plants. For real-time monitoring of the plant condition, graphical models are constructed by K-means clustering. To build a time-series forecasting model, a multi layer preceptron (MLP) is used. Once created, the models are updated in the model library to provide an adaptive environment for the proposed system. Graphical user interface (GUI) illustrates the variation of all sensor values affecting a particular alarm/fault, as well as the step-by-step procedure for avoiding critical situations and consequent plant shutdown. The forecasting performance is evaluated by computing the mean absolute error and root mean square error of the predictions.

  5. Visual-servoing optical microscopy

    DOEpatents

    Callahan, Daniel E.; Parvin, Bahram

    2009-06-09

    The present invention provides methods and devices for the knowledge-based discovery and optimization of differences between cell types. In particular, the present invention provides visual servoing optical microscopy, as well as analysis methods. The present invention provides means for the close monitoring of hundreds of individual, living cells over time: quantification of dynamic physiological responses in multiple channels; real-time digital image segmentation and analysis; intelligent, repetitive computer-applied cell stress and cell stimulation; and the ability to return to the same field of cells for long-term studies and observation. The present invention further provides means to optimize culture conditions for specific subpopulations of cells.

  6. Visual-servoing optical microscopy

    DOEpatents

    Callahan, Daniel E [Martinez, CA; Parvin, Bahram [Mill Valley, CA

    2011-05-24

    The present invention provides methods and devices for the knowledge-based discovery and optimization of differences between cell types. In particular, the present invention provides visual servoing optical microscopy, as well as analysis methods. The present invention provides means for the close monitoring of hundreds of individual, living cells over time; quantification of dynamic physiological responses in multiple channels; real-time digital image segmentation and analysis; intelligent, repetitive computer-applied cell stress and cell stimulation; and the ability to return to the same field of cells for long-term studies and observation. The present invention further provides means to optimize culture conditions for specific subpopulations of cells.

  7. Visual-servoing optical microscopy

    DOEpatents

    Callahan, Daniel E; Parvin, Bahram

    2013-10-01

    The present invention provides methods and devices for the knowledge-based discovery and optimization of differences between cell types. In particular, the present invention provides visual servoing optical microscopy, as well as analysis methods. The present invention provides means for the close monitoring of hundreds of individual, living cells over time; quantification of dynamic physiological responses in multiple channels; real-time digital image segmentation and analysis; intelligent, repetitive computer-applied cell stress and cell stimulation; and the ability to return to the same field of cells for long-term studies and observation. The present invention further provides means to optimize culture conditions for specific subpopulations of cells.

  8. Transparent Proxy for Secure E-Mail

    NASA Astrophysics Data System (ADS)

    Michalák, Juraj; Hudec, Ladislav

    2010-05-01

    The paper deals with the security of e-mail messages and e-mail server implementation by means of a transparent SMTP proxy. The security features include encryption and signing of transported messages. The goal is to design and implement a software proxy for secure e-mail including its monitoring, administration, encryption and signing keys administration. In particular, we focus on automatic public key on-the-fly encryption and signing of e-mail messages according to S/MIME standard by means of an embedded computer system whose function can be briefly described as a brouter with transparent SMTP proxy.

  9. Monitoring and control of amygdala neurofeedback involves distributed information processing in the human brain.

    PubMed

    Paret, Christian; Zähringer, Jenny; Ruf, Matthias; Gerchen, Martin Fungisai; Mall, Stephanie; Hendler, Talma; Schmahl, Christian; Ende, Gabriele

    2018-03-30

    Brain-computer interfaces provide conscious access to neural activity by means of brain-derived feedback ("neurofeedback"). An individual's abilities to monitor and control feedback are two necessary processes for effective neurofeedback therapy, yet their underlying functional neuroanatomy is still being debated. In this study, healthy subjects received visual feedback from their amygdala response to negative pictures. Activation and functional connectivity were analyzed to disentangle the role of brain regions in different processes. Feedback monitoring was mapped to the thalamus, ventromedial prefrontal cortex (vmPFC), ventral striatum (VS), and rostral PFC. The VS responded to feedback corresponding to instructions while rPFC activity differentiated between conditions and predicted amygdala regulation. Control involved the lateral PFC, anterior cingulate, and insula. Monitoring and control activity overlapped in the VS and thalamus. Extending current neural models of neurofeedback, this study introduces monitoring and control of feedback as anatomically dissociated processes, and suggests their important role in voluntary neuromodulation. © 2018 Wiley Periodicals, Inc.

  10. Role of Statistical Random-Effects Linear Models in Personalized Medicine.

    PubMed

    Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose

    2012-03-01

    Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization.

  11. Using a software-defined computer in teaching the basics of computer architecture and operation

    NASA Astrophysics Data System (ADS)

    Kosowska, Julia; Mazur, Grzegorz

    2017-08-01

    The paper describes the concept and implementation of SDC_One software-defined computer designed for experimental and didactic purposes. Equipped with extensive hardware monitoring mechanisms, the device enables the students to monitor the computer's operation on bus transfer cycle or instruction cycle basis, providing the practical illustration of basic aspects of computer's operation. In the paper, we describe the hardware monitoring capabilities of SDC_One and some scenarios of using it in teaching the basics of computer architecture and microprocessor operation.

  12. The effect of baseline pressure errors on an intracranial pressure-derived index: results of a prospective observational study

    PubMed Central

    2014-01-01

    Background In order to characterize the intracranial pressure-volume reserve capacity, the correlation coefficient (R) between the ICP wave amplitude (A) and the mean ICP level (P), the RAP index, has been used to improve the diagnostic value of ICP monitoring. Baseline pressure errors (BPEs), caused by spontaneous shifts or drifts in baseline pressure, cause erroneous readings of mean ICP. Consequently, BPEs could also affect ICP indices such as the RAP where in the mean ICP is incorporated. Methods A prospective, observational study was carried out on patients with aneurysmal subarachnoid hemorrhage (aSAH) undergoing ICP monitoring as part of their surveillance. Via the same burr hole in the scull, two separate ICP sensors were placed close to each other. For each consecutive 6-sec time window, the dynamic mean ICP wave amplitude (MWA; measure of the amplitude of the single pressure waves) and the static mean ICP, were computed. The RAP index was computed as the Pearson correlation coefficient between the MWA and the mean ICP for 40 6-sec time windows, i.e. every subsequent 4-min period (method 1). We compared this approach with a method of calculating RAP using a 4-min moving window updated every 6 seconds (method 2). Results The study included 16 aSAH patients. We compared 43,653 4-min RAP observations of signals 1 and 2 (method 1), and 1,727,000 6-sec RAP observations (method 2). The two methods of calculating RAP produced similar results. Differences in RAP ≥0.4 in at least 7% of observations were seen in 5/16 (31%) patients. Moreover, the combination of a RAP of ≥0.6 in one signal and <0.6 in the other was seen in ≥13% of RAP-observations in 4/16 (25%) patients, and in ≥8% in another 4/16 (25%) patients. The frequency of differences in RAP >0.2 was significantly associated with the frequency of BPEs (5 mmHg ≤ BPE <10 mmHg). Conclusions Simultaneous monitoring from two separate, close-by ICP sensors reveals significant differences in RAP that correspond to the occurrence of BPEs. As differences in RAP are of magnitudes that may alter patient management, we do not advocate the use of RAP in the management of neurosurgical patients. PMID:25052470

  13. Estimated nitrogen loads from selected tributaries in Connecticut draining to Long Island Sound, 1999–2009

    USGS Publications Warehouse

    Mullaney, John R.; Schwarz, Gregory E.

    2013-01-01

    The total nitrogen load to Long Island Sound from Connecticut and contributing areas to the north was estimated for October 1998 to September 2009. Discrete measurements of total nitrogen concentrations and continuous flow data from 37 water-quality monitoring stations in the Long Island Sound watershed were used to compute total annual nitrogen yields and loads. Total annual computed yields and basin characteristics were used to develop a generalized-least squares regression model for use in estimating the total nitrogen yields from unmonitored areas in coastal and central Connecticut. Significant variables in the regression included the percentage of developed land, percentage of row crops, point-source nitrogen yields from wastewater-treatment facilities, and annual mean streamflow. Computed annual median total nitrogen yields at individual monitoring stations ranged from less than 2,000 pounds per square mile in mostly forested basins (typically less than 10 percent developed land) to more than 13,000 pounds per square mile in urban basins (greater than 40 percent developed) with wastewater-treatment facilities and in one agricultural basin. Medians of computed total annual nitrogen yields for water years 1999–2009 at most stations were similar to those previously computed for water years 1988–98. However, computed medians of annual yields at several stations, including the Naugatuck River, Quinnipiac River, and Hockanum River, were lower than during 1988–98. Nitrogen yields estimated for 26 unmonitored areas downstream from monitoring stations ranged from less than 2,000 pounds per square mile to 34,000 pounds per square mile. Computed annual total nitrogen loads at the farthest downstream monitoring stations were combined with the corresponding estimates for the downstream unmonitored areas for a combined estimate of the total nitrogen load from the entire study area. Resulting combined total nitrogen loads ranged from 38 to 68 million pounds per year during water years 1999–2009. Total annual loads from the monitored basins represent 63 to 74 percent of the total load. Computed annual nitrogen loads from four stations near the Massachusetts border with Connecticut represent 52 to 54 percent of the total nitrogen load during water years 2008–9, the only years with data for all the border sites. During the latter part of the 1999–2009 study period, total nitrogen loads to Long Island Sound from the study area appeared to increase slightly. The apparent increase in loads may be due to higher than normal streamflows, which consequently increased nonpoint nitrogen loads during the study, offsetting major reductions of nitrogen from wastewater-treatment facilities. Nitrogen loads from wastewater treatment facilities declined as much as 2.3 million pounds per year in areas of Connecticut upstream from the monitoring stations and as much as 5.8 million pounds per year in unmonitored areas downstream in coastal and central Connecticut.

  14. Automatic liver tumor segmentation on computed tomography for patient treatment planning and monitoring

    PubMed Central

    Moghbel, Mehrdad; Mashohor, Syamsiah; Mahmud, Rozi; Saripan, M. Iqbal Bin

    2016-01-01

    Segmentation of liver tumors from Computed Tomography (CT) and tumor burden analysis play an important role in the choice of therapeutic strategies for liver diseases and treatment monitoring. In this paper, a new segmentation method for liver tumors from contrast-enhanced CT imaging is proposed. As manual segmentation of tumors for liver treatment planning is both labor intensive and time-consuming, a highly accurate automatic tumor segmentation is desired. The proposed framework is fully automatic requiring no user interaction. The proposed segmentation evaluated on real-world clinical data from patients is based on a hybrid method integrating cuckoo optimization and fuzzy c-means algorithm with random walkers algorithm. The accuracy of the proposed method was validated using a clinical liver dataset containing one of the highest numbers of tumors utilized for liver tumor segmentation containing 127 tumors in total with further validation of the results by a consultant radiologist. The proposed method was able to achieve one of the highest accuracies reported in the literature for liver tumor segmentation compared to other segmentation methods with a mean overlap error of 22.78 % and dice similarity coefficient of 0.75 in 3Dircadb dataset and a mean overlap error of 15.61 % and dice similarity coefficient of 0.81 in MIDAS dataset. The proposed method was able to outperform most other tumor segmentation methods reported in the literature while representing an overlap error improvement of 6 % compared to one of the best performing automatic methods in the literature. The proposed framework was able to provide consistently accurate results considering the number of tumors and the variations in tumor contrast enhancements and tumor appearances while the tumor burden was estimated with a mean error of 0.84 % in 3Dircadb dataset. PMID:27540353

  15. Autonomic Computing: Panacea or Poppycock?

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy; Hinchey, Mike

    2005-01-01

    Autonomic Computing arose out of a need for a means to cope with rapidly growing complexity of integrating, managing, and operating computer-based systems as well as a need to reduce the total cost of ownership of today's systems. Autonomic Computing (AC) as a discipline was proposed by IBM in 2001, with the vision to develop self-managing systems. As the name implies, the influence for the new paradigm is the human body's autonomic system, which regulates vital bodily functions such as the control of heart rate, the body's temperature and blood flow-all without conscious effort. The vision is to create selfivare through self-* properties. The initial set of properties, in terms of objectives, were self-configuring, self-healing, self-optimizing and self-protecting, along with attributes of self-awareness, self-monitoring and self-adjusting. This self-* list has grown: self-anticipating, self-critical, self-defining, self-destructing, self-diagnosis, self-governing, self-organized, self-reflecting, and self-simulation, for instance.

  16. Processing of the WLCG monitoring data using NoSQL

    NASA Astrophysics Data System (ADS)

    Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  17. Usefulness of LANDSAT data for monitoring plant development and range conditions in California's annual grassland

    NASA Technical Reports Server (NTRS)

    Carneggie, D. M.; Degloria, S. D.; Colwell, R. N.

    1975-01-01

    A network of sampling sites throughout the annual grassland region of California was established to correlate plant growth stages and forage production to climatic and other environmental factors. Plant growth and range conditions were further related to geographic location and seasonal variations. A sequence of LANDSAT data was obtained covering critical periods in the growth cycle. This was analyzed by both photointerpretation and computer aided techniques. Image characteristics and spectral reflectance data were then related to forage production, range condition, range site and changing growth conditions. It was determined that repeat sequences with LANDSAT color composite images do provide a means for monitoring changes in range condition. Spectral radiance data obtained from magnetic tape can be used to determine quantitatively the critical stages in the forage growth cycle. A computer ratioing technique provided a sensitive indicator of changes in growth stages and an indication of the relative differences in forage production between range sites.

  18. "Smart" Sensor Module

    NASA Technical Reports Server (NTRS)

    Mahajan, Ajay

    2007-01-01

    An assembly that contains a sensor, sensor-signal-conditioning circuitry, a sensor-readout analog-to-digital converter (ADC), data-storage circuitry, and a microprocessor that runs special-purpose software and communicates with one or more external computer(s) has been developed as a prototype of "smart" sensor modules for monitoring the integrity and functionality (the "health") of engineering systems. Although these modules are now being designed specifically for use on rocket-engine test stands, it is anticipated that they could also readily be designed to be incorporated into health-monitoring subsystems of such diverse engineering systems as spacecraft, aircraft, land vehicles, bridges, buildings, power plants, oilrigs, and defense installations. The figure is a simplified block diagram of the "smart" sensor module. The analog sensor readout signal is processed by the ADC, the digital output of which is fed to the microprocessor. By means of a standard RS-232 cable, the microprocessor is connected to a local personal computer (PC), from which software is downloaded into a randomaccess memory in the microprocessor. The local PC is also used to debug the software. Once the software is running, the local PC is disconnected and the module is controlled by, and all output data from the module are collected by, a remote PC via an Ethernet bus. Several smart sensor modules like this one could be connected to the same Ethernet bus and controlled by the single remote PC. The software running in the microprocessor includes driver programs for operation of the sensor, programs that implement self-assessment algorithms, programs that implement protocols for communication with the external computer( s), and programs that implement evolutionary methodologies to enable the module to improve its performance over time. The design of the module and of the health-monitoring system of which it is a part reflects the understanding that the main purpose of a health-monitoring system is to detect damage and, therefore, the health-monitoring system must be able to function effectively in the presence of damage and should be capable of distinguishing between damage to itself and damage to the system being monitored. A major benefit afforded by the self-assessment algorithms is that in the output of the module, the sensor data indicative of the health of the engineering system being monitored are coupled with a confidence factor that quantifies the degree of reliability of the data. Hence, the output includes information on the health of the sensor module itself in addition to information on the health of the engineering system being monitored.

  19. Predicting adult pulmonary ventilation volume and wearing complianceby on-board accelerometry during personal level exposure assessments

    NASA Astrophysics Data System (ADS)

    Rodes, C. E.; Chillrud, S. N.; Haskell, W. L.; Intille, S. S.; Albinali, F.; Rosenberger, M. E.

    2012-09-01

    BackgroundMetabolic functions typically increase with human activity, but optimal methods to characterize activity levels for real-time predictions of ventilation volume (l min-1) during exposure assessments have not been available. Could tiny, triaxial accelerometers be incorporated into personal level monitors to define periods of acceptable wearing compliance, and allow the exposures (μg m-3) to be extended to potential doses in μg min-1 kg-1 of body weight? ObjectivesIn a pilot effort, we tested: 1) whether appropriately-processed accelerometer data could be utilized to predict compliance and in linear regressions to predict ventilation volumes in real-time as an on-board component of personal level exposure sensor systems, and 2) whether locating the exposure monitors on the chest in the breathing zone, provided comparable accelerometric data to other locations more typically utilized (waist, thigh, wrist, etc.). MethodsPrototype exposure monitors from RTI International and Columbia University were worn on the chest by a pilot cohort of adults while conducting an array of scripted activities (all <10 METS), spanning common recumbent, sedentary, and ambulatory activity categories. Referee Wocket accelerometers that were placed at various body locations allowed comparison with the chest-located exposure sensor accelerometers. An Oxycon Mobile mask was used to measure oral-nasal ventilation volumes in-situ. For the subset of participants with complete data (n = 22), linear regressions were constructed (processed accelerometric variable versus ventilation rate) for each participant and exposure monitor type, and Pearson correlations computed to compare across scenarios. ResultsTriaxial accelerometer data were demonstrated to be adequately sensitive indicators for predicting exposure monitor wearing compliance. Strong linear correlations (R values from 0.77 to 0.99) were observed for all participants for both exposure sensor accelerometer variables against ventilation volume for recumbent, sedentary, and ambulatory activities with MET values ˜<6. The RTI monitors mean R value of 0.91 was slightly higher than the Columbia monitors mean of 0.86 due to utilizing a 20 Hz data rate instead of a slower 1 Hz rate. A nominal mean regression slope was computed for the RTI system across participants and showed a modest RSD of +/-36.6%. Comparison of the correlation values of the exposure monitors with the Wocket accelerometers at various body locations showed statistically identical regressions for all sensors at alternate hip, ankle, upper arm, thigh, and pocket locations, but not for the Wocket accelerometer located at the dominant side wrist location (R = 0.57; p = 0.016). ConclusionsEven with a modest number of adult volunteers, the consistency and linearity of regression slopes for all subjects were very good with excellent within-person Pearson correlations for the accelerometer versus ventilation volume data. Computing accelerometric standard deviations allowed good sensitivity for compliance assessments even for sedentary activities. These pilot findings supported the hypothesis that a common linear regression is likely to be usable for a wider range of adults to predict ventilation volumes from accelerometry data over a range of low to moderate energy level activities. The predicted volumes would then allow real-time estimates of potential dose, enabling more robust panel studies. The poorer correlation in predicting ventilation rate for an accelerometer located on the wrist suggested that this location should not be considered for predictions of ventilation volume.

  20. Incipient fault detection study for advanced spacecraft systems

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Black, Michael C.; Hovenga, J. Mike; Mcclure, Paul F.

    1986-01-01

    A feasibility study to investigate the application of vibration monitoring to the rotating machinery of planned NASA advanced spacecraft components is described. Factors investigated include: (1) special problems associated with small, high RPM machines; (2) application across multiple component types; (3) microgravity; (4) multiple fault types; (5) eight different analysis techniques including signature analysis, high frequency demodulation, cepstrum, clustering, amplitude analysis, and pattern recognition are compared; and (6) small sample statistical analysis is used to compare performance by computation of probability of detection and false alarm for an ensemble of repeated baseline and faulted tests. Both detection and classification performance are quantified. Vibration monitoring is shown to be an effective means of detecting the most important problem types for small, high RPM fans and pumps typical of those planned for the advanced spacecraft. A preliminary monitoring system design and implementation plan is presented.

  1. Influence of room lighting on grey-scale perception with a CRT-and a TFT monitor display.

    PubMed

    Haak, R; Wicht, M J; Hellmich, M; Nowak, G; Noack, M J

    2002-05-01

    To determine the influence of ambient lighting on grey-scale perception using a cathode-ray tube (CRT) and a thin film transistor (TFT) computer display. A cathode ray tube (Nokia XS 446) and a liquid crystal display (Panasonic LC 50S) were used at reduced room lighting (70 lux) and under conditions recommended for a dental operatory (1000 lux). Twenty-seven observers examined twice a modified SMPTE test pattern [0 to 255; 255 to 0] grey-scale values. The corresponding contrast differences were allocated to four ranges of grey levels (I: 0-63; II: 64-127; III: 128-191; IV: 192-255). The influences of monitor type, grey-scale range and illumination were evaluated by means of repeated measures analysis of variance. Detection of differences in monochromatic intensity was significantly earlier with reduced lighting (P<0.0001). When full ambient lighting was used, the TFT display was superior compared to the CRT monitor in ranges II and III (P<0.0001), whereas no differences could be detected for grey intensities between 0 and 63 (P=0.71) and between 192 and 255 (P=0.36). Background lighting hampers grey-scale perception on computer displays. In this study of one TFT and one CRT monitor, the TFT in full ambient lighting was associated with earlier detection of grey scale differences than CRT.

  2. Corrosion sensor

    DOEpatents

    Glass, Robert S.; Clarke, Jr., Willis L.; Ciarlo, Dino R.

    1994-01-01

    A corrosion sensor array incorporating individual elements for measuring various elements and ions, such as chloride, sulfide, copper, hydrogen (pH), etc. and elements for evaluating the instantaneous corrosion properties of structural materials. The exact combination and number of elements measured or monitored would depend upon the environmental conditions and materials used which are subject to corrosive effects. Such a corrosion monitoring system embedded in or mounted on a structure exposed to the environment would serve as an early warning system for the onset of severe corrosion problems for the structure, thus providing a safety factor as well as economic factors. The sensor array is accessed to an electronics/computational system, which provides a means for data collection and analysis.

  3. Corrosion sensor

    DOEpatents

    Glass, R.S.; Clarke, W.L. Jr.; Ciarlo, D.R.

    1994-04-26

    A corrosion sensor array is described incorporating individual elements for measuring various elements and ions, such as chloride, sulfide, copper, hydrogen (pH), etc. and elements for evaluating the instantaneous corrosion properties of structural materials. The exact combination and number of elements measured or monitored would depend upon the environmental conditions and materials used which are subject to corrosive effects. Such a corrosion monitoring system embedded in or mounted on a structure exposed to the environment would serve as an early warning system for the onset of severe corrosion problems for the structure, thus providing a safety factor as well as economic factors. The sensor array is accessed to an electronics/computational system, which provides a means for data collection and analysis. 7 figures.

  4. Automatic visual monitoring of welding procedure in stainless steel kegs

    NASA Astrophysics Data System (ADS)

    Leo, Marco; Del Coco, Marco; Carcagnì, Pierluigi; Spagnolo, Paolo; Mazzeo, Pier Luigi; Distante, Cosimo; Zecca, Raffaele

    2018-05-01

    In this paper a system for automatic visual monitoring of welding process, in dry stainless steel kegs for food storage, is proposed. In the considered manufacturing process the upper and lower skirts are welded to the vessel by means of Tungsten Inert Gas (TIG) welding. During the process several problems can arise: 1) residuals on the bottom 2) darker weld 3) excessive/poor penetration and 4) outgrowths. The proposed system deals with all the four aforementioned problems and its inspection performances have been evaluated by using a large set of kegs demonstrating both the reliability in terms of defect detection and the suitability to be introduced in the manufacturing system in terms of computational costs.

  5. Energy expenditure prediction via a footwear-based physical activity monitor: Accuracy and comparison to other devices

    NASA Astrophysics Data System (ADS)

    Dannecker, Kathryn

    2011-12-01

    Accurately estimating free-living energy expenditure (EE) is important for monitoring or altering energy balance and quantifying levels of physical activity. The use of accelerometers to monitor physical activity and estimate physical activity EE is common in both research and consumer settings. Recent advances in physical activity monitors include the ability to identify specific activities (e.g. stand vs. walk) which has resulted in improved EE estimation accuracy. Recently, a multi-sensor footwear-based physical activity monitor that is capable of achieving 98% activity identification accuracy has been developed. However, no study has compared the EE estimation accuracy for this monitor and compared this accuracy to other similar devices. Purpose . To determine the accuracy of physical activity EE estimation of a footwear-based physical activity monitor that uses an embedded accelerometer and insole pressure sensors and to compare this accuracy against a variety of research and consumer physical activity monitors. Methods. Nineteen adults (10 male, 9 female), mass: 75.14 (17.1) kg, BMI: 25.07(4.6) kg/m2 (mean (SD)), completed a four hour stay in a room calorimeter. Participants wore a footwear-based physical activity monitor, as well as three physical activity monitoring devices used in research: hip-mounted Actical and Actigraph accelerometers and a multi-accelerometer IDEEA device with sensors secured to the limb and chest. In addition, participants wore two consumer devices: Philips DirectLife and Fitbit. Each individual performed a series of randomly assigned and ordered postures/activities including lying, sitting (quietly and using a computer), standing, walking, stepping, cycling, sweeping, as well as a period of self-selected activities. We developed branched (i.e. activity specific) linear regression models to estimate EE from the footwear-based device, and we used the manufacturer's software to estimate EE for all other devices. Results. The shoe-based device was not significantly different than the mean measured EE (476(20) vs. 478(18) kcal) (Mean(SE)), respectively, and had the lowest root mean square error (RMSE) by two-fold (29.6 kcal (6.19%)). The IDEEA (445(23) kcal) and DirecLlife (449(13) kcal) estimates of EE were also not different than the measured EE. The Actigraph, Fitbit and Actical devices significantly underestimated EE (339 (19) kcal, 363(18) kcal and 383(17) kcal, respectively (p<.05)). Root mean square errors were 62.1 kcal (14%), 88.2 kcal(18%), 122.2 kcal (27%), 130.1 kcal (26%), and 143.2 kcal (28%) for DirectLife, IDEEA, Actigraph, Actical and Fitbit respectively. Conclusions. The shoe based physical activity monitor was able to accurately estimate EE. The research and consumer physical activity monitors tested have a wide range of accuracy when estimating EE. Given the similar hardware of these devices, these results suggest that the algorithms used to estimate EE are primarily responsible for their accuracy, particularly the ability of the shoe-based device to estimate EE based on activity classifications.

  6. The application of data mining and cloud computing techniques in data-driven models for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.

    2016-04-01

    Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.

  7. Newberry Combined Gravity 2016

    DOE Data Explorer

    Kelly Rose

    2016-01-22

    Newberry combined gravity from Zonge Int'l, processed for the EGS stimulation project at well 55-29. Includes data from both Davenport 2006 collection and for OSU/4D EGS monitoring 2012 collection. Locations are NAD83, UTM Zone 10 North, meters. Elevation is NAVD88. Gravity in milligals. Free air and observed gravity are included, along with simple Bouguer anomaly and terrain corrected Bouguer anomaly. SBA230 means simple Bouguer anomaly computed at 2.30 g/cc. CBA230 means terrain corrected Bouguer anomaly at 2.30 g/cc. This suite of densities are included (g/cc): 2.00, 2.10, 2.20, 2.30, 2.40, 2.50, 2.67.

  8. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey

    2018-02-01

    At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.

  9. Role of Statistical Random-Effects Linear Models in Personalized Medicine

    PubMed Central

    Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose

    2012-01-01

    Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization. PMID:23467392

  10. Mercury in Precipitation in Indiana, January 2004-December 2005

    USGS Publications Warehouse

    Risch, Martin R.; Fowler, Kathleen K.

    2008-01-01

    Mercury in precipitation was monitored during 2004-2005 at five locations in Indiana as part of the National Atmospheric Deposition Program-Mercury Deposition Network (NADP-MDN). Monitoring stations were operated at Roush Lake near Huntington, Clifty Falls State Park near Madison, Fort Harrison State Park near Indianapolis, Monroe County Regional Airport near Bloomington, and Indiana Dunes National Lakeshore near Porter. At these monitoring stations, precipitation amounts were measured continuously and weekly samples were collected for analysis of mercury by methods achieving detection limits as low as 0.05 ng/L (nanograms per liter). Wet deposition was computed as the product of mercury concentration and precipitation. The data were analyzed for seasonal patterns, temporal trends, and geographic differences. In the 2 years, 520 weekly samples were collected at the 5 monitoring stations and 448 of these samples had sufficient precipitation to compute mercury wet deposition. The 2-year mean mercury concentration at the five monitoring stations (normalized to the sample volume) was 10.6 ng/L. As a reference for comparison, the total mercury concentration in 41 percent of the samples analyzed was greater than the statewide Indiana water-quality standard for mercury (12 ng/L, protecting aquatic life) and 99 percent of the concentrations exceeded the most conservative Indiana water-quality criterion (1.3 ng/L, protecting wild mammals and birds). The normalized annual mercury concentration at Clifty Falls in 2004 was the fourth highest in the NADP-MDN in eastern North America that year. In 2005, the mercury concentrations at Clifty Falls and Indiana Dunes were the ninth highest in the NADP-MDN in eastern North America. At the five monitoring stations during the study period, the mean weekly total mercury deposition was 0.208 ug/m2 (micrograms per square meter) and mean annual total mercury deposition was 10.8 ug/m2. The annual mercury deposition at Clifty Falls in 2004 and 2005 was in the top 25 percent of the NADP-MDN stations in eastern North America. Mercury concentrations and deposition varied at the five monitoring stations during 2004-2005. Mercury concentrations in wet-deposition samples ranged from 1.2 to 116.6 ng/L and weekly mercury deposition ranged from 0.002 to 1.74 ug/m2. Data from weekly samples exhibited seasonal patterns. During April through September, total mercury concentrations and deposition were higher than the median for all samples. Annual precipitation at four of the five monitoring stations was within 10 percent of normal both years, with the exception of Indiana Dunes, where precipitation was 23 percent below normal in 2005. Episodes of high mercury deposition, which were the top 10 percent of weekly mercury deposition at the five monitoring stations, contributed 39 percent of all mercury deposition during 2004-2005. Mercury deposition more than 1.04 ug/m2 (5 times the mean weekly deposition) was recorded for 12 samples. These episodes of highest mercury deposition were recorded at all five monitoring stations, but the most (7 of 12) were at Clifty Falls and contributed 34.4 percent of the total deposition at that station during 2004-2005. Weekly samples with high mercury deposition may help to explain the differences in annual mercury deposition among the five monitoring stations in Indiana. A statistical evaluation of the monitoring data for 2001-2005 indicated several statistically significant temporal trends. A statewide (5-station) decrease (p = 0.007) in mercury deposition and a statewide decrease (p = 0.059) in mercury concentration were shown. Decreases in mercury deposition (p = 0.061 and p = 0.083) were observed at Roush Lake and Bloomington. A statistically significant trend was not observed for precipitation at the five monitoring stations during this 5-year period. A potential explanation for part of the statewide decrease in mercury concentration and mercury deposition was a 2

  11. Real-time human collaboration monitoring and intervention

    DOEpatents

    Merkle, Peter B.; Johnson, Curtis M.; Jones, Wendell B.; Yonas, Gerold; Doser, Adele B.; Warner, David J.

    2010-07-13

    A method of and apparatus for monitoring and intervening in, in real time, a collaboration between a plurality of subjects comprising measuring indicia of physiological and cognitive states of each of the plurality of subjects, communicating the indicia to a monitoring computer system, with the monitoring computer system, comparing the indicia with one or more models of previous collaborative performance of one or more of the plurality of subjects, and with the monitoring computer system, employing the results of the comparison to communicate commands or suggestions to one or more of the plurality of subjects.

  12. Local area network with fault-checking, priorities, and redundant backup

    NASA Technical Reports Server (NTRS)

    Morales, Sergio (Inventor); Friedman, Gary L. (Inventor)

    1989-01-01

    This invention is a redundant error detecting and correcting local area networked computer system having a plurality of nodes each including a network connector board within the node for connecting to an interfacing transceiver operably attached to a network cable. There is a first network cable disposed along a path to interconnect the nodes. The first network cable includes a plurality of first interfacing transceivers attached thereto. A second network cable is disposed in parallel with the first cable and, in like manner, includes a plurality of second interfacing transceivers attached thereto. There are a plurality of three position switches each having a signal input, three outputs for individual selective connection to the input, and a control input for receiving signals designating which of the outputs is to be connected to the signal input. Each of the switches includes means for designating a response address for responding to addressed signals appearing at the control input and each of the switches further has its signal input connected to a respective one of the input/output lines from the nodes. Also, one of the three outputs is connected to a repective one of the plurality of first interfacing transceivers. There is master switch control means having an output connected to the control inputs of the plurality of three position switches and an input for receiving directive signals for outputting addressed switch position signals to the three position switches as well as monitor and control computer means having a pair of network connector boards therein connected to respective ones of one of the first interfacing transceivers and one of the second interfacing transceivers and an output connected to the input of the master switch means for monitoring the status of the networked computer system by sending messages to the nodes and receiving and verifying messages therefrom and for sending control signals to the master switch to cause the master switch to cause respective ones of the nodes to use a desired one of the first and second cables for transmitting and receiving messages and for disconnecting desired ones of the nodes from both cables.

  13. Pulse oximetry-derived respiratory rate in general care floor patients.

    PubMed

    Addison, Paul S; Watson, James N; Mestek, Michael L; Ochs, James P; Uribe, Alberto A; Bergese, Sergio D

    2015-02-01

    Respiratory rate is recognized as a clinically important parameter for monitoring respiratory status on the general care floor (GCF). Currently, intermittent manual assessment of respiratory rate is the standard of care on the GCF. This technique has several clinically-relevant shortcomings, including the following: (1) it is not a continuous measurement, (2) it is prone to observer error, and (3) it is inefficient for the clinical staff. We report here on an algorithm designed to meet clinical needs by providing respiratory rate through a standard pulse oximeter. Finger photoplethysmograms were collected from a cohort of 63 GCF patients monitored during free breathing over a 25-min period. These were processed using a novel in-house algorithm based on continuous wavelet-transform technology within an infrastructure incorporating confidence-based averaging and logical decision-making processes. The computed oximeter respiratory rates (RRoxi) were compared to an end-tidal CO2 reference rate (RRETCO2). RRETCO2 ranged from a lowest recorded value of 4.7 breaths per minute (brpm) to a highest value of 32.0 brpm. The mean respiratory rate was 16.3 brpm with standard deviation of 4.7 brpm. Excellent agreement was found between RRoxi and RRETCO2, with a mean difference of -0.48 brpm and standard deviation of 1.77 brpm. These data demonstrate that our novel respiratory rate algorithm is a potentially viable method of monitoring respiratory rate in GCF patients. This technology provides the means to facilitate continuous monitoring of respiratory rate, coupled with arterial oxygen saturation and pulse rate, using a single non-invasive sensor in low acuity settings.

  14. Clinical and Radiographic Response of Extramedullary Leukemia in Patients Treated With Gemtuzumab Ozogamicin.

    PubMed

    McNeil, Michael J; Parisi, Marguerite T; Hijiya, Nobuko; Meshinchi, Soheil; Cooper, Todd; Tarlock, Katherine

    2018-05-04

    Extramedullary leukemia (EML) is common in pediatric acute leukemia and can present at diagnosis or relapse. CD33 is detected on the surface of myeloid blasts in many patients with acute myelogenous leukemia and is the target of the antibody drug conjugate gemtuzumab ozogamicin (GO). Here we present 2 patients with CD33 EML treated with GO. They achieved significant response, with reduction of EML on both clinical and radiographic exams, specifically fluorine fluorodeoxyglucose positron emission tomography/computed tomography, demonstrating potential for targeted therapy with GO as a means of treating EML in patients with CD33 leukemia and the utility of fluorine fluorodeoxyglucose positron emission tomography/computed tomography monitoring in EML.

  15. Building a robust vehicle detection and classification module

    NASA Astrophysics Data System (ADS)

    Grigoryev, Anton; Khanipov, Timur; Koptelov, Ivan; Bocharov, Dmitry; Postnikov, Vassily; Nikolaev, Dmitry

    2015-12-01

    The growing adoption of intelligent transportation systems (ITS) and autonomous driving requires robust real-time solutions for various event and object detection problems. Most of real-world systems still cannot rely on computer vision algorithms and employ a wide range of costly additional hardware like LIDARs. In this paper we explore engineering challenges encountered in building a highly robust visual vehicle detection and classification module that works under broad range of environmental and road conditions. The resulting technology is competitive to traditional non-visual means of traffic monitoring. The main focus of the paper is on software and hardware architecture, algorithm selection and domain-specific heuristics that help the computer vision system avoid implausible answers.

  16. Effects of computer monitor-emitted radiation on oxidant/antioxidant balance in cornea and lens from rats

    PubMed Central

    Namuslu, Mehmet; Devrim, Erdinç; Durak, İlker

    2009-01-01

    Purpose This study aims to investigate the possible effects of computer monitor-emitted radiation on the oxidant/antioxidant balance in corneal and lens tissues and to observe any protective effects of vitamin C (vit C). Methods Four groups (PC monitor, PC monitor plus vitamin C, vitamin C, and control) each consisting of ten Wistar rats were studied. The study lasted for three weeks. Vitamin C was administered in oral doses of 250 mg/kg/day. The computer and computer plus vitamin C groups were exposed to computer monitors while the other groups were not. Malondialdehyde (MDA) levels and superoxide dismutase (SOD), glutathione peroxidase (GSH-Px), and catalase (CAT) activities were measured in corneal and lens tissues of the rats. Results In corneal tissue, MDA levels and CAT activity were found to increase in the computer group compared with the control group. In the computer plus vitamin C group, MDA level, SOD, and GSH-Px activities were higher and CAT activity lower than those in the computer and control groups. Regarding lens tissue, in the computer group, MDA levels and GSH-Px activity were found to increase, as compared to the control and computer plus vitamin C groups, and SOD activity was higher than that of the control group. In the computer plus vitamin C group, SOD activity was found to be higher and CAT activity to be lower than those in the control group. Conclusion The results of this study suggest that computer-monitor radiation leads to oxidative stress in the corneal and lens tissues, and that vitamin C may prevent oxidative effects in the lens. PMID:19960068

  17. LANDSAT-1 data as it has been applied for land use and water quality data by the Virginia State Water Control Board. 1: The state project. 2: Monitoring water quality from LANDSAT

    NASA Technical Reports Server (NTRS)

    Trexler, P. L.; Barker, J. L.

    1975-01-01

    LANDSAT-1 imagery has been used for water quality and land use monitoring in and around the Swift Creek and Lake Chesdin Reservoirs in Virginia. This has proved useful by (1) helping determine valid reservoir sampling stations, (2) monitoring areas not accessible by land or water, (3) giving the State a viable means of measuring Secchi depth readings in these inaccessible areas, (4) giving an overview of trends in changing sedimentation loadings over a given time period and classifying these waters into various categories, (5) enabling the State to inventory all major lakes and reservoirs and computing their acreage, (6) monitoring land use changes in any specific area, (7) evaluating possible long-term environmental effects of nearby developments, and (8) monitoring and predicting population shifts with possible impact on water quality problems. The main problems in the long-term use of such imagery appear to be cost and lack of consistency due to cloud cover limitations.

  18. Study on application of dynamic monitoring of land use based on mobile GIS technology

    NASA Astrophysics Data System (ADS)

    Tian, Jingyi; Chu, Jian; Guo, Jianxing; Wang, Lixin

    2006-10-01

    The land use dynamic monitoring is an important mean to maintain the real-time update of the land use data. Mobile GIS technology integrates GIS, GPS and Internet. It can update the historic al data in real time with site-collected data and realize the data update in large scale with high precision. The Monitoring methods on the land use change data with the mobile GIS technology were discussed. Mobile terminal of mobile GIS has self-developed for this study with GPS-25 OEM and notebook computer. The RTD (real-time difference) operation mode is selected. Mobile GIS system of dynamic monitoring of land use have developed with Visual C++ as operation platform, MapObjects control as graphic platform and MSCmm control as communication platform, which realizes organic integration of GPS, GPRS and GIS. This system has such following basic functions as data processing, graphic display, graphic editing, attribute query and navigation. Qinhuangdao city was selected as the experiential area. Shown by the study result, the mobile GIS integration system of dynamic monitoring of land use developed by this study has practical application value.

  19. Development of an Intelligent Monitoring and Control System for a Heterogeneous Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Afjeh, Abdollah A.; Lewandowski, Henry; Homer, Patrick T.; Schlichting, Richard D.

    1996-01-01

    The NASA Numerical Propulsion System Simulation (NPSS) project is exploring the use of computer simulation to facilitate the design of new jet engines. Several key issues raised in this research are being examined in an NPSS-related research project: zooming, monitoring and control, and support for heterogeneity. The design of a simulation executive that addresses each of these issues is described. In this work, the strategy of zooming, which allows codes that model at different levels of fidelity to be integrated within a single simulation, is applied to the fan component of a turbofan propulsion system. A prototype monitoring and control system has been designed for this simulation to support experimentation with expert system techniques for active control of the simulation. An interconnection system provides a transparent means of connecting the heterogeneous systems that comprise the prototype.

  20. District Computer Concerns: Checklist for Monitoring Instructional Use of Computers.

    ERIC Educational Resources Information Center

    Coe, Merilyn

    Designed to assist those involved with planning, organizing, and implementing computer use in schools, this checklist can be applied to: (1) assess the present state of instructional computer use in the district; (2) assist with the development of plans or guidelines for computer use; (3) support a start-up phase; and (4) monitor the…

  1. Vocal activity as a low cost and scalable index of seabird colony size

    USGS Publications Warehouse

    Borker, Abraham L.; McKown, Matthew W.; Ackerman, Joshua T.; Eagles-Smith, Collin A.; Tershy, Bernie R.; Croll, Donald A.

    2014-01-01

    Although wildlife conservation actions have increased globally in number and complexity, the lack of scalable, cost-effective monitoring methods limits adaptive management and the evaluation of conservation efficacy. Automated sensors and computer-aided analyses provide a scalable and increasingly cost-effective tool for conservation monitoring. A key assumption of automated acoustic monitoring of birds is that measures of acoustic activity at colony sites are correlated with the relative abundance of nesting birds. We tested this assumption for nesting Forster's terns (Sterna forsteri) in San Francisco Bay for 2 breeding seasons. Sensors recorded ambient sound at 7 colonies that had 15–111 nests in 2009 and 2010. Colonies were spaced at least 250 m apart and ranged from 36 to 2,571 m2. We used spectrogram cross-correlation to automate the detection of tern calls from recordings. We calculated mean seasonal call rate and compared it with mean active nest count at each colony. Acoustic activity explained 71% of the variation in nest abundance between breeding sites and 88% of the change in colony size between years. These results validate a primary assumption of acoustic indices; that is, for terns, acoustic activity is correlated to relative abundance, a fundamental step toward designing rigorous and scalable acoustic monitoring programs to measure the effectiveness of conservation actions for colonial birds and other acoustically active wildlife.

  2. Applications of advanced data analysis and expert system technologies in the ATLAS Trigger-DAQ Controls framework

    NASA Astrophysics Data System (ADS)

    Avolio, G.; Corso Radu, A.; Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-12-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment is a very complex distributed computing system, composed of more than 20000 applications running on more than 2000 computers. The TDAQ Controls system has to guarantee the smooth and synchronous operations of all the TDAQ components and has to provide the means to minimize the downtime of the system caused by runtime failures. During data taking runs, streams of information messages sent or published by running applications are the main sources of knowledge about correctness of running operations. The huge flow of operational monitoring data produced is constantly monitored by experts in order to detect problems or misbehaviours. Given the scale of the system and the rates of data to be analyzed, the automation of the system functionality in the areas of operational monitoring, system verification, error detection and recovery is a strong requirement. To accomplish its objective, the Controls system includes some high-level components which are based on advanced software technologies, namely the rule-based Expert System and the Complex Event Processing engines. The chosen techniques allow to formalize, store and reuse the knowledge of experts and thus to assist the shifters in the ATLAS control room during the data-taking activities.

  3. Hyper-resolution monitoring of urban flooding with social media and crowdsourcing data

    NASA Astrophysics Data System (ADS)

    Wang, Ruo-Qian; Mao, Huina; Wang, Yuan; Rae, Chris; Shaw, Wesley

    2018-02-01

    Hyper-resolution datasets for urban flooding are rare. This problem prevents detailed flooding risk analysis, urban flooding control, and the validation of hyper-resolution numerical models. We employed social media and crowdsourcing data to address this issue. Natural Language Processing and Computer Vision techniques are applied to the data collected from Twitter and MyCoast (a crowdsourcing app). We found these big data based flood monitoring approaches can complement the existing means of flood data collection. The extracted information is validated against precipitation data and road closure reports to examine the data quality. The two data collection approaches are compared and the two data mining methods are discussed. A series of suggestions is given to improve the data collection strategy.

  4. Benefits of adopting good radiation practices in reducing the whole body radiation dose to the nuclear medicine personnel during (18)F-fluorodeoxyglucose positron emission tomography/computed tomography imaging.

    PubMed

    Verma, Shashwat; Kheruka, Subhash Chand; Maurya, Anil Kumar; Kumar, Narvesh; Gambhir, Sanjay; Kumari, Sarita

    2016-01-01

    Positron emission tomography has been established as an important imaging modality in the management of patients, especially in oncology. The higher gamma radiation energy of positron-emitting isotopes poses an additional radiation safety problem. Those working with this modality may likely to receive higher whole body doses than those working only in conventional nuclear medicine. The radiation exposure to the personnel occurs in dispensing the dose, administration of activity, patient positioning, and while removing the intravenous (i.v.) cannula. The estimation of radiation dose to Nuclear Medicine Physician (NMP) involved during administration of activity to the patient and technical staff assisting in these procedures in a positron emission tomography/computed tomography (PET/CT) facility was carried out. An i.v access was secured for the patient by putting the cannula and blood sugar was monitored. The activity was then dispensed and measured in the dose calibrator and administered to the patient by NMP. Personnel doses received by NMP and technical staff were measured using electronic pocket dosimeter. The radiation exposure levels at various working locations were assessed with the help of gamma survey meter. The radiation level at working distance while administering the radioactivity was found to be 106-170 μSv/h with a mean value of 126.5 ± 14.88 μSv/h which was reduced to 4.2-14.2 μSv/h with a mean value of 7.16 ± 2.29 μSv/h with introduction of L-bench for administration of radioactivity. This shows a mean exposure level reduction of 94.45 ± 1.03%. The radiation level at working distance, while removing the i.v. cannula postscanning was found to be 25-70 μSv/h with a mean value of 37.4 ± 13.16 μSv/h which was reduced to 1.0-5.0 μSv/h with a mean value of 2.77 ± 1.3 μSv/h with introduction of L-bench for removal of i.v cannula. This shows a mean exposure level reduction of 92.85 ± 1.78%. This study shows that good radiation practices are very helpful in reducing the personnel radiation doses. Use of radiation protection devices such as L-bench reduces exposure significantly. PET/CT staff members must use their personnel monitors diligently and should do so in a consistent manner so that comparisons of their doses are meaningful from one monitoring period to the next.

  5. Automated technical validation--a real time expert system for decision support.

    PubMed

    de Graeve, J S; Cambus, J P; Gruson, A; Valdiguié, P M

    1996-04-15

    Dealing daily with various machines and various control specimens provides a lot of data that cannot be processed manually. In order to help decision-making we wrote specific software coping with the traditional QC, with patient data (mean of normals, delta check) and with criteria related to the analytical equipment (flags and alarms). Four machines (3 Ektachem 700 and 1 Hitachi 911) analysing 25 common chemical tests are controlled. Every day, three different control specimens and one more once a week (regional survey) are run on the various pieces of equipment. The data are collected on a 486 microcomputer connected to the central computer. For every parameter the standard deviation is compared with the published acceptable limits and the Westgard's rules are computed. The mean of normals is continuously monitored. The final decision induces either an alarm sound and the print-out of the cause of rejection or, if no alarms happen, the daily print-out of recorded data, with or without the Levey Jennings graphs.

  6. Combating adverse selection in secondary PC markets.

    PubMed

    Hickey, Stewart W; Fitzpatrick, Colin

    2008-04-15

    Adverse selection is a significant contributor to market failure in secondary personal computer (PC) markets. Signaling can act as a potential solution to adverse selection and facilitate superior remarketing of second-hand PCs. Signaling is a means whereby usage information can be utilized to enhance consumer perception of both value and utility of used PCs and, therefore, promote lifetime extension for these systems. This can help mitigate a large portion of the environmental impact associated with PC system manufacture. In this paper, the computer buying and selling behavior of consumers is characterized via a survey of 270 Irish residential users. Results confirm the existence of adverse selection in the Irish market with 76% of potential buyers being unwilling to purchase and 45% of potential vendors being unwilling to sell a used PC. The so-called "closet affect" is also apparent with 78% of users storing their PC after use has ceased. Results also indicate that consumers place a higher emphasis on specifications when considering a second-hand purchase. This contradicts their application needs which are predominantly Internet and word-processing/spreadsheet/presentation applications, 88% and 60% respectively. Finally, a market solution utilizing self monitoring and reporting technology (SMART) sensors for the purpose of real time usage monitoring is proposed, that can change consumer attitudes with regard to second-hand computer equipment.

  7. A feasability study of color flow doppler vectorization for automated blood flow monitoring.

    PubMed

    Schorer, R; Badoual, A; Bastide, B; Vandebrouck, A; Licker, M; Sage, D

    2017-12-01

    An ongoing issue in vascular medicine is the measure of the blood flow. Catheterization remains the gold standard measurement method, although non-invasive techniques are an area of intense research. We hereby present a computational method for real-time measurement of the blood flow from color flow Doppler data, with a focus on simplicity and monitoring instead of diagnostics. We then analyze the performance of a proof-of-principle software implementation. We imagined a geometrical model geared towards blood flow computation from a color flow Doppler signal, and we developed a software implementation requiring only a standard diagnostic ultrasound device. Detection performance was evaluated by computing flow and its determinants (flow speed, vessel area, and ultrasound beam angle of incidence) on purposely designed synthetic and phantom-based arterial flow simulations. Flow was appropriately detected in all cases. Errors on synthetic images ranged from nonexistent to substantial depending on experimental conditions. Mean errors on measurements from our phantom flow simulation ranged from 1.2 to 40.2% for angle estimation, and from 3.2 to 25.3% for real-time flow estimation. This study is a proof of concept showing that accurate measurement can be done from automated color flow Doppler signal extraction, providing the industry the opportunity for further optimization using raw ultrasound data.

  8. Overcoming Clinical Inertia: A Randomized Clinical Trial of a Telehealth Remote Monitoring Intervention Using Paired Glucose Testing in Adults With Type 2 Diabetes

    PubMed Central

    Blozis, Shelley A; Young, Heather M; Nesbitt, Thomas S; Quinn, Charlene C

    2015-01-01

    Background Type 2 diabetes mellitus is a worldwide challenge. Practice guidelines promote structured self-monitoring of blood glucose (SMBG) for informing health care providers about glycemic control and providing patient feedback to increase knowledge, self-efficacy, and behavior change. Paired glucose testing—pairs of glucose results obtained before and after a meal or physical activity—is a method of structured SMBG. However, frequent access to glucose data to interpret values and recommend actions is challenging. A complete feedback loop—data collection and interpretation combined with feedback to modify treatment—has been associated with improved outcomes, yet there remains limited integration of SMBG feedback in diabetes management. Incorporating telehealth remote monitoring and asynchronous electronic health record (EHR) feedback from certified diabetes educators (CDEs)—specialists in glucose pattern management—employ the complete feedback loop to improve outcomes. Objective The purpose of this study was to evaluate a telehealth remote monitoring intervention using paired glucose testing and asynchronous data analysis in adults with type 2 diabetes. The primary aim was change in glycated hemoglobin (A1c)—a measure of overall glucose management—between groups after 6 months. The secondary aims were change in self-reported Summary of Diabetes Self-Care Activities (SDSCA), Diabetes Empowerment Scale, and Diabetes Knowledge Test. Methods A 2-group randomized clinical trial was conducted comparing usual care to telehealth remote monitoring with paired glucose testing and asynchronous virtual visits. Participants were aged 30-70 years, not using insulin with A1c levels between 7.5% and 10.9% (58-96 mmol/mol). The telehealth remote monitoring tablet computer transmitted glucose data and facilitated a complete feedback loop to educate participants, analyze actionable glucose data, and provide feedback. Data from paired glucose testing were analyzed asynchronously using computer-assisted pattern analysis and were shared with patients via the EHR weekly. CDEs called participants monthly to discuss paired glucose testing trends and treatment changes. Separate mixed-effects models were used to analyze data. Results Participants (N=90) were primarily white (64%, 56/87), mean age 58 (SD 11) years, mean body mass index 34.1 (SD 6.7) kg/m2, with diabetes for mean 8.2 (SD 5.4) years, and a mean A1c of 8.3% (SD 1.1; 67 mmol/mol). Both groups lowered A1c with an estimated average decrease of 0.70 percentage points in usual care group and 1.11 percentage points in the treatment group with a significant difference of 0.41 percentage points at 6 months (SE 0.08, t159=–2.87, P=.005). Change in medication (SE 0.21, t157=–3.37, P=.009) was significantly associated with lower A1c level. The treatment group significantly improved on the SDSCA subscales carbohydrate spacing (P=.04), monitoring glucose (P=.001), and foot care (P=.02). Conclusions An eHealth model incorporating a complete feedback loop with telehealth remote monitoring and paired glucose testing with asynchronous data analysis significantly improved A1c levels compared to usual care. Trial Registration Clinicaltrials.gov NCT01715649; https://www.clinicaltrials.gov/ct2/show/NCT01715649 (Archived by WebCite at http://www.webcitation.org/6ZinLl8D0). PMID:26199142

  9. Overcoming Clinical Inertia: A Randomized Clinical Trial of a Telehealth Remote Monitoring Intervention Using Paired Glucose Testing in Adults With Type 2 Diabetes.

    PubMed

    Greenwood, Deborah A; Blozis, Shelley A; Young, Heather M; Nesbitt, Thomas S; Quinn, Charlene C

    2015-07-21

    Type 2 diabetes mellitus is a worldwide challenge. Practice guidelines promote structured self-monitoring of blood glucose (SMBG) for informing health care providers about glycemic control and providing patient feedback to increase knowledge, self-efficacy, and behavior change. Paired glucose testing—pairs of glucose results obtained before and after a meal or physical activity—is a method of structured SMBG. However, frequent access to glucose data to interpret values and recommend actions is challenging. A complete feedback loop—data collection and interpretation combined with feedback to modify treatment—has been associated with improved outcomes, yet there remains limited integration of SMBG feedback in diabetes management. Incorporating telehealth remote monitoring and asynchronous electronic health record (EHR) feedback from certified diabetes educators (CDEs)—specialists in glucose pattern management—employ the complete feedback loop to improve outcomes. The purpose of this study was to evaluate a telehealth remote monitoring intervention using paired glucose testing and asynchronous data analysis in adults with type 2 diabetes. The primary aim was change in glycated hemoglobin (A(1c))—a measure of overall glucose management—between groups after 6 months. The secondary aims were change in self-reported Summary of Diabetes Self-Care Activities (SDSCA), Diabetes Empowerment Scale, and Diabetes Knowledge Test. A 2-group randomized clinical trial was conducted comparing usual care to telehealth remote monitoring with paired glucose testing and asynchronous virtual visits. Participants were aged 30-70 years, not using insulin with A1c levels between 7.5% and 10.9% (58-96 mmol/mol). The telehealth remote monitoring tablet computer transmitted glucose data and facilitated a complete feedback loop to educate participants, analyze actionable glucose data, and provide feedback. Data from paired glucose testing were analyzed asynchronously using computer-assisted pattern analysis and were shared with patients via the EHR weekly. CDEs called participants monthly to discuss paired glucose testing trends and treatment changes. Separate mixed-effects models were used to analyze data. Participants (N=90) were primarily white (64%, 56/87), mean age 58 (SD 11) years, mean body mass index 34.1 (SD 6.7) kg/m2, with diabetes for mean 8.2 (SD 5.4) years, and a mean A(1c) of 8.3% (SD 1.1; 67 mmol/mol). Both groups lowered A(1c) with an estimated average decrease of 0.70 percentage points in usual care group and 1.11 percentage points in the treatment group with a significant difference of 0.41 percentage points at 6 months (SE 0.08, t159=-2.87, P=.005). Change in medication (SE 0.21, t157=-3.37, P=.009) was significantly associated with lower A(1c) level. The treatment group significantly improved on the SDSCA subscales carbohydrate spacing (P=.04), monitoring glucose (P=.001), and foot care (P=.02). An eHealth model incorporating a complete feedback loop with telehealth remote monitoring and paired glucose testing with asynchronous data analysis significantly improved A(1c) levels compared to usual care. Clinicaltrials.gov NCT01715649; https://www.clinicaltrials.gov/ct2/show/NCT01715649 (Archived by WebCite at http://www.webcitation.org/6ZinLl8D0).

  10. Knowledge representation and user interface concepts to support mixed-initiative diagnosis

    NASA Technical Reports Server (NTRS)

    Sobelman, Beverly H.; Holtzblatt, Lester J.

    1989-01-01

    The Remote Maintenance Monitoring System (RMMS) provides automated support for the maintenance and repair of ModComp computer systems used in the Launch Processing System (LPS) at Kennedy Space Center. RMMS supports manual and automated diagnosis of intermittent hardware failures, providing an efficient means for accessing and analyzing the data generated by catastrophic failure recovery procedures. This paper describes the design and functionality of the user interface for interactive analysis of memory dump data, relating it to the underlying declarative representation of memory dumps.

  11. Apparatus for monitoring high temperature ultrasonic characterization

    DOEpatents

    Lanagan, Michael T.; Kupperman, David S.; Yaconi, George A.

    1998-01-01

    A method and an apparatus for nondestructive detecting and evaluating chas in the microstructural properties of a material by employing one or more magnetostrictive transducers linked to the material by means of one or more sonic signal conductors. The magnetostrictive transducer or transducers are connected to a pulser/receiver which in turn is connected to an oscilloscope. The oscilloscope is connected to a computer which employs an algorithm to evaluate changes in the velocity of a signal transmitted to the material sample as function of time and temperature.

  12. Radiological risk assessment of cosmic radiation at aviation altitudes (a trip from Houston Intercontinental Airport to Lagos International Airport).

    PubMed

    Enyinna, Paschal Ikenna

    2016-01-01

    Radiological risk parameters associated with aircrew members traveling from Houston Intercontinental Airport to Lagos International Airport have been computed using computer software called EPCARD (version 3.2). The mean annual effective dose of radiation was computed to be 2.94 mSv/year. This result is above the standard permissible limit of 1 mSv/year set for the public and pregnant aircrew members but below the limit set for occupationally exposed workers. The Risk of cancer mortality and excess career time cancer risk computed ranged from 3.5 × 10(-5) to 24.5 × 10(-5) (with average of 14.7 × 10(-5)) and 7 × 10(-4) to 49 × 10(-4) (with average of 29.4 × 10(-4)). Passengers and aircrew members should be aware of the extra cosmic radiation doses taken in during flights. All aircraft operators should monitor radiation doses incurred during aviation trips.

  13. Radiological risk assessment of cosmic radiation at aviation altitudes (a trip from Houston Intercontinental Airport to Lagos International Airport)

    PubMed Central

    Enyinna, Paschal Ikenna

    2016-01-01

    Radiological risk parameters associated with aircrew members traveling from Houston Intercontinental Airport to Lagos International Airport have been computed using computer software called EPCARD (version 3.2). The mean annual effective dose of radiation was computed to be 2.94 mSv/year. This result is above the standard permissible limit of 1 mSv/year set for the public and pregnant aircrew members but below the limit set for occupationally exposed workers. The Risk of cancer mortality and excess career time cancer risk computed ranged from 3.5 × 10−5 to 24.5 × 10−5 (with average of 14.7 × 10−5) and 7 × 10−4 to 49 × 10−4 (with average of 29.4 × 10−4). Passengers and aircrew members should be aware of the extra cosmic radiation doses taken in during flights. All aircraft operators should monitor radiation doses incurred during aviation trips. PMID:27651568

  14. Workflow computing. Improving management and efficiency of pathology diagnostic services.

    PubMed

    Buffone, G J; Moreau, D; Beck, J R

    1996-04-01

    Traditionally, information technology in health care has helped practitioners to collect, store, and present information and also to add a degree of automation to simple tasks (instrument interfaces supporting result entry, for example). Thus commercially available information systems do little to support the need to model, execute, monitor, coordinate, and revise the various complex clinical processes required to support health-care delivery. Workflow computing, which is already implemented and improving the efficiency of operations in several nonmedical industries, can address the need to manage complex clinical processes. Workflow computing not only provides a means to define and manage the events, roles, and information integral to health-care delivery but also supports the explicit implementation of policy or rules appropriate to the process. This article explains how workflow computing may be applied to health-care and the inherent advantages of the technology, and it defines workflow system requirements for use in health-care delivery with special reference to diagnostic pathology.

  15. Real-Time Mapping alert system; user's manual

    USGS Publications Warehouse

    Torres, L.A.

    1996-01-01

    The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water- related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field monitoring sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. These alert values can help keep water- resource specialists informed of current hydrologic conditions. The current alert status at monitoring sites is of critical importance during floods, hurricanes, and other extreme hydrologic events where quick analysis of the situation is needed. This manual provides instructions for using the Real-Time Mapping software, a series of computer programs developed by the U.S. Geological Survey for quick analysis of hydrologic conditions, and guides users through a basic interactive session. The software provides interactive graphics display and query of real-time information in a map-based, menu-driven environment.

  16. Information collection and processing of dam distortion in digital reservoir system

    NASA Astrophysics Data System (ADS)

    Liang, Yong; Zhang, Chengming; Li, Yanling; Wu, Qiulan; Ge, Pingju

    2007-06-01

    The "digital reservoir" is usually understood as describing the whole reservoir with digital information technology to make it serve the human existence and development furthest. Strictly speaking, the "digital reservoir" is referred to describing vast information of the reservoir in different dimension and space-time by RS, GPS, GIS, telemetry, remote-control and virtual reality technology based on computer, multi-media, large-scale memory and wide-band networks technology for the human existence, development and daily work, life and entertainment. The core of "digital reservoir" is to realize the intelligence and visibility of vast information of the reservoir through computers and networks. The dam is main building of reservoir, whose safety concerns reservoir and people's safety. Safety monitoring is important way guaranteeing the dam's safety, which controls the dam's running through collecting the dam's information concerned and developing trend. Safety monitoring of the dam is the process from collection and processing of initial safety information to forming safety concept in the brain. The paper mainly researches information collection and processing of the dam by digital means.

  17. Computerization in industry causes problems for people with reading and writing difficulties (dyslexia).

    PubMed

    Knutsson, A

    1986-01-01

    For 10 years computerization in industry has advanced at a rapid pace. A problem which has not received attention is that of people with reading and writing difficulties who experience severe problems when they have to communicate with a computer monitor screen. These individuals are often embarrassed by their difficulties and conceal them from their fellow workers. A number of case studies are described which show the form the problems can take. In one case, an employee was compelled to move from department to department as each was computerized in turn. Computers transform a large number of manual tasks in industry into jobs which call for reading and writing skills. Better education at elementary school and at the workplace in connection with computerization are the most important means of overcoming this problem. Moreover, computer programs could be written in a more human way.

  18. Computer-Assisted Monitoring Of A Complex System

    NASA Technical Reports Server (NTRS)

    Beil, Bob J.; Mickelson, Eric M.; Sterritt, John M.; Costantino, Rob W.; Houvener, Bob C.; Super, Mike A.

    1995-01-01

    Propulsion System Advisor (PSA) computer-based system assists engineers and technicians in analyzing masses of sensory data indicative of operating conditions of space shuttle propulsion system during pre-launch and launch activities. Designed solely for monitoring; does not perform any control functions. Although PSA developed for highly specialized application, serves as prototype of noncontrolling, computer-based subsystems for monitoring other complex systems like electric-power-distribution networks and factories.

  19. Computation offloading for real-time health-monitoring devices.

    PubMed

    Kalantarian, Haik; Sideris, Costas; Tuan Le; Hosseini, Anahita; Sarrafzadeh, Majid

    2016-08-01

    Among the major challenges in the development of real-time wearable health monitoring systems is to optimize battery life. One of the major techniques with which this objective can be achieved is computation offloading, in which portions of computation can be partitioned between the device and other resources such as a server or cloud. In this paper, we describe a novel dynamic computation offloading scheme for real-time wearable health monitoring devices that adjusts the partitioning of data between the wearable device and mobile application as a function of desired classification accuracy.

  20. Behavioral features recognition and oestrus detection based on fast approximate clustering algorithm in dairy cows

    NASA Astrophysics Data System (ADS)

    Tian, Fuyang; Cao, Dong; Dong, Xiaoning; Zhao, Xinqiang; Li, Fade; Wang, Zhonghua

    2017-06-01

    Behavioral features recognition was an important effect to detect oestrus and sickness in dairy herds and there is a need for heat detection aid. The detection method was based on the measure of the individual behavioural activity, standing time, and temperature of dairy using vibrational sensor and temperature sensor in this paper. The data of behavioural activity index, standing time, lying time and walking time were sent to computer by lower power consumption wireless communication system. The fast approximate K-means algorithm (FAKM) was proposed to deal the data of the sensor for behavioral features recognition. As a result of technical progress in monitoring cows using computers, automatic oestrus detection has become possible.

  1. On-Site Determination and Monitoring of Real-Time Fluence Delivery for an Operating UV Reactor Based on a True Fluence Rate Detector.

    PubMed

    Li, Mengkai; Li, Wentao; Qiang, Zhimin; Blatchley, Ernest R

    2017-07-18

    At present, on-site fluence (distribution) determination and monitoring of an operating UV system represent a considerable challenge. The recently developed microfluorescent silica detector (MFSD) is able to measure the approximate true fluence rate (FR) at a fixed position in a UV reactor that can be compared with a FR model directly. Hence it has provided a connection between model calculation and real-time fluence determination. In this study, an on-site determination and monitoring method of fluence delivery for an operating UV reactor was developed. True FR detectors, a UV transmittance (UVT) meter, and a flow rate meter were used for fundamental measurements. The fluence distribution, as well as reduction equivalent fluence (REF), 10th percentile dose in the UV fluence distribution (F 10 ), minimum fluence (F min ), and mean fluence (F mean ) of a test reactor, was calculated in advance by the combined use of computational fluid dynamics and FR field modeling. A field test was carried out on the test reactor for disinfection of a secondary water supply. The estimated real-time REF, F 10 , F min , and F mean decreased 73.6%, 71.4%, 69.6%, and 72.9%, respectively, during a 6-month period, which was attributable to lamp output attenuation and sleeve fouling. The results were analyzed with synchronous data from a previously developed triparameter UV monitoring system and water temperature sensor. This study allowed demonstration of an accurate method for on-site, real-time fluence determination which could be used to enhance the security and public confidence of UV-based water treatment processes.

  2. Continuous non-contact vital sign monitoring in neonatal intensive care unit

    PubMed Central

    Guazzi, Alessandro; Jorge, João; Davis, Sara; Watkinson, Peter; Green, Gabrielle; Shenvi, Asha; McCormick, Kenny; Tarassenko, Lionel

    2014-01-01

    Current technologies to allow continuous monitoring of vital signs in pre-term infants in the hospital require adhesive electrodes or sensors to be in direct contact with the patient. These can cause stress, pain, and also damage the fragile skin of the infants. It has been established previously that the colour and volume changes in superficial blood vessels during the cardiac cycle can be measured using a digital video camera and ambient light, making it possible to obtain estimates of heart rate or breathing rate. Most of the papers in the literature on non-contact vital sign monitoring report results on adult healthy human volunteers in controlled environments for short periods of time. The authors' current clinical study involves the continuous monitoring of pre-term infants, for at least four consecutive days each, in the high-dependency care area of the Neonatal Intensive Care Unit (NICU) at the John Radcliffe Hospital in Oxford. The authors have further developed their video-based, non-contact monitoring methods to obtain continuous estimates of heart rate, respiratory rate and oxygen saturation for infants nursed in incubators. In this Letter, it is shown that continuous estimates of these three parameters can be computed with an accuracy which is clinically useful. During stable sections with minimal infant motion, the mean absolute error between the camera-derived estimates of heart rate and the reference value derived from the ECG is similar to the mean absolute error between the ECG-derived value and the heart rate value from a pulse oximeter. Continuous non-contact vital sign monitoring in the NICU using ambient light is feasible, and the authors have shown that clinically important events such as a bradycardia accompanied by a major desaturation can be identified with their algorithms for processing the video signal. PMID:26609384

  3. Continuous non-contact vital sign monitoring in neonatal intensive care unit.

    PubMed

    Villarroel, Mauricio; Guazzi, Alessandro; Jorge, João; Davis, Sara; Watkinson, Peter; Green, Gabrielle; Shenvi, Asha; McCormick, Kenny; Tarassenko, Lionel

    2014-09-01

    Current technologies to allow continuous monitoring of vital signs in pre-term infants in the hospital require adhesive electrodes or sensors to be in direct contact with the patient. These can cause stress, pain, and also damage the fragile skin of the infants. It has been established previously that the colour and volume changes in superficial blood vessels during the cardiac cycle can be measured using a digital video camera and ambient light, making it possible to obtain estimates of heart rate or breathing rate. Most of the papers in the literature on non-contact vital sign monitoring report results on adult healthy human volunteers in controlled environments for short periods of time. The authors' current clinical study involves the continuous monitoring of pre-term infants, for at least four consecutive days each, in the high-dependency care area of the Neonatal Intensive Care Unit (NICU) at the John Radcliffe Hospital in Oxford. The authors have further developed their video-based, non-contact monitoring methods to obtain continuous estimates of heart rate, respiratory rate and oxygen saturation for infants nursed in incubators. In this Letter, it is shown that continuous estimates of these three parameters can be computed with an accuracy which is clinically useful. During stable sections with minimal infant motion, the mean absolute error between the camera-derived estimates of heart rate and the reference value derived from the ECG is similar to the mean absolute error between the ECG-derived value and the heart rate value from a pulse oximeter. Continuous non-contact vital sign monitoring in the NICU using ambient light is feasible, and the authors have shown that clinically important events such as a bradycardia accompanied by a major desaturation can be identified with their algorithms for processing the video signal.

  4. Data processing for water monitoring system

    NASA Technical Reports Server (NTRS)

    Monford, L.; Linton, A. T.

    1978-01-01

    Water monitoring data acquisition system is structured about central computer that controls sampling and sensor operation, and analyzes and displays data in real time. Unit is essentially separated into two systems: computer system, and hard wire backup system which may function separately or with computer.

  5. Shortcomings of low-cost imaging systems for viewing computed radiographs.

    PubMed

    Ricke, J; Hänninen, E L; Zielinski, C; Amthauer, H; Stroszczynski, C; Liebig, T; Wolf, M; Hosten, N

    2000-01-01

    To assess potential advantages of a new PC-based viewing tool featuring image post-processing for viewing computed radiographs on low-cost hardware (PC) with a common display card and color monitor, and to evaluate the effect of using color versus monochrome monitors. Computed radiographs of a statistical phantom were viewed on a PC, with and without post-processing (spatial frequency and contrast processing), employing a monochrome or a color monitor. Findings were compared with the viewing on a radiological Workstation and evaluated with ROC analysis. Image post-processing improved the perception of low-contrast details significantly irrespective of the monitor used. No significant difference in perception was observed between monochrome and color monitors. The review at the radiological Workstation was superior to the review done using the PC with image processing. Lower quality hardware (graphic card and monitor) used in low cost PCs negatively affects perception of low-contrast details in computed radiographs. In this situation, it is highly recommended to use spatial frequency and contrast processing. No significant quality gain has been observed for the high-end monochrome monitor compared to the color display. However, the color monitor was affected stronger by high ambient illumination.

  6. Health status monitoring for ICU patients based on locally weighted principal component analysis.

    PubMed

    Ding, Yangyang; Ma, Xin; Wang, Youqing

    2018-03-01

    Intelligent status monitoring for critically ill patients can help medical stuff quickly discover and assess the changes of disease and then make appropriate treatment strategy. However, general-type monitoring model now widely used is difficult to adapt the changes of intensive care unit (ICU) patients' status due to its fixed pattern, and a more robust, efficient and fast monitoring model should be developed to the individual. A data-driven learning approach combining locally weighted projection regression (LWPR) and principal component analysis (PCA) is firstly proposed and applied to monitor the nonlinear process of patients' health status in ICU. LWPR is used to approximate the complex nonlinear process with local linear models, in which PCA could be further applied to status monitoring, and finally a global weighted statistic will be acquired for detecting the possible abnormalities. Moreover, some improved versions are developed, such as LWPR-MPCA and LWPR-JPCA, which also have superior performance. Eighteen subjects were selected from the Physiobank's Multi-parameter Intelligent Monitoring for Intensive Care II (MIMIC II) database, and two vital signs of each subject were chosen for online monitoring. The proposed method was compared with several existing methods including traditional PCA, Partial least squares (PLS), just in time learning combined with modified PCA (L-PCA), and Kernel PCA (KPCA). The experimental results demonstrated that the mean fault detection rate (FDR) of PCA can be improved by 41.7% after adding LWPR. The mean FDR of LWPR-MPCA was increased by 8.3%, compared with the latest reported method L-PCA. Meanwhile, LWPR spent less training time than others, especially KPCA. LWPR is first introduced into ICU patients monitoring and achieves the best monitoring performance including adaptability to changes in patient status, sensitivity for abnormality detection as well as its fast learning speed and low computational complexity. The algorithm is an excellent approach to establishing a personalized model for patients, which is the mainstream direction of modern medicine in the following development, as well as improving the global monitoring performance. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  7. The monitoring and managing application of cloud computing based on Internet of Things.

    PubMed

    Luo, Shiliang; Ren, Bin

    2016-07-01

    Cloud computing and the Internet of Things are the two hot points in the Internet application field. The application of the two new technologies is in hot discussion and research, but quite less on the field of medical monitoring and managing application. Thus, in this paper, we study and analyze the application of cloud computing and the Internet of Things on the medical field. And we manage to make a combination of the two techniques in the medical monitoring and managing field. The model architecture for remote monitoring cloud platform of healthcare information (RMCPHI) was established firstly. Then the RMCPHI architecture was analyzed. Finally an efficient PSOSAA algorithm was proposed for the medical monitoring and managing application of cloud computing. Simulation results showed that our proposed scheme can improve the efficiency about 50%. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Solar radiation variability over La Réunion island and associated larger-scale dynamics

    NASA Astrophysics Data System (ADS)

    Mialhe, Pauline; Morel, Béatrice; Pohl, Benjamin; Bessafi, Miloud; Chabriat, Jean-Pierre

    2017-04-01

    This study aims to examine the solar radiation variability over La Réunion island and its relationship with large-scale circulation. The Satellite Application Facility on Climate Monitoring (CM SAF) produces a Shortwave Incoming Solar radiation (SIS) data record called Solar surfAce RAdiation Heliosat - East (SARAH-E). A comparison to in situ observations from Météo-France measurements networks quantifies the skill of SARAH-E grids which we use as dataset. First step of the work, irradiance mean cycles are calculated to describe the diurnal-seasonal SIS behaviour over La Réunion island. By analogy with the climate anomalies, instantaneous deviations are computed after removal of the mean states. Finally, we associate these anomalies with larger-scale atmospheric dynamics into the South West Indian Ocean by applying multivariate clustering analyses (Hierarchical Ascending Classification, k-means).

  9. Reactor Operations Monitoring System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, M.M.

    1989-01-01

    The Reactor Operations Monitoring System (ROMS) is a VME based, parallel processor data acquisition and safety action system designed by the Equipment Engineering Section and Reactor Engineering Department of the Savannah River Site. The ROMS will be analyzing over 8 million signal samples per minute. Sixty-eight microprocessors are used in the ROMS in order to achieve a real-time data analysis. The ROMS is composed of multiple computer subsystems. Four redundant computer subsystems monitor 600 temperatures with 2400 thermocouples. Two computer subsystems share the monitoring of 600 reactor coolant flows. Additional computer subsystems are dedicated to monitoring 400 signals from assortedmore » process sensors. Data from these computer subsystems are transferred to two redundant process display computer subsystems which present process information to reactor operators and to reactor control computers. The ROMS is also designed to carry out safety functions based on its analysis of process data. The safety functions include initiating a reactor scram (shutdown), the injection of neutron poison, and the loadshed of selected equipment. A complete development Reactor Operations Monitoring System has been built. It is located in the Program Development Center at the Savannah River Site and is currently being used by the Reactor Engineering Department in software development. The Equipment Engineering Section is designing and fabricating the process interface hardware. Upon proof of hardware and design concept, orders will be placed for the final five systems located in the three reactor areas, the reactor training simulator, and the hardware maintenance center.« less

  10. The Accuracy of Cognitive Monitoring during Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Garhart, Casey; Hannafin, Michael J.

    This study was conducted to determine the accuracy of learners' comprehension monitoring during computer-based instruction and to assess the relationship between enroute monitoring and different levels of learning. Participants were 50 university undergraduate students enrolled in an introductory educational psychology class. All students received…

  11. Computer-assisted learning and simulation systems in dentistry--a challenge to society.

    PubMed

    Welk, A; Splieth, Ch; Wierinck, E; Gilpatrick, R O; Meyer, G

    2006-07-01

    Computer technology is increasingly used in practical training at universities. However, in spite of their potential, computer-assisted learning (CAL) and computer-assisted simulation (CAS) systems still appear to be underutilized in dental education. Advantages, challenges, problems, and solutions of computer-assisted learning and simulation in dentistry are discussed by means of MEDLINE, open Internet platform searches, and key results of a study among German dental schools. The advantages of computer-assisted learning are seen for example in self-paced and self-directed learning and increased motivation. It is useful for both objective theoretical and practical tests and for training students to handle complex cases. CAL can lead to more structured learning and can support training in evidence-based decision-making. The reasons for the still relatively rare implementation of CAL/CAS systems in dental education include an inability to finance, lack of studies of CAL/CAS, and too much effort required to integrate CAL/CAS systems into the curriculum. To overcome the reasons for the relative low degree of computer technology use, we should strive for multicenter research and development projects monitored by the appropriate national and international scientific societies, so that the potential of computer technology can be fully realized in graduate, postgraduate, and continuing dental education.

  12. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    PubMed

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  13. A Job Monitoring and Accounting Tool for the LSF Batch System

    NASA Astrophysics Data System (ADS)

    Sarkar, Subir; Taneja, Sonia

    2011-12-01

    This paper presents a web based job monitoring and group-and-user accounting tool for the LSF Batch System. The user oriented job monitoring displays a simple and compact quasi real-time overview of the batch farm for both local and Grid jobs. For Grid jobs the Distinguished Name (DN) of the Grid users is shown. The overview monitor provides the most up-to-date status of a batch farm at any time. The accounting tool works with the LSF accounting log files. The accounting information is shown for a few pre-defined time periods by default. However, one can also compute the same information for any arbitrary time window. The tool already proved to be an extremely useful means to validate more extensive accounting tools available in the Grid world. Several sites have already been using the present tool and more sites running the LSF batch system have shown interest. We shall discuss the various aspects that make the tool essential for site administrators and end-users alike and outline the current status of development as well as future plans.

  14. OCCAM: a flexible, multi-purpose and extendable HPC cluster

    NASA Astrophysics Data System (ADS)

    Aldinucci, M.; Bagnasco, S.; Lusso, S.; Pasteris, P.; Rabellino, S.; Vallero, S.

    2017-10-01

    The Open Computing Cluster for Advanced data Manipulation (OCCAM) is a multipurpose flexible HPC cluster designed and operated by a collaboration between the University of Torino and the Sezione di Torino of the Istituto Nazionale di Fisica Nucleare. It is aimed at providing a flexible, reconfigurable and extendable infrastructure to cater to a wide range of different scientific computing use cases, including ones from solid-state chemistry, high-energy physics, computer science, big data analytics, computational biology, genomics and many others. Furthermore, it will serve as a platform for R&D activities on computational technologies themselves, with topics ranging from GPU acceleration to Cloud Computing technologies. A heterogeneous and reconfigurable system like this poses a number of challenges related to the frequency at which heterogeneous hardware resources might change their availability and shareability status, which in turn affect methods and means to allocate, manage, optimize, bill, monitor VMs, containers, virtual farms, jobs, interactive bare-metal sessions, etc. This work describes some of the use cases that prompted the design and construction of the HPC cluster, its architecture and resource provisioning model, along with a first characterization of its performance by some synthetic benchmark tools and a few realistic use-case tests.

  15. Monitoring of body position and motion in children with severe cerebral palsy for 24 hours.

    PubMed

    Sato, Haruhiko; Iwasaki, Toshiyuki; Yokoyama, Misako; Inoue, Takenobu

    2014-01-01

    To investigate differences in position and body movements between children with severe cerebral palsy (CP) and children with typical development (TD) during the daytime and while asleep at night. Fifteen children with severe quadriplegic CP living at home (GMFCS level V, 7 males, 8 females; mean age 8 years 3 months; range 3-20 years) and 15 children with TD (6 males, 9 females; mean age 8 years 7 months; range 1-16 years) participated. Body position and movements were recorded for 24 h by a body position monitor and a physical activity monitor, respectively. The amount of time spent in one position and the durations of inactive periods during the daytime and during night-time sleep were computed and analyzed for group differences. In children with CP, the mean longest time spent in one position was longer than that in children with TD during night-time sleep (5.6 ± 3.5 h versus 1.6 ± 1.2 h). In contrast, no significant differences were found between the groups during the daytime (1.9 ± 1.1 h versus 1.6 ± 0.7 h). The mean longest time the body remained inactive was longer in the children with CP during both daytime and nighttime sleep (0.6 ± 0.3 h versus 0.3 ± 0.3 h for daytime, 1.4 ± 0.8 h versus 0.7 ± 0.3 h for nighttime). Children with severe CP living at home showed prolonged immobilized posture during night-time sleep when their caregivers would be likely to also be asleep. This may suggest that these children should receive postural care assistance at night.

  16. A Computer Interview for Multivariate Monitoring of Psychiatric Outcome.

    ERIC Educational Resources Information Center

    Stevenson, John F.; And Others

    Application of computer technology to psychiatric outcome measurement offers the promise of coping with increasing demands for extensive patient interviews repeated longitudinally. Described is the development of a cost-effective multi-dimensional tracking device to monitor psychiatric functioning, building on a previous local computer interview…

  17. Monitoring Collaborative Activities in Computer Supported Collaborative Learning

    ERIC Educational Resources Information Center

    Persico, Donatella; Pozzi, Francesca; Sarti, Luigi

    2010-01-01

    Monitoring the learning process in computer supported collaborative learning (CSCL) environments is a key element for supporting the efficacy of tutor actions. This article proposes an approach for analysing learning processes in a CSCL environment to support tutors in their monitoring tasks. The approach entails tracking the interactions within…

  18. The Crazy Business of Internet Peeping, Privacy, and Anonymity.

    ERIC Educational Resources Information Center

    Van Horn, Royal

    2000-01-01

    Peeping software takes several forms and can be used on a network or to monitor a certain computer. E-Mail Plus, for example, hides inside a computer and sends exact copies of incoming or outgoing e-mail anywhere. School staff with monitored computers should demand e-mail privacy. (MLH)

  19. Relation of stream quality to streamflow, and estimated loads of selected water-quality constituents in the James and Rappahannock rivers near the fall line of Virginia, July 1988 through June 1990

    USGS Publications Warehouse

    Belval, D.L.; Campbell, J.P.; Woodside, M.D.

    1994-01-01

    This report presents the results of a study by the U.S. Geological Survey, in cooperation with the Virginia Department of Environmental Quality-- Division of Intergovernmental Coordination to monitor and estimate loads of selected nutrients and suspended solids discharged to Chesapeake Bay from two major tributaries in Virginia. From July 1988 through June 1990, monitoring consisted of collecting depth-integrated, cross-sectional samples from the James and Rappahannock Rivers during storm- flow conditions and at scheduled intervals. Water- quality constituents that were monitored included total suspended solids (residue, total at 105 degrees Celsius), dissolved nitrite plus nitrate, dissolved ammonia, total Kjeldahl nitrogen (ammonia plus organic), total nitrogen, total phosphorus, dissolved orthopohosphorus, total organic carbon, and dissolved silica. Daily mean load estimates of each constituent were computed by month, using a seven-parameter log-linear-regression model that uses variables of time, discharge, and seasonality. Water-quality data and constituent- load estimates are included in the report in tabular and graphic form. The data and load estimates provided in this report will be used to calibrate the computer modeling efforts of the Chesapeake Bay region, evaluate the water quality of the Bay and the major effects on the water quality, and assess the results of best-management practices in Virginia.

  20. Dynamic Computation Offloading for Low-Power Wearable Health Monitoring Systems.

    PubMed

    Kalantarian, Haik; Sideris, Costas; Mortazavi, Bobak; Alshurafa, Nabil; Sarrafzadeh, Majid

    2017-03-01

    The objective of this paper is to describe and evaluate an algorithm to reduce power usage and increase battery lifetime for wearable health-monitoring devices. We describe a novel dynamic computation offloading scheme for real-time wearable health monitoring devices that adjusts the partitioning of data processing between the wearable device and mobile application as a function of desired classification accuracy. By making the correct offloading decision based on current system parameters, we show that we are able to reduce system power by as much as 20%. We demonstrate that computation offloading can be applied to real-time monitoring systems, and yields significant power savings. Making correct offloading decisions for health monitoring devices can extend battery life and improve adherence.

  1. Automated Ecological Assessment of Physical Activity: Advancing Direct Observation.

    PubMed

    Carlson, Jordan A; Liu, Bo; Sallis, James F; Kerr, Jacqueline; Hipp, J Aaron; Staggs, Vincent S; Papa, Amy; Dean, Kelsey; Vasconcelos, Nuno M

    2017-12-01

    Technological advances provide opportunities for automating direct observations of physical activity, which allow for continuous monitoring and feedback. This pilot study evaluated the initial validity of computer vision algorithms for ecological assessment of physical activity. The sample comprised 6630 seconds per camera (three cameras in total) of video capturing up to nine participants engaged in sitting, standing, walking, and jogging in an open outdoor space while wearing accelerometers. Computer vision algorithms were developed to assess the number and proportion of people in sedentary, light, moderate, and vigorous activity, and group-based metabolic equivalents of tasks (MET)-minutes. Means and standard deviations (SD) of bias/difference values, and intraclass correlation coefficients (ICC) assessed the criterion validity compared to accelerometry separately for each camera. The number and proportion of participants sedentary and in moderate-to-vigorous physical activity (MVPA) had small biases (within 20% of the criterion mean) and the ICCs were excellent (0.82-0.98). Total MET-minutes were slightly underestimated by 9.3-17.1% and the ICCs were good (0.68-0.79). The standard deviations of the bias estimates were moderate-to-large relative to the means. The computer vision algorithms appeared to have acceptable sample-level validity (i.e., across a sample of time intervals) and are promising for automated ecological assessment of activity in open outdoor settings, but further development and testing is needed before such tools can be used in a diverse range of settings.

  2. Automated Ecological Assessment of Physical Activity: Advancing Direct Observation

    PubMed Central

    Carlson, Jordan A.; Liu, Bo; Sallis, James F.; Kerr, Jacqueline; Papa, Amy; Dean, Kelsey; Vasconcelos, Nuno M.

    2017-01-01

    Technological advances provide opportunities for automating direct observations of physical activity, which allow for continuous monitoring and feedback. This pilot study evaluated the initial validity of computer vision algorithms for ecological assessment of physical activity. The sample comprised 6630 seconds per camera (three cameras in total) of video capturing up to nine participants engaged in sitting, standing, walking, and jogging in an open outdoor space while wearing accelerometers. Computer vision algorithms were developed to assess the number and proportion of people in sedentary, light, moderate, and vigorous activity, and group-based metabolic equivalents of tasks (MET)-minutes. Means and standard deviations (SD) of bias/difference values, and intraclass correlation coefficients (ICC) assessed the criterion validity compared to accelerometry separately for each camera. The number and proportion of participants sedentary and in moderate-to-vigorous physical activity (MVPA) had small biases (within 20% of the criterion mean) and the ICCs were excellent (0.82–0.98). Total MET-minutes were slightly underestimated by 9.3–17.1% and the ICCs were good (0.68–0.79). The standard deviations of the bias estimates were moderate-to-large relative to the means. The computer vision algorithms appeared to have acceptable sample-level validity (i.e., across a sample of time intervals) and are promising for automated ecological assessment of activity in open outdoor settings, but further development and testing is needed before such tools can be used in a diverse range of settings. PMID:29194358

  3. Quantitative Computed Tomography Ventriculography for Assessment and Monitoring of Hydrocephalus: A Pilot Study and Description of Method in Subarachnoid Hemorrhage.

    PubMed

    Multani, Jasjit Singh; Oermann, Eric Karl; Titano, Joseph; Mascitelli, Justin; Nicol, Kelly; Feng, Rui; Skovrlj, Branko; Pain, Margaret; Mocco, J D; Bederson, Joshua B; Costa, Anthony; Shrivastava, Raj

    2017-08-01

    There is no facile quantitative method for monitoring hydrocephalus (HCP). We propose quantitative computed tomography (CT) ventriculography (qCTV) as a novel computer vision tool for empirically assessing HCP in patients with subarachnoid hemorrhage (SAH). Twenty patients with SAH who were evaluated for ventriculoperitoneal shunt (VPS) placement were selected for inclusion. Ten patients with normal head computed tomography (CTH) findings were analyzed as negative controls. CTH scans were segmented both manually and automatically (by qCTV) to generate measures of ventricular volume. The median manually calculated ventricular volume was 36.1 cm 3 (interquartile range [IQR], 30-115 cm 3 ), which was similar to the median qCTV measured volume of 37.5 cm 3 (IQR, 32-118 cm 3 ) (P = 0.796). Patients undergoing VPS placement demonstrated an increase in median ventricular volume on qCTV from 21 cm 3 to 40 cm 3 on day T-2 and to 51 cm 3 by day 0, a change of 144%. This is in contrast to patients who did not require shunting, in whom median ventricular volume decreased from 16 cm 3 to 14 cm 3 on day T-2 and to 13 cm 3 by day 0, with an average overall volume decrease 19% (P = 0.001). The average change in ventricular volume predicted which patients would require VPS placement, successfully identifying 7 of 10 patients (P = 0.004). Using an optimized cutoff of a change in ventricular volume of 2.5 cm 3 identified all patients who went on to require VPS placement (10 of 10; P = 0.011). qCTV is a reliable means of quantifying ventricular volume and hydrocephalus. This technique offers a new tool for monitoring neurosurgical patients for hydrocephalus, and may be beneficial for use in future research studies, as well as in the routine care of patients with hydrocephalus. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Accommodative and convergence response to computer screen and printed text

    NASA Astrophysics Data System (ADS)

    Ferreira, Andreia; Lira, Madalena; Franco, Sandra

    2011-05-01

    The aim of this work was to find out if differences exist in accommodative and convergence response for different computer monitors' and a printed text. It was also tried to relate the horizontal heterophoria value and accommodative response with the symptoms associated with computer use. Two independents experiments were carried out in this study. The first experiment was measuring the accommodative response on 89 subjects using the Grand Seiko WAM-5500 (Grand Seiko Co., Ltd., Japan). The accommodative response was measured using three computer monitors: a 17-inch cathode ray tube (CRT), two liquid crystal displays LCDs, one 17-inch (LCD17) and one 15 inches (LCD15) and a printed text. The text displayed was always the same for all the subjects and tests. A second experiment aimed to measure the value of habitual horizontal heterophoria on 80 subjects using the Von Graefe technique. The measurements were obtained using the same target presented on two different computer monitors, one 19-inch cathode ray tube (CRT) and other 19 inches liquid crystal displays (LCD) and printed on paper. A small survey about the incidence and prevalence of symptoms was performed similarly in both experiments. In the first experiment, the accommodation response was higher in the CRT and LCD's than for paper. There were not found significantly different response for both LCD monitors'. The second experiment showed that, the heterophoria values were similar for all the stimuli. On average, participants presented a small exophoria. In both experiments, asthenopia was the symptom that presented higher incidence. There are different accommodative responses when reading on paper or on computer monitors. This difference is more significant for CRT monitors. On the other hand, there was no difference in the values of convergence for the computer monitors' and paper. The symptoms associated with the use of computers are not related with the increase in accommodation and with the horizontal heterophoria values.

  5. Monitoring techniques and alarm procedures for CMS services and sites in WLCG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molina-Perez, J.; Bonacorsi, D.; Gutsche, O.

    2012-01-01

    The CMS offline computing system is composed of roughly 80 sites (including most experienced T3s) and a number of central services to distribute, process and analyze data worldwide. A high level of stability and reliability is required from the underlying infrastructure and services, partially covered by local or automated monitoring and alarming systems such as Lemon and SLS, the former collects metrics from sensors installed on computing nodes and triggers alarms when values are out of range, the latter measures the quality of service and warns managers when service is affected. CMS has established computing shift procedures with personnel operatingmore » worldwide from remote Computing Centers, under the supervision of the Computing Run Coordinator at CERN. This dedicated 24/7 computing shift personnel is contributing to detect and react timely on any unexpected error and hence ensure that CMS workflows are carried out efficiently and in a sustained manner. Synergy among all the involved actors is exploited to ensure the 24/7 monitoring, alarming and troubleshooting of the CMS computing sites and services. We review the deployment of the monitoring and alarming procedures, and report on the experience gained throughout the first two years of LHC operation. We describe the efficiency of the communication tools employed, the coherent monitoring framework, the proactive alarming systems and the proficient troubleshooting procedures that helped the CMS Computing facilities and infrastructure to operate at high reliability levels.« less

  6. [The Key Technology Study on Cloud Computing Platform for ECG Monitoring Based on Regional Internet of Things].

    PubMed

    Yang, Shu; Qiu, Yuyan; Shi, Bo

    2016-09-01

    This paper explores the methods of building the internet of things of a regional ECG monitoring, focused on the implementation of ECG monitoring center based on cloud computing platform. It analyzes implementation principles of automatic identifi cation in the types of arrhythmia. It also studies the system architecture and key techniques of cloud computing platform, including server load balancing technology, reliable storage of massive smalfi les and the implications of quick search function.

  7. Antineutrino Monitoring of Spent Nuclear Fuel

    NASA Astrophysics Data System (ADS)

    Brdar, Vedran; Huber, Patrick; Kopp, Joachim

    2017-11-01

    Military and civilian applications of nuclear energy have left a significant amount of spent nuclear fuel over the past 70 years. Currently, in many countries worldwide, the use of nuclear energy is on the rise. Therefore, the management of highly radioactive nuclear waste is a pressing issue. In this paper, we explore antineutrino detectors as a tool for monitoring and safeguarding nuclear-waste material. We compute the flux and spectrum of antineutrinos emitted by spent nuclear fuel elements as a function of time, and we illustrate the usefulness of antineutrino detectors in several benchmark scenarios. In particular, we demonstrate how a measurement of the antineutrino flux can help to reverify the contents of a dry storage cask in case the monitoring chain by conventional means gets disrupted. We then comment on the usefulness of antineutrino detectors at long-term storage facilities such as Yucca mountain. Finally, we put forward antineutrino detection as a tool in locating underground "hot spots" in contaminated areas such as the Hanford site in Washington state.

  8. The Use of a Geographic Information System and Remote Sensing Technology for Monitoring Land Use and Soil Carbon Change in the Subtropical Dry Forest Life Zone of Puerto Rico

    NASA Technical Reports Server (NTRS)

    Velez-Rodriguez, Linda L. (Principal Investigator)

    1996-01-01

    Aerial photography, one of the first form of remote sensing technology, has long been an invaluable means to monitor activities and conditions at the Earth's surface. Geographic Information Systems or GIS is the use of computers in showing and manipulating spatial data. This report will present the use of geographic information systems and remote sensing technology for monitoring land use and soil carbon change in the subtropical dry forest life zone of Puerto Rico. This research included the south of Puerto Rico that belongs to the subtropical dry forest life zone. The Guanica Commonwealth Forest Biosphere Reserve and the Jobos Bay National Estuarine Research Reserve are studied in detail, because of their location in the subtropical dry forest life zone. Aerial photography, digital multispectral imagery, soil samples, soil survey maps, field inspections, and differential global positioning system (DGPS) observations were used.

  9. Monitoring tumor motion by real time 2D/3D registration during radiotherapy.

    PubMed

    Gendrin, Christelle; Furtado, Hugo; Weber, Christoph; Bloch, Christoph; Figl, Michael; Pawiro, Supriyanto Ardjo; Bergmann, Helmar; Stock, Markus; Fichtinger, Gabor; Georg, Dietmar; Birkfellner, Wolfgang

    2012-02-01

    In this paper, we investigate the possibility to use X-ray based real time 2D/3D registration for non-invasive tumor motion monitoring during radiotherapy. The 2D/3D registration scheme is implemented using general purpose computation on graphics hardware (GPGPU) programming techniques and several algorithmic refinements in the registration process. Validation is conducted off-line using a phantom and five clinical patient data sets. The registration is performed on a region of interest (ROI) centered around the planned target volume (PTV). The phantom motion is measured with an rms error of 2.56 mm. For the patient data sets, a sinusoidal movement that clearly correlates to the breathing cycle is shown. Videos show a good match between X-ray and digitally reconstructed radiographs (DRR) displacement. Mean registration time is 0.5 s. We have demonstrated that real-time organ motion monitoring using image based markerless registration is feasible. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  10. Semireal Time Monitoring Of The Functional Movements Of The Mandible

    NASA Astrophysics Data System (ADS)

    Isaacson, Robert J.; Baumrind, Sheldon; Curry, Sean; Molthen, Robert A.

    1983-07-01

    Many branches of dental practice would benefit from the availability of a relatively accurate, precise, and efficient method for monitoring the movements of the human mandible during function. Mechanical analog systems have been utilized in the past but these are difficult to quantify, have limited accuracy due to frictional resistance of the components, and contain information only on the borders of the envelopes of possible movement of the landmarks measured (rather than on the functional paths of the landmarks which lie within their envelopes). Those electronic solutions which have been attempted thus far have been prohibitively expensive and time consuming for clinical use, have had lag times between data acquisition and display, or have involved such restrictions of freedom of motion as to render ambiguous the meaning of the data obtained. We report work aimed at developing a relatively non-restrictive semi-real time acoustical system for monitoring the functional movement of the mandible relative to the rest of the head. A set of three sparking devices is mounted to the mandibular component of a light, relatively non-constraining extra-oral harness and another set of three sparkers is attached to the harness' cranial or skull component. The sparkers are fired sequentially by a multiplexer and the sound associated with each firing is recorded by an array of three or more microphones. Computations based on the known speed of sound are used to evaluate the distances between the sparkers and the microphones. These data can then be transformed by computer to provide numeric or graphic information on the movement of selected mandibular landmarks with respect to the skull. Total elapsed time between the firing of the sparkers and the display of graphic information need not exceed 30-60 seconds using even a relatively modest modern computer.

  11. Severe Traumatic Brain Injury at a Tertiary Referral Center in Tanzania: Epidemiology and Adherence to Brain Trauma Foundation Guidelines.

    PubMed

    Smart, Luke R; Mangat, Halinder S; Issarow, Benson; McClelland, Paul; Mayaya, Gerald; Kanumba, Emmanuel; Gerber, Linda M; Wu, Xian; Peck, Robert N; Ngayomela, Isidore; Fakhar, Malik; Stieg, Philip E; Härtl, Roger

    2017-09-01

    Severe traumatic brain injury (TBI) is a major cause of death and disability worldwide. Prospective TBI data from sub-Saharan Africa are sparse. This study examines epidemiology and explores management of patients with severe TBI and adherence to Brain Trauma Foundation Guidelines at a tertiary care referral hospital in Tanzania. Patients with severe TBI hospitalized at Bugando Medical Centre were recorded in a prospective registry including epidemiologic, clinical, treatment, and outcome data. Between September 2013 and October 2015, 371 patients with TBI were admitted; 33% (115/371) had severe TBI. Mean age was 32.0 years ± 20.1, and most patients were male (80.0%). Vehicular injuries were the most common cause of injury (65.2%). Approximately half of the patients (47.8%) were hospitalized on the day of injury. Computed tomography of the brain was performed in 49.6% of patients, and 58.3% were admitted to the intensive care unit. Continuous arterial blood pressure monitoring and intracranial pressure monitoring were not performed in any patient. Of patients with severe TBI, 38.3% received hyperosmolar therapy, and 35.7% underwent craniotomy. The 2-week mortality was 34.8%. Mortality of patients with severe TBI at Bugando Medical Centre, Tanzania, is approximately twice that in high-income countries. Intensive care unit care, computed tomography imaging, and continuous arterial blood pressure and intracranial pressure monitoring are underused or unavailable in the tertiary referral hospital setting. Improving outcomes after severe TBI will require concerted investment in prehospital care and improvement in availability of intensive care unit resources, computed tomography, and expertise in multidisciplinary care. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. USE OF TRANSPORTABLE RADIATION DETECTION INSTRUMENTS TO ASSESS INTERNAL CONTAMINATION FROM INTAKES OF RADIONUCLIDES PART I: FIELD TESTS AND MONTE CARLO SIMULATIONS

    PubMed Central

    Anigstein, Robert; Erdman, Michael C.; Ansari, Armin

    2017-01-01

    The detonation of a radiological dispersion device or other radiological incidents could result in the dispersion of radioactive materials and intakes of radionuclides by affected individuals. Transportable radiation monitoring instruments could be used to measure photon radiation from radionuclides in the body for triaging individuals and assigning priorities to their bioassay samples for further assessments. Computer simulations and experimental measurements are required for these instruments to be used for assessing intakes of radionuclides. Count rates from calibrated sources of 60Co, 137Cs, and 241Am were measured on three instruments: a survey meter containing a 2.54 × 2.54-cm NaI(Tl) crystal, a thyroid probe using a 5.08 × 5.08-cm NaI(Tl) crystal, and a portal monitor incorporating two 3.81 × 7.62 × 182.9-cm polyvinyltoluene plastic scintillators. Computer models of the instruments and of the calibration sources were constructed, using engineering drawings and other data provided by the manufacturers. Count rates on the instruments were simulated using the Monte Carlo radiation transport code MCNPX. The computer simulations were within 16% of the measured count rates for all 20 measurements without using empirical radionuclide-dependent scaling factors, as reported by others. The weighted root-mean-square deviations (differences between measured and simulated count rates, added in quadrature and weighted by the variance of the difference) were 10.9% for the survey meter, 4.2% for the thyroid probe, and 0.9% for the portal monitor. These results validate earlier MCNPX models of these instruments that were used to develop calibration factors that enable these instruments to be used for assessing intakes and committed doses from several gamma-emitting radionuclides. PMID:27115229

  13. Use of Transportable Radiation Detection Instruments to Assess Internal Contamination From Intakes of Radionuclides Part I: Field Tests and Monte Carlo Simulations.

    PubMed

    Anigstein, Robert; Erdman, Michael C; Ansari, Armin

    2016-06-01

    The detonation of a radiological dispersion device or other radiological incidents could result in the dispersion of radioactive materials and intakes of radionuclides by affected individuals. Transportable radiation monitoring instruments could be used to measure photon radiation from radionuclides in the body for triaging individuals and assigning priorities to their bioassay samples for further assessments. Computer simulations and experimental measurements are required for these instruments to be used for assessing intakes of radionuclides. Count rates from calibrated sources of Co, Cs, and Am were measured on three instruments: a survey meter containing a 2.54 × 2.54-cm NaI(Tl) crystal, a thyroid probe using a 5.08 × 5.08-cm NaI(Tl) crystal, and a portal monitor incorporating two 3.81 × 7.62 × 182.9-cm polyvinyltoluene plastic scintillators. Computer models of the instruments and of the calibration sources were constructed, using engineering drawings and other data provided by the manufacturers. Count rates on the instruments were simulated using the Monte Carlo radiation transport code MCNPX. The computer simulations were within 16% of the measured count rates for all 20 measurements without using empirical radionuclide-dependent scaling factors, as reported by others. The weighted root-mean-square deviations (differences between measured and simulated count rates, added in quadrature and weighted by the variance of the difference) were 10.9% for the survey meter, 4.2% for the thyroid probe, and 0.9% for the portal monitor. These results validate earlier MCNPX models of these instruments that were used to develop calibration factors that enable these instruments to be used for assessing intakes and committed doses from several gamma-emitting radionuclides.

  14. Computer keyboard interaction as an indicator of early Parkinson’s disease

    PubMed Central

    Giancardo, L.; Sánchez-Ferro, A.; Arroyo-Gallego, T.; Butterworth, I.; Mendoza, C. S.; Montero, P.; Matarazzo, M.; Obeso, J. A.; Gray, M. L.; Estépar, R. San José

    2016-01-01

    Parkinson’s disease (PD) is a slowly progressing neurodegenerative disease with early manifestation of motor signs. Objective measurements of motor signs are of vital importance for diagnosing, monitoring and developing disease modifying therapies, particularly for the early stages of the disease when putative neuroprotective treatments could stop neurodegeneration. Current medical practice has limited tools to routinely monitor PD motor signs with enough frequency and without undue burden for patients and the healthcare system. In this paper, we present data indicating that the routine interaction with computer keyboards can be used to detect motor signs in the early stages of PD. We explore a solution that measures the key hold times (the time required to press and release a key) during the normal use of a computer without any change in hardware and converts it to a PD motor index. This is achieved by the automatic discovery of patterns in the time series of key hold times using an ensemble regression algorithm. This new approach discriminated early PD groups from controls with an AUC = 0.81 (n = 42/43; mean age = 59.0/60.1; women = 43%/60%;PD/controls). The performance was comparable or better than two other quantitative motor performance tests used clinically: alternating finger tapping (AUC = 0.75) and single key tapping (AUC = 0.61). PMID:27703257

  15. Computer keyboard interaction as an indicator of early Parkinson’s disease

    NASA Astrophysics Data System (ADS)

    Giancardo, L.; Sánchez-Ferro, A.; Arroyo-Gallego, T.; Butterworth, I.; Mendoza, C. S.; Montero, P.; Matarazzo, M.; Obeso, J. A.; Gray, M. L.; Estépar, R. San José

    2016-10-01

    Parkinson’s disease (PD) is a slowly progressing neurodegenerative disease with early manifestation of motor signs. Objective measurements of motor signs are of vital importance for diagnosing, monitoring and developing disease modifying therapies, particularly for the early stages of the disease when putative neuroprotective treatments could stop neurodegeneration. Current medical practice has limited tools to routinely monitor PD motor signs with enough frequency and without undue burden for patients and the healthcare system. In this paper, we present data indicating that the routine interaction with computer keyboards can be used to detect motor signs in the early stages of PD. We explore a solution that measures the key hold times (the time required to press and release a key) during the normal use of a computer without any change in hardware and converts it to a PD motor index. This is achieved by the automatic discovery of patterns in the time series of key hold times using an ensemble regression algorithm. This new approach discriminated early PD groups from controls with an AUC = 0.81 (n = 42/43 mean age = 59.0/60.1 women = 43%/60%PD/controls). The performance was comparable or better than two other quantitative motor performance tests used clinically: alternating finger tapping (AUC = 0.75) and single key tapping (AUC = 0.61).

  16. Computer keyboard interaction as an indicator of early Parkinson's disease.

    PubMed

    Giancardo, L; Sánchez-Ferro, A; Arroyo-Gallego, T; Butterworth, I; Mendoza, C S; Montero, P; Matarazzo, M; Obeso, J A; Gray, M L; Estépar, R San José

    2016-10-05

    Parkinson's disease (PD) is a slowly progressing neurodegenerative disease with early manifestation of motor signs. Objective measurements of motor signs are of vital importance for diagnosing, monitoring and developing disease modifying therapies, particularly for the early stages of the disease when putative neuroprotective treatments could stop neurodegeneration. Current medical practice has limited tools to routinely monitor PD motor signs with enough frequency and without undue burden for patients and the healthcare system. In this paper, we present data indicating that the routine interaction with computer keyboards can be used to detect motor signs in the early stages of PD. We explore a solution that measures the key hold times (the time required to press and release a key) during the normal use of a computer without any change in hardware and converts it to a PD motor index. This is achieved by the automatic discovery of patterns in the time series of key hold times using an ensemble regression algorithm. This new approach discriminated early PD groups from controls with an AUC = 0.81 (n = 42/43; mean age = 59.0/60.1; women = 43%/60%;PD/controls). The performance was comparable or better than two other quantitative motor performance tests used clinically: alternating finger tapping (AUC = 0.75) and single key tapping (AUC = 0.61).

  17. Development of a cuffless blood pressure measurement system.

    PubMed

    Shyu, Liang-Yu; Kao, Yao-Lin; Tsai, Wen-Ya; Hu, Weichih

    2012-01-01

    This study constructs a novel blood pressure measurement device without the air cuff to overcome the problem of discomfort and portability. The proposed device measures the blood pressure through a mechanism that is made of silicon rubber and pressure transducer. The system uses a microcontroller to control the measurement procedure and to perform the necessary computation. To verify the feasibility of the constructed device, ten young volunteers were recruited. Ten blood pressure readings were obtained using the new system and were compared with ten blood pressure readings from bedside monitor (Spacelabs Medical, model 90367). The results indicated that, when all the readings were included, the mean pressure, systolic pressure and diastolic pressure from the new system were all higher than those from bedside monitor. The correlation coefficients between these two were 0.15, 0.18 and 0.29, for mean, systolic and diastolic pressures, respectively. After excluding irregular apparatus utilization, the correlation coefficient increased to 0.71, 0.60 and 0.41 for diastolic pressure, mean pressure and systolic pressure, respectively. We can conclude from these results that the accuracy can be improved effectively by defining the user regulation more precisely. The above mentioned irregular apparatus utilization factors can be identified and eliminated by the microprocessor to provide a reliable blood pressure measurement in practical applications in the future.

  18. Apparatus for monitoring high temperature ultrasonic characterization

    DOEpatents

    Lanagan, M.T.; Kupperman, D.S.; Yaconi, G.A.

    1998-03-24

    A method and an apparatus for nondestructive detecting and evaluating changes in the microstructural properties of a material by employing one or more magnetostrictive transducers linked to the material by means of one or more sonic signal conductors. The magnetostrictive transducer or transducers are connected to a pulser/receiver which in turn is connected to an oscilloscope. The oscilloscope is connected to a computer which employs an algorithm to evaluate changes in the velocity of a signal transmitted to the material sample as function of time and temperature. 6 figs.

  19. System and Method for Monitoring Distributed Asset Data

    NASA Technical Reports Server (NTRS)

    Gorinevsky, Dimitry (Inventor)

    2015-01-01

    A computer-based monitoring system and monitoring method implemented in computer software for detecting, estimating, and reporting the condition states, their changes, and anomalies for many assets. The assets are of same type, are operated over a period of time, and outfitted with data collection systems. The proposed monitoring method accounts for variability of working conditions for each asset by using regression model that characterizes asset performance. The assets are of the same type but not identical. The proposed monitoring method accounts for asset-to-asset variability; it also accounts for drifts and trends in the asset condition and data. The proposed monitoring system can perform distributed processing of massive amounts of historical data without discarding any useful information where moving all the asset data into one central computing system might be infeasible. The overall processing is includes distributed preprocessing data records from each asset to produce compressed data.

  20. Early prediction of eruption site using lightning location data: An operational real-time system in Iceland

    NASA Astrophysics Data System (ADS)

    Arason, Þórður; Bjornsson, Halldór; Nína Petersen, Guðrún

    2013-04-01

    Eruption of subglacial volcanoes may lead to catastrophic floods and thus early determination of the exact eruption site may be critical to civil protection evacuation plans. A system is being developed that automatically monitors and analyses volcanic lightning in Iceland. The system predicts the eruption site location from mean lightning locations, taking into account upper level wind. In estimating mean lightning locations, outliers are automatically omitted. A simple wind correction is performed based on the vector wind at the 500 hPa pressure level in the latest radiosonde from Keflavík airport. The system automatically creates a web page with maps and tables showing individual lightning locations and mean locations with and without wind corrections along with estimates of uncetainty. A dormant automatic monitoring system, waiting for a rare event, potentially for several years, is quite susceptible to degeneration during the waiting period, e.g. due to computer or other IT-system upgrades. However, ordinary weather thunderstorms in Iceland should initiate special monitoring and automatic analysis of this system in the same fashion as during a volcanic eruption. Such ordinary weather thunderstorm events will be used to observe anomalies and malfunctions in the system. The essential elements of this system will be described. An example is presented of how the system would have worked during the first hours of the Grímsvötn 2011 eruption. In that case the exact eruption site, within the Grímsvötn caldera, was first known about 15 hours into the eruption.

  1. Accuracy of CBCT for volumetric measurement of simulated periapical lesions.

    PubMed

    Ahlowalia, M S; Patel, S; Anwar, H M S; Cama, G; Austin, R S; Wilson, R; Mannocci, F

    2013-06-01

    To compare the accuracy of cone beam computed tomography (CBCT) and micro-computed tomography (μCT) when measuring the volume of bone cavities. Ten irregular-shaped cavities of varying dimensions were created in bovine bone specimens using a rotary diamond bur. The samples were then scanned using the Accuitomo 3D CBCT scanner. The scanned information was converted to the Digital Imaging and Communication in Medicine (DICOM) format ready for analysis. Once formatted, 10 trained and calibrated examiners segmented the scans and measured the volumes of the lesions. Intra/interexaminer agreement was assessed by each examiner re-segmenting each scan after a 2-week interval. Micro-CT scans were analysed by a single examiner. To achieve a physical reading of the artificially created cavities, replicas were created using dimensionally stable silicone impression material. After measuring the mass of each impression sample, the volume was calculated by dividing the mass of each sample by the density of the set impression material. Further corroboration of these measurements was obtained by employing Archimedes' principle to measure the volume of each impression sample. Intraclass correlation was used to assess agreement. Both CBCT (mean volume: 175.9 mm3) and μCT (mean volume: 163.1 mm3) showed a high degree of agreement (intraclass correlation coefficient >0.9) when compared to both weighed and 'Archimedes' principle' measurements (mean volume: 177.7 and 182.6 mm3, respectively). Cone beam computed tomography is an accurate means of measuring volume of artificially created bone cavities in an ex vivo model. This may provide a valuable tool for monitoring the healing rate of apical periodontitis; further investigations are warranted. © 2012 International Endodontic Journal. Published by Blackwell Publishing Ltd.

  2. Beyond shared perceptions of trust and monitoring in teams: implications of asymmetry and dissensus.

    PubMed

    De Jong, Bart A; Dirks, Kurt T

    2012-03-01

    Past research has implicitly assumed that only mean levels of trust and monitoring in teams are critical for explaining their interrelations and their relationships with team performance. In this article, the authors argue that it is equally important to consider the dispersion in trust and monitoring that exists within teams. The authors introduce "trust asymmetry" and "monitoring dissensus" as critical dispersion properties of trust and monitoring and hypothesize that these moderate the relationships between mean monitoring, mean trust, and team performance. Data from a cross-lagged panel study and a partially lagged study support the hypotheses. The first study also offered support for an integrative model that includes mean and dispersion levels of both trust and monitoring. Overall, the studies provide a comprehensive and clear picture of how trust and monitoring emerge and function at the team level via mean and dispersion.

  3. 48 CFR 252.204-7011 - Alternative Line Item Structure.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...

  4. 48 CFR 252.204-7011 - Alternative Line Item Structure.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...

  5. 48 CFR 252.204-7011 - Alternative Line Item Structure.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...

  6. 48 CFR 252.204-7011 - Alternative Line Item Structure.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...

  7. Sediment inflow, outflow and deposition for Lakes Marion and Moultrie, South Carolina, October 1983-March 1985

    USGS Publications Warehouse

    Cooney, T.W.

    1988-01-01

    In 1941 a Coastal Plain reach of the Santee River was impounded to form Lake Marion and diverted into a diked-off part of the Cooper River basin to form Lake Moultrie. Rates of sediment inflow and outflow of the lakes were determined by the U.S. Geological Survey for the periods July 1966 - June 1968 and October 1983 - March 1985. Total sediment discharge was estimated for two inflow stations and continuous streamflow monitors and automatic suspended-sediment samplers were used for computation of suspended-sediment discharge. Bedload discharge was computed by the modified Einstein procedure. Suspended-sediment discharge was monitored at three outflow stations, with the suspended-sediment concentration measured on a weekly basis. During the 1983-1985 study, mean annual suspended-sediment inflow to Lakes Marion and Moultrie was estimated to be 722,000 tons, and the outflow was estimated at 175,000 tons, for a trap efficiency of 76% and a deposition rate of about 547,000 tons/year. This is about 33% less than the deposition rate determined during the 1966-68 study. The deposition rate for suspended and bedload sediment during the 1983 - 1985 study was about 650,000 tons/year. (USGS)

  8. Quantitative computed tomography assessment of transfusional iron overload.

    PubMed

    Wood, John C; Mo, Ashley; Gera, Aakansha; Koh, Montre; Coates, Thomas; Gilsanz, Vicente

    2011-06-01

    Quantitative computed tomography (QCT) has been proposed for iron quantification for more than 30 years, however there has been little clinical validation. We compared liver attenuation by QCT with magnetic resonance imaging (MRI)-derived estimates of liver iron concentration (LIC) in 37 patients with transfusional siderosis. MRI and QCT measurements were performed as clinically indicated monitoring of LIC and vertebral bone-density respectively, over a 6-year period. Mean time difference between QCT and MRI studies was 14 d, with 25 studies performed on the same day. For liver attenuation outside the normal range, attenuation values rose linearly with LIC (r(2) = 0·94). However, intersubject variability in intrinsic liver attenuation prevented quantitation of LIC <8 mg/g dry weight of liver, and was the dominant source of measurement uncertainty. Calculated QCT and MRI accuracies were equivalent for LIC values approaching 22 mg/g dry weight, with QCT having superior performance at higher LIC's. Although not suitable for monitoring patients with good iron control, QCT may nonetheless represent a viable technique for liver iron quantitation in patients with moderate to severe iron in regions where MRI resources are limited because of its low cost, availability, and high throughput. © 2011 Blackwell Publishing Ltd.

  9. Common Accounting System for Monitoring the ATLAS Distributed Computing Resources

    NASA Astrophysics Data System (ADS)

    Karavakis, E.; Andreeva, J.; Campana, S.; Gayazov, S.; Jezequel, S.; Saiz, P.; Sargsyan, L.; Schovancova, J.; Ueda, I.; Atlas Collaboration

    2014-06-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  10. Unobtrusive monitoring of computer interactions to detect cognitive status in elders.

    PubMed

    Jimison, Holly; Pavel, Misha; McKanna, James; Pavel, Jesse

    2004-09-01

    The U.S. has experienced a rapid growth in the use of computers by elders. E-mail, Web browsing, and computer games are among the most common routine activities for this group of users. In this paper, we describe techniques for unobtrusively monitoring naturally occurring computer interactions to detect sustained changes in cognitive performance. Researchers have demonstrated the importance of the early detection of cognitive decline. Users over the age of 75 are at risk for medically related cognitive problems and confusion, and early detection allows for more effective clinical intervention. In this paper, we present algorithms for inferring a user's cognitive performance using monitoring data from computer games and psychomotor measurements associated with keyboard entry and mouse movement. The inferences are then used to classify significant performance changes, and additionally, to adapt computer interfaces with tailored hints and assistance when needed. These methods were tested in a group of elders in a residential facility.

  11. Automatic vehicle monitoring systems study. Report of phase O. Volume 2: Problem definition and derivation of AVM system selection techniques

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A set of planning guidelines is presented to help law enforcement agencies and vehicle fleet operators decide which automatic vehicle monitoring (AVM) system could best meet their performance requirements. Improvements in emergency response times and resultant cost benefits obtainable with various operational and planned AVM systems may be synthesized and simulated by means of special computer programs for model city parameters applicable to small, medium and large urban areas. Design characteristics of various AVM systems and the implementation requirements are illustrated and cost estimated for the vehicles, the fixed sites and the base equipments. Vehicle location accuracies for different RF links and polling intervals are analyzed. Actual applications and coverage data are tabulated for seven cities whose police departments actively cooperated in the study.

  12. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    NASA Astrophysics Data System (ADS)

    Schovancová, J.; Campana, S.; Di Girolamo, A.; Jézéquel, S.; Ueda, I.; Wenaus, T.; Atlas Collaboration

    2014-06-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources. During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visualization bits across the different tools. A rich family of various filtering and searching options enhancing available user interfaces comes naturally with the data and visualization layer separation. With a variety of reliable monitoring data accessible through standardized interfaces, the possibility of automating actions under well defined conditions correlating multiple data sources has become feasible. In this contribution we discuss also about the automated exclusion of degraded resources and their automated recovery in various activities.

  13. Computer-aided video exposure monitoring.

    PubMed

    Walsh, P T; Clark, R D; Flaherty, S; Gentry, S J

    2000-01-01

    A computer-aided video exposure monitoring system was used to record exposure information. The system comprised a handheld camcorder, portable video cassette recorder, radio-telemetry transmitter/receiver, and handheld or notebook computers for remote data logging, photoionization gas/vapor detectors (PIDs), and a personal aerosol monitor. The following workplaces were surveyed using the system: dry cleaning establishments--monitoring tetrachoroethylene in the air and in breath; printing works--monitoring white spirit type solvent; tire manufacturing factory--monitoring rubber fume; and a slate quarry--monitoring respirable dust and quartz. The system based on the handheld computer, in particular, simplified the data acquisition process compared with earlier systems in use by our laboratory. The equipment is more compact and easier to operate, and allows more accurate calibration of the instrument reading on the video image. Although a variety of data display formats are possible, the best format for videos intended for educational and training purposes was the review-preview chart superimposed on the video image of the work process. Recommendations for reducing exposure by engineering or by modifying work practice were possible through use of the video exposure system in the dry cleaning and tire manufacturing applications. The slate quarry work illustrated how the technique can be used to test ventilation configurations quickly to see their effect on the worker's personal exposure.

  14. Implementation of a computer-assisted monitoring system for the detection of adverse drug reactions in gastroenterology.

    PubMed

    Dormann, H; Criegee-Rieck, M; Neubert, A; Egger, T; Levy, M; Hahn, E G; Brune, K

    2004-02-01

    To investigate the effectiveness of a computer monitoring system that detects adverse drug reactions (ADRs) by laboratory signals in gastroenterology. A prospective, 6-month, pharmaco-epidemiological survey was carried out on a gastroenterological ward at the University Hospital Erlangen-Nuremberg. Two methods were used to identify ADRs. (i) All charts were reviewed daily by physicians and clinical pharmacists. (ii) A computer monitoring system generated a daily list of automatic laboratory signals and alerts of ADRs, including patient data and dates of events. One hundred and nine ADRs were detected in 474 admissions (377 patients). The computer monitoring system generated 4454 automatic laboratory signals from 39 819 laboratory parameters tested, and issued 2328 alerts, 914 (39%) of which were associated with ADRs; 574 (25%) were associated with ADR-positive admissions. Of all the alerts generated, signals of hepatotoxicity (1255), followed by coagulation disorders (407) and haematological toxicity (207), were prevalent. Correspondingly, the prevailing ADRs were concerned with the metabolic and hepato-gastrointestinal system (61). The sensitivity was 91%: 69 of 76 ADR-positive patients were indicated by an alert. The specificity of alerts was increased from 23% to 76% after implementation of an automatic laboratory signal trend monitoring algorithm. This study shows that a computer monitoring system is a useful tool for the systematic and automated detection of ADRs in gastroenterological patients.

  15. Progress Monitoring with Computer Adaptive Assessments: The Impact of Data Collection Schedule on Growth Estimates

    ERIC Educational Resources Information Center

    Nelson, Peter M.; Van Norman, Ethan R.; Klingbeil, Dave A.; Parker, David C.

    2017-01-01

    Although extensive research exists on the use of curriculum-based measures for progress monitoring, little is known about using computer adaptive tests (CATs) for progress-monitoring purposes. The purpose of this study was to evaluate the impact of the frequency of data collection on individual and group growth estimates using a CAT. Data were…

  16. Behavior-Based Fault Monitoring

    DTIC Science & Technology

    1990-12-03

    processor targeted for avionics and space applications . It appears that the signature monitoring technique can be extended to detect computer viruses as...most common approach is structural duplication. Although effective, duplication is too expensive for all but a few applications . Redundancy can also be...Signature Monitoring and Encryption," Int. Conf. on Dependable Computing for Critical Applications , August 1989. 7. K.D. Wilken and J.P. Shen

  17. LEMON - LHC Era Monitoring for Large-Scale Infrastructures

    NASA Astrophysics Data System (ADS)

    Marian, Babik; Ivan, Fedorko; Nicholas, Hook; Hector, Lansdale Thomas; Daniel, Lenkes; Miroslav, Siket; Denis, Waldron

    2011-12-01

    At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.

  18. The Use of a Microcomputer in Collecting Data from Cardiovascular Experiments on Muscle Relaxants

    PubMed Central

    Thut, Paul D.; Polansky, Gregg; Pruzansky, Elysa

    1983-01-01

    The possible association of cardiovascular side-effects from potentially, clinically useful non-depolarizing neuromuscular blocking drugs has been studied with the aid of a micro- computer. The maximal changes in heart rate, systolic, diastolic and mean arterial pressure and pulse pressure were recorded in the onset, maximal effect and recovery phase of relaxant activity in dogs anesthetized with isoflurane. The data collection system employed a Gould 2800S polygraph, an Apple II Plus microcomputer, a Cyborg Corp. ‘Issac’ 12 bit analog to digital converter, two 5 1/4″ floppy disk drives, a ‘Videoterm’ 80 column display board and a 12″ green phosphor monitor. Prior to development of the computer system, direct analysis of polygraph records required more than three times more time than the actual experiment. With the aid of the computer, analysis of data, tabular and graphic presentation and narrative reports were completed within 15 minutes after the end of the experiment.

  19. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  20. Treatment of eight dogs with nasal tumours with alternating doses of doxorubicin and carboplatin in conjunction with oral piroxicam.

    PubMed

    Langova, V; Mutsaers, A J; Phillips, B; Straw, R

    2004-11-01

    To determine the efficacy and toxicity of chemotherapy in the treatment of canine nasal tumours. Retrospective clinical study Eight dogs with histologically confirmed nasal tumours were staged by means of complete blood count, serum biochemical analysis, cytological analysis of fine needle aspirate of the regional lymph nodes, thoracic radiographs and computed tomography scan of the nasal cavity. All dogs were treated with alternating doses of doxorubicin, carboplatin and oral piroxicam. All dogs were monitored for side effects of chemotherapy and evaluated for response to treatment by computed tomography scan of the nasal cavity after the first four treatments. Complete remission was achieved in four dogs, partial remission occurred in two dogs and two had stable disease on the basis of computed tomography evaluation. There was resolution of clinical signs after one to two doses of chemotherapy in all dogs. This chemotherapy protocol was efficacious and well tolerated in this series of eight cases of canine nasal tumours.

  1. Using computer, mobile and wearable technology enhanced interventions to reduce sedentary behaviour: a systematic review and meta-analysis.

    PubMed

    Stephenson, Aoife; McDonough, Suzanne M; Murphy, Marie H; Nugent, Chris D; Mair, Jacqueline L

    2017-08-11

    High levels of sedentary behaviour (SB) are associated with negative health consequences. Technology enhanced solutions such as mobile applications, activity monitors, prompting software, texts, emails and websites are being harnessed to reduce SB. The aim of this paper is to evaluate the effectiveness of such technology enhanced interventions aimed at reducing SB in healthy adults and to examine the behaviour change techniques (BCTs) used. Five electronic databases were searched to identify randomised-controlled trials (RCTs), published up to June 2016. Interventions using computer, mobile or wearable technologies to facilitate a reduction in SB, using a measure of sedentary time as an outcome, were eligible for inclusion. Risk of bias was assessed using the Cochrane Collaboration's tool and interventions were coded using the BCT Taxonomy (v1). Meta-analysis of 15/17 RCTs suggested that computer, mobile and wearable technology tools resulted in a mean reduction of -41.28 min per day (min/day) of sitting time (95% CI -60.99, -21.58, I2 = 77%, n = 1402), in favour of the intervention group at end point follow-up. The pooled effects showed mean reductions at short (≤ 3 months), medium (>3 to 6 months), and long-term follow-up (>6 months) of -42.42 min/day, -37.23 min/day and -1.65 min/day, respectively. Overall, 16/17 studies were deemed as having a high or unclear risk of bias, and 1/17 was judged to be at a low risk of bias. A total of 46 BCTs (14 unique) were coded for the computer, mobile and wearable components of the interventions. The most frequently coded were "prompts and cues", "self-monitoring of behaviour", "social support (unspecified)" and "goal setting (behaviour)". Interventions using computer, mobile and wearable technologies can be effective in reducing SB. Effectiveness appeared most prominent in the short-term and lessened over time. A range of BCTs have been implemented in these interventions. Future studies need to improve reporting of BCTs within interventions and address the methodological flaws identified within the review through the use of more rigorously controlled study designs with longer-term follow-ups, objective measures of SB and the incorporation of strategies to reduce attrition. The review protocol was registered with PROSPERO: CRD42016038187.

  2. Application of solar max ACRIM data to analyze solar-driven climatic variability on Earth

    NASA Technical Reports Server (NTRS)

    Hoffert, M. I.

    1986-01-01

    Terrestrial climatic effects associated with solar variability have been proposed for at least a century, but could not be assessed quantitatively owing to observational uncertainities in solar flux variations. Measurements from 1980 to 1984 by the Active Cavity Radiometer Irradiance Monitor (ACRIM), capable of resolving fluctuations above the sensible atmosphere less than 0.1% of the solar constant, permit direct albeit preliminary assessments of solar forcing effects on global temperatures during this period. The global temperature response to ACRIM-measured fluctuations was computed from 1980 to 1985 using the NYU transient climate model including thermal inertia effects of the world ocean; and compared the results with observations of recent temperature trends. Monthly mean ACRIM-driven global surface temperature fluctuations computed with the climate model are an order of magnitude smaller, of order 0.01 C. In constrast, global mean surface temperature observations indicate an approx. 0.1 C increase during this period. Solar variability is therefore likely to have been a minor factor in global climate change during this period compared with variations in atmospheric albedo, greenhouse gases and internal self-inducedoscillations. It was not possible to extend the applicability of the measured flux variations to longer periods since a possible correlation of luminosity with solar annual activity is not supported by statistical analysis. The continuous monitoring of solar flux by satellite-based instruments over timescales of 20 years or more comparable to timescales for thermal relaxation of the oceans and of the solar cycle itself is needed to resolve the question of long-term solar variation effects on climate.

  3. Atmospheric Parameter Climatologies from AIRS: Monitoring Short-, and Longer-Term Climate Variabilities and 'Trends'

    NASA Technical Reports Server (NTRS)

    Molnar, Gyula; Susskind, Joel

    2008-01-01

    The AIRS instrument is currently the best space-based tool to simultaneously monitor the vertical distribution of key climatically important atmospheric parameters as well as surface properties, and has provided high quality data for more than 5 years. AIRS analysis results produced at the GODDARD/DAAC, based on Versions 4 & 5 of the AIRS retrieval algorithm, are currently available for public use. Here, first we present an assessment of interrelationships of anomalies (proxies of climate variability based on 5 full years, since Sept. 2002) of various climate parameters at different spatial scales. We also present AIRS-retrievals-based global, regional and 1x1 degree grid-scale "trend"-analyses of important atmospheric parameters for this 5-year period. Note that here "trend" simply means the linear fit to the anomaly (relative the mean seasonal cycle) time series of various parameters at the above-mentioned spatial scales, and we present these to illustrate the usefulness of continuing AIRS-based climate observations. Preliminary validation efforts, in terms of intercomparisons of interannual variabilities with other available satellite data analysis results, will also be addressed. For example, we show that the outgoing longwave radiation (OLR) interannual spatial variabilities from the available state-of-the-art CERES measurements and from the AIRS computations are in remarkably good agreement. Version 6 of the AIRS retrieval scheme (currently under development) promises to further improve bias agreements for the absolute values by implementing a more accurate radiative transfer model for the OLR computations and by improving surface emissivity retrievals.

  4. Energy expenditure in adolescents playing new generation computer games.

    PubMed

    Graves, Lee; Stratton, Gareth; Ridgers, N D; Cable, N T

    2008-07-01

    To compare the energy expenditure of adolescents when playing sedentary and new generation active computer games. Cross sectional comparison of four computer games. Setting Research laboratories. Six boys and five girls aged 13-15 years. Participants were fitted with a monitoring device validated to predict energy expenditure. They played four computer games for 15 minutes each. One of the games was sedentary (XBOX 360) and the other three were active (Wii Sports). Predicted energy expenditure, compared using repeated measures analysis of variance. Mean (standard deviation) predicted energy expenditure when playing Wii Sports bowling (190.6 (22.2) kl/kg/min), tennis (202.5 (31.5) kl/kg/min), and boxing (198.1 (33.9) kl/kg/min) was significantly greater than when playing sedentary games (125.5 (13.7) kl/kg/min) (P<0.001). Predicted energy expenditure was at least 65.1 (95% confidence interval 47.3 to 82.9) kl/kg/min greater when playing active rather than sedentary games. Playing new generation active computer games uses significantly more energy than playing sedentary computer games but not as much energy as playing the sport itself. The energy used when playing active Wii Sports games was not of high enough intensity to contribute towards the recommended daily amount of exercise in children.

  5. Smart Multi-Level Tool for Remote Patient Monitoring Based on a Wireless Sensor Network and Mobile Augmented Reality

    PubMed Central

    González, Fernando Cornelio Jimènez; Villegas, Osslan Osiris Vergara; Ramírez, Dulce Esperanza Torres; Sánchez, Vianey Guadalupe Cruz; Domínguez, Humberto Ochoa

    2014-01-01

    Technological innovations in the field of disease prevention and maintenance of patient health have enabled the evolution of fields such as monitoring systems. One of the main advances is the development of real-time monitors that use intelligent and wireless communication technology. In this paper, a system is presented for the remote monitoring of the body temperature and heart rate of a patient by means of a wireless sensor network (WSN) and mobile augmented reality (MAR). The combination of a WSN and MAR provides a novel alternative to remotely measure body temperature and heart rate in real time during patient care. The system is composed of (1) hardware such as Arduino microcontrollers (in the patient nodes), personal computers (for the nurse server), smartphones (for the mobile nurse monitor and the virtual patient file) and sensors (to measure body temperature and heart rate), (2) a network layer using WiFly technology, and (3) software such as LabView, Android SDK, and DroidAR. The results obtained from tests show that the system can perform effectively within a range of 20 m and requires ten minutes to stabilize the temperature sensor to detect hyperthermia, hypothermia or normal body temperature conditions. Additionally, the heart rate sensor can detect conditions of tachycardia and bradycardia. PMID:25230306

  6. Smart multi-level tool for remote patient monitoring based on a wireless sensor network and mobile augmented reality.

    PubMed

    González, Fernando Cornelio Jiménez; Villegas, Osslan Osiris Vergara; Ramírez, Dulce Esperanza Torres; Sánchez, Vianey Guadalupe Cruz; Domínguez, Humberto Ochoa

    2014-09-16

    Technological innovations in the field of disease prevention and maintenance of patient health have enabled the evolution of fields such as monitoring systems. One of the main advances is the development of real-time monitors that use intelligent and wireless communication technology. In this paper, a system is presented for the remote monitoring of the body temperature and heart rate of a patient by means of a wireless sensor network (WSN) and mobile augmented reality (MAR). The combination of a WSN and MAR provides a novel alternative to remotely measure body temperature and heart rate in real time during patient care. The system is composed of (1) hardware such as Arduino microcontrollers (in the patient nodes), personal computers (for the nurse server), smartphones (for the mobile nurse monitor and the virtual patient file) and sensors (to measure body temperature and heart rate), (2) a network layer using WiFly technology, and (3) software such as LabView, Android SDK, and DroidAR. The results obtained from tests show that the system can perform effectively within a range of 20 m and requires ten minutes to stabilize the temperature sensor to detect hyperthermia, hypothermia or normal body temperature conditions. Additionally, the heart rate sensor can detect conditions of tachycardia and bradycardia.

  7. Software For Monitoring A Computer Network

    NASA Technical Reports Server (NTRS)

    Lee, Young H.

    1992-01-01

    SNMAT is rule-based expert-system computer program designed to assist personnel in monitoring status of computer network and identifying defective computers, workstations, and other components of network. Also assists in training network operators. Network for SNMAT located at Space Flight Operations Center (SFOC) at NASA's Jet Propulsion Laboratory. Intended to serve as data-reduction system providing windows, menus, and graphs, enabling users to focus on relevant information. SNMAT expected to be adaptable to other computer networks; for example in management of repair, maintenance, and security, or in administration of planning systems, billing systems, or archives.

  8. Results of a 2011 national questionnaire for investigation of mean glandular dose from mammography in Japan.

    PubMed

    Asada, Y; Suzuki, S; Minami, K; Shirakawa, S

    2014-03-01

    Diagnostic reference levels (DRLs) for mammography have yet to be created in Japan. A national questionnaire investigation into radiographic conditions in Japan was carried out for the purpose of creating DRLs. Items investigated included the following: tube voltage; tube current; current-time product; source-image distance; craniocaudal view; automatic exposure control (AEC) settings; name of mammography unit; image receptor system (computed radiography (CR), flat panel detector (FPD), or film/screen (F/S)); and supported or unsupported monitor diagnosis (including monitor resolution). Estimation of the mean glandular dose (MGD) for mammography was performed and compared with previous investigations. The MGD was 1.58(0.48) mGy, which did not significantly differ from a 2007 investigation. In relation to image receptors, although no difference in average MGD values was observed between CR and FPD systems, F/S systems had a significantly decreased value compared to both CR and FPDs. Concerning digital systems (FPDs), the MGD value of the direct conversion system was significantly higher than the indirect conversion system. No significant difference in MGD value was evident concerning type of monitor diagnosis for either the CR or the FPD digital systems; however, hard copies were used more often in CR. No significant difference in the MGD value was found in relation to monitor resolution. This report suggests ways to lower the doses patients undergoing mammography receive in Japan, and serves as reference data for 4.2 cm compressed breast tissue of 50% composition DRLs. Furthermore, our findings suggest that further optimisation of FPD settings can promote a reduction in the MGD value.

  9. Worker-specific exposure monitor and method for surveillance of workers

    DOEpatents

    Lovejoy, Michael L.; Peeters, John P.; Johnson, A. Wayne

    2000-01-01

    A person-specific monitor that provides sensor information regarding hazards to which the person is exposed and means to geolocate the person at the time of the exposure. The monitor also includes means to communicate with a remote base station. Information from the monitor can be downloaded at the base station for long term storage and analysis. The base station can also include means to recharge the monitor.

  10. Psycho-physiological analysis of an aerobic dance programme for women

    PubMed Central

    Rockefeller, Kathleen A.; Burke, E. J.

    1979-01-01

    The purpose of this study was to determine: (1) the energy cost and (2) the psycho-physiological effects of an aerobic dance programme in young women. Twenty-one college-age women participated 40 minutes a day, three days a week, for a 10-week training period. Each work session included a five-minute warm-up period, a 30-minute stimulus period (including walk-runs) and a five-minute cool-down period. During the last four weeks of the training period, the following parameters were monitored in six of the subjects during two consecutive sessions: perceived exertion (RPE) utilising the Borg 6-20 scale, Mean = 13.19; heart rate (HR) monitored at regular intervals during the training session, Mean = 166.37; and estimated caloric expenditure based on measured oxygen consumption (V̇O2) utilising a Kofranyi-Michaelis respirometer, Mean = 289.32. Multivariate analysis of variance (MANOVA) computed between pre and post tests for the six dependent variables revealed a significant approximate F-ratio of 5.72 (p <.05). Univariate t-test analysis of mean changes revealed significant pre-post test differences for V̇O2 max expressed in ml/kg min-1, maximal pulmonary ventilation, maximal working capacity on the bicycle ergometer, submaximal HR and submaximal RPE. Body weight was not significantly altered. It was concluded that the aerobic dance training programme employed was of sufficient intensity to elicit significant physiological and psycho-physiological alterations in college-age women. PMID:465914

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkening, D.A.

    This paper discusses the extent to which bomber/cruise missile characteristics and activities can be monitored by national technical means (NTM). Particular attention is paid to those characteristics and activities relevant to arms control. National technical means--which refers to various technical means by which monitoring data can be gathered, usually involving satellite reconnaissance--are not the sole means for monitoring, though they may be the most dependable. This paper discusses the confidence one might have in monitoring bombers and cruise missiles.

  12. New and improved apparatus and method for monitoring the intensities of charged-particle beams

    DOEpatents

    Varma, M.N.; Baum, J.W.

    1981-01-16

    Charged particle beam monitoring means are disposed in the path of a charged particle beam in an experimental device. The monitoring means comprise a beam monitoring component which is operable to prevent passage of a portion of beam, while concomitantly permitting passage of another portion thereof for incidence in an experimental chamber, and providing a signal (I/sub m/) indicative of the intensity of the beam portion which is not passed. Caibration means are disposed in the experimental chamber in the path of the said another beam portion and are operable to provide a signal (I/sub f/) indicative of the intensity thereof. Means are provided to determine the ratio (R) between said signals whereby, after suitable calibration, the calibration means may be removed from the experimental chamber and the intensity of the said another beam portion determined by monitoring of the monitoring means signal, per se.

  13. Fault tolerant features and experiments of ANTS distributed real-time system

    NASA Astrophysics Data System (ADS)

    Dominic-Savio, Patrick; Lo, Jien-Chung; Tufts, Donald W.

    1995-01-01

    The ANTS project at the University of Rhode Island introduces the concept of Active Nodal Task Seeking (ANTS) as a way to efficiently design and implement dependable, high-performance, distributed computing. This paper presents the fault tolerant design features that have been incorporated in the ANTS experimental system implementation. The results of performance evaluations and fault injection experiments are reported. The fault-tolerant version of ANTS categorizes all computing nodes into three groups. They are: the up-and-running green group, the self-diagnosing yellow group and the failed red group. Each available computing node will be placed in the yellow group periodically for a routine diagnosis. In addition, for long-life missions, ANTS uses a monitoring scheme to identify faulty computing nodes. In this monitoring scheme, the communication pattern of each computing node is monitored by two other nodes.

  14. Commercial Smartphone-Based Devices and Smart Applications for Personalized Healthcare Monitoring and Management

    PubMed Central

    Vashist, Sandeep Kumar; Schneider, E. Marion; Luong, John H.T.

    2014-01-01

    Smartphone-based devices and applications (SBDAs) with cost effectiveness and remote sensing are the most promising and effective means of delivering mobile healthcare (mHealthcare). Several SBDAs have been commercialized for the personalized monitoring and/or management of basic physiological parameters, such as blood pressure, weight, body analysis, pulse rate, electrocardiograph, blood glucose, blood glucose saturation, sleeping and physical activity. With advances in Bluetooth technology, software, cloud computing and remote sensing, SBDAs provide real-time on-site analysis and telemedicine opportunities in remote areas. This scenario is of utmost importance for developing countries, where the number of smartphone users is about 70% of 6.8 billion cell phone subscribers worldwide with limited access to basic healthcare service. The technology platform facilitates patient-doctor communication and the patients to effectively manage and keep track of their medical conditions. Besides tremendous healthcare cost savings, SBDAs are very critical for the monitoring and effective management of emerging epidemics and food contamination outbreaks. The next decade will witness pioneering advances and increasing applications of SBDAs in this exponentially growing field of mHealthcare. This article provides a critical review of commercial SBDAs that are being widely used for personalized healthcare monitoring and management. PMID:26852680

  15. Commercial Smartphone-Based Devices and Smart Applications for Personalized Healthcare Monitoring and Management.

    PubMed

    Vashist, Sandeep Kumar; Schneider, E Marion; Luong, John H T

    2014-08-18

    Smartphone-based devices and applications (SBDAs) with cost effectiveness and remote sensing are the most promising and effective means of delivering mobile healthcare (mHealthcare). Several SBDAs have been commercialized for the personalized monitoring and/or management of basic physiological parameters, such as blood pressure, weight, body analysis, pulse rate, electrocardiograph, blood glucose, blood glucose saturation, sleeping and physical activity. With advances in Bluetooth technology, software, cloud computing and remote sensing, SBDAs provide real-time on-site analysis and telemedicine opportunities in remote areas. This scenario is of utmost importance for developing countries, where the number of smartphone users is about 70% of 6.8 billion cell phone subscribers worldwide with limited access to basic healthcare service. The technology platform facilitates patient-doctor communication and the patients to effectively manage and keep track of their medical conditions. Besides tremendous healthcare cost savings, SBDAs are very critical for the monitoring and effective management of emerging epidemics and food contamination outbreaks. The next decade will witness pioneering advances and increasing applications of SBDAs in this exponentially growing field of mHealthcare. This article provides a critical review of commercial SBDAs that are being widely used for personalized healthcare monitoring and management.

  16. A portable, inexpensive, wireless vital signs monitoring system.

    PubMed

    Kaputa, David; Price, David; Enderle, John D

    2010-01-01

    The University of Connecticut, Department of Biomedical Engineering has developed a device to be used by patients to collect physiological data outside of a medical facility. This device facilitates modes of data collection that would be expensive, inconvenient, or impossible to obtain by traditional means within the medical facility. Data can be collected on specific days, at specific times, during specific activities, or while traveling. The device uses biosensors to obtain information such as pulse oximetry (SpO2), heart rate, electrocardiogram (ECG), non-invasive blood pressure (NIBP), and weight which are sent via Bluetooth to an interactive monitoring device. The data can then be downloaded to an electronic storage device or transmitted to a company server, physician's office, or hospital. The data collection software is usable on any computer device with Bluetooth capability, thereby removing the need for special hardware for the monitoring device and reducing the total cost of the system. The modular biosensors can be added or removed as needed without changing the monitoring device software. The user is prompted by easy-to-follow instructions written in non-technical language. Additional features, such as screens with large buttons and large text, allow for use by those with limited vision or limited motor skills.

  17. The Effort to Reduce a Muscle Fatigue Through Gymnastics Relaxation and Ergonomic Approach for Computer Users in Central Building State University of Medan

    NASA Astrophysics Data System (ADS)

    Gultom, Syamsul; Darma Sitepu, Indra; Hasibuan, Nurman

    2018-03-01

    Fatigue due to long and continuous computer usage can lead to problems of dominant fatigue associated with decreased performance and work motivation. Specific targets in the first phase have been achieved in this research such as: (1) Identified complaints on workers using computers, using the Bourdon Wiersma test kit. (2) Finding the right relaxation & work posture draft for a solution to reduce muscle fatigue in computer-based workers. The type of research used in this study is research and development method which aims to produce the products or refine existing products. The final product is a prototype of back-holder, monitoring filter and arranging a relaxation exercise as well as the manual book how to do this while in front of the computer to lower the fatigue level for computer users in Unimed’s Administration Center. In the first phase, observations and interviews have been conducted and identified the level of fatigue on the employees of computer users at Uniemd’s Administration Center using Bourdon Wiersma test and has obtained the following results: (1) The average velocity time of respondents in BAUK, BAAK and BAPSI after working with the value of interpretation of the speed obtained value of 8.4, WS 13 was in a good enough category, (2) The average of accuracy of respondents in BAUK, in BAAK and in BAPSI after working with interpretation value accuracy obtained Value of 5.5, WS 8 was in doubt-category. This result shows that computer users experienced a significant tiredness at the Unimed Administration Center, (3) the consistency of the average of the result in measuring tiredness level on computer users in Unimed’s Administration Center after working with values in consistency of interpretation obtained Value of 5.5 with WS 8 was put in a doubt-category, which means computer user in The Unimed Administration Center suffered an extreme fatigue. In phase II, based on the results of the first phase in this research, the researcher offers solutions such as the prototype of Back-Holder, monitoring filter, and design a proper relaxation exercise to reduce the fatigue level. Furthermore, in in order to maximize the exercise itself, a manual book will be given to employees whom regularly work in front of computers at Unimed’s Administration Center

  18. GEOdetic Data assimilation and EStimation of references for climate change InvEstigation. An overall presentation of the French GEODESIE project

    NASA Astrophysics Data System (ADS)

    Nahmani, S.; Coulot, D.; Biancale, R.; Bizouard, C.; Bonnefond, P.; Bouquillon, S.; Collilieux, X.; Deleflie, F.; Garayt, B.; Lambert, S. B.; Laurent-Varin, S.; Marty, J. C.; Mercier, F.; Metivier, L.; Meyssignac, B.; Pollet, A.; Rebischung, P.; Reinquin, F.; Richard, J. Y.; Tertre, F.; Woppelmann, G.

    2017-12-01

    Many major indicators of climate change are monitored with space observations. This monitoring is highly dependent on references that only geodesy can provide. The current accuracy of these references does not permit to fully support the challenges that the constantly evolving Earth system gives rise to, and can consequently limit the accuracy of these indicators. Thus, in the framework of the GGOS, stringent requirements are fixed to the International Terrestrial Reference Frame (ITRF) for the next decade: an accuracy at the level of 1 mm and a stability at the level of 0.1 mm/yr. This means an improvement of the current quality of ITRF by a factor of 5-10. Improving the quality of the geodetic references is an issue which requires a thorough reassessment of the methodologies involved. The most relevant and promising method to improve this quality is the direct combination of the space-geodetic measurements used to compute the official references of the IERS. The GEODESIE project aims at (i) determining highly-accurate global and consistent references and (ii) providing the geophysical and climate research communities with these references, for a better estimation of geocentric sea level rise, ice mass balance and on-going climate changes. Time series of sea levels computed from altimetric data and tide gauge records with these references will also be provided. The geodetic references will be essential bases for Earth's observation and monitoring to support the challenges of the century. The geocentric time series of sea levels will permit to better apprehend (i) the drivers of the global mean sea level rise and of regional variations of sea level and (ii) the contribution of the global climate change induced by anthropogenic greenhouse gases emissions to these drivers. All the results and computation and quality assessment reports will be available at geodesie_anr.ign.fr.This project, supported by the French Agence Nationale de la Recherche (ANR) for the period 2017-2020, will be an unprecedented opportunity to provide the French Groupe de Recherche de Géodésie Spatiale (GRGS) with complete simulation and data processing capabilities to prepare the future arrival of space missions such as the European Geodetic Reference Antenna in SPace (E-GRASP) and to significantly contribute to the GGOS with accurate references.

  19. Application research of Ganglia in Hadoop monitoring and management

    NASA Astrophysics Data System (ADS)

    Li, Gang; Ding, Jing; Zhou, Lixia; Yang, Yi; Liu, Lei; Wang, Xiaolei

    2017-03-01

    There are many applications of Hadoop System in the field of large data, cloud computing. The test bench of storage and application in seismic network at Earthquake Administration of Tianjin use with Hadoop system, which is used the open source software of Ganglia to operate and monitor. This paper reviews the function, installation and configuration process, application effect of operating and monitoring in Hadoop system of the Ganglia system. It briefly introduces the idea and effect of Nagios software monitoring Hadoop system. It is valuable for the industry in the monitoring system of cloud computing platform.

  20. Effects of land use on water quality and transport of selected constituents in streams in Mecklenburg County, North Carolina, 1994–98

    USGS Publications Warehouse

    Ferrell, Gloria M.

    2001-01-01

    Transport rates for total solids, total nitrogen, total phosphorus, biochemical oxygen demand, chromium, copper, lead, nickel, and zinc during 1994–98 were computed for six stormwater-monitoring sites in Mecklenburg County, North Carolina. These six stormwater-monitoring sites were operated by the Mecklenburg County Department of Environmental Protection, in cooperation with the City of Charlotte, and are located near the mouths of major streams. Constituent transport at the six study sites generally was dominated by nonpoint sources, except for nitrogen and phosphorus at two sites located downstream from the outfalls of major municipal wastewater-treatment plants.To relate land use to constituent transport, regression equations to predict constituent yield were developed by using water-quality data from a previous study of nine stormwater-monitoring sites on small streams in Mecklenburg County. The drainage basins of these nine stormwater sites have relatively homogeneous land-use characteristics compared to the six study sites. Mean annual construction activity, based on building permit files, was estimated for all stormwater-monitoring sites and included as an explanatory variable in the regression equations. These regression equations were used to predict constituent yield for the six study sites. Predicted yields generally were in agreement with computed yields. In addition, yields were predicted by using regression equations derived from a national urban water-quality database. Yields predicted from the regional regression equations generally were about an order of magnitude lower than computed yields.Regression analysis indicated that construction activity was a major contributor to transport of the constituents evaluated in this study except for total nitrogen and biochemical oxygen demand. Transport of total nitrogen and biochemical oxygen demand was dominated by point-source contributions. The two study basins that had the largest amounts of construction activity also had the highest total solids yields (1,300 and 1,500 tons per square mile per year). The highest total phosphorus yields (3.2 and 1.7 tons per square mile per year) attributable to nonpoint sources also occurred in these basins. Concentrations of chromium, copper, lead, nickel, and zinc were positively correlated with total solids concentrations at most of the study sites (Pearson product-moment correlation >0.50). The site having the highest median concentrations of chromium, copper, and nickel also was the site having the highest computed yield for total solids.

  1. Monitoring diver kinematics with dielectric elastomer sensors

    NASA Astrophysics Data System (ADS)

    Walker, Christopher R.; Anderson, Iain A.

    2017-04-01

    Diving, initially motivated for food purposes, is crucial to the oil and gas industry, search and rescue, and is even done recreationally by millions of people. There is a growing need however, to monitor the health and activity of divers. The Divers Alert Network has reported on average 90 fatalities per year since 1980. Furthermore an estimated 1000 divers require recompression treatment for dive-related injuries every year. One means of monitoring diver activity is to integrate strain sensors into a wetsuit. This would provide kinematic information on the diver potentially improving buoyancy control assessment, providing a platform for gesture communication, detecting panic attacks and monitoring diver fatigue. To explore diver kinematic monitoring we have coupled dielectric elastomer sensors to a wetsuit worn by the pilot of a human-powered wet submarine. This provided a unique platform to test the performance and accuracy of dielectric elastomer strain sensors in an underwater application. The aim of this study was to assess the ability of strain sensors to monitor the kinematics of a diver. This study was in collaboration with the University of Auckland's human-powered submarine team, Team Taniwha. The pilot, completely encapsulated in a hull, pedals to propel the submarine forward. Therefore this study focused on leg motion as that is the primary motion of the submarine pilot. Four carbon-filled silicone dielectric elastomer sensors were fabricated and coupled to the pilot's wetsuit. The first two sensors were attached over the knee joints, with the remaining two attached between the pelvis and thigh. The goal was to accurately measure leg joint angles thereby determining the position of each leg relative to the hip. A floating data acquisition unit monitored the sensors and transmitted data packets to a nearby computer for real-time processing. A GoPro Hero 4 silver edition was used to capture the experiments and provide a means of post-validation. The ability of the sensors to measure joint angles was assessed by examining GoPro footage in the image processing software, ImageJ. This paper applies dielectric elastomer sensor technology to monitoring the leg motion of a diver. The experimental set-up and results are presented and discussed.

  2. Intensity Thresholds on Raw Acceleration Data: Euclidean Norm Minus One (ENMO) and Mean Amplitude Deviation (MAD) Approaches

    PubMed Central

    Bakrania, Kishan; Yates, Thomas; Rowlands, Alex V.; Esliger, Dale W.; Bunnewell, Sarah; Sanders, James; Davies, Melanie; Khunti, Kamlesh; Edwardson, Charlotte L.

    2016-01-01

    Objectives (1) To develop and internally-validate Euclidean Norm Minus One (ENMO) and Mean Amplitude Deviation (MAD) thresholds for separating sedentary behaviours from common light-intensity physical activities using raw acceleration data collected from both hip- and wrist-worn tri-axial accelerometers; and (2) to compare and evaluate the performances between the ENMO and MAD metrics. Methods Thirty-three adults [mean age (standard deviation (SD)) = 27.4 (5.9) years; mean BMI (SD) = 23.9 (3.7) kg/m2; 20 females (60.6%)] wore four accelerometers; an ActiGraph GT3X+ and a GENEActiv on the right hip; and an ActiGraph GT3X+ and a GENEActiv on the non-dominant wrist. Under laboratory-conditions, participants performed 16 different activities (11 sedentary behaviours and 5 light-intensity physical activities) for 5 minutes each. ENMO and MAD were computed from the raw acceleration data, and logistic regression and receiver-operating-characteristic (ROC) analyses were implemented to derive thresholds for activity discrimination. Areas under ROC curves (AUROC) were calculated to summarise performances and thresholds were assessed via executing leave-one-out-cross-validations. Results For both hip and wrist monitor placements, in comparison to the ActiGraph GT3X+ monitors, the ENMO and MAD values derived from the GENEActiv devices were observed to be slightly higher, particularly for the lower-intensity activities. Monitor-specific hip and wrist ENMO and MAD thresholds showed excellent ability for separating sedentary behaviours from motion-based light-intensity physical activities (in general, AUROCs >0.95), with validation indicating robustness. However, poor classification was experienced when attempting to isolate standing still from sedentary behaviours (in general, AUROCs <0.65). The ENMO and MAD metrics tended to perform similarly across activities and accelerometer brands. Conclusions Researchers can utilise these robust monitor-specific hip and wrist ENMO and MAD thresholds, in order to accurately separate sedentary behaviours from common motion-based light-intensity physical activities. However, caution should be taken if isolating sedentary behaviours from standing is of particular interest. PMID:27706241

  3. Intensity Thresholds on Raw Acceleration Data: Euclidean Norm Minus One (ENMO) and Mean Amplitude Deviation (MAD) Approaches.

    PubMed

    Bakrania, Kishan; Yates, Thomas; Rowlands, Alex V; Esliger, Dale W; Bunnewell, Sarah; Sanders, James; Davies, Melanie; Khunti, Kamlesh; Edwardson, Charlotte L

    2016-01-01

    (1) To develop and internally-validate Euclidean Norm Minus One (ENMO) and Mean Amplitude Deviation (MAD) thresholds for separating sedentary behaviours from common light-intensity physical activities using raw acceleration data collected from both hip- and wrist-worn tri-axial accelerometers; and (2) to compare and evaluate the performances between the ENMO and MAD metrics. Thirty-three adults [mean age (standard deviation (SD)) = 27.4 (5.9) years; mean BMI (SD) = 23.9 (3.7) kg/m2; 20 females (60.6%)] wore four accelerometers; an ActiGraph GT3X+ and a GENEActiv on the right hip; and an ActiGraph GT3X+ and a GENEActiv on the non-dominant wrist. Under laboratory-conditions, participants performed 16 different activities (11 sedentary behaviours and 5 light-intensity physical activities) for 5 minutes each. ENMO and MAD were computed from the raw acceleration data, and logistic regression and receiver-operating-characteristic (ROC) analyses were implemented to derive thresholds for activity discrimination. Areas under ROC curves (AUROC) were calculated to summarise performances and thresholds were assessed via executing leave-one-out-cross-validations. For both hip and wrist monitor placements, in comparison to the ActiGraph GT3X+ monitors, the ENMO and MAD values derived from the GENEActiv devices were observed to be slightly higher, particularly for the lower-intensity activities. Monitor-specific hip and wrist ENMO and MAD thresholds showed excellent ability for separating sedentary behaviours from motion-based light-intensity physical activities (in general, AUROCs >0.95), with validation indicating robustness. However, poor classification was experienced when attempting to isolate standing still from sedentary behaviours (in general, AUROCs <0.65). The ENMO and MAD metrics tended to perform similarly across activities and accelerometer brands. Researchers can utilise these robust monitor-specific hip and wrist ENMO and MAD thresholds, in order to accurately separate sedentary behaviours from common motion-based light-intensity physical activities. However, caution should be taken if isolating sedentary behaviours from standing is of particular interest.

  4. Network Monitoring and Fault Detection on the University of Illinois at Urbana-Champaign Campus Computer Network.

    ERIC Educational Resources Information Center

    Sng, Dennis Cheng-Hong

    The University of Illinois at Urbana-Champaign (UIUC) has a large campus computer network serving a community of about 20,000 users. With such a large network, it is inevitable that there are a wide variety of technologies co-existing in a multi-vendor environment. Effective network monitoring tools can help monitor traffic and link usage, as well…

  5. Development of a real-time bridge structural monitoring and warning system: a case study in Thailand

    NASA Astrophysics Data System (ADS)

    Khemapech, I.; Sansrimahachai, W.; Toachoodee, M.

    2017-04-01

    Regarded as one of the physical aspects under societal and civil development and evolution, engineering structure is required to support growth of the nation. It also impacts life quality and safety of the civilian. Despite of its own weight (dead load) and live load, structural members are also significantly affected by disaster and environment. Proper inspection and detection are thus crucial both during regular and unsafe events. An Enhanced Structural Health Monitoring System Using Stream Processing and Artificial Neural Network Techniques (SPANNeT) has been developed and is described in this paper. SPANNeT applies wireless sensor network, real-time data stream processing and artificial neural network based upon the measured bending strains. Major contributions include an effective, accurate and energy-aware data communication and damage detection of the engineering structure. Strain thresholds have been defined according to computer simulation results and the AASHTO (American Association of State Highway and Transportation Officials) LRFD (Load and Resistance Factor Design) Bridge Design specifications for launching several warning levels. SPANNeT has been tested and evaluated by means of computer-based simulation and on-site levels. According to the measurements, the observed maximum values are 25 to 30 microstrains during normal operation. The given protocol provided at least 90% of data communication reliability. SPANNeT is capable of real-time data report, monitoring and warning efficiently conforming to the predefined thresholds which can be adjusted regarding user's requirements and structural engineering characteristics.

  6. Statistical power for detecting trends with applications to seabird monitoring

    USGS Publications Warehouse

    Hatch, Shyla A.

    2003-01-01

    Power analysis is helpful in defining goals for ecological monitoring and evaluating the performance of ongoing efforts. I examined detection standards proposed for population monitoring of seabirds using two programs (MONITOR and TRENDS) specially designed for power analysis of trend data. Neither program models within- and among-years components of variance explicitly and independently, thus an error term that incorporates both components is an essential input. Residual variation in seabird counts consisted of day-to-day variation within years and unexplained variation among years in approximately equal parts. The appropriate measure of error for power analysis is the standard error of estimation (S.E.est) from a regression of annual means against year. Replicate counts within years are helpful in minimizing S.E.est but should not be treated as independent samples for estimating power to detect trends. Other issues include a choice of assumptions about variance structure and selection of an exponential or linear model of population change. Seabird count data are characterized by strong correlations between S.D. and mean, thus a constant CV model is appropriate for power calculations. Time series were fit about equally well with exponential or linear models, but log transformation ensures equal variances over time, a basic assumption of regression analysis. Using sample data from seabird monitoring in Alaska, I computed the number of years required (with annual censusing) to detect trends of -1.4% per year (50% decline in 50 years) and -2.7% per year (50% decline in 25 years). At ??=0.05 and a desired power of 0.9, estimated study intervals ranged from 11 to 69 years depending on species, trend, software, and study design. Power to detect a negative trend of 6.7% per year (50% decline in 10 years) is suggested as an alternative standard for seabird monitoring that achieves a reasonable match between statistical and biological significance.

  7. Step Detection and Activity Recognition Accuracy of Seven Physical Activity Monitors

    PubMed Central

    Storm, Fabio A.; Heller, Ben W.; Mazzà, Claudia

    2015-01-01

    The aim of this study was to compare the seven following commercially available activity monitors in terms of step count detection accuracy: Movemonitor (Mc Roberts), Up (Jawbone), One (Fitbit), ActivPAL (PAL Technologies Ltd.), Nike+ Fuelband (Nike Inc.), Tractivity (Kineteks Corp.) and Sensewear Armband Mini (Bodymedia). Sixteen healthy adults consented to take part in the study. The experimental protocol included walking along an indoor straight walkway, descending and ascending 24 steps, free outdoor walking and free indoor walking. These tasks were repeated at three self-selected walking speeds. Angular velocity signals collected at both shanks using two wireless inertial measurement units (OPAL, ADPM Inc) were used as a reference for the step count, computed using previously validated algorithms. Step detection accuracy was assessed using the mean absolute percentage error computed for each sensor. The Movemonitor and the ActivPAL were also tested within a nine-minute activity recognition protocol, during which the participants performed a set of complex tasks. Posture classifications were obtained from the two monitors and expressed as a percentage of the total task duration. The Movemonitor, One, ActivPAL, Nike+ Fuelband and Sensewear Armband Mini underestimated the number of steps in all the observed walking speeds, whereas the Tractivity significantly overestimated step count. The Movemonitor was the best performing sensor, with an error lower than 2% at all speeds and the smallest error obtained in the outdoor walking. The activity recognition protocol showed that the Movemonitor performed best in the walking recognition, but had difficulty in discriminating between standing and sitting. Results of this study can be used to inform choice of a monitor for specific applications. PMID:25789630

  8. Step detection and activity recognition accuracy of seven physical activity monitors.

    PubMed

    Storm, Fabio A; Heller, Ben W; Mazzà, Claudia

    2015-01-01

    The aim of this study was to compare the seven following commercially available activity monitors in terms of step count detection accuracy: Movemonitor (Mc Roberts), Up (Jawbone), One (Fitbit), ActivPAL (PAL Technologies Ltd.), Nike+ Fuelband (Nike Inc.), Tractivity (Kineteks Corp.) and Sensewear Armband Mini (Bodymedia). Sixteen healthy adults consented to take part in the study. The experimental protocol included walking along an indoor straight walkway, descending and ascending 24 steps, free outdoor walking and free indoor walking. These tasks were repeated at three self-selected walking speeds. Angular velocity signals collected at both shanks using two wireless inertial measurement units (OPAL, ADPM Inc) were used as a reference for the step count, computed using previously validated algorithms. Step detection accuracy was assessed using the mean absolute percentage error computed for each sensor. The Movemonitor and the ActivPAL were also tested within a nine-minute activity recognition protocol, during which the participants performed a set of complex tasks. Posture classifications were obtained from the two monitors and expressed as a percentage of the total task duration. The Movemonitor, One, ActivPAL, Nike+ Fuelband and Sensewear Armband Mini underestimated the number of steps in all the observed walking speeds, whereas the Tractivity significantly overestimated step count. The Movemonitor was the best performing sensor, with an error lower than 2% at all speeds and the smallest error obtained in the outdoor walking. The activity recognition protocol showed that the Movemonitor performed best in the walking recognition, but had difficulty in discriminating between standing and sitting. Results of this study can be used to inform choice of a monitor for specific applications.

  9. Monitoring system including an electronic sensor platform and an interrogation transceiver

    DOEpatents

    Kinzel, Robert L.; Sheets, Larry R.

    2003-09-23

    A wireless monitoring system suitable for a wide range of remote data collection applications. The system includes at least one Electronic Sensor Platform (ESP), an Interrogator Transceiver (IT) and a general purpose host computer. The ESP functions as a remote data collector from a number of digital and analog sensors located therein. The host computer provides for data logging, testing, demonstration, installation checkout, and troubleshooting of the system. The IT transmits signals from one or more ESP's to the host computer to the ESP's. The IT host computer may be powered by a common power supply, and each ESP is individually powered by a battery. This monitoring system has an extremely low power consumption which allows remote operation of the ESP for long periods; provides authenticated message traffic over a wireless network; utilizes state-of-health and tamper sensors to ensure that the ESP is secure and undamaged; has robust housing of the ESP suitable for use in radiation environments; and is low in cost. With one base station (host computer and interrogator transceiver), multiple ESP's may be controlled at a single monitoring site.

  10. Electronics Environmental Benefits Calculator

    EPA Pesticide Factsheets

    The Electronics Environmental Benefits Calculator (EEBC) was developed to assist organizations in estimating the environmental benefits of greening their purchase, use and disposal of electronics.The EEBC estimates the environmental and economic benefits of: Purchasing Electronic Product Environmental Assessment Tool (EPEAT)-registered products; Enabling power management features on computers and monitors above default percentages; Extending the life of equipment beyond baseline values; Reusing computers, monitors and cell phones; and Recycling computers, monitors, cell phones and loads of mixed electronic products.The EEBC may be downloaded as a Microsoft Excel spreadsheet.See https://www.federalelectronicschallenge.net/resources/bencalc.htm for more details.

  11. Models for interrupted monitoring of a stochastic process

    NASA Technical Reports Server (NTRS)

    Palmer, E.

    1977-01-01

    As computers are added to the cockpit, the pilot's job is changing from of manually flying the aircraft, to one of supervising computers which are doing navigation, guidance and energy management calculations as well as automatically flying the aircraft. In this supervisorial role the pilot must divide his attention between monitoring the aircraft's performance and giving commands to the computer. Normative strategies are developed for tasks where the pilot must interrupt his monitoring of a stochastic process in order to attend to other duties. Results are given as to how characteristics of the stochastic process and the other tasks affect the optimal strategies.

  12. Launch Processing System. [for Space Shuttle

    NASA Technical Reports Server (NTRS)

    Byrne, F.; Doolittle, G. V.; Hockenberger, R. W.

    1976-01-01

    This paper presents a functional description of the Launch Processing System, which provides automatic ground checkout and control of the Space Shuttle launch site and airborne systems, with emphasis placed on the Checkout, Control, and Monitor Subsystem. Hardware and software modular design concepts for the distributed computer system are reviewed relative to performing system tests, launch operations control, and status monitoring during ground operations. The communication network design, which uses a Common Data Buffer interface to all computers to allow computer-to-computer communication, is discussed in detail.

  13. Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.

    PubMed

    Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang

    2016-11-01

    Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types. © 2016, National Ground Water Association.

  14. Space weather monitoring by ground-based means carried out in Polar Geophysical Center at Arctic and Antarctic Research Institute

    NASA Astrophysics Data System (ADS)

    Janzhura, Alexander

    A real-time information on geophysical processes in polar regions is very important for goals of Space Weather monitoring by the ground-based means. The modern communication systems and computer technology makes it possible to collect and process the data from remote sites without significant delays. A new acquisition equipment based on microprocessor modules and reliable in hush climatic conditions was deployed at the Roshydromet networks of geophysical observations in Arctic and is deployed at observatories in Antarctic. A contemporary system for on-line collecting and transmitting the geophysical data from the Arctic and Antarctic stations to AARI has been realized and the Polar Geophysical Center (PGC) arranged at AARI ensures the near-real time processing and analyzing the geophysical information from 11 stations in Arctic and 5 stations in Antarctic. The space weather monitoring by the ground based means is one of the main tasks standing before the Polar Geophysical Center. As studies by Troshichev and Janzhura, [2012] showed, the PC index characterizing the polar cap magnetic activity appeared to be an adequate indicator of the solar wind energy that entered into the magnetosphere and the energy that is accumulating in the magnetosphere. A great advantage of the PC index application over other methods based on satellite data is a permanent on-line availability of information about magnetic activity in both northern and southern polar caps. A special procedure agreed between Arctic and Antarctic Research Institute (AARI) and Space Institute of the Danish Technical University (DTUSpace) ensures calculation of the unified PC index in quasi-real time by magnetic data from the Thule and Vostok stations (see public site: http://pc-index.org). The method for estimation of AL and Dst indices (as indicators of state of the disturbed magnetosphere) based on data on foregoing PC indices has been elaborated and testified in the Polar Geophysical Center. It is demonstrated that the PC index can be successfully used to monitor the state of the magnetosphere (space weather monitoring) and the readiness of the magnetosphere to producing substorm or storm (space weather nowcasting).

  15. [Economic efficiency of computer monitoring of health].

    PubMed

    Il'icheva, N P; Stazhadze, L L

    2001-01-01

    Presents the method of computer monitoring of health, based on utilization of modern information technologies in public health. The method helps organize preventive activities of an outpatient clinic at a high level and essentially decrease the time and money loss. Efficiency of such preventive measures, increased number of computer and Internet users suggests that such methods are promising and further studies in this field are needed.

  16. Multiple-User, Multitasking, Virtual-Memory Computer System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1993-01-01

    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  17. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  18. Benchmarking Undedicated Cloud Computing Providers for Analysis of Genomic Datasets

    PubMed Central

    Yazar, Seyhan; Gooden, George E. C.; Mackey, David A.; Hewitt, Alex W.

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5–78.2) for E.coli and 53.5% (95% CI: 34.4–72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5–303.1) and 173.9% (95% CI: 134.6–213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  19. Detection of Failure in Asynchronous Motor Using Soft Computing Method

    NASA Astrophysics Data System (ADS)

    Vinoth Kumar, K.; Sony, Kevin; Achenkunju John, Alan; Kuriakose, Anto; John, Ano P.

    2018-04-01

    This paper investigates the stator short winding failure of asynchronous motor also their effects on motor current spectrums. A fuzzy logic approach i.e., model based technique possibly will help to detect the asynchronous motor failure. Actually, fuzzy logic similar to humanoid intelligent methods besides expected linguistic empowering inferences through vague statistics. The dynamic model is technologically advanced for asynchronous motor by means of fuzzy logic classifier towards investigate the stator inter turn failure in addition open phase failure. A hardware implementation was carried out with LabVIEW for the online-monitoring of faults.

  20. Development of a tree classifier for discrimination of surface mine activity from Landsat digital data

    NASA Technical Reports Server (NTRS)

    Solomon, J. L.; Miller, W. F.; Quattrochi, D. A.

    1979-01-01

    In a cooperative project with the Geological Survey of Alabama, the Mississippi State Remote Sensing Applications Program has developed a single purpose, decision-tree classifier using band-ratioing techniques to discriminate various stages of surface mining activity. The tree classifier has four levels and employs only two channels in classification at each level. An accurate computation of the amount of disturbed land resulting from the mining activity can be made as a product of the classification output. The utilization of Landsat data provides a cost-efficient, rapid, and accurate means of monitoring surface mining activities.

  1. Metals processing control by counting molten metal droplets

    DOEpatents

    Schlienger, Eric; Robertson, Joanna M.; Melgaard, David; Shelmidine, Gregory J.; Van Den Avyle, James A.

    2000-01-01

    Apparatus and method for controlling metals processing (e.g., ESR) by melting a metal ingot and counting molten metal droplets during melting. An approximate amount of metal in each droplet is determined, and a melt rate is computed therefrom. Impedance of the melting circuit is monitored, such as by calculating by root mean square a voltage and current of the circuit and dividing the calculated current into the calculated voltage. Analysis of the impedance signal is performed to look for a trace characteristic of formation of a molten metal droplet, such as by examining skew rate, curvature, or a higher moment.

  2. Real-time failure control (SAFD)

    NASA Technical Reports Server (NTRS)

    Panossian, Hagop V.; Kemp, Victoria R.; Eckerling, Sherry J.

    1990-01-01

    The Real Time Failure Control program involves development of a failure detection algorithm, referred as System for Failure and Anomaly Detection (SAFD), for the Space Shuttle Main Engine (SSME). This failure detection approach is signal-based and it entails monitoring SSME measurement signals based on predetermined and computed mean values and standard deviations. Twenty four engine measurements are included in the algorithm and provisions are made to add more parameters if needed. Six major sections of research are presented: (1) SAFD algorithm development; (2) SAFD simulations; (3) Digital Transient Model failure simulation; (4) closed-loop simulation; (5) SAFD current limitations; and (6) enhancements planned for.

  3. Structural health monitoring (vibration) as a tool for identifying structural alterations of the lumbar spine: a twin control study.

    PubMed

    Kawchuk, Gregory N; Hartvigsen, Jan; Edgecombe, Tiffany; Prasad, Narasimha; van Dieen, Jaap H

    2016-03-11

    Structural health monitoring (SHM) is an engineering technique used to identify mechanical abnormalities not readily apparent through other means. Recently, SHM has been adapted for use in biological systems, but its invasive nature limits its clinical application. As such, the purpose of this project was to determine if a non-invasive form of SHM could identify structural alterations in the spines of living human subjects. Lumbar spines of 10 twin pairs were visualized by magnetic resonance imaging then assessed by a blinded radiologist to determine whether twin pairs were structurally concordant or discordant. Vibration was then applied to each subject's spine and the resulting response recorded from sensors overlying lumbar spinous processes. The peak frequency, area under the curve and the root mean square were computed from the frequency response function of each sensor. Statistical analysis demonstrated that in twins whose structural appearance was discordant, peak frequency was significantly different between twin pairs while in concordant twins, no outcomes were significantly different. From these results, we conclude that structural changes within the spine can alter its vibration response. As such, further investigation of SHM to identify spinal abnormalities in larger human populations is warranted.

  4. Active Cyber Defense: Enhancing National Cyber Defense

    DTIC Science & Technology

    2011-12-01

    Prevention System ISP Internet Service Provider IT Information Technology IWM Information Warfare Monitor LOAC Law of Armed Conflict NATO...the Information Warfare Monitor ( IWM ) discovered that GhostNet had infected 1,295 computers in 103 countries. As many as thirty percent of these...By monitoring the computers in Dharamsala and at various Tibetan missions, IWM was able to determine the IP addresses of the servers hosting Gh0st

  5. 15. NBS TOP SIDE CONTROL ROOM. THE SUIT SYSTEMS CONSOLE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. NBS TOP SIDE CONTROL ROOM. THE SUIT SYSTEMS CONSOLE IS USED TO CONTROL AIR FLOW AND WATER FLOW TO THE UNDERWATER SPACE SUIT DURING THE TEST. THE SUIT SYSTEMS ENGINEER MONITORS AIR FLOW ON THE PANEL TO THE LEFT, AND SUIT DATA ON THE COMPUTER MONITOR JUST SLIGHTLY TO HIS LEFT. WATER FLOW IS MONITORED ON THE PANEL JUST SLIGHTLY TO HIS RIGHT AND TEST VIDEO TO HIS FAR RIGHT. THE DECK CHIEF MONITORS THE DIVER'S DIVE TIMES ON THE COMPUTER IN THE UPPER RIGHT. THE DECK CHIEF LOGS THEM IN AS THEY ENTER THE WATER, AND LOGS THEM OUT AS THEY EXIT THE WATER. THE COMPUTER CALCULATES TOTAL DIVE TIME. - Marshall Space Flight Center, Neutral Buoyancy Simulator Facility, Rideout Road, Huntsville, Madison County, AL

  6. Dance-the-Music: an educational platform for the modeling, recognition and audiovisual monitoring of dance steps using spatiotemporal motion templates

    NASA Astrophysics Data System (ADS)

    Maes, Pieter-Jan; Amelynck, Denis; Leman, Marc

    2012-12-01

    In this article, a computational platform is presented, entitled "Dance-the-Music", that can be used in a dance educational context to explore and learn the basics of dance steps. By introducing a method based on spatiotemporal motion templates, the platform facilitates to train basic step models from sequentially repeated dance figures performed by a dance teacher. Movements are captured with an optical motion capture system. The teachers' models can be visualized from a first-person perspective to instruct students how to perform the specific dance steps in the correct manner. Moreover, recognition algorithms-based on a template matching method-can determine the quality of a student's performance in real time by means of multimodal monitoring techniques. The results of an evaluation study suggest that the Dance-the-Music is effective in helping dance students to master the basics of dance figures.

  7. Defect Detection in Arc-Welding Processes by Means of the Line-to-Continuum Method and Feature Selection.

    PubMed

    Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M

    2009-01-01

    Plasma optical spectroscopy is widely employed in on-line welding diagnostics. The determination of the plasma electron temperature, which is typically selected as the output monitoring parameter, implies the identification of the atomic emission lines. As a consequence, additional processing stages are required with a direct impact on the real time performance of the technique. The line-to-continuum method is a feasible alternative spectroscopic approach and it is particularly interesting in terms of its computational efficiency. However, the monitoring signal highly depends on the chosen emission line. In this paper, a feature selection methodology is proposed to solve the uncertainty regarding the selection of the optimum spectral band, which allows the employment of the line-to-continuum method for on-line welding diagnostics. Field test results have been conducted to demonstrate the feasibility of the solution.

  8. Dynamic design of ecological monitoring networks for non-Gaussian spatio-temporal data

    USGS Publications Warehouse

    Wikle, C.K.; Royle, J. Andrew

    2005-01-01

    Many ecological processes exhibit spatial structure that changes over time in a coherent, dynamical fashion. This dynamical component is often ignored in the design of spatial monitoring networks. Furthermore, ecological variables related to processes such as habitat are often non-Gaussian (e.g. Poisson or log-normal). We demonstrate that a simulation-based design approach can be used in settings where the data distribution is from a spatio-temporal exponential family. The key random component in the conditional mean function from this distribution is then a spatio-temporal dynamic process. Given the computational burden of estimating the expected utility of various designs in this setting, we utilize an extended Kalman filter approximation to facilitate implementation. The approach is motivated by, and demonstrated on, the problem of selecting sampling locations to estimate July brood counts in the prairie pothole region of the U.S.

  9. Computer program analyzes and monitors electrical power systems (POSIMO)

    NASA Technical Reports Server (NTRS)

    Jaeger, K.

    1972-01-01

    Requirements to monitor and/or simulate electric power distribution, power balance, and charge budget are discussed. Computer program to analyze power system and generate set of characteristic power system data is described. Application to status indicators to denote different exclusive conditions is presented.

  10. A daily living activity remote monitoring system for solitary elderly people.

    PubMed

    Maki, Hiromichi; Ogawa, Hidekuni; Matsuoka, Shingo; Yonezawa, Yoshiharu; Caldwell, W Morton

    2011-01-01

    A daily living activity remote monitoring system has been developed for supporting solitary elderly people. The monitoring system consists of a tri-axis accelerometer, six low-power active filters, a low-power 8-bit microcontroller (MC), a 1GB SD memory card (SDMC) and a 2.4 GHz low transmitting power mobile phone (PHS). The tri-axis accelerometer attached to the subject's chest can simultaneously measure dynamic and static acceleration forces produced by heart sound, respiration, posture and behavior. The heart rate, respiration rate, activity, posture and behavior are detected from the dynamic and static acceleration forces. These data are stored in the SD. The MC sends the data to the server computer every hour. The server computer stores the data and makes a graphic chart from the data. When the caregiver calls from his/her mobile phone to the server computer, the server computer sends the graphical chart via the PHS. The caregiver's mobile phone displays the chart to the monitor graphically.

  11. Apparatus and method for monitoring the intensities of charged particle beams

    DOEpatents

    Varma, Matesh N.; Baum, John W.

    1982-11-02

    Charged particle beam monitoring means (40) are disposed in the path of a charged particle beam (44) in an experimental device (10). The monitoring means comprise a beam monitoring component (42) which is operable to prevent passage of a portion of beam (44), while concomitantly permitting passage of another portion thereof (46) for incidence in an experimental chamber (18), and providing a signal (I.sub.m) indicative of the intensity of the beam portion which is not passed. Calibration means (36) are disposed in the experimental chamber in the path of the said another beam portion and are operable to provide a signal (I.sub.f) indicative of the intensity thereof. Means (41 and 43) are provided to determine the ratio (R) between said signals whereby, after suitable calibration, the calibration means may be removed from the experimental chamber and the intensity of the said another beam portion determined by monitoring of the monitoring means signal, per se.

  12. Notebook computer use with different monitor tilt angle: effects on posture, muscle activity and discomfort of neck pain users.

    PubMed

    Chiou, Wen-Ko; Chou, Wei-Ying; Chen, Bi-Hui

    2012-01-01

    This study aimed to evaluate the posture, muscle activities, and self reported discomforts of neck pain notebook computer users on three monitor tilt conditions: 100°, 115°, and 130°. Six subjects were recruited in this study to completed typing tasks. Results showed subjects have a trend to show the forward head posture in the condition that monitor was set at 100°, and the significant less neck and shoulder discomfort were noted in the condition that monitor was set at 130°. These result suggested neck pain notebook user to set their monitor tilt angle at 130°.

  13. Improving the management of warfarin in aged-care facilities utilising innovative technology: a proof-of-concept study.

    PubMed

    Bereznicki, Luke R E; Jackson, Shane L; Kromdijk, Wiete; Gee, Peter; Fitzmaurice, Kimbra; Bereznicki, Bonnie J; Peterson, Gregory M

    2014-02-01

    In aged-care facilities (ACFs) monitoring of warfarin can be logistically challenging and International Normalised Ratio (INR control) is often suboptimal. We aimed to determine whether an integrated information and communications technology system and the use of point-of-care (POC) monitors by nursing staff could improve the INR control of aged-care facility residents who take warfarin. Nursing staff identified residents who were prescribed warfarin in participating ACFs. A computer program (MedePOC) was developed to store and transmit INR results from the ACFs to general practitioners (GPs) for dosage adjustment. Nursing staff received training in the use of the CoaguChek XS point-of-care INR monitor and the MedePOC software. Following a run-in phase, eligible patients were monitored weekly for up to 12 weeks. The primary outcome was the change in the time in therapeutic range (TTR) in the intervention phase compared to the TTR in the 12 months preceding the study. All GPs, nursing staff and patients were surveyed for their experiences and opinions of the project. Twenty-four patients and 19 GPs completed the trial across six ACFs. The mean TTR for all patients improved non-significantly from 58.9 to 60.6% (P=0.79) and the proportion of INR tests in range improved non-significantly from 57.1 to 64.1% (P=0.21). The mean TTR improved in 14 patients (58%) and in these patients the mean absolute improvement in TTR was 23.1%. A post hoc analysis of the INR data using modified therapeutic INR ranges to reflect the dosage adjustment practices of GPs suggested that the intervention did lead to improved INR control. The MedePOC program and POC monitoring was well received by nursing staff. Weekly POC INR monitoring conducted in ACFs and electronic communication of the results and warfarin doses resulted in non-significant improvements in INR control in a small cohort of elderly residents. Further research involving modification to the communication strategy and a longer follow-up period is warranted to investigate whether this strategy can improve INR control and clinical outcomes in this vulnerable population. © 2013 The Authors. IJPP © 2013 Royal Pharmaceutical Society.

  14. Rainfall estimation for real time flood monitoring using geostationary meteorological satellite data

    NASA Astrophysics Data System (ADS)

    Veerakachen, Watcharee; Raksapatcharawong, Mongkol

    2015-09-01

    Rainfall estimation by geostationary meteorological satellite data provides good spatial and temporal resolutions. This is advantageous for real time flood monitoring and warning systems. However, a rainfall estimation algorithm developed in one region needs to be adjusted for another climatic region. This work proposes computationally-efficient rainfall estimation algorithms based on an Infrared Threshold Rainfall (ITR) method calibrated with regional ground truth. Hourly rain gauge data collected from 70 stations around the Chao-Phraya river basin were used for calibration and validation of the algorithms. The algorithm inputs were derived from FY-2E satellite observations consisting of infrared and water vapor imagery. The results were compared with the Global Satellite Mapping of Precipitation (GSMaP) near real time product (GSMaP_NRT) using the probability of detection (POD), root mean square error (RMSE) and linear correlation coefficient (CC) as performance indices. Comparison with the GSMaP_NRT product for real time monitoring purpose shows that hourly rain estimates from the proposed algorithm with the error adjustment technique (ITR_EA) offers higher POD and approximately the same RMSE and CC with less data latency.

  15. Adaptive responses of the cardiovascular system to prolonged spaceflight conditions: assessment with Holter monitoring

    NASA Technical Reports Server (NTRS)

    Baevsky, R. M.; Bennett, B. S.; Bungo, M. W.; Charles, J. B.; Goldberger, A. L.; Nikulina, G. A.

    1997-01-01

    This article presents selected findings obtained with Holter monitoring from two crew members of the expedition, performed during a 175-day space mission on board orbital space station "MIR." Using mathematical processing of daily cardiointervals files, 5-minute sections of records were analyzed consecutively. Then, the average daily values of indices, the average-per-every-eight-hours values (morning, evening, night) and mean values per hour were computed. The results of analysis showed that prolonged exposure of man to microgravity conditions leads to important functional alteration in human neuroautonomic regulatory mechanisms. Both crew members had significant increase of heart rate, the rise of stress index, the decrease in power of the spectrum in the range of respiratory sinus arrhythmia. These marked signs of activation of the sympathetic section of the vegetative nervous system showed individual variations. The analysis of the daily collection of cardiointervals with Holter monitoring allows us to understand and forecast the functional feasibilities of the human organism under a variety of stress conditions associated with acute and chronic microgravity exposure.

  16. A pilot study of human response to general aviation aircraft noise

    NASA Technical Reports Server (NTRS)

    Stearns, J.; Brown, R.; Neiswander, P.

    1983-01-01

    A pilot study, conducted to evaluate procedures for measuring the noise impact and community response to general aviation aircraft around Torrance Municipal Airport, a typical large GA airport, employed Torrance Airport's computer-based aircraft noise monitoring system, which includes nine permanent monitor stations surrounding the airport. Some 18 residences near these monitor stations were equipped with digital noise level recorders to measure indoor noise levels. Residents were instructed to fill out annoyance diaries for periods of 5-6 days, logging the time of each annoying aircraft overflight noise event and judging its degree of annoyance on a seven-point scale. Among the noise metrics studied, the differential between outdoor maximum A-weighted noise level of the aircraft and the outdoor background level showed the best correlation with annoyance; this correlation was clearly seen at only high noise levels, And was only slightly better than that using outdoor aircraft noise level alone. The results indicate that, on a national basis, a telephone survey coupled with outdoor noise measurements would provide an efficient and practical means of assessing the noise impact of general aviation aircraft.

  17. Evaluation of low-cost computer monitors for the detection of cervical spine injuries in the emergency room: an observer confidence-based study.

    PubMed

    Brem, M H; Böhner, C; Brenning, A; Gelse, K; Radkow, T; Blanke, M; Schlechtweg, P M; Neumann, G; Wu, I Y; Bautz, W; Hennig, F F; Richter, H

    2006-11-01

    To compare the diagnostic value of low-cost computer monitors and a Picture Archiving and Communication System (PACS) workstation for the evaluation of cervical spine fractures in the emergency room. Two groups of readers blinded to the diagnoses (2 radiologists and 3 orthopaedic surgeons) independently assessed-digital radiographs of the cervical spine (anterior-posterior, oblique and trans-oral-dens views). The radiographs of 57 patients who arrived consecutively to the emergency room in 2004 with clinical suspicion of a cervical spine injury were evaluated. The diagnostic values of these radiographs were scored on a 3-point scale (1 = diagnosis not possible/bad image quality, 2 = diagnosis uncertain, 3 = clear diagnosis of fracture or no fracture) on a PACS workstation and on two different liquid crystal display (LCD) personal computer monitors. The images were randomised to avoid memory effects. We used logistic mixed-effects models to determine the possible effects of monitor type on the evaluation of x ray images. To determine the overall effects of monitor type, this variable was used as a fixed effect, and the image number and reader group (radiologist or orthopaedic surgeon) were used as random effects on display quality. Group-specific effects were examined, with the reader group and additional fixed effects as terms. A significance level of 0.05 was established for assessing the contribution of each fixed effect to the model. Overall, the diagnostic score did not differ significantly between standard personal computer monitors and the PACS workstation (both p values were 0.78). Low-cost LCD personal computer monitors may be useful in establishing a diagnosis of cervical spine fractures in the emergency room.

  18. System for autonomous monitoring of bioagents

    DOEpatents

    Langlois, Richard G.; Milanovich, Fred P.; Colston, Jr, Billy W.; Brown, Steve B.; Masquelier, Don A.; Mariella, Jr., Raymond P.; Venkateswaran, Kodomudi

    2015-06-09

    An autonomous monitoring system for monitoring for bioagents. A collector gathers the air, water, soil, or substance being monitored. A sample preparation means for preparing a sample is operatively connected to the collector. A detector for detecting the bioagents in the sample is operatively connected to the sample preparation means. One embodiment of the present invention includes confirmation means for confirming the bioagents in the sample.

  19. Dual compile strategy for parallel heterogeneous execution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Tyler Barratt; Perry, James Thomas

    2012-06-01

    The purpose of the Dual Compile Strategy is to increase our trust in the Compute Engine during its execution of instructions. This is accomplished by introducing a heterogeneous Monitor Engine that checks the execution of the Compute Engine. This leads to the production of a second and custom set of instructions designed for monitoring the execution of the Compute Engine at runtime. This use of multiple engines differs from redundancy in that one engine is working on the application while the other engine is monitoring and checking in parallel instead of both applications (and engines) performing the same work atmore » the same time.« less

  20. Factors leading to the computer vision syndrome: an issue at the contemporary workplace.

    PubMed

    Izquierdo, Juan C; García, Maribel; Buxó, Carmen; Izquierdo, Natalio J

    2007-01-01

    Vision and eye related problems are common among computer users, and have been collectively called the Computer Vision Syndrome (CVS). An observational study in order to identify the risk factors leading to the CVS was done. Twenty-eight participants answered a validated questionnaire, and had their workstations examined. The questionnaire evaluated personal, environmental, ergonomic factors, and physiologic response of computer users. The distance from the eye to the computers' monitor (A), the computers' monitor height (B), and visual axis height (C) were measured. The difference between B and C was calculated and labeled as D. Angles of gaze to the computer monitor were calculated using the formula: angle=tan-1(D/A). Angles were divided into two groups: participants with angles of gaze ranging from 0 degree to 13.9 degrees were included in Group 1; and participants gazing at angles larger than 14 degrees were included in Group 2. Statistical analysis of the evaluated variables was made. Computer users in both groups used more tear supplements (as part of the syndrome) than expected. This association was statistically significant (p < 0.10). Participants in Group 1 reported more pain than participants in Group 2. Associations between the CVS and other personal or ergonomic variables were not statistically significant. Our findings show that the most important factor leading to the syndrome is the angle of gaze at the computer monitor. Pain in computer users is diminished when gazing downwards at angles of 14 degrees or more. The CVS remains an under estimated and poorly understood issue at the workplace. The general public, health professionals, the government, and private industries need to be educated about the CVS.

  1. Factors leading to the Computer Vision Syndrome: an issue at the contemporary workplace.

    PubMed

    Izquierdo, Juan C; García, Maribel; Buxó, Carmen; Izquierdo, Natalio J

    2004-01-01

    Vision and eye related problems are common among computer users, and have been collectively called the Computer Vision Syndrome (CVS). An observational study in order to identify the risk factors leading to the CVS was done. Twenty-eight participants answered a validated questionnaire, and had their workstations examined. The questionnaire evaluated personal, environmental, ergonomic factors, and physiologic response of computer users. The distance from the eye to the computers' monitor (A), the computers' monitor height (B), and visual axis height (C) were measured. The difference between B and C was calculated and labeled as D. Angles of gaze to the computer monitor were calculated using the formula: angle=tan(-1)(D/ A). Angles were divided into two groups: participants with angles of gaze ranging from 0 degrees to 13.9 degrees were included in Group 1; and participants gazing at angles larger than 14 degrees were included in Group 2. Statistical analysis of the evaluated variables was made. Computer users in both groups used more tear supplements (as part of the syndrome) than expected. This association was statistically significant (p<0.10). Participants in Group 1 reported more pain than participants in Group 2. Associations between the CVS and other personal or ergonomic variables were not statistically significant. Our findings show that most important factor leading to the syndrome is the angle of gaze at the computer monitor. Pain in computer users is diminished when gazing downwards at angles of 14 degrees or more. The CVS remains an under estimated and poorly understood issue at the workplace. The general public, health professionals, the government, and private industries need to be educated about the CVS.

  2. Grid site availability evaluation and monitoring at CMS

    DOE PAGES

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  3. Grid site availability evaluation and monitoring at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  4. Grid site availability evaluation and monitoring at CMS

    NASA Astrophysics Data System (ADS)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.

  5. View southeast of computer controlled energy monitoring system. System replaced ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View southeast of computer controlled energy monitoring system. System replaced strip chart recorders and other instruments under the direct observation of the load dispatcher. - Thirtieth Street Station, Load Dispatch Center, Thirtieth & Market Streets, Railroad Station, Amtrak (formerly Pennsylvania Railroad Station), Philadelphia, Philadelphia County, PA

  6. Brain-Congruent Instruction: Does the Computer Make It Feasible?

    ERIC Educational Resources Information Center

    Stewart, William J.

    1984-01-01

    Based on the premise that computers could translate brain research findings into classroom practice, this article presents discoveries concerning human brain development, organization, and operation, and describes brain activity monitoring devices, brain function and structure variables, and a procedure for monitoring and analyzing brain activity…

  7. The Macintosh Lab Monitor, Numbers 1-4.

    ERIC Educational Resources Information Center

    Wanderman, Richard; And Others

    1987-01-01

    Four issues of the "Macintosh Lab Monitor" document the Computer-Aided Writing Project at the Forman School (Connecticut) which is a college preparatory school for bright dyslexic adolescents. The project uses Macintosh computers to teach outlining, writing, organizational and thinking skills. Sample articles have the following titles:…

  8. Computational modeling of radiofrequency ablation: evaluation on ex vivo data using ultrasound monitoring

    NASA Astrophysics Data System (ADS)

    Audigier, Chloé; Kim, Younsu; Dillow, Austin; Boctor, Emad M.

    2017-03-01

    Radiofrequency ablation (RFA) is the most widely used minimally invasive ablative therapy for liver cancer, but it is challenged by a lack of patient-specific monitoring. Inter-patient tissue variability and the presence of blood vessels make the prediction of the RFA difficult. A monitoring tool which can be personalized for a given patient during the intervention would be helpful to achieve a complete tumor ablation. However, the clinicians do not have access to such a tool, which results in incomplete treatment and a large number of recurrences. Computational models can simulate the phenomena and mechanisms governing this therapy. The temperature evolution as well as the resulted ablation can be modeled. When combined together with intraoperative measurements, computational modeling becomes an accurate and powerful tool to gain quantitative understanding and to enable improvements in the ongoing clinical settings. This paper shows how computational models of RFA can be evaluated using intra-operative measurements. First, simulations are used to demonstrate the feasibility of the method, which is then evaluated on two ex vivo datasets. RFA is simulated on a simplified geometry to generate realistic longitudinal temperature maps and the resulted necrosis. Computed temperatures are compared with the temperature evolution recorded using thermometers, and with temperatures monitored by ultrasound (US) in a 2D plane containing the ablation tip. Two ablations are performed on two cadaveric bovine livers, and we achieve error of 2.2 °C on average between the computed and the thermistors temperature and 1.4 °C and 2.7 °C on average between the temperature computed and monitored by US during the ablation at two different time points (t = 240 s and t = 900 s).

  9. The effect of monitor raster latency on VEPs, ERPs and Brain-Computer Interface performance.

    PubMed

    Nagel, Sebastian; Dreher, Werner; Rosenstiel, Wolfgang; Spüler, Martin

    2018-02-01

    Visual neuroscience experiments and Brain-Computer Interface (BCI) control often require strict timings in a millisecond scale. As most experiments are performed using a personal computer (PC), the latencies that are introduced by the setup should be taken into account and be corrected. As a standard computer monitor uses a rastering to update each line of the image sequentially, this causes a monitor raster latency which depends on the position, on the monitor and the refresh rate. We technically measured the raster latencies of different monitors and present the effects on visual evoked potentials (VEPs) and event-related potentials (ERPs). Additionally we present a method for correcting the monitor raster latency and analyzed the performance difference of a code-modulated VEP BCI speller by correcting the latency. There are currently no other methods validating the effects of monitor raster latency on VEPs and ERPs. The timings of VEPs and ERPs are directly affected by the raster latency. Furthermore, correcting the raster latency resulted in a significant reduction of the target prediction error from 7.98% to 4.61% and also in a more reliable classification of targets by significantly increasing the distance between the most probable and the second most probable target by 18.23%. The monitor raster latency affects the timings of VEPs and ERPs, and correcting resulted in a significant error reduction of 42.23%. It is recommend to correct the raster latency for an increased BCI performance and methodical correctness. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Wireless boundary monitor system and method

    DOEpatents

    Haynes, H.D.; Ayers, C.W.

    1997-12-09

    A wireless boundary monitor system used to monitor the integrity of a boundary surrounding an area uses at least two housings having at least one transmitting means for emitting ultrasonic pressure waves to a medium. Each of the housings has a plurality of receiving means for sensing the pressure waves in the medium. The transmitting means and the receiving means of each housing are aimable and communicably linked. At least one of the housings is equipped with a local alarm means for emitting a first alarm indication whereby, when the pressure waves propagating from a transmitting means to a receiving means are sufficiently blocked by an object a local alarm means or a remote alarm means or a combination thereof emit respective alarm indications. The system may be reset either manually or automatically. This wireless boundary monitor system has useful applications in both indoor and outdoor environments. 4 figs.

  11. Wireless boundary monitor system and method

    DOEpatents

    Haynes, Howard D.; Ayers, Curtis W.

    1997-01-01

    A wireless boundary monitor system used to monitor the integrity of a boundary surrounding an area uses at least two housings having at least one transmitting means for emitting ultrasonic pressure waves to a medium. Each of the housings has a plurality of receiving means for sensing the pressure waves in the medium. The transmitting means and the receiving means of each housing are aimable and communicably linked. At least one of the housings is equipped with a local alarm means for emitting a first alarm indication whereby, when the pressure waves propagating from a transmitting means to a receiving means are sufficiently blocked by an object a local alarm means or a remote alarm means or a combination thereof emit respective alarm indications. The system may be reset either manually or automatically. This wireless boundary monitor system has useful applications in both indoor and outdoor environments.

  12. Computer-generated graphical presentations: use of multimedia to enhance communication.

    PubMed

    Marks, L S; Penson, D F; Maller, J J; Nielsen, R T; deKernion, J B

    1997-01-01

    Personal computers may be used to create, store, and deliver graphical presentations. With computer-generated combinations of the five media (text, images, sound, video, and animation)--that is, multimedia presentations--the effectiveness of message delivery can be greatly increased. The basic tools are (1) a personal computer; (2) presentation software; and (3) a projector to enlarge the monitor images for audience viewing. Use of this new method has grown rapidly in the business-conference world, but has yet to gain widespread acceptance at medical meetings. We review herein the rationale for multimedia presentations in medicine (vis-à-vis traditional slide shows) as an improved means for increasing audience attention, comprehension, and retention. The evolution of multimedia is traced from earliest times to the present. The steps involved in making a multimedia presentation are summarized, emphasizing advances in technology that bring the new method within practical reach of busy physicians. Specific attention is given to software, digital image processing, storage devices, and delivery methods. Our development of a urology multimedia presentation--delivered May 4, 1996, before the Society for Urology and Engineering and now Internet-accessible at http://www.usrf.org--was the impetus for this work.

  13. Visual based laser speckle pattern recognition method for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Park, Kyeongtaek; Torbol, Marco

    2017-04-01

    This study performed the system identification of a target structure by analyzing the laser speckle pattern taken by a camera. The laser speckle pattern is generated by the diffuse reflection of the laser beam on a rough surface of the target structure. The camera, equipped with a red filter, records the scattered speckle particles of the laser light in real time and the raw speckle image of the pixel data is fed to the graphic processing unit (GPU) in the system. The algorithm for laser speckle contrast analysis (LASCA) computes: the laser speckle contrast images and the laser speckle flow images. The k-mean clustering algorithm is used to classify the pixels in each frame and the clusters' centroids, which function as virtual sensors, track the displacement between different frames in time domain. The fast Fourier transform (FFT) and the frequency domain decomposition (FDD) compute the modal properties of the structure: natural frequencies and damping ratios. This study takes advantage of the large scale computational capability of GPU. The algorithm is written in Compute Unifies Device Architecture (CUDA C) that allows the processing of speckle images in real time.

  14. Aircraft Engine-Monitoring System And Display

    NASA Technical Reports Server (NTRS)

    Abbott, Terence S.; Person, Lee H., Jr.

    1992-01-01

    Proposed Engine Health Monitoring System and Display (EHMSD) provides enhanced means for pilot to control and monitor performances of engines. Processes raw sensor data into information meaningful to pilot. Provides graphical information about performance capabilities, current performance, and operational conditions in components or subsystems of engines. Provides means to control engine thrust directly and innovative means to monitor performance of engine system rapidly and reliably. Features reduce pilot workload and increase operational safety.

  15. Efficiency Evaluation of Handling of Geologic-Geophysical Information by Means of Computer Systems

    NASA Astrophysics Data System (ADS)

    Nuriyahmetova, S. M.; Demyanova, O. V.; Zabirova, L. M.; Gataullin, I. I.; Fathutdinova, O. A.; Kaptelinina, E. A.

    2018-05-01

    Development of oil and gas resources, considering difficult geological, geographical and economic conditions, requires considerable finance costs; therefore their careful reasons, application of the most perspective directions and modern technologies from the point of view of cost efficiency of planned activities are necessary. For ensuring high precision of regional and local forecasts and modeling of reservoirs of fields of hydrocarbonic raw materials, it is necessary to analyze huge arrays of the distributed information which is constantly changing spatial. The solution of this task requires application of modern remote methods of a research of the perspective oil-and-gas territories, complex use of materials remote, nondestructive the environment of geologic-geophysical and space methods of sounding of Earth and the most perfect technologies of their handling. In the article, the authors considered experience of handling of geologic-geophysical information by means of computer systems by the Russian and foreign companies. Conclusions that the multidimensional analysis of geologicgeophysical information space, effective planning and monitoring of exploration works requires broad use of geoinformation technologies as one of the most perspective directions in achievement of high profitability of an oil and gas industry are drawn.

  16. Image authentication by means of fragile CGH watermarking

    NASA Astrophysics Data System (ADS)

    Schirripa Spagnolo, Giuseppe; Simonetti, Carla; Cozzella, Lorenzo

    2005-09-01

    In this paper we propose a fragile marking system based on Computer Generated Hologram coding techniques, which is able to detect malicious tampering while tolerating some incidental distortions. A fragile watermark is a mark that is readily altered or destroyed when the host image is modified through a linear or nonlinear transformation. A fragile watermark monitors the integrity of the content of the image but not its numerical representation. Therefore the watermark is designed so that the integrity is proven if the content of the image has not been tampered. Since digital images can be altered or manipulated with ease, the ability to detect changes to digital images is very important for many applications such as news reporting, medical archiving, or legal usages. The proposed technique could be applied to Color Images as well as to Gray Scale ones. Using Computer Generated Hologram watermarking, the embedded mark could be easily recovered by means of a Fourier Transform. Due to this fact host image can be tampered and watermarked with the same holographic pattern. To avoid this possibility we have introduced an encryption method using a asymmetric Cryptography. The proposed schema is based on the knowledge of original mark from the Authentication

  17. Determination of leakage areas in nuclear piping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keim, E.

    1997-04-01

    For the design and operation of nuclear power plants the Leak-Before-Break (LBB) behavior of a piping component has to be shown. This means that the length of a crack resulting in a leak is smaller than the critical crack length and that the leak is safely detectable by a suitable monitoring system. The LBB-concept of Siemens/KWU is based on computer codes for the evaluation of critical crack lengths, crack openings, leakage areas and leakage rates, developed by Siemens/KWU. In the experience with the leak rate program is described while this paper deals with the computation of crack openings and leakagemore » areas of longitudinal and circumferential cracks by means of fracture mechanics. The leakage areas are determined by the integration of the crack openings along the crack front, considering plasticity and geometrical effects. They are evaluated with respect to minimum values for the design of leak detection systems, and maximum values for controlling jet and reaction forces. By means of fracture mechanics LBB for subcritical cracks has to be shown and the calculation of leakage areas is the basis for quantitatively determining the discharge rate of leaking subcritical through-wall cracks. The analytical approach and its validation will be presented for two examples of complex structures. The first one is a pipe branch containing a circumferential crack and the second one is a pipe bend with a longitudinal crack.« less

  18. Automated Instructional Monitors for Complex Operational Tasks. Final Report.

    ERIC Educational Resources Information Center

    Feurzeig, Wallace

    A computer-based instructional system is described which incorporates diagnosis of students difficulties in acquiring complex concepts and skills. A computer automatically generated a simulated display. It then monitored and analyzed a student's work in the performance of assigned training tasks. Two major tasks were studied. The first,…

  19. JAVA CLASSES FOR NONPROCEDURAL VARIOGRAM MONITORING. JOURNAL OF COMPUTERS AND GEOSCIENCE

    EPA Science Inventory

    NRMRL-ADA-00229 Faulkner*, B.P. Java Classes for Nonprocedural Variogram Monitoring. Journal of Computers and Geosciences ( Elsevier Science, Ltd.) 28:387-397 (2002). EPA/600/J-02/235. A set of Java classes was written for variogram modeling to support research for US EPA's Reg...

  20. [Continuous glucose monitoring using the Glucoday system, in children and adolescents who have type one diabetes].

    PubMed

    López Gutiérrez, Sònia; Pavía Sesma, Carlos

    2010-10-01

    At present times, Continuous Glucose Monitoring is a recommended method to detect glucemia fluctuations in patients who have diabetes, due to the fine correlation between interstitial glucose and capillary glucemia. The objective of this project is obtain a glucose register during a 48 hour period in a group of Type I diabetes patients who have an irregular metabolic control and to evaluate effectiveness of treatment employed, as well as evaluating compliance of the therapeutic norms established for each patient. At the same time, the authors plan to test the usefulness of a graphical register for diabetic education. After applying an anesthetic cream in the lateral umbilical zone, a subcutaneous catheter connected to a GLUCODAY, Menarini Diagnostics glucose sensor is inserted which, by means of a continuous sequence taking measures in interstitial liquid, registers a value every three minutes for as long as this connection is maintained. After a maximum 48 hour period, data are transferred to a computer and by means of the corresponding computer program, a graphic register for each patient is produced. Informed consent has been obtained from each patient. There were 23 patients in this study group, each diagnosed to have Type I diabetes; eight of these patients, two girls and six boys, were pre-puberty aged while 15, six girls and nine boys, were adolescents. Two of the pre-puberty patients had pathological antecedents, in one case celiac and the other thyroid disease. Two of the puberty aged patients had a history of chronic lymphocytic thyroid disease under opo-therapeutic treatment. Individual analysis of each case permits health professionals to detect a series of facts: it is difficult to comply with glucemic objectives in this group of adolescents having diabetes, with insulin treatment installed and researchers detect postprandial hyper-glucemia which do not appear when using capillary glucemias carried out by habitual methods. Study observations manifest a lack of compliance in indicated agreed upon schedules and researchers detect dietary transgressions. Continuous Glucose Monitoring makes it possible to obtain a graphic which can include those incidences which have occurred, facilitate commenting on the errors detected with adolescent patients and permit proposing a series of therapeutic modifications based on concrete, real data. Continuous Glucose Monitoring promises to be a useful tool to educate patients about diabetes.

  1. Covariation of depressive mood and spontaneous physical activity in major depressive disorder: toward continuous monitoring of depressive mood.

    PubMed

    Kim, Jinhyuk; Nakamura, Toru; Kikuchi, Hiroe; Yoshiuchi, Kazuhiro; Sasaki, Tsukasa; Yamamoto, Yoshiharu

    2015-07-01

    The objective evaluation of depressive mood is considered to be useful for the diagnosis and treatment of depressive disorders. Thus, we investigated psychobehavioral correlates, particularly the statistical associations between momentary depressive mood and behavioral dynamics measured objectively, in patients with major depressive disorder (MDD) and healthy subjects. Patients with MDD ( n = 14) and healthy subjects ( n = 43) wore a watch-type computer device and rated their momentary symptoms using ecological momentary assessment. Spontaneous physical activity in daily life, referred to as locomotor activity, was also continuously measured by an activity monitor built into the device. A multilevel modeling approach was used to model the associations between changes in depressive mood scores and the local statistics of locomotor activity simultaneously measured. We further examined the cross validity of such associations across groups. The statistical model established indicated that worsening of the depressive mood was associated with the increased intermittency of locomotor activity, as characterized by a lower mean and higher skewness. The model was cross validated across groups, suggesting that the same psychobehavioral correlates are shared by both healthy subjects and patients, although the latter had significantly higher mean levels of depressive mood scores. Our findings suggest the presence of robust as well as common associations between momentary depressive mood and behavioral dynamics in healthy individuals and patients with depression, which may lead to the continuous monitoring of the pathogenic processes (from healthy states) and pathological states of MDD.

  2. Real-Time Mapping alert system; characteristics and capabilities

    USGS Publications Warehouse

    Torres, L.A.; Lambert, S.C.; Liebermann, T.D.

    1995-01-01

    The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water-related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field sampling sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. The current alert status at monitoring sites within a state or region is of critical importance during floods, hurricanes, and other extreme hydrologic events. This report describes the characteristics and capabilities of a series of computer programs for real-time mapping of hydrologic data. The software provides interactive graphics display and query of hydrologic information from the network in a real-time, map-based, menu-driven environment.

  3. Computer assisted operations in Petroleum Development Oman (PDO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Hinai, S.H.; Mutimer, K.

    1995-10-01

    Petroleum Development Oman (PDO) currently produces some 750,000 bopd and 900,000 bwpd from some 74 fields in a large geographical area and diverse operating conditions. A key corporate objective is to reduce operating costs by exploiting productivity gains from proven technology. Automation is seen as a means of managing the rapid growth of well population and production facilities. the overall objective is to improve field management through continuous monitoring of wells and facilities and dissemination of data throughout the whole organization. A major upgrade of PDO`s field Supervisory Control and Data Acquisition (SCADA) system is complete providing a platform tomore » exploit new initiatives particularly for production optimization of artificial lift systems and automatic well testing using multi selector valves, coriolis flow meter measurements and multi component (oil, gas, water) flowmeter. The paper describes PDO`s experience including benefits and challenges which have to be managed when developing Computer Assisted Operations (CAO).« less

  4. Cellular Level Brain Imaging in Behaving Mammals: An Engineering Approach

    PubMed Central

    Hamel, Elizabeth J.O.; Grewe, Benjamin F.; Parker, Jones G.; Schnitzer, Mark J.

    2017-01-01

    Fluorescence imaging offers expanding capabilities for recording neural dynamics in behaving mammals, including the means to monitor hundreds of cells targeted by genetic type or connectivity, track cells over weeks, densely sample neurons within local microcircuits, study cells too inactive to isolate in extracellular electrical recordings, and visualize activity in dendrites, axons, or dendritic spines. We discuss recent progress and future directions for imaging in behaving mammals from a systems engineering perspective, which seeks holistic consideration of fluorescent indicators, optical instrumentation, and computational analyses. Today, genetically encoded indicators of neural Ca2+ dynamics are widely used, and those of trans-membrane voltage are rapidly improving. Two complementary imaging paradigms involve conventional microscopes for studying head-restrained animals and head-mounted miniature microscopes for imaging in freely behaving animals. Overall, the field has attained sufficient sophistication that increased cooperation between those designing new indicators, light sources, microscopes, and computational analyses would greatly benefit future progress. PMID:25856491

  5. Improved finite-difference computation of the van der Waals force: One-dimensional case

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinto, Fabrizio

    2009-10-15

    We present an improved demonstration of the calculation of Casimir forces in one-dimensional systems based on the recently proposed numerical imaginary frequency Green's function computation approach. The dispersion force on two thick lossy dielectric slabs separated by an empty gap and placed within a perfectly conducting cavity is obtained from the Green's function of the modified Helmholtz equation by means of an ordinary finite-difference method. In order to demonstrate the possibility to develop algorithms to explore complex geometries in two and three dimensions to higher order in the mesh spacing, we generalize existing classical electromagnetism algebraic methods to generate themore » difference equations for dielectric boundaries not coinciding with any grid points. Diagnostic tests are presented to monitor the accuracy of our implementation of the method and follow-up applications in higher dimensions are introduced.« less

  6. Real-time seismic monitoring and functionality assessment of a building

    USGS Publications Warehouse

    Celebi, M.; ,

    2005-01-01

    This paper presents recent developments and approaches (using GPS technology and real-time double-integration) to obtain displacements and, in turn, drift ratios, in real-time or near real-time to meet the needs of the engineering and user community in seismic monitoring and assessing the functionality and damage condition of structures. Drift ratios computed in near real-time allow technical assessment of the damage condition of a building. Relevant parameters, such as the type of connections and story structural characteristics (including geometry) are used in computing drifts corresponding to several pre-selected threshold stages of damage. Thus, drift ratios determined from real-time monitoring can be compared to pre-computed threshold drift ratios. The approaches described herein can be used for performance evaluation of structures and can be considered as building health-monitoring applications.

  7. 2D Sub-Pixel Disparity Measurement Using QPEC / Medicis

    NASA Astrophysics Data System (ADS)

    Cournet, M.; Giros, A.; Dumas, L.; Delvit, J. M.; Greslou, D.; Languille, F.; Blanchet, G.; May, S.; Michel, J.

    2016-06-01

    In the frame of its earth observation missions, CNES created a library called QPEC, and one of its launcher called Medicis. QPEC / Medicis is a sub-pixel two-dimensional stereo matching algorithm that works on an image pair. This tool is a block matching algorithm, which means that it is based on a local method. Moreover it does not regularize the results found. It proposes several matching costs, such as the Zero mean Normalised Cross-Correlation or statistical measures (the Mutual Information being one of them), and different match validation flags. QPEC / Medicis is able to compute a two-dimensional dense disparity map with a subpixel precision. Hence, it is more versatile than disparity estimation methods found in computer vision literature, which often assume an epipolar geometry. CNES uses Medicis, among other applications, during the in-orbit image quality commissioning of earth observation satellites. For instance the Pléiades-HR 1A & 1B and the Sentinel-2 geometric calibrations are based on this block matching algorithm. Over the years, it has become a common tool in ground segments for in-flight monitoring purposes. For these two kinds of applications, the two-dimensional search and the local sub-pixel measure without regularization can be essential. This tool is also used to generate automatic digital elevation models, for which it was not initially dedicated. This paper deals with the QPEC / Medicis algorithm. It also presents some of its CNES applications (in-orbit commissioning, in flight monitoring or digital elevation model generation). Medicis software is distributed outside the CNES as well. This paper finally describes some of these external applications using Medicis, such as ground displacement measurement, or intra-oral scanner in the dental domain.

  8. MO-FG-202-08: Real-Time Monte Carlo-Based Treatment Dose Reconstruction and Monitoring for Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Z; Shi, F; Gu, X

    2016-06-15

    Purpose: This proof-of-concept study is to develop a real-time Monte Carlo (MC) based treatment-dose reconstruction and monitoring system for radiotherapy, especially for the treatments with complicated delivery, to catch treatment delivery errors at the earliest possible opportunity and interrupt the treatment only when an unacceptable dosimetric deviation from our expectation occurs. Methods: First an offline scheme is launched to pre-calculate the expected dose from the treatment plan, used as ground truth for real-time monitoring later. Then an online scheme with three concurrent threads is launched while treatment delivering, to reconstruct and monitor the patient dose in a temporally resolved fashionmore » in real-time. Thread T1 acquires machine status every 20 ms to calculate and accumulate fluence map (FM). Once our accumulation threshold is reached, T1 transfers the FM to T2 for dose reconstruction ad starts to accumulate a new FM. A GPU-based MC dose calculation is performed on T2 when MC dose engine is ready and a new FM is available. The reconstructed instantaneous dose is directed to T3 for dose accumulation and real-time visualization. Multiple dose metrics (e.g. maximum and mean dose for targets and organs) are calculated from the current accumulated dose and compared with the pre-calculated expected values. Once the discrepancies go beyond our tolerance, an error message will be send to interrupt the treatment delivery. Results: A VMAT Head-and-neck patient case was used to test the performance of our system. Real-time machine status acquisition was simulated here. The differences between the actual dose metrics and the expected ones were 0.06%–0.36%, indicating an accurate delivery. ∼10Hz frequency of dose reconstruction and monitoring was achieved, with 287.94s online computation time compared to 287.84s treatment delivery time. Conclusion: Our study has demonstrated the feasibility of computing a dose distribution in a temporally resolved fashion in real-time and quantitatively and dosimetrically monitoring the treatment delivery.« less

  9. The Role of Parents and Related Factors on Adolescent Computer Use

    PubMed Central

    Epstein, Jennifer A.

    2012-01-01

    Background Research suggested the importance of parents on their adolescents’ computer activity. Spending too much time on the computer for recreational purposes in particular has been found to be related to areas of public health concern in children/adolescents, including obesity and substance use. Design and Methods The goal of the research was to determine the association between recreational computer use and potentially linked factors (parental monitoring, social influences to use computers including parents, age of first computer use, self-control, and particular internet activities). Participants (aged 13-17 years and residing in the United States) were recruited via the Internet to complete an anonymous survey online using a survey tool. The target sample of 200 participants who completed the survey was achieved. The sample’s average age was 16 and was 63% girls. Results A set of regressions with recreational computer use as dependent variables were run. Conclusions Less parental monitoring, younger age at first computer use, listening or downloading music from the internet more frequently, using the internet for educational purposes less frequently, and parent’s use of the computer for pleasure were related to spending a greater percentage of time on non-school computer use. These findings suggest the importance of parental monitoring and parental computer use on their children’s own computer use, and the influence of some internet activities on adolescent computer use. Finally, programs aimed at parents to help them increase the age when their children start using computers and learn how to place limits on recreational computer use are needed. PMID:25170449

  10. Comparison of energy expenditure in adolescents when playing new generation and sedentary computer games: cross sectional study

    PubMed Central

    2007-01-01

    Objective To compare the energy expenditure of adolescents when playing sedentary and new generation active computer games. Design Cross sectional comparison of four computer games. Setting Research laboratories. Participants Six boys and five girls aged 13-15 years. Procedure Participants were fitted with a monitoring device validated to predict energy expenditure. They played four computer games for 15 minutes each. One of the games was sedentary (XBOX 360) and the other three were active (Wii Sports). Main outcome measure Predicted energy expenditure, compared using repeated measures analysis of variance. Results Mean (standard deviation) predicted energy expenditure when playing Wii Sports bowling (190.6 (22.2) kJ/kg/min), tennis (202.5 (31.5) kJ/kg/min), and boxing (198.1 (33.9) kJ/kg/min) was significantly greater than when playing sedentary games (125.5 (13.7) kJ/kg/min) (P<0.001). Predicted energy expenditure was at least 65.1 (95% confidence interval 47.3 to 82.9) kJ/kg/min greater when playing active rather than sedentary games. Conclusions Playing new generation active computer games uses significantly more energy than playing sedentary computer games but not as much energy as playing the sport itself. The energy used when playing active Wii Sports games was not of high enough intensity to contribute towards the recommended daily amount of exercise in children. PMID:18156227

  11. Comparison of energy expenditure in adolescents when playing new generation and sedentary computer games: cross sectional study.

    PubMed

    Graves, Lee; Stratton, Gareth; Ridgers, N D; Cable, N T

    2007-12-22

    To compare the energy expenditure of adolescents when playing sedentary and new generation active computer games. Cross sectional comparison of four computer games. Research laboratories. Six boys and five girls aged 13-15 years. Procedure Participants were fitted with a monitoring device validated to predict energy expenditure. They played four computer games for 15 minutes each. One of the games was sedentary (XBOX 360) and the other three were active (Wii Sports). Predicted energy expenditure, compared using repeated measures analysis of variance. Mean (standard deviation) predicted energy expenditure when playing Wii Sports bowling (190.6 (22.2) kJ/kg/min), tennis (202.5 (31.5) kJ/kg/min), and boxing (198.1 (33.9) kJ/kg/min) was significantly greater than when playing sedentary games (125.5 (13.7) kJ/kg/min) (P<0.001). Predicted energy expenditure was at least 65.1 (95% confidence interval 47.3 to 82.9) kJ/kg/min greater when playing active rather than sedentary games. Playing new generation active computer games uses significantly more energy than playing sedentary computer games but not as much energy as playing the sport itself. The energy used when playing active Wii Sports games was not of high enough intensity to contribute towards the recommended daily amount of exercise in children.

  12. Adaptive runtime for a multiprocessing API

    DOEpatents

    Antao, Samuel F.; Bertolli, Carlo; Eichenberger, Alexandre E.; O'Brien, John K.

    2016-11-15

    A computer-implemented method includes selecting a runtime for executing a program. The runtime includes a first combination of feature implementations, where each feature implementation implements a feature of an application programming interface (API). Execution of the program is monitored, and the execution uses the runtime. Monitor data is generated based on the monitoring. A second combination of feature implementations are selected, by a computer processor, where the selection is based at least in part on the monitor data. The runtime is modified by activating the second combination of feature implementations to replace the first combination of feature implementations.

  13. Adaptive runtime for a multiprocessing API

    DOEpatents

    Antao, Samuel F.; Bertolli, Carlo; Eichenberger, Alexandre E.; O'Brien, John K.

    2016-10-11

    A computer-implemented method includes selecting a runtime for executing a program. The runtime includes a first combination of feature implementations, where each feature implementation implements a feature of an application programming interface (API). Execution of the program is monitored, and the execution uses the runtime. Monitor data is generated based on the monitoring. A second combination of feature implementations are selected, by a computer processor, where the selection is based at least in part on the monitor data. The runtime is modified by activating the second combination of feature implementations to replace the first combination of feature implementations.

  14. Experimental evaluations of wearable ECG monitor.

    PubMed

    Ha, Kiryong; Kim, Youngsung; Jung, Junyoung; Lee, Jeunwoo

    2008-01-01

    Healthcare industry is changing with ubiquitous computing environment and wearable ECG measurement is one of the most popular approaches in this healthcare industry. Reliability and performance of healthcare device is fundamental issue for widespread adoptions, and interdisciplinary perspectives of wearable ECG monitor make this more difficult. In this paper, we propose evaluation criteria considering characteristic of both ECG measurement and ubiquitous computing. With our wearable ECG monitors, various levels of experimental analysis are performed based on evaluation strategy.

  15. Consolidation of cloud computing in ATLAS

    NASA Astrophysics Data System (ADS)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  16. Monitoring Dosimetric Impact of Weight Loss With Kilovoltage (KV) Cone Beam CT (CBCT) During Parotid-Sparing IMRT and Concurrent Chemotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, Kean Fatt, E-mail: hokeanfatt@hotmail.com; Marchant, Tom; Moore, Chris

    2012-03-01

    Purpose: Parotid-sparing head-and-neck intensity-modulated radiotherapy (IMRT) can reduce long-term xerostomia. However, patients frequently experience weight loss and tumor shrinkage during treatment. We evaluate the use of kilovoltage (kV) cone beam computed tomography (CBCT) for dose monitoring and examine if the dosimetric impact of such changes on the parotid and critical neural structures warrants replanning during treatment. Methods and materials: Ten patients with locally advanced oropharyngeal cancer were treated with contralateral parotid-sparing IMRT concurrently with platinum-based chemotherapy. Mean doses of 65 Gy and 54 Gy were delivered to clinical target volume (CTV)1 and CTV2, respectively, in 30 daily fractions. CBCT wasmore » prospectively acquired weekly. Each CBCT was coregistered with the planned isocenter. The spinal cord, brainstem, parotids, larynx, and oral cavity were outlined on each CBCT. Dose distributions were recalculated on the CBCT after correcting the gray scale to provide accurate Hounsfield calibration, using the original IMRT plan configuration. Results: Planned contralateral parotid mean doses were not significantly different to those delivered during treatment (p > 0.1). Ipsilateral and contralateral parotids showed a mean reduction in volume of 29.7% and 28.4%, respectively. There was no significant difference between planned and delivered maximum dose to the brainstem (p = 0.6) or spinal cord (p = 0.2), mean dose to larynx (p = 0.5) and oral cavity (p = 0.8). End-of-treatment mean weight loss was 7.5 kg (8.8% of baseline weight). Despite a {>=}10% weight loss in 5 patients, there was no significant dosimetric change affecting the contralateral parotid and neural structures. Conclusions: Although patient weight loss and parotid volume shrinkage was observed, overall, there was no significant excess dose to the organs at risk. No replanning was felt necessary for this patient cohort, but a larger patient sample will be investigated to further confirm these results. Nevertheless, kilovoltage CBCT is a valuable tool for patient setup verification and monitoring of dosimetric variation during radiotherapy.« less

  17. Assessing the performance of regional landslide early warning models: the EDuMaP method

    NASA Astrophysics Data System (ADS)

    Calvello, M.; Piciullo, L.

    2016-01-01

    A schematic of the components of regional early warning systems for rainfall-induced landslides is herein proposed, based on a clear distinction between warning models and warning systems. According to this framework an early warning system comprises a warning model as well as a monitoring and warning strategy, a communication strategy and an emergency plan. The paper proposes the evaluation of regional landslide warning models by means of an original approach, called the "event, duration matrix, performance" (EDuMaP) method, comprising three successive steps: identification and analysis of the events, i.e., landslide events and warning events derived from available landslides and warnings databases; definition and computation of a duration matrix, whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model performance by means of performance criteria and indicators applied to the duration matrix. During the first step the analyst identifies and classifies the landslide and warning events, according to their spatial and temporal characteristics, by means of a number of model parameters. In the second step, the analyst computes a time-based duration matrix with a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warning data from the municipal early warning system operating in Rio de Janeiro (Brazil).

  18. Fault-tolerant battery system employing intra-battery network architecture

    DOEpatents

    Hagen, Ronald A.; Chen, Kenneth W.; Comte, Christophe; Knudson, Orlin B.; Rouillard, Jean

    2000-01-01

    A distributed energy storing system employing a communications network is disclosed. A distributed battery system includes a number of energy storing modules, each of which includes a processor and communications interface. In a network mode of operation, a battery computer communicates with each of the module processors over an intra-battery network and cooperates with individual module processors to coordinate module monitoring and control operations. The battery computer monitors a number of battery and module conditions, including the potential and current state of the battery and individual modules, and the conditions of the battery's thermal management system. An over-discharge protection system, equalization adjustment system, and communications system are also controlled by the battery computer. The battery computer logs and reports various status data on battery level conditions which may be reported to a separate system platform computer. A module transitions to a stand-alone mode of operation if the module detects an absence of communication connectivity with the battery computer. A module which operates in a stand-alone mode performs various monitoring and control functions locally within the module to ensure safe and continued operation.

  19. Automating ATLAS Computing Operations using the Site Status Board

    NASA Astrophysics Data System (ADS)

    J, Andreeva; Iglesias C, Borrego; S, Campana; Girolamo A, Di; I, Dzhunov; Curull X, Espinal; S, Gayazov; E, Magradze; M, Nowotka M.; L, Rinaldi; P, Saiz; J, Schovancova; A, Stewart G.; M, Wright

    2012-12-01

    The automation of operations is essential to reduce manpower costs and improve the reliability of the system. The Site Status Board (SSB) is a framework which allows Virtual Organizations to monitor their computing activities at distributed sites and to evaluate site performance. The ATLAS experiment intensively uses the SSB for the distributed computing shifts, for estimating data processing and data transfer efficiencies at a particular site, and for implementing automatic exclusion of sites from computing activities, in case of potential problems. The ATLAS SSB provides a real-time aggregated monitoring view and keeps the history of the monitoring metrics. Based on this history, usability of a site from the perspective of ATLAS is calculated. The paper will describe how the SSB is integrated in the ATLAS operations and computing infrastructure and will cover implementation details of the ATLAS SSB sensors and alarm system, based on the information in the SSB. It will demonstrate the positive impact of the use of the SSB on the overall performance of ATLAS computing activities and will overview future plans.

  20. Thermal and orbital analysis of Earth monitoring Sun-synchronous space experiments

    NASA Technical Reports Server (NTRS)

    Killough, Brian D.

    1990-01-01

    The fundamentals of an Earth monitoring Sun-synchronous orbit are presented. A Sun-synchronous Orbit Analysis Program (SOAP) was developed to calculate orbital parameters for an entire year. The output from this program provides the required input data for the TRASYS thermal radiation computer code, which in turn computes the infrared, solar and Earth albedo heat fluxes incident on a space experiment. Direct incident heat fluxes can be used as input to a generalized thermal analyzer program to size radiators and predict instrument operating temperatures. The SOAP computer code and its application to the thermal analysis methodology presented, should prove useful to the thermal engineer during the design phases of Earth monitoring Sun-synchronous space experiments.

  1. Checkpoint triggering in a computer system

    DOEpatents

    Cher, Chen-Yong

    2016-09-06

    According to an aspect, a method for triggering creation of a checkpoint in a computer system includes executing a task in a processing node of the computer system and determining whether it is time to read a monitor associated with a metric of the task. The monitor is read to determine a value of the metric based on determining that it is time to read the monitor. A threshold for triggering creation of the checkpoint is determined based on the value of the metric. Based on determining that the value of the metric has crossed the threshold, the checkpoint including state data of the task is created to enable restarting execution of the task upon a restart operation.

  2. Inductive System Health Monitoring

    NASA Technical Reports Server (NTRS)

    Iverson, David L.

    2004-01-01

    The Inductive Monitoring System (IMS) software was developed to provide a technique to automatically produce health monitoring knowledge bases for systems that are either difficult to model (simulate) with a computer or which require computer models that are too complex to use for real time monitoring. IMS uses nominal data sets collected either directly from the system or from simulations to build a knowledge base that can be used to detect anomalous behavior in the system. Machine learning and data mining techniques are used to characterize typical system behavior by extracting general classes of nominal data from archived data sets. IMS is able to monitor the system by comparing real time operational data with these classes. We present a description of learning and monitoring method used by IMS and summarize some recent IMS results.

  3. Building a transnational biosurveillance network using semantic web technologies: requirements, design, and preliminary evaluation.

    PubMed

    Teodoro, Douglas; Pasche, Emilie; Gobeill, Julien; Emonet, Stéphane; Ruch, Patrick; Lovis, Christian

    2012-05-29

    Antimicrobial resistance has reached globally alarming levels and is becoming a major public health threat. Lack of efficacious antimicrobial resistance surveillance systems was identified as one of the causes of increasing resistance, due to the lag time between new resistances and alerts to care providers. Several initiatives to track drug resistance evolution have been developed. However, no effective real-time and source-independent antimicrobial resistance monitoring system is available publicly. To design and implement an architecture that can provide real-time and source-independent antimicrobial resistance monitoring to support transnational resistance surveillance. In particular, we investigated the use of a Semantic Web-based model to foster integration and interoperability of interinstitutional and cross-border microbiology laboratory databases. Following the agile software development methodology, we derived the main requirements needed for effective antimicrobial resistance monitoring, from which we proposed a decentralized monitoring architecture based on the Semantic Web stack. The architecture uses an ontology-driven approach to promote the integration of a network of sentinel hospitals or laboratories. Local databases are wrapped into semantic data repositories that automatically expose local computing-formalized laboratory information in the Web. A central source mediator, based on local reasoning, coordinates the access to the semantic end points. On the user side, a user-friendly Web interface provides access and graphical visualization to the integrated views. We designed and implemented the online Antimicrobial Resistance Trend Monitoring System (ARTEMIS) in a pilot network of seven European health care institutions sharing 70+ million triples of information about drug resistance and consumption. Evaluation of the computing performance of the mediator demonstrated that, on average, query response time was a few seconds (mean 4.3, SD 0.1 × 10(2) seconds). Clinical pertinence assessment showed that resistance trends automatically calculated by ARTEMIS had a strong positive correlation with the European Antimicrobial Resistance Surveillance Network (EARS-Net) (ρ = .86, P < .001) and the Sentinel Surveillance of Antibiotic Resistance in Switzerland (SEARCH) (ρ = .84, P < .001) systems. Furthermore, mean resistance rates extracted by ARTEMIS were not significantly different from those of either EARS-Net (∆ = ±0.130; 95% confidence interval -0 to 0.030; P < .001) or SEARCH (∆ = ±0.042; 95% confidence interval -0.004 to 0.028; P = .004). We introduce a distributed monitoring architecture that can be used to build transnational antimicrobial resistance surveillance networks. Results indicated that the Semantic Web-based approach provided an efficient and reliable solution for development of eHealth architectures that enable online antimicrobial resistance monitoring from heterogeneous data sources. In future, we expect that more health care institutions can join the ARTEMIS network so that it can provide a large European and wider biosurveillance network that can be used to detect emerging bacterial resistance in a multinational context and support public health actions.

  4. Building a Transnational Biosurveillance Network Using Semantic Web Technologies: Requirements, Design, and Preliminary Evaluation

    PubMed Central

    Pasche, Emilie; Gobeill, Julien; Emonet, Stéphane; Ruch, Patrick; Lovis, Christian

    2012-01-01

    Background Antimicrobial resistance has reached globally alarming levels and is becoming a major public health threat. Lack of efficacious antimicrobial resistance surveillance systems was identified as one of the causes of increasing resistance, due to the lag time between new resistances and alerts to care providers. Several initiatives to track drug resistance evolution have been developed. However, no effective real-time and source-independent antimicrobial resistance monitoring system is available publicly. Objective To design and implement an architecture that can provide real-time and source-independent antimicrobial resistance monitoring to support transnational resistance surveillance. In particular, we investigated the use of a Semantic Web-based model to foster integration and interoperability of interinstitutional and cross-border microbiology laboratory databases. Methods Following the agile software development methodology, we derived the main requirements needed for effective antimicrobial resistance monitoring, from which we proposed a decentralized monitoring architecture based on the Semantic Web stack. The architecture uses an ontology-driven approach to promote the integration of a network of sentinel hospitals or laboratories. Local databases are wrapped into semantic data repositories that automatically expose local computing-formalized laboratory information in the Web. A central source mediator, based on local reasoning, coordinates the access to the semantic end points. On the user side, a user-friendly Web interface provides access and graphical visualization to the integrated views. Results We designed and implemented the online Antimicrobial Resistance Trend Monitoring System (ARTEMIS) in a pilot network of seven European health care institutions sharing 70+ million triples of information about drug resistance and consumption. Evaluation of the computing performance of the mediator demonstrated that, on average, query response time was a few seconds (mean 4.3, SD 0.1×102 seconds). Clinical pertinence assessment showed that resistance trends automatically calculated by ARTEMIS had a strong positive correlation with the European Antimicrobial Resistance Surveillance Network (EARS-Net) (ρ = .86, P < .001) and the Sentinel Surveillance of Antibiotic Resistance in Switzerland (SEARCH) (ρ = .84, P < .001) systems. Furthermore, mean resistance rates extracted by ARTEMIS were not significantly different from those of either EARS-Net (∆ = ±0.130; 95% confidence interval –0 to 0.030; P < .001) or SEARCH (∆ = ±0.042; 95% confidence interval –0.004 to 0.028; P = .004). Conclusions We introduce a distributed monitoring architecture that can be used to build transnational antimicrobial resistance surveillance networks. Results indicated that the Semantic Web-based approach provided an efficient and reliable solution for development of eHealth architectures that enable online antimicrobial resistance monitoring from heterogeneous data sources. In future, we expect that more health care institutions can join the ARTEMIS network so that it can provide a large European and wider biosurveillance network that can be used to detect emerging bacterial resistance in a multinational context and support public health actions. PMID:22642960

  5. Can we measure beauty? Computational evaluation of coral reef aesthetics

    PubMed Central

    Guibert, Marine; Foerschner, Anja; Co, Tim; Calhoun, Sandi; George, Emma; Hatay, Mark; Dinsdale, Elizabeth; Sandin, Stuart A.; Smith, Jennifer E.; Vermeij, Mark J.A.; Felts, Ben; Dustan, Phillip; Salamon, Peter; Rohwer, Forest

    2015-01-01

    The natural beauty of coral reefs attracts millions of tourists worldwide resulting in substantial revenues for the adjoining economies. Although their visual appearance is a pivotal factor attracting humans to coral reefs current monitoring protocols exclusively target biogeochemical parameters, neglecting changes in their aesthetic appearance. Here we introduce a standardized computational approach to assess coral reef environments based on 109 visual features designed to evaluate the aesthetic appearance of art. The main feature groups include color intensity and diversity of the image, relative size, color, and distribution of discernable objects within the image, and texture. Specific coral reef aesthetic values combining all 109 features were calibrated against an established biogeochemical assessment (NCEAS) using machine learning algorithms. These values were generated for ∼2,100 random photographic images collected from 9 coral reef locations exposed to varying levels of anthropogenic influence across 2 ocean systems. Aesthetic values proved accurate predictors of the NCEAS scores (root mean square error < 5 for N ≥ 3) and significantly correlated to microbial abundance at each site. This shows that mathematical approaches designed to assess the aesthetic appearance of photographic images can be used as an inexpensive monitoring tool for coral reef ecosystems. It further suggests that human perception of aesthetics is not purely subjective but influenced by inherent reactions towards measurable visual cues. By quantifying aesthetic features of coral reef systems this method provides a cost efficient monitoring tool that targets one of the most important socioeconomic values of coral reefs directly tied to revenue for its local population. PMID:26587350

  6. Measuring phenological variability from satellite imagery

    USGS Publications Warehouse

    Reed, Bradley C.; Brown, Jesslyn F.; Vanderzee, D.; Loveland, Thomas R.; Merchant, James W.; Ohlen, Donald O.

    1994-01-01

    Vegetation phenological phenomena are closely related to seasonal dynamics of the lower atmosphere and are therefore important elements in global models and vegetation monitoring. Normalized difference vegetation index (NDVI) data derived from the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer (AVHRR) satellite sensor offer a means of efficiently and objectively evaluating phenological characteristics over large areas. Twelve metrics linked to key phenological events were computed based on time-series NDVI data collected from 1989 to 1992 over the conterminous United States. These measures include the onset of greenness, time of peak NDVI, maximum NDVI, rate of greenup, rate of senescence, and integrated NDVI. Measures of central tendency and variability of the measures were computed and analyzed for various land cover types. Results from the analysis showed strong coincidence between the satellite-derived metrics and predicted phenological characteristics. In particular, the metrics identified interannual variability of spring wheat in North Dakota, characterized the phenology of four types of grasslands, and established the phenological consistency of deciduous and coniferous forests. These results have implications for large- area land cover mapping and monitoring. The utility of re- motely sensed data as input to vegetation mapping is demonstrated by showing the distinct phenology of several land cover types. More stable information contained in ancillary data should be incorporated into the mapping process, particularly in areas with high phenological variability. In a regional or global monitoring system, an increase in variability in a region may serve as a signal to perform more detailed land cover analysis with higher resolution imagery.

  7. Application of ubiquitous computing in personal health monitoring systems.

    PubMed

    Kunze, C; Grossmann, U; Stork, W; Müller-Glaser, K D

    2002-01-01

    A possibility to significantly reduce the costs of public health systems is to increasingly use information technology. The Laboratory for Information Processing Technology (ITIV) at the University of Karlsruhe is developing a personal health monitoring system, which should improve health care and at the same time reduce costs by combining micro-technological smart sensors with personalized, mobile computing systems. In this paper we present how ubiquitous computing theory can be applied in the health-care domain.

  8. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    NASA Astrophysics Data System (ADS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  9. Computer simulation of a space SAR using a range-sequential processor for soil moisture mapping

    NASA Technical Reports Server (NTRS)

    Fujita, M.; Ulaby, F. (Principal Investigator)

    1982-01-01

    The ability of a spaceborne synthetic aperture radar (SAR) to detect soil moisture was evaluated by means of a computer simulation technique. The computer simulation package includes coherent processing of the SAR data using a range-sequential processor, which can be set up through hardware implementations, thereby reducing the amount of telemetry involved. With such a processing approach, it is possible to monitor the earth's surface on a continuous basis, since data storage requirements can be easily met through the use of currently available technology. The Development of the simulation package is described, followed by an examination of the application of the technique to actual environments. The results indicate that in estimating soil moisture content with a four-look processor, the difference between the assumed and estimated values of soil moisture is within + or - 20% of field capacity for 62% of the pixels for agricultural terrain and for 53% of the pixels for hilly terrain. The estimation accuracy for soil moisture may be improved by reducing the effect of fading through non-coherent averaging.

  10. A machine learning approach to improve contactless heart rate monitoring using a webcam.

    PubMed

    Monkaresi, Hamed; Calvo, Rafael A; Yan, Hong

    2014-07-01

    Unobtrusive, contactless recordings of physiological signals are very important for many health and human-computer interaction applications. Most current systems require sensors which intrusively touch the user's skin. Recent advances in contact-free physiological signals open the door to many new types of applications. This technology promises to measure heart rate (HR) and respiration using video only. The effectiveness of this technology, its limitations, and ways of overcoming them deserves particular attention. In this paper, we evaluate this technique for measuring HR in a controlled situation, in a naturalistic computer interaction session, and in an exercise situation. For comparison, HR was measured simultaneously using an electrocardiography device during all sessions. The results replicated the published results in controlled situations, but show that they cannot yet be considered as a valid measure of HR in naturalistic human-computer interaction. We propose a machine learning approach to improve the accuracy of HR detection in naturalistic measurements. The results demonstrate that the root mean squared error is reduced from 43.76 to 3.64 beats/min using the proposed method.

  11. A cloud-based information repository for bridge monitoring applications

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Zhang, Yilan; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2016-04-01

    This paper describes an information repository to support bridge monitoring applications on a cloud computing platform. Bridge monitoring, with instrumentation of sensors in particular, collects significant amount of data. In addition to sensor data, a wide variety of information such as bridge geometry, analysis model and sensor description need to be stored. Data management plays an important role to facilitate data utilization and data sharing. While bridge information modeling (BrIM) technologies and standards have been proposed and they provide a means to enable integration and facilitate interoperability, current BrIM standards support mostly the information about bridge geometry. In this study, we extend the BrIM schema to include analysis models and sensor information. Specifically, using the OpenBrIM standards as the base, we draw on CSI Bridge, a commercial software widely used for bridge analysis and design, and SensorML, a standard schema for sensor definition, to define the data entities necessary for bridge monitoring applications. NoSQL database systems are employed for data repository. Cloud service infrastructure is deployed to enhance scalability, flexibility and accessibility of the data management system. The data model and systems are tested using the bridge model and the sensor data collected at the Telegraph Road Bridge, Monroe, Michigan.

  12. Wireless Monitoring of Liver Hemodynamics In Vivo

    DOE PAGES

    Akl, Tony J.; Wilson, Mark A.; Ericson, M. Nance; ...

    2014-07-14

    Liver transplants have their highest failure rate in the first two weeks following surgery. There are no devices for continuous, real-time monitoring of the graft, currently. Here, we present a continuous perfusion and oxygen consumption monitor based on photoplethysmography. The sensor is battery operated and communicates wirelessly with a data acquisition computer which provides the possibility of implantation provided sufficient miniaturization. In two in vivo porcine studies, the sensor tracked perfusion changes in hepatic tissue during vascular occlusions with a root mean square error (RMSE) of 0.125 mL/min/g of tissue. We show the possibility of using the pulsatile wave tomore » measure the arterial oxygen saturation similar to pulse oximetry. This signal is used as a feedback to extract the venous oxygen saturation from the DC levels. Arterial and venous oxygen saturation changes were measured with an RMSE of 2.19 and 1.39% respectively when no vascular occlusions were induced. The resulting error increased to 2.82 and 3.83% when vascular occlusions were induced during hypoxia. These errors are similar to the resolution of the oximetry catheter used as a reference. This work is the first realization of a wireless perfusion and oxygenation sensor for continuous monitoring of hepatic perfusion and oxygenation changes.« less

  13. Prenatal Remote Monitoring of Women With Gestational Hypertensive Diseases: Cost Analysis

    PubMed Central

    Vandenberk, Thijs; Smeets, Christophe JP; De Cannière, Hélène; Vonck, Sharona; Claessens, Jade; Heyrman, Yenthel; Vandijck, Dominique; Storms, Valerie; Thijs, Inge M; Grieten, Lars; Gyselaers, Wilfried

    2018-01-01

    Background Remote monitoring in obstetrics is relatively new; some studies have shown its effectiveness for both mother and child. However, few studies have evaluated the economic impact compared to conventional care, and no cost analysis of a remote monitoring prenatal follow-up program for women diagnosed with gestational hypertensive diseases (GHD) has been published. Objective The aim of this study was to assess the costs of remote monitoring versus conventional care relative to reported benefits. Methods Patient data from the Pregnancy Remote Monitoring (PREMOM) study were used. Health care costs were calculated from patient-specific hospital bills of Ziekenhuis Oost-Limburg (Genk, Belgium) in 2015. Cost comparison was made from three perspectives: the Belgian national health care system (HCS), the National Institution for Insurance of Disease and Disability (RIZIV), and costs for individual patients. The calculations were made for four major domains: prenatal follow-up, prenatal admission to the hospital, maternal and neonatal care at and after delivery, and total amount of costs. A simulation exercise was made in which it was calculated how much could be demanded of RIZIV for funding the remote monitoring service. Results A total of 140 pregnancies were included, of which 43 received remote monitoring (30.7%) and 97 received conventional care (69.2%). From the three perspectives, there were no differences in costs for prenatal follow-up. Compared to conventional care, remote monitoring patients had 34.51% less HCS and 41.72% less RIZIV costs for laboratory test results (HCS: mean €0.00 [SD €55.34] vs mean €38.28 [SD € 44.08], P<.001; RIZIV: mean €21.09 [SD €27.94] vs mean €36.19 [SD €41.36], P<.001) and a reduction of 47.16% in HCS and 48.19% in RIZIV costs for neonatal care (HCS: mean €989.66 [SD €3020.22] vs mean €1872.92 [SD €5058.31], P<.001; RIZIV: mean €872.97 [SD €2761.64] vs mean €1684.86 [SD €4702.20], P<.001). HCS costs for medication were 1.92% lower in remote monitoring than conventional care (mean €209.22 [SD €213.32] vs mean €231.32 [SD 67.09], P=.02), but were 0.69% higher for RIZIV (mean €122.60 [SD €92.02] vs mean €121.78 [SD €20.77], P<.001). Overall HCS costs for remote monitoring were mean €4233.31 (SD €3463.31) per person and mean €4973.69 (SD €5219.00) per person for conventional care (P=.82), a reduction of €740.38 (14.89%) per person, with savings mainly for RIZIV of €848.97 per person (23.18%; mean €2797.42 [SD €2905.18] vs mean €3646.39 [SD €4878.47], P=.19). When an additional fee of €525.07 per month per pregnant woman for funding remote monitoring costs is demanded, remote monitoring is acceptable in their costs for HCS, RIZIV, and individual patients. Conclusions In the current organization of Belgian health care, a remote monitoring prenatal follow-up of women with GHD is cost saving for the global health care system, mainly via savings for the insurance institution RIZIV. PMID:29581094

  14. Prenatal Remote Monitoring of Women With Gestational Hypertensive Diseases: Cost Analysis.

    PubMed

    Lanssens, Dorien; Vandenberk, Thijs; Smeets, Christophe Jp; De Cannière, Hélène; Vonck, Sharona; Claessens, Jade; Heyrman, Yenthel; Vandijck, Dominique; Storms, Valerie; Thijs, Inge M; Grieten, Lars; Gyselaers, Wilfried

    2018-03-26

    Remote monitoring in obstetrics is relatively new; some studies have shown its effectiveness for both mother and child. However, few studies have evaluated the economic impact compared to conventional care, and no cost analysis of a remote monitoring prenatal follow-up program for women diagnosed with gestational hypertensive diseases (GHD) has been published. The aim of this study was to assess the costs of remote monitoring versus conventional care relative to reported benefits. Patient data from the Pregnancy Remote Monitoring (PREMOM) study were used. Health care costs were calculated from patient-specific hospital bills of Ziekenhuis Oost-Limburg (Genk, Belgium) in 2015. Cost comparison was made from three perspectives: the Belgian national health care system (HCS), the National Institution for Insurance of Disease and Disability (RIZIV), and costs for individual patients. The calculations were made for four major domains: prenatal follow-up, prenatal admission to the hospital, maternal and neonatal care at and after delivery, and total amount of costs. A simulation exercise was made in which it was calculated how much could be demanded of RIZIV for funding the remote monitoring service. A total of 140 pregnancies were included, of which 43 received remote monitoring (30.7%) and 97 received conventional care (69.2%). From the three perspectives, there were no differences in costs for prenatal follow-up. Compared to conventional care, remote monitoring patients had 34.51% less HCS and 41.72% less RIZIV costs for laboratory test results (HCS: mean €0.00 [SD €55.34] vs mean €38.28 [SD € 44.08], P<.001; RIZIV: mean €21.09 [SD €27.94] vs mean €36.19 [SD €41.36], P<.001) and a reduction of 47.16% in HCS and 48.19% in RIZIV costs for neonatal care (HCS: mean €989.66 [SD €3020.22] vs mean €1872.92 [SD €5058.31], P<.001; RIZIV: mean €872.97 [SD €2761.64] vs mean €1684.86 [SD €4702.20], P<.001). HCS costs for medication were 1.92% lower in remote monitoring than conventional care (mean €209.22 [SD €213.32] vs mean €231.32 [SD 67.09], P=.02), but were 0.69% higher for RIZIV (mean €122.60 [SD €92.02] vs mean €121.78 [SD €20.77], P<.001). Overall HCS costs for remote monitoring were mean €4233.31 (SD €3463.31) per person and mean €4973.69 (SD €5219.00) per person for conventional care (P=.82), a reduction of €740.38 (14.89%) per person, with savings mainly for RIZIV of €848.97 per person (23.18%; mean €2797.42 [SD €2905.18] vs mean €3646.39 [SD €4878.47], P=.19). When an additional fee of €525.07 per month per pregnant woman for funding remote monitoring costs is demanded, remote monitoring is acceptable in their costs for HCS, RIZIV, and individual patients. In the current organization of Belgian health care, a remote monitoring prenatal follow-up of women with GHD is cost saving for the global health care system, mainly via savings for the insurance institution RIZIV. ©Dorien Lanssens, Thijs Vandenberk, Christophe JP Smeets, Hélène De Cannière, Sharona Vonck, Jade Claessens, Yenthel Heyrman, Dominique Vandijck, Valerie Storms, Inge M Thijs, Lars Grieten, Wilfried Gyselaers. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 26.03.2018.

  15. Low-complexity R-peak detection in ECG signals: a preliminary step towards ambulatory fetal monitoring.

    PubMed

    Rooijakkers, Michiel; Rabotti, Chiara; Bennebroek, Martijn; van Meerbergen, Jef; Mischi, Massimo

    2011-01-01

    Non-invasive fetal health monitoring during pregnancy has become increasingly important. Recent advances in signal processing technology have enabled fetal monitoring during pregnancy, using abdominal ECG recordings. Ubiquitous ambulatory monitoring for continuous fetal health measurement is however still unfeasible due to the computational complexity of noise robust solutions. In this paper an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, while increasing the R-peak detection quality compared to existing R-peak detection schemes. Validation of the algorithm is performed on two manually annotated datasets, the MIT/BIH Arrhythmia database and an in-house abdominal database. Both R-peak detection quality and computational complexity are compared to state-of-the-art algorithms as described in the literature. With a detection error rate of 0.22% and 0.12% on the MIT/BIH Arrhythmia and in-house databases, respectively, the quality of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.

  16. Exploiting analytics techniques in CMS computing monitoring

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Kuznetsov, V.; Magini, N.; Repečka, A.; Vaandering, E.

    2017-10-01

    The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster for further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.

  17. Computed tomography correlates with improvement with ivacaftor in cystic fibrosis patients with G551D mutation.

    PubMed

    Sheikh, Shahid I; Long, Frederick R; McCoy, Karen S; Johnson, Terri; Ryan-Wenger, Nancy A; Hayes, Don

    2015-01-01

    Ivacaftor corrects the cystic fibrosis transmembrane conductance regulator (CFTR) gating defect associated with G551D mutation and is quickly becoming an important treatment in patients with cystic fibrosis (CF) due to this genetic mutation. A single-center study was performed in CF patients receiving ivacaftor to evaluate the usefulness of high resolution computed tomography (HRCT) of the chest as a way to gauge response to ivacaftor therapy. Ten patients with CF were enrolled for at least one year before and after starting ivacaftor. At time of enrollment, mean age was 20.9 ± 10.8 (range 10-44) years. There were significant improvements from baseline to 6 months in mean %FVC (93 ± 16 to 99 ± 16) and %FEV1 (79 ± 26 to 87 ± 28) but reverted to baseline at one year. Mean sweat chloride levels decreased significantly from baseline to one year. Mean weight and BMI improved at 6 months. Weight continued to improve with stabilization of BMI at one year. Chest HRCT showed significant improvement at one year in mean modified Brody scores for bronchiectasis, mucous plugging, airway wall thickness, and total Brody scores. Elevated bronchiectasis and airway wall thickness scores correlated significantly with lower %FEV1, while higher airway wall thickness and mucus plugging scores correlated with more pulmonary exacerbations requiring IV and oral antibiotics respectively. Based on our findings, HRCT imaging is a useful tool in monitoring response to ivacaftor therapy that corrects the gating defect associated with the G551D-CFTR mutation. Copyright © 2014 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  18. HOME - An application of fault-tolerant techniques and system self-testing. [independent computer for helicopter flight control command monitoring

    NASA Technical Reports Server (NTRS)

    Holden, D. G.

    1975-01-01

    Hard Over Monitoring Equipment (HOME) has been designed to complement and enhance the flight safety of a flight research helicopter. HOME is an independent, highly reliable, and fail-safe special purpose computer that monitors the flight control commands issued by the flight control computer of the helicopter. In particular, HOME detects the issuance of a hazardous hard-over command for any of the four flight control axes and transfers the control of the helicopter to the flight safety pilot. The design of HOME incorporates certain reliability and fail-safe enhancement design features, such as triple modular redundancy, majority logic voting, fail-safe dual circuits, independent status monitors, in-flight self-test, and a built-in preflight exerciser. The HOME design and operation is described with special emphasis on the reliability and fail-safe aspects of the design.

  19. A remote access ecg monitoring system - biomed 2009.

    PubMed

    Ogawa, Hidekuni; Yonezawa, Yoshiharu; Maki, Hiromichi; Iwamoto, Junichi; Hahn, Allen W; Caldwell, W Morton

    2009-01-01

    We have developed a remotely accessible telemedicine system for monitoring a patient's electrocardiogram (ECG). The system consists of an ECG recorder mounted on chest electrodes and a physician's laptop personal computer. This ECG recorder is designed with a variable gain instrumentation amplifier; a low power 8-bit single-chip microcomputer; two 128KB EEPROMs and 2.4 GHz low transmit power mobile telephone. When the physician wants to monitor the patient's ECG, he/she calls directly from the laptop PC to the ECG recorder's phone and the recorder sends the ECG to the computer. The electrode-mounted recorder continuously samples the ECG. Additionally, when the patient feels a heart discomfort, he/she pushes a data transmission switch on the recorder and the recorder sends the recorded ECG waveforms of the two prior minutes, and for two minutes after the switch is pressed. The physician can display and monitor the data on the computer's liquid crystal display.

  20. Deep space telecommunications, navigation, and information management - Support of the Space Exploration Initiative

    NASA Technical Reports Server (NTRS)

    Hall, Justin R.; Hastrup, Rolf C.

    1990-01-01

    The principal challenges in providing effective deep space navigation, telecommunications, and information management architectures and designs for Mars exploration support are presented. The fundamental objectives are to provide the mission with the means to monitor and control mission elements, obtain science, navigation, and engineering data, compute state vectors and navigate, and to move these data efficiently and automatically between mission nodes for timely analysis and decision making. New requirements are summarized, and related issues and challenges including the robust connectivity for manned and robotic links, are identified. Enabling strategies are discussed, and candidate architectures and driving technologies are described.

  1. Biochips Containing Arrays of Carbon-Nanotube Electrodes

    NASA Technical Reports Server (NTRS)

    Li, Jun; Meyyappan, M.; Koehne, Jessica; Cassell, Alan; Chen, Hua

    2008-01-01

    Biochips containing arrays of nanoelectrodes based on multiwalled carbon nanotubes (MWCNTs) are being developed as means of ultrasensitive electrochemical detection of specific deoxyribonucleic acid (DNA) and messenger ribonucleic acid (mRNA) biomarkers for purposes of medical diagnosis and bioenvironmental monitoring. In mass production, these biochips could be relatively inexpensive (hence, disposable). These biochips would be integrated with computer-controlled microfluidic and microelectronic devices in automated hand-held and bench-top instruments that could be used to perform rapid in vitro genetic analyses with simplified preparation of samples. Carbon nanotubes are attractive for use as nanoelectrodes for detection of biomolecules because of their nanoscale dimensions and their chemical properties.

  2. NASA's Aviation Safety and Modeling Project

    NASA Technical Reports Server (NTRS)

    Chidester, Thomas R.; Statler, Irving C.

    2006-01-01

    The Aviation Safety Monitoring and Modeling (ASMM) Project of NASA's Aviation Safety program is cultivating sources of data and developing automated computer hardware and software to facilitate efficient, comprehensive, and accurate analyses of the data collected from large, heterogeneous databases throughout the national aviation system. The ASMM addresses the need to provide means for increasing safety by enabling the identification and correcting of predisposing conditions that could lead to accidents or to incidents that pose aviation risks. A major component of the ASMM Project is the Aviation Performance Measuring System (APMS), which is developing the next generation of software tools for analyzing and interpreting flight data.

  3. Study on Information Security and e-Trust in Spanish households

    NASA Astrophysics Data System (ADS)

    Aguado, José

    The study on Information Security and e-Trust in Spanish households has been conducted by INTECO (The National Institute of Communication Technologies) through the Information Security Observatory. It is a study on the incidence and trust of users in the Internet by means of measuring the frequency of the episodes of individual risk in a wide sample of users that are monitored online on a monthly basis, combining quantitative data of incidences (monthly scans of home computers) and qualitative perception data (quarterly surveys). The study is supplied with data from more than 3,000 households with Internet connection, spread across the whole country.

  4. Deep space telecommunications, navigation, and information management - Support of the Space Exploration Initiative

    NASA Astrophysics Data System (ADS)

    Hall, Justin R.; Hastrup, Rolf C.

    1990-10-01

    The principal challenges in providing effective deep space navigation, telecommunications, and information management architectures and designs for Mars exploration support are presented. The fundamental objectives are to provide the mission with the means to monitor and control mission elements, obtain science, navigation, and engineering data, compute state vectors and navigate, and to move these data efficiently and automatically between mission nodes for timely analysis and decision making. New requirements are summarized, and related issues and challenges including the robust connectivity for manned and robotic links, are identified. Enabling strategies are discussed, and candidate architectures and driving technologies are described.

  5. SCUBA divers as oceanographic samplers: The potential of dive computers to augment aquatic temperature monitoring

    PubMed Central

    Wright, Serena; Hull, Tom; Sivyer, David B.; Pearce, David; Pinnegar, John K.; Sayer, Martin D. J.; Mogg, Andrew O. M.; Azzopardi, Elaine; Gontarek, Steve; Hyder, Kieran

    2016-01-01

    Monitoring temperature of aquatic waters is of great importance, with modelled, satellite and in-situ data providing invaluable insights into long-term environmental change. However, there is often a lack of depth-resolved temperature measurements. Recreational dive computers routinely record temperature and depth, so could provide an alternate and highly novel source of oceanographic information to fill this data gap. In this study, a citizen science approach was used to obtain over 7,000 scuba diver temperature profiles. The accuracy, offset and lag of temperature records was assessed by comparing dive computers with scientific conductivity-temperature-depth instruments and existing surface temperature data. Our results show that, with processing, dive computers can provide a useful and novel tool with which to augment existing monitoring systems all over the globe, but especially in under-sampled or highly changeable coastal environments. PMID:27445104

  6. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maxwell, Don E; Ezell, Matthew A; Becklehimer, Jeff

    While sites generally have systems in place to monitor the health of Cray computers themselves, often the cooling systems are ignored until a computer failure requires investigation into the source of the failure. The Liebert XDP units used to cool the Cray XE/XK models as well as the Cray proprietary cooling system used for the Cray XC30 models provide data useful for health monitoring. Unfortunately, this valuable information is often available only to custom solutions not accessible by a center-wide monitoring system or is simply ignored entirely. In this paper, methods and tools used to harvest the monitoring data availablemore » are discussed, and the implementation needed to integrate the data into a center-wide monitoring system at the Oak Ridge National Laboratory is provided.« less

  8. LabVIEW: a software system for data acquisition, data analysis, and instrument control.

    PubMed

    Kalkman, C J

    1995-01-01

    Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.

  9. Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs

    NASA Astrophysics Data System (ADS)

    RIngenburg, Michael F.

    Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.

  10. Monitor Tone Generates Stress in Computer and VDT Operators: A Preliminary Study.

    ERIC Educational Resources Information Center

    Dow, Caroline; Covert, Douglas C.

    A near-ultrasonic pure tone of 15,570 Herz generated by flyback transformers in computer and video display terminal (VDT) monitors may cause severe non-specific irritation or stress disease in operators. Women hear higher frequency sounds than men and are twice as sensitive to "too loud" noise. Pure tones at high frequencies are more…

  11. Technical Adequacy of Growth Estimates from a Computer Adaptive Test: Implications for Progress Monitoring

    ERIC Educational Resources Information Center

    Van Norman, Ethan R.; Nelson, Peter M.; Parker, David C.

    2017-01-01

    Computer adaptive tests (CATs) hold promise to monitor student progress within multitiered systems of support. However, the relationship between how long and how often data are collected and the technical adequacy of growth estimates from CATs has not been explored. Given CAT administration times, it is important to identify optimal data…

  12. Biosensor Technologies for Augmented Brain-Computer Interfaces in the Next Decades

    DTIC Science & Technology

    2012-05-13

    Research Triangle Park, NC 27709-2211 Augmented brain–computer interface (ABCI);biosensor; cognitive-state monitoring; electroencephalogram( EEG ); human...biosensor; cognitive-state monitoring; electroencephalogram ( EEG ); human brain imaging Manuscript received November 28, 2011; accepted December 20...magnetic reso- nance imaging (fMRI) [1], positron emission tomography (PET) [2], electroencephalograms ( EEGs ) and optical brain imaging techniques (i.e

  13. An Investigation of Learner-Control Variables in Vocabulary Learning Using Traditional Instruction and Two Forms of Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    1988-01-01

    Investigates college students' ability to monitor learner-controlled vocabulary instruction when performed in traditional workbook-like tasks and in two different computer-based formats: video game and text game exercises. Suggests that developmental reading students are unable to monitor their own vocabulary development accurately. (MM)

  14. A Computational Pipeline to Improve Clinical Alarms Using a Parallel Computing Infrastructure

    ERIC Educational Resources Information Center

    Nguyen, Andrew V.

    2013-01-01

    Physicians, nurses, and other clinical staff rely on alarms from various bedside monitors and sensors to alert when there is a change in the patient's clinical status, typically when urgent intervention is necessary. These alarms are usually embedded directly within the sensor or monitor and lacks the context of the patient's medical history and…

  15. Monitoring SLAC High Performance UNIX Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lettsome, Annette K.; /Bethune-Cookman Coll. /SLAC

    2005-12-15

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia.more » Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface.« less

  16. HyperCard Monitor System.

    ERIC Educational Resources Information Center

    Harris, Julian; Maurer, Hermann

    An investigation into high level event monitoring within the scope of a well-known multimedia application, HyperCard--a program on the Macintosh computer, is carried out. A monitoring system is defined as a system which automatically monitors usage of some activity and gathers statistics based on what is has observed. Monitor systems can give the…

  17. Reliability and validity of a brief questionnaire to assess television viewing and computer use by middle school children.

    PubMed

    Schmitz, Kathryn H; Harnack, Lisa; Fulton, Janet E; Jacobs, David R; Gao, Shujun; Lytle, Leslie A; Van Coevering, Pam

    2004-11-01

    Sedentary behaviors, like television viewing, are positively associated with overweight among young people. To monitor national health objectives for sedentary behaviors in young adolescents, this project developed and assessed the reliability and validity of a brief questionnaire to measure weekly television viewing, usual television viewing, and computer use by middle school children. Reliability and validity of the Youth Risk Behavior Survey (YRBS) question on weekday television viewing also were examined. A brief, five-item television and computer use questionnaire was completed twice by 245 middle school children with one week apart. To concurrently assess validity, students also completed television and computer use logs for seven days. Among all students, Spearman correlations for test-retest reliability for television viewing and computer use ranged from 0.55 to 0.68. Spearman correlations between the first questionnaire and the seven-day log produced the following results: YRBS question for weekday television viewing (0.46), weekend television viewing (0.37), average television viewing over the week (0.47), and computer use (0.39). Methods comparison analysis showed a mean difference (hours/week) between answers to questionnaire items and the log of -0.04 (1.70 standard deviation [SD]) hours for weekday television, -0.21 (2.54 SD) for weekend television, -0.09 (1.75 SD) for average television over the week, and 0.68 (1.26 SD) for computer use. The YRBS weekday television viewing question, and the newly developed questions to assess weekend television viewing, average television viewing, and computer use, produced adequate reliability and validity for surveillance of middle school students.

  18. Comparison of progressive addition lenses for general purpose and for computer vision: an office field study.

    PubMed

    Jaschinski, Wolfgang; König, Mirjam; Mekontso, Tiofil M; Ohlendorf, Arne; Welscher, Monique

    2015-05-01

    Two types of progressive addition lenses (PALs) were compared in an office field study: 1. General purpose PALs with continuous clear vision between infinity and near reading distances and 2. Computer vision PALs with a wider zone of clear vision at the monitor and in near vision but no clear distance vision. Twenty-three presbyopic participants wore each type of lens for two weeks in a double-masked four-week quasi-experimental procedure that included an adaptation phase (Weeks 1 and 2) and a test phase (Weeks 3 and 4). Questionnaires on visual and musculoskeletal conditions as well as preferences regarding the type of lenses were administered. After eight more weeks of free use of the spectacles, the preferences were assessed again. The ergonomic conditions were analysed from photographs. Head inclination when looking at the monitor was significantly lower by 2.3 degrees with the computer vision PALs than with the general purpose PALs. Vision at the monitor was judged significantly better with computer PALs, while distance vision was judged better with general purpose PALs; however, the reported advantage of computer vision PALs differed in extent between participants. Accordingly, 61 per cent of the participants preferred the computer vision PALs, when asked without information about lens design. After full information about lens characteristics and additional eight weeks of free spectacle use, 44 per cent preferred the computer vision PALs. On average, computer vision PALs were rated significantly better with respect to vision at the monitor during the experimental part of the study. In the final forced-choice ratings, approximately half of the participants preferred either the computer vision PAL or the general purpose PAL. Individual factors seem to play a role in this preference and in the rated advantage of computer vision PALs. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  19. Monitors.

    ERIC Educational Resources Information Center

    Powell, David

    1984-01-01

    Provides guidelines for selecting a monitor to suit specific applications, explains the process by which graphics images are produced on a CRT monitor, and describes four types of flat-panel displays being used in the newest lap-sized portable computers. A comparison chart provides prices and specifications for over 80 monitors. (MBR)

  20. Persons with multiple disabilities select environmental stimuli through a smile response monitored via camera-based technology.

    PubMed

    Lancioni, Giulio E; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N; O'reilly, Mark F; Lang, Russell; Didden, Robert; Bosco, Andrea

    2011-01-01

    To assess whether two persons with multiple disabilities could use smile expressions and new camera-based microswitch technology to select environmental stimuli. Within each session, a computer system provided samples/reminders of preferred and non-preferred stimuli. The camera-based microswitch determined whether the participants had smile expressions in relation to those samples. If they did, stimuli matching the specific samples to which they responded were presented for 20 seconds. The smile expression could be profitably used by the participants who managed to select means of ∼70% or 75% of the preferred stimulus opportunities made available by the environment while avoiding almost all the non-preferred stimulus opportunities. Smile expressions (a) might be an effective and rapid means for selecting preferred stimulation and (b) might develop into cognitively more elaborate forms of responding through the learning experience (i.e. their consistent association with positive/reinforcing consequences).

  1. [Sedentary behaviour 13-years-olds and its association with selected health behaviours, parenting practices and body mass].

    PubMed

    Jodkowska, Maria; Tabak, Izabela; Oblacińska, Anna; Stalmach, Magdalena

    2013-01-01

    1. To estimate the time spent in sedentary behaviour (watching TV, using the computer, doing homework). 2. To assess the link between the total time spent on watching TV, using the computer, doing homework and dietary habits, physical activity, parental practices and body mass. Cross-sectional study was conducted in Poland in 2008 among 13-year olds (n=600). They self-reported their time of TV viewing, computer use and homework. Their dietary behaviours, physical activity (MVPA) and parenting practices were also self-reported. Height and weight were measured by school nurses. Descriptive statistics and correlation were used in this analysis. The mean time spent watching television in school days was 2.3 hours for girls and 2.2 for boys. Boys spent significantly more time using the computer than girls - respectively 1.8 and 1.5 hours, while girls took longer doing homework - respectively 1.7 and 1.3 hours. Mean screen time was about 4 hours in school days and about 6 hours during weekend, statistically longer for boys in weekdays. Screen time was positively associated with intake of sweets, chips, soft drinks, "fast food" and meals consumption during TV, and negatively with regularity of meals and parental supervision. There was no correlation between screen time with physical activity and body mass. Sedentary behaviours and physical activity are not competing behaviours in Polish teenagers, but their relationship with unhealthy dietary patterns may lead to development of obesity. Good parental practices, both mother's and father's supervision seems to be crucial for screen time limitation in their children. Parents should become aware that relevant lifestyle monitoring of their children is a crucial element of health education in prevention of civilization diseases. This is a task for both healthcare workers and educational staff.

  2. Selected micrometeorological and soil-moisture data at Amargosa Desert Research Site, an arid site near Beatty, Nye County, Nevada, 1998-2000

    USGS Publications Warehouse

    Johnson, Michael J.; Mayers, Charles J.; Andraski, Brian J.

    2002-01-01

    Selected micrometeorological and soil-moisture data were collected at the Amargosa Desert Research Site adjacent to a low-level radioactive waste and hazardous chemical waste facility near Beatty, Nev., 1998-2000. Data were collected in support of ongoing research studies to improve the understanding of hydrologic and contaminant-transport processes in arid environments. Micrometeorological data include precipitation, air temperature, solar radiation, net radiation, relative humidity, ambient vapor pressure, wind speed and direction, barometric pressure, soil temperature, and soil-heat flux. All micrometeorological data were collected using a 10-second sampling interval by data loggers that output daily mean, maximum, and minimum values, and hourly mean values. For precipitation, data output consisted of daily, hourly, and 5-minute totals. Soil-moisture data included periodic measurements of soil-water content at nine neutron-probe access tubes with measurable depths ranging from 5.25 to 29.75 meters. The computer data files included in this report contain the complete micrometeorological and soil-moisture data sets. The computer data consists of seven files with about 14 megabytes of information. The seven files are in tabular format: (1) one file lists daily mean, maximum, and minimum micrometeorological data and daily total precipitation; (2) three files list hourly mean micrometeorological data and hourly precipitation for each year (1998-2000); (3) one file lists 5-minute precipitation data; (4) one file lists mean soil-water content by date and depth at four experimental sites; and (5) one file lists soil-water content by date and depth for each neutron-probe access tube. This report highlights selected data contained in the computer data files using figures, tables, and brief discussions. Instrumentation used for data collection also is described. Water-content profiles are shown to demonstrate variability of water content with depth. Time-series data are plotted to illustrate temporal variations in micrometeorological and soil-water content data. Substantial precipitation at the end of an El Ni?o cycle in early 1998 resulted in measurable water penetration to a depth of 1.25 meters at one of the four experimental soil-monitoring sites.

  3. Monitoring Obstructive Sleep Apnea by means of a real-time mobile system based on the automatic extraction of sets of rules through Differential Evolution.

    PubMed

    Sannino, Giovanna; De Falco, Ivanoe; De Pietro, Giuseppe

    2014-06-01

    Real-time Obstructive Sleep Apnea (OSA) episode detection and monitoring are important for society in terms of an improvement in the health of the general population and of a reduction in mortality and healthcare costs. Currently, to diagnose OSA patients undergo PolySomnoGraphy (PSG), a complicated and invasive test to be performed in a specialized center involving many sensors and wires. Accordingly, each patient is required to stay in the same position throughout the duration of one night, thus restricting their movements. This paper proposes an easy, cheap, and portable approach for the monitoring of patients with OSA, which collects single-channel ElectroCardioGram (ECG) data only. It is easy to perform from the patient's point of view because only one wearable sensor is required, so the patient is not restricted to keeping the same position all night long, and the detection and monitoring can be carried out in any place through the use of a mobile device. Our approach is based on the automatic extraction, from a database containing information about the monitored patient, of explicit knowledge in the form of a set of IF…THEN rules containing typical parameters derived from Heart Rate Variability (HRV) analysis. The extraction is carried out off-line by means of a Differential Evolution algorithm. This set of rules can then be exploited in the real-time mobile monitoring system developed at our Laboratory: the ECG data is gathered by a wearable sensor and sent to a mobile device, where it is processed in real time. Subsequently, HRV-related parameters are computed from this data, and, if their values activate some of the rules describing the occurrence of OSA, an alarm is automatically produced. This approach has been tested on a well-known literature database of OSA patients. The numerical results show its effectiveness in terms of accuracy, sensitivity, and specificity, and the achieved sets of rules evidence the user-friendliness of the approach. Furthermore, the method is compared against other well known classifiers, and its discrimination ability is shown to be higher. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Raspberry Pi in-situ network monitoring system of groundwater flow and temperature integrated with OpenGeoSys

    NASA Astrophysics Data System (ADS)

    Park, Chan-Hee; Lee, Cholwoo

    2016-04-01

    Raspberry Pi series is a low cost, smaller than credit-card sized computers that various operating systems such as linux and recently even Windows 10 are ported to run on. Thanks to massive production and rapid technology development, the price of various sensors that can be attached to Raspberry Pi has been dropping at an increasing speed. Therefore, the device can be an economic choice as a small portable computer to monitor temporal hydrogeological data in fields. In this study, we present a Raspberry Pi system that measures a flow rate, and temperature of groundwater at sites, stores them into mysql database, and produces interactive figures and tables such as google charts online or bokeh offline for further monitoring and analysis. Since all the data are to be monitored on internet, any computers or mobile devices can be good monitoring tools at convenience. The measured data are further integrated with OpenGeoSys, one of the hydrogeological models that is also ported to the Raspberry Pi series. This leads onsite hydrogeological modeling fed by temporal sensor data to meet various needs.

  5. Guidelines and standard procedures for continuous water-quality monitors: Station operation, record computation, and data reporting

    USGS Publications Warehouse

    Wagner, Richard J.; Boulger, Robert W.; Oblinger, Carolyn J.; Smith, Brett A.

    2006-01-01

    The U.S. Geological Survey uses continuous water-quality monitors to assess the quality of the Nation's surface water. A common monitoring-system configuration for water-quality data collection is the four-parameter monitoring system, which collects temperature, specific conductance, dissolved oxygen, and pH data. Such systems also can be configured to measure other properties, such as turbidity or fluorescence. Data from sensors can be used in conjunction with chemical analyses of samples to estimate chemical loads. The sensors that are used to measure water-quality field parameters require careful field observation, cleaning, and calibration procedures, as well as thorough procedures for the computation and publication of final records. This report provides guidelines for site- and monitor-selection considerations; sensor inspection and calibration methods; field procedures; data evaluation, correction, and computation; and record-review and data-reporting processes, which supersede the guidelines presented previously in U.S. Geological Survey Water-Resources Investigations Report WRIR 00-4252. These procedures have evolved over the past three decades, and the process continues to evolve with newer technologies.

  6. Computational metrology: enabling full-lot high-density fingerprint information without adding wafer metrology budget, and driving improved monitoring and process control

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Sok; Hyun, Min-Sung; Ju, Jae-Wuk; Kim, Young-Sik; Lambregts, Cees; van Rhee, Peter; Kim, Johan; McNamara, Elliott; Tel, Wim; Böcker, Paul; Oh, Nang-Lyeom; Lee, Jun-Hyung

    2018-03-01

    Computational metrology has been proposed as the way forward to resolve the need for increased metrology density, resulting from extending correction capabilities, without adding actual metrology budget. By exploiting TWINSCAN based metrology information, dense overlay fingerprints for every wafer can be computed. This extended metrology dataset enables new use cases, such as monitoring and control based on fingerprints for every wafer of the lot. This paper gives a detailed description, discusses the accuracy of the fingerprints computed, and will show results obtained in a DRAM HVM manufacturing environment. Also an outlook for improvements and extensions will be shared.

  7. Ubiquitous computing in sports: A review and analysis.

    PubMed

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  8. Dynamic virtual machine allocation policy in cloud computing complying with service level agreement using CloudSim

    NASA Astrophysics Data System (ADS)

    Aneri, Parikh; Sumathy, S.

    2017-11-01

    Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.

  9. Use of a continuous glucose sensor in an extracorporeal life support circuit.

    PubMed

    Steil, Garry M; Alexander, Jamin; Papas, Alexandra; Monica, Langer; Modi, Biren P; Piper, Hannah; Jaksic, Tom; Gottlieb, Rebecca; Agus, Michael S D

    2011-01-01

    Standard care for infants on extracorporeal life support (ECLS) relies on intermittent measurement of blood glucose (BG); however, this can lead to significant changes in BG that go unrecognized for several hours. The present study was designed to assess performance and clinical applicability of a subcutaneous glucose sensor technology modified for use as a blood-contacting sensor within the ECLS circuit. Twelve children, aged 3 years or less, requiring ECLS support were studied. Three continuous glucose sensors (Medtronic MiniMed) were inserted into hubs placed in line with the ECLS circuit. Blood glucose was assessed with a laboratory analyzer (BG(LAB); Bayer Rapidlab 860) approximately every 5 h (mean 4.9 ± 3.3 h) with more frequent samples obtained with a bedside monitor (HemoCue) as needed. Sensor current (I(SIG)) was transmitted to a laptop computer and retrospectively calibrated using BGLAB. Sensor performance was assessed by mean absolute relative difference (MARD), linear regression slope and intercept, and correlation, all with BGLAB as reference. The BGLAB averaged 107.6 ± 36.4 mg/dl (mean ± standard deviation) ranging from 58 to 366 mg/dl. The MARD was 11.4%, with linear regression slope (0.86 ± 0.030) and intercept (9.0 ± 3.2 mg/dl) different from 1 and 0, respectively (p < .05), and correlation (r² = 0.76; p < .001). The system was not associated with any adverse events, and placement and removal into the hubs was easily accomplished. Instances in which more frequent BG values were obtained using a bedside HemoCue (BGHEMO) monitor showed the sensor to respond rapidly to changes. We conclude that continuous sensors can be adapted for use in an ECLS circuit with accuracy similar to or better than that achieved with the subcutaneous site. Continuous glucose monitoring in this population can rapidly detect changes in BG that would not otherwise be observed. Further studies will be needed to assess the benefit of continuous glucose monitoring in this population. © 2010 Diabetes Technology Society.

  10. Using cross correlations of turbulent flow-induced ambient vibrations to estimate the structural impulse response. Application to structural health monitoring.

    PubMed

    Sabra, Karim G; Winkel, Eric S; Bourgoyne, Dwayne A; Elbing, Brian R; Ceccio, Steve L; Perlin, Marc; Dowling, David R

    2007-04-01

    It has been demonstrated theoretically and experimentally that an estimate of the impulse response (or Green's function) between two receivers can be obtained from the cross correlation of diffuse wave fields at these two receivers in various environments and frequency ranges: ultrasonics, civil engineering, underwater acoustics, and seismology. This result provides a means for structural monitoring using ambient structure-borne noise only, without the use of active sources. This paper presents experimental results obtained from flow-induced random vibration data recorded by pairs of accelerometers mounted within a flat plate or hydrofoil in the test section of the U.S. Navy's William B. Morgan Large Cavitation Channel. The experiments were conducted at high Reynolds number (Re > 50 million) with the primary excitation source being turbulent boundary layer pressure fluctuations on the upper and lower surfaces of the plate or foil. Identical deterministic time signatures emerge from the noise cross-correlation function computed via robust and simple processing of noise measured on different days by a pair of passive sensors. These time signatures are used to determine and/or monitor the structural response of the test models from a few hundred to a few thousand Hertz.

  11. A strip chart recorder pattern recognition tool kit for Shuttle operations

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Moebes, Travis A.; Shelton, Robert O.; Savely, Robert T.

    1993-01-01

    During Space Shuttle operations, Mission Control personnel monitor numerous mission-critical systems such as electrical power; guidance, navigation, and control; and propulsion by means of paper strip chart recorders. For example, electrical power controllers monitor strip chart recorder pen traces to identify onboard electrical equipment activations and deactivations. Recent developments in pattern recognition technologies coupled with new capabilities that distribute real-time Shuttle telemetry data to engineering workstations make it possible to develop computer applications that perform some of the low-level monitoring now performed by controllers. The number of opportunities for such applications suggests a need to build a pattern recognition tool kit to reduce software development effort through software reuse. We are building pattern recognition applications while keeping such a tool kit in mind. We demonstrated the initial prototype application, which identifies electrical equipment activations, during three recent Shuttle flights. This prototype was developed to test the viability of the basic system architecture, to evaluate the performance of several pattern recognition techniques including those based on cross-correlation, neural networks, and statistical methods, to understand the interplay between an advanced automation application and human controllers to enhance utility, and to identify capabilities needed in a more general-purpose tool kit.

  12. Monitoring Linked Epidemics: The Case of Tuberculosis and HIV

    PubMed Central

    Sánchez, María S.; Lloyd-Smith, James O.; Getz, Wayne M.

    2010-01-01

    Background The tight epidemiological coupling between HIV and its associated opportunistic infections leads to challenges and opportunities for disease surveillance. Methodology/Principal Findings We review efforts of WHO and collaborating agencies to track and fight the TB/HIV co-epidemic, and discuss modeling—via mathematical, statistical, and computational approaches—as a means to identify disease indicators designed to integrate data from linked diseases in order to characterize how co-epidemics change in time and space. We present R TB/HIV, an index comparing changes in TB incidence relative to HIV prevalence, and use it to identify those sub-Saharan African countries with outlier TB/HIV dynamics. R TB/HIV can also be used to predict epidemiological trends, investigate the coherency of reported trends, and cross-check the anticipated impact of public health interventions. Identifying the cause(s) responsible for anomalous R TB/HIV values can reveal information crucial to the management of public health. Conclusions/Significance We frame our suggestions for integrating and analyzing co-epidemic data within the context of global disease monitoring. Used routinely, joint disease indicators such as R TB/HIV could greatly enhance the monitoring and evaluation of public health programs. PMID:20098716

  13. Job monitoring on DIRAC for Belle II distributed computing

    NASA Astrophysics Data System (ADS)

    Kato, Yuji; Hayasaka, Kiyoshi; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo

    2015-12-01

    We developed a monitoring system for Belle II distributed computing, which consists of active and passive methods. In this paper we describe the passive monitoring system, where information stored in the DIRAC database is processed and visualized. We divide the DIRAC workload management flow into steps and store characteristic variables which indicate issues. These variables are chosen carefully based on our experiences, then visualized. As a result, we are able to effectively detect issues. Finally, we discuss the future development for automating log analysis, notification of issues, and disabling problematic sites.

  14. Real-Time Monitoring of Scada Based Control System for Filling Process

    NASA Astrophysics Data System (ADS)

    Soe, Aung Kyaw; Myint, Aung Naing; Latt, Maung Maung; Theingi

    2008-10-01

    This paper is a design of real-time monitoring for filling system using Supervisory Control and Data Acquisition (SCADA). The monitoring of production process is described in real-time using Visual Basic.Net programming under Visual Studio 2005 software without SCADA software. The software integrators are programmed to get the required information for the configuration screens. Simulation of components is expressed on the computer screen using parallel port between computers and filling devices. The programs of real-time simulation for the filling process from the pure drinking water industry are provided.

  15. Method and system for redundancy management of distributed and recoverable digital control system

    NASA Technical Reports Server (NTRS)

    Stange, Kent (Inventor); Hess, Richard (Inventor); Kelley, Gerald B (Inventor); Rogers, Randy (Inventor)

    2012-01-01

    A method and system for redundancy management is provided for a distributed and recoverable digital control system. The method uses unique redundancy management techniques to achieve recovery and restoration of redundant elements to full operation in an asynchronous environment. The system includes a first computing unit comprising a pair of redundant computational lanes for generating redundant control commands. One or more internal monitors detect data errors in the control commands, and provide a recovery trigger to the first computing unit. A second redundant computing unit provides the same features as the first computing unit. A first actuator control unit is configured to provide blending and monitoring of the control commands from the first and second computing units, and to provide a recovery trigger to each of the first and second computing units. A second actuator control unit provides the same features as the first actuator control unit.

  16. Computer mouse movement patterns: A potential marker of mild cognitive impairment.

    PubMed

    Seelye, Adriana; Hagler, Stuart; Mattek, Nora; Howieson, Diane B; Wild, Katherine; Dodge, Hiroko H; Kaye, Jeffrey A

    2015-12-01

    Subtle changes in cognitively demanding activities occur in MCI but are difficult to assess with conventional methods. In an exploratory study, we examined whether patterns of computer mouse movements obtained from routine home computer use discriminated between older adults with and without MCI. Participants were 42 cognitively intact and 20 older adults with MCI enrolled in a longitudinal study of in-home monitoring technologies. Mouse pointer movement variables were computed during one week of routine home computer use using algorithms that identified and characterized mouse movements within each computer use session. MCI was associated with making significantly fewer total mouse moves ( p <.01), and making mouse movements that were more variable, less efficient, and with longer pauses between movements ( p <.05). Mouse movement measures were significantly associated with several cognitive domains ( p 's<.01-.05). Remotely monitored computer mouse movement patterns are a potential early marker of real-world cognitive changes in MCI.

  17. Monitoring for Human Papillomavirus Vaccine Impact Among Gay, Bisexual, and Other Men Who Have Sex With Men—United States, 2012–2014

    PubMed Central

    Meites, Elissa; Gorbach, Pamina M.; Gratzer, Beau; Panicker, Gitika; Steinau, Martin; Collins, Tom; Parrish, Adam; Randel, Cody; McGrath, Mark; Carrasco, Steven; Moore, Janell; Zaidi, Akbar; Braxton, Jim; Kerndt, Peter R.; Unger, Elizabeth R.; Crosby, Richard A.; Markowitz, Lauri E.

    2016-01-01

    Background Gay, bisexual, and other men who have sex with men (MSM) are at high risk for human papillomavirus (HPV) infection; vaccination is recommended for US males, including MSM through age 26 years. We assessed evidence of HPV among vaccine-eligible MSM and transgender women to monitor vaccine impact. Methods During 2012–2014, MSM aged 18–26 years at select clinics completed a computer-assisted self-interview regarding sexual behavior, human immunodeficiency virus (HIV) status, and vaccinations. Self-collected anal swab and oral rinse specimens were tested for HPV DNA (37 types) by L1 consensus polymerase chain reaction; serum was tested for HPV antibodies (4 types) by a multiplexed virus-like particle–based immunoglobulin G direct enzyme-linked immunosorbent assay. Results Among 922 vaccine-eligible participants, the mean age was 23 years, and the mean number of lifetime sex partners was 37. Among 834 without HIV infection, any anal HPV was detected in 69.4% and any oral HPV in 8.4%, yet only 8.5% had evidence of exposure to all quadrivalent vaccine types. In multivariate analysis, HPV prevalence varied significantly (P < .05) by HIV status, sexual orientation, and lifetime number of sex partners, but not by race/ethnicity. Discussions Most young MSM lacked evidence of current or past infection with all vaccine-type HPV types, suggesting that they could benefit from vaccination. The impact of vaccination among MSM may be assessed by monitoring HPV prevalence, including in self-collected specimens. PMID:27296847

  18. Stroke patients and their attitudes toward mHealth monitoring to support blood pressure control and medication adherence.

    PubMed

    Jenkins, Carolyn; Burkett, Nina-Sarena; Ovbiagele, Bruce; Mueller, Martina; Patel, Sachin; Brunner-Jackson, Brenda; Saulson, Raelle; Treiber, Frank

    2016-05-01

    Mobile health, or mHealth, has increasingly been signaled as an effective means to expedite communication and improve medical regimen adherence, especially for patients with chronic health conditions such as stroke. However, there is a lack of data on attitudes of stroke patients toward mHealth. Such information will aid in identifying key indicators for feasibility and optimal implementation of mHealth to prevent and/or decrease rates of secondary stroke. Our objective was to ascertain stroke patients' attitudes toward using mobile phone enabled blood pressure (BP) monitoring and medication adherence and identify factors that modulate these attitudes. Sixty stroke patients received a brief demonstration of mHealth devices to assist with BP control and medication adherence and a survey to evaluate willingness to use this technology. The 60 participants had a mean age of 57 years, were 43.3% male, and 53.3% were White. With respect to telecommunication prevalence, 93.3% owned a cellular device and 25% owned a smartphone. About 70% owned a working computer. Regarding attitudes, 85% felt comfortable with a doctor or nurse using mHealth technologies to monitor personal health information, 78.3% believed mHealth would help remind them to follow doctor's directions, and 83.3% were confident that technology could effectively be used to communicate with health care providers for medical needs. Mobile device use is high in stroke patients and they are amenable to mHealth for communication and assistance in adhering to their medical regimens. More research is needed to explore usefulness of this technology in larger stroke populations.

  19. GPS Monitor Station Upgrade Program at the Naval Research Laboratory

    NASA Technical Reports Server (NTRS)

    Galysh, Ivan J.; Craig, Dwin M.

    1996-01-01

    One of the measurements made by the Global Positioning System (GPS) monitor stations is to measure the continuous pseudo-range of all the passing GPS satellites. The pseudo-range contains GPS and monitor station clock errors as well as GPS satellite navigation errors. Currently the time at the GPS monitor station is obtained from the GPS constellation and has an inherent inaccuracy as a result. Improved timing accuracy at the GPS monitoring stations will improve GPS performance. The US Naval Research Laboratory (NRL) is developing hardware and software for the GPS monitor station upgrade program to improve the monitor station clock accuracy. This upgrade will allow a method independent of the GPS satellite constellation of measuring and correcting monitor station time to US Naval Observatory (USNO) time. THe hardware consists of a high performance atomic cesium frequency standard (CFS) and a computer which is used to ensemble the CFS with the two CFS's currently located at the monitor station by use of a dual-mixer system. The dual-mixer system achieves phase measurements between the high-performance CFS and the existing monitor station CFS's to within 400 femtoseconds. Time transfer between USNO and a given monitor station is achieved via a two way satellite time transfer modem. The computer at the monitor station disciplines the CFS based on a comparison of one pulse per second sent from the master site at USNO. The monitor station computer is also used to perform housekeeping functions, as well as recording the health status of all three CFS's. This information is sent to the USNO through the time transfer modem. Laboratory time synchronization results in the sub nanosecond range have been observed and the ability to maintain the monitor station CFS frequency to within 3.0 x 10 (sup minus 14) of the master site at USNO.

  20. Universal Batch Steganalysis

    DTIC Science & Technology

    2014-06-01

    in large-scale datasets such as might be obtained by monitoring a corporate network or social network. Identifying guilty actors, rather than payload...by monitoring a corporate network or social network. Identifying guilty actors, rather than payload-carrying objects, is entirely novel in steganalysis...implementation using Compute Unified Device Architecture (CUDA) on NVIDIA graphics cards. The key to good performance is to combine computations so that

  1. Item-Specific Adaptation and the Conflict-Monitoring Hypothesis: A Computational Model

    ERIC Educational Resources Information Center

    Blais, Chris; Robidoux, Serje; Risko, Evan F.; Besner, Derek

    2007-01-01

    Comments on articles by Botvinick et al. and Jacob et al. M. M. Botvinick, T. S. Braver, D. M. Barch, C. S. Carter, and J. D. Cohen (2001) implemented their conflict-monitoring hypothesis of cognitive control in a series of computational models. The authors of the current article first demonstrate that M. M. Botvinick et al.'s (2001)…

  2. Curriculum-Based Measurement: Developing a Computer-Based Assessment Instrument for Monitoring Student Reading Progress on Multiple Indicators

    ERIC Educational Resources Information Center

    Forster, Natalie; Souvignier, Elmar

    2011-01-01

    The purpose of this study was to examine the technical adequacy of a computer-based assessment instrument which is based on hierarchical models of text comprehension for monitoring student reading progress following the Curriculum-Based Measurement (CBM) approach. At intervals of two weeks, 120 third-grade students finished eight CBM tests. To…

  3. Exploiting Analytics Techniques in CMS Computing Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonacorsi, D.; Kuznetsov, V.; Magini, N.

    The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster formore » further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.« less

  4. Is Technology-Mediated Parental Monitoring Related to Adolescent Substance Use?

    PubMed

    Rudi, Jessie; Dworkin, Jodi

    2018-01-03

    Prevention researchers have identified parental monitoring leading to parental knowledge to be a protective factor against adolescent substance use. In today's digital society, parental monitoring can occur using technology-mediated communication methods, such as text messaging, email, and social networking sites. The current study aimed to identify patterns, or clusters, of in-person and technology-mediated monitoring behaviors, and examine differences between the patterns (clusters) in adolescent substance use. Cross-sectional survey data were collected from 289 parents of adolescents using Facebook and Amazon Mechanical Turk (MTurk). Cluster analyses were computed to identify patterns of in-person and technology-mediated monitoring behaviors, and chi-square analyses were computed to examine differences in substance use between the identified clusters. Three monitoring clusters were identified: a moderate in-person and moderate technology-mediated monitoring cluster (moderate-moderate), a high in-person and high technology-mediated monitoring cluster (high-high), and a high in-person and low technology-mediated monitoring cluster (high-low). Higher frequency of technology-mediated parental monitoring was not associated with lower levels of substance use. Results show that higher levels of technology-mediated parental monitoring may not be associated with adolescent substance use.

  5. A microbased shared virtual world prototype

    NASA Technical Reports Server (NTRS)

    Pitts, Gerald; Robinson, Mark; Strange, Steve

    1993-01-01

    Virtual reality (VR) allows sensory immersion and interaction with a computer-generated environment. The user adopts a physical interface with the computer, through Input/Output devices such as a head-mounted display, data glove, mouse, keyboard, or monitor, to experience an alternate universe. What this means is that the computer generates an environment which, in its ultimate extension, becomes indistinguishable from the real world. 'Imagine a wraparound television with three-dimensional programs, including three-dimensional sound, and solid objects that you can pick up and manipulate, even feel with your fingers and hands.... 'Imagine that you are the creator as well as the consumer of your artificial experience, with the power to use a gesture or word to remold the world you see and hear and feel. That part is not fiction... three-dimensional computer graphics, input/output devices, computer models that constitute a VR system make it possible, today, to immerse yourself in an artificial world and to reach in and reshape it.' Our research's goal was to propose a feasibility experiment in the construction of a networked virtual reality system, making use of current personal computer (PC) technology. The prototype was built using Borland C compiler, running on an IBM 486 33 MHz and a 386 33 MHz. Each game currently is represented as an IPX client on a non-dedicated Novell server. We initially posed the two questions: (1) Is there a need for networked virtual reality? (2) In what ways can the technology be made available to the most people possible?

  6. A Wireless Monitoring Sub-nA Resolution Test Platform for Nanostructure Sensors

    PubMed Central

    Jang, Chi Woong; Byun, Young Tae; Lee, Taikjin; Woo, Deok Ha; Lee, Seok; Jhon, Young Min

    2013-01-01

    We have constructed a wireless monitoring test platform with a sub-nA resolution signal amplification/processing circuit (SAPC) and a wireless communication network to test the real-time remote monitoring of the signals from carbon nanotube (CNT) sensors. The operation characteristics of the CNT sensors can also be measured by the ISD-VSD curve with the SAPC. The SAPC signals are transmitted to a personal computer by Bluetooth communication and the signals from the computer are transmitted to smart phones by Wi-Fi communication, in such a way that the signals from the sensors can be remotely monitored through a web browser. Successful remote monitoring of signals from a CNT sensor was achieved with the wireless monitoring test platform for detection of 0.15% methanol vapor with 0.5 nA resolution and 7 Hz sampling rate. PMID:23783735

  7. 40 CFR 1042.110 - Recording reductant use and other diagnostic functions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) The onboard computer log must record in nonvolatile computer memory all incidents of engine operation... such operation in nonvolatile computer memory. You are not required to monitor NOX concentrations...

  8. Subtlenoise: sonification of distributed computing operations

    NASA Astrophysics Data System (ADS)

    Love, P. A.

    2015-12-01

    The operation of distributed computing systems requires comprehensive monitoring to ensure reliability and robustness. There are two components found in most monitoring systems: one being visually rich time-series graphs and another being notification systems for alerting operators under certain pre-defined conditions. In this paper the sonification of monitoring messages is explored using an architecture that fits easily within existing infrastructures based on mature opensource technologies such as ZeroMQ, Logstash, and Supercollider (a synth engine). Message attributes are mapped onto audio attributes based on broad classification of the message (continuous or discrete metrics) but keeping the audio stream subtle in nature. The benefits of audio rendering are described in the context of distributed computing operations and may provide a less intrusive way to understand the operational health of these systems.

  9. Potential of mean force for ion pairs in non-aqueous solvents. Comparison of polarizable and non-polarizable MD simulations

    NASA Astrophysics Data System (ADS)

    Odinokov, A. V.; Leontyev, I. V.; Basilevsky, M. V.; Petrov, N. Ch.

    2011-01-01

    Potentials of mean force (PMF) are calculated for two model ion pairs in two non-aqueous solvents. Standard non-polarizable molecular dynamics simulation (NPMD) and approximate polarizable simulation (PMD) are implemented and compared as tools for monitoring PMF profiles. For the polar solvent (dimethylsulfoxide, DMSO) the PMF generated in terms of the NPMD reproduces fairly well the refined PMD-PMF profile. For the non-polar solvent (benzene) the conventional NPMD computation proves to be deficient. The validity of the correction found in terms of the approximate PMD approach is verified by its comparison with the result of the explicit PMD computation in benzene. The shapes of the PMF profiles in DMSO and in benzene are quite different. In DMSO, owing to dielectric screening, the PMF presents a flat plot with a shallow minimum positioned in the vicinity of the van der Waals contact of the ion pair. For the benzene case, the observed minimum proves to be unexpectedly deep, which manifests the formation of a tightly-binded contact ion pair. This remarkable effect arises owing to the strong electrostatic interaction that is incompletely screened by a non-polar medium. The PMFs for the binary benzene/DMSO mixtures display intermediate behaviour depending on the DMSO content.

  10. An Optimal Parameterization Framework for Infrasonic Tomography of the Stratospheric Winds Using Non-Local Sources

    DOE PAGES

    Blom, Philip Stephen; Marcillo, Omar Eduardo

    2016-12-05

    A method is developed to apply acoustic tomography methods to a localized network of infrasound arrays with intention of monitoring the atmosphere state in the region around the network using non-local sources without requiring knowledge of the precise source location or non-local atmosphere state. Closely spaced arrays provide a means to estimate phase velocities of signals that can provide limiting bounds on certain characteristics of the atmosphere. Larger spacing between such clusters provide a means to estimate celerity from propagation times along multiple unique stratospherically or thermospherically ducted propagation paths and compute more precise estimates of the atmosphere state. Inmore » order to avoid the commonly encountered complex, multimodal distributions for parametric atmosphere descriptions and to maximize the computational efficiency of the method, an optimal parametrization framework is constructed. This framework identifies the ideal combination of parameters for tomography studies in specific regions of the atmosphere and statistical model selection analysis shows that high quality corrections to the middle atmosphere winds can be obtained using as few as three parameters. Lastly, comparison of the resulting estimates for synthetic data sets shows qualitative agreement between the middle atmosphere winds and those estimated from infrasonic traveltime observations.« less

  11. Imaging Breathing Rate in the CO2Absorption Band.

    PubMed

    Fei, Jin; Zhu, Zhen; Pavlidis, Ioannis

    2005-01-01

    Following up on our previous work, we have developed one more non-contact method to measure human breathing rate. We have retrofitted our Mid-Wave Infra-Red (MWIR) imaging system with a narrow band-pass filter in the CO2absorption band (4.3 µm). This improves the contrast between the foreground (i.e., expired air) and background (e.g., wall). Based on the radiation information within the breath flow region, we get the mean dynamic thermal signal. This signal is quasi-periodic due to the interleaving of high and low intensities corresponding to expirations and inspirations respectively. We sample the signal at a constant rate and then determine the breathing frequency through Fourier analysis. We have performed experiments on 9 subjects at distances ranging from 6-8 ft. We compared the breathing rate computed by our novel method with ground-truth measurements obtained via a traditional contact device (PowerLab/4SP from ADInstruments with an abdominal transducer). The results show high correlation between the two modalities. For the first time, we report a Fourier based breathing rate computation method on a MWIR signal in the CO2absorption band. The method opens the way for desktop, unobtrusive monitoring of an important vital sign, that is, breathing rate. It may find widespread applications in preventive medicine as well as sustained physiological monitoring of subjects suffering from chronic ailments.

  12. Developing a cosmic ray muon sampling capability for muon tomography and monitoring applications

    NASA Astrophysics Data System (ADS)

    Chatzidakis, S.; Chrysikopoulou, S.; Tsoukalas, L. H.

    2015-12-01

    In this study, a cosmic ray muon sampling capability using a phenomenological model that captures the main characteristics of the experimentally measured spectrum coupled with a set of statistical algorithms is developed. The "muon generator" produces muons with zenith angles in the range 0-90° and energies in the range 1-100 GeV and is suitable for Monte Carlo simulations with emphasis on muon tomographic and monitoring applications. The muon energy distribution is described by the Smith and Duller (1959) [35] phenomenological model. Statistical algorithms are then employed for generating random samples. The inverse transform provides a means to generate samples from the muon angular distribution, whereas the Acceptance-Rejection and Metropolis-Hastings algorithms are employed to provide the energy component. The predictions for muon energies 1-60 GeV and zenith angles 0-90° are validated with a series of actual spectrum measurements and with estimates from the software library CRY. The results confirm the validity of the phenomenological model and the applicability of the statistical algorithms to generate polyenergetic-polydirectional muons. The response of the algorithms and the impact of critical parameters on computation time and computed results were investigated. Final output from the proposed "muon generator" is a look-up table that contains the sampled muon angles and energies and can be easily integrated into Monte Carlo particle simulation codes such as Geant4 and MCNP.

  13. Methodologies and Tools for Tuning Parallel Programs: 80% Art, 20% Science, and 10% Luck

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Bailey, David (Technical Monitor)

    1996-01-01

    The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessors. However, without effective means to monitor (and analyze) program execution, tuning the performance of parallel programs becomes exponentially difficult as program complexity and machine size increase. In the past few years, the ubiquitous introduction of performance tuning tools from various supercomputer vendors (Intel's ParAide, TMC's PRISM, CRI's Apprentice, and Convex's CXtrace) seems to indicate the maturity of performance instrumentation/monitor/tuning technologies and vendors'/customers' recognition of their importance. However, a few important questions remain: What kind of performance bottlenecks can these tools detect (or correct)? How time consuming is the performance tuning process? What are some important technical issues that remain to be tackled in this area? This workshop reviews the fundamental concepts involved in analyzing and improving the performance of parallel and heterogeneous message-passing programs. Several alternative strategies will be contrasted, and for each we will describe how currently available tuning tools (e.g. AIMS, ParAide, PRISM, Apprentice, CXtrace, ATExpert, Pablo, IPS-2) can be used to facilitate the process. We will characterize the effectiveness of the tools and methodologies based on actual user experiences at NASA Ames Research Center. Finally, we will discuss their limitations and outline recent approaches taken by vendors and the research community to address them.

  14. Monitoring benthic aIgal communides: A comparison of targeted and coefficient sampling methods

    USGS Publications Warehouse

    Edwards, Matthew S.; Tinker, M. Tim

    2009-01-01

    Choosing an appropriate sample unit is a fundamental decision in the design of ecological studies. While numerous methods have been developed to estimate organism abundance, they differ in cost, accuracy and precision.Using both field data and computer simulation modeling, we evaluated the costs and benefits associated with two methods commonly used to sample benthic organisms in temperate kelp forests. One of these methods, the Targeted Sampling method, relies on different sample units, each "targeted" for a specific species or group of species while the other method relies on coefficients that represent ranges of bottom cover obtained from visual esti-mates within standardized sample units. Both the field data and the computer simulations suggest that both methods yield remarkably similar estimates of organism abundance and among-site variability, although the Coefficient method slightly underestimates variability among sample units when abundances are low. In contrast, the two methods differ considerably in the effort needed to sample these communities; the Targeted Sampling requires more time and twice the personnel to complete. We conclude that the Coefficent Sampling method may be better for environmental monitoring programs where changes in mean abundance are of central concern and resources are limiting, but that the Targeted sampling methods may be better for ecological studies where quantitative relationships among species and small-scale variability in abundance are of central concern.

  15. Low-complexity R-peak detection for ambulatory fetal monitoring.

    PubMed

    Rooijakkers, Michael J; Rabotti, Chiara; Oei, S Guid; Mischi, Massimo

    2012-07-01

    Non-invasive fetal health monitoring during pregnancy is becoming increasingly important because of the increasing number of high-risk pregnancies. Despite recent advances in signal-processing technology, which have enabled fetal monitoring during pregnancy using abdominal electrocardiogram (ECG) recordings, ubiquitous fetal health monitoring is still unfeasible due to the computational complexity of noise-robust solutions. In this paper, an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, without reducing the R-peak detection performance compared to the existing R-peak detection schemes. Validation of the algorithm is performed on three manually annotated datasets. With a detection error rate of 0.23%, 1.32% and 9.42% on the MIT/BIH Arrhythmia and in-house maternal and fetal databases, respectively, the detection rate of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.

  16. An Automated Recording Method in Clinical Consultation to Rate the Limp in Lower Limb Osteoarthritis

    PubMed Central

    Barrois, R.; Gregory, Th.; Oudre, L.; Moreau, Th.; Truong, Ch.; Aram Pulini, A.; Vienne, A.; Labourdette, Ch.; Vayatis, N.; Buffat, S.; Yelnik, A.; de Waele, C.; Laporte, S.; Vidal, P. P.; Ricard, D.

    2016-01-01

    For diagnosis and follow up, it is important to be able to quantify limp in an objective, and precise way adapted to daily clinical consultation. The purpose of this exploratory study was to determine if an inertial sensor-based method could provide simple features that correlate with the severity of lower limb osteoarthritis evaluated by the WOMAC index without the use of step detection in the signal processing. Forty-eight patients with lower limb osteoarthritis formed two severity groups separated by the median of the WOMAC index (G1, G2). Twelve asymptomatic age-matched control subjects formed the control group (G0). Subjects were asked to walk straight 10 meters forward and 10 meters back at self-selected walking speeds with inertial measurement units (IMU) (3-D accelerometers, 3-D gyroscopes and 3-D magnetometers) attached on the head, the lower back (L3-L4) and both feet. Sixty parameters corresponding to the mean and the root mean square (RMS) of the recorded signals on the various sensors (head, lower back and feet), in the various axes, in the various frames were computed. Parameters were defined as discriminating when they showed statistical differences between the three groups. In total, four parameters were found discriminating: mean and RMS of the norm of the acceleration in the horizontal plane for contralateral and ipsilateral foot in the doctor’s office frame. No discriminating parameter was found on the head or the lower back. No discriminating parameter was found in the sensor linked frames. This study showed that two IMUs placed on both feet and a step detection free signal processing method could be an objective and quantitative complement to the clinical examination of the physician in everyday practice. Our method provides new automatically computed parameters that could be used for the comprehension of lower limb osteoarthritis. It may not only be used in medical consultation to score patients but also to monitor the evolution of their clinical syndrome during and after rehabilitation. Finally, it paves the way for the quantification of gait in other fields such as neurology and for monitoring the gait at a patient’s home. PMID:27776168

  17. Noninvasive and Painless Urine Glucose Detection by Using Computer-based Polarimeter

    NASA Astrophysics Data System (ADS)

    Sutrisno; Laksono, Y. A.; Hidayat, N.

    2017-05-01

    Diabetes kills millions of people worldwide each year. It challenges us as researchers to give contribution in early diagnosis to ensure a healthy life. As a matter of fact, common glucose testing devices that have been widely used so far are, at least, glucose meter and urine glucose test strip. The glucose meter ordinarily requires blood taken from patient’s finger. The glucose test strip uses patient’s urine but records unspecific urine glucose level, since the strip only provides the glucose level in some particular ranges. Instead of detecting the glucose level in blood and using the non-specific technique, a noninvasive and painless technique that can detect glucose level accurately will provide a more feasible approach for diabetes diagnosis. The noninvasive and painless urine glucose level monitoring by means of computer-based polarimeter is presented in this paper. The instrument consisted of a power source, a sample box, a light sensor, a polarizer, an analyzer, an analog to digital converter (ADC), and a computer. The concentration of urine glucose concentration was evaluated from the curve of the change in detected optical rotation angle and output potential by the computer-based polarimeter. Statistical analyses by means of Gaussian fitting and linear regression were applied to investigate the rotation angle and urine glucose concentration, respectively. From our experiment, the urine glucose level, measured by glucose test strips, of the normal patient was 100 mg/dl, and the diabetic patient was 500 mg/dl. Our polarimeter even read more precise values for the urine glucose concentrations of those normal and diabetic of the same patients, i.e. 50.61 mg/dl and 502.41 mg/dl, respectively. In other words, the results showed that our polarimeter was able to quantitatively measure the urine glucose level more accurate than urine glucose test strips. Hence, this computer-based polarimeter could be used as an alternative for early detection of urine glucose with noninvasive and painless characteristics.

  18. A method for achieving an order-of-magnitude increase in the temporal resolution of a standard CRT computer monitor.

    PubMed

    Fiesta, Matthew P; Eagleman, David M

    2008-09-15

    As the frequency of a flickering light is increased, the perception of flicker is replaced by the perception of steady light at what is known as the critical flicker fusion threshold (CFFT). This threshold provides a useful measure of the brain's information processing speed, and has been used in medicine for over a century both for diagnostic and drug efficacy studies. However, the hardware for presenting the stimulus has not advanced to take advantage of computers, largely because the refresh rates of typical monitors are too slow to provide fine-grained changes in the alternation rate of a visual stimulus. For example, a cathode ray tube (CRT) computer monitor running at 100Hz will render a new frame every 10 ms, thus restricting the period of a flickering stimulus to multiples of 20 ms. These multiples provide a temporal resolution far too low to make precise threshold measurements, since typical CFFT values are in the neighborhood of 35 ms. We describe here a simple and novel technique to enable alternating images at several closely-spaced periods on a standard monitor. The key to our technique is to programmatically control the video card to dynamically reset the refresh rate of the monitor. Different refresh rates allow slightly different frame durations; this can be leveraged to vastly increase the resolution of stimulus presentation times. This simple technique opens new inroads for experiments on computers that require more finely-spaced temporal resolution than a monitor at a single, fixed refresh rate can allow.

  19. Serial Network Flow Monitor

    NASA Technical Reports Server (NTRS)

    Robinson, Julie A.; Tate-Brown, Judy M.

    2009-01-01

    Using a commercial software CD and minimal up-mass, SNFM monitors the Payload local area network (LAN) to analyze and troubleshoot LAN data traffic. Validating LAN traffic models may allow for faster and more reliable computer networks to sustain systems and science on future space missions. Research Summary: This experiment studies the function of the computer network onboard the ISS. On-orbit packet statistics are captured and used to validate ground based medium rate data link models and enhance the way that the local area network (LAN) is monitored. This information will allow monitoring and improvement in the data transfer capabilities of on-orbit computer networks. The Serial Network Flow Monitor (SNFM) experiment attempts to characterize the network equivalent of traffic jams on board ISS. The SNFM team is able to specifically target historical problem areas including the SAMS (Space Acceleration Measurement System) communication issues, data transmissions from the ISS to the ground teams, and multiple users on the network at the same time. By looking at how various users interact with each other on the network, conflicts can be identified and work can begin on solutions. SNFM is comprised of a commercial off the shelf software package that monitors packet traffic through the payload Ethernet LANs (local area networks) on board ISS.

  20. Monitoring Start of Season in Alaska

    NASA Astrophysics Data System (ADS)

    Robin, J.; Dubayah, R.; Sparrow, E.; Levine, E.

    2006-12-01

    In biomes that have distinct winter seasons, start of spring phenological events, specifically timing of budburst and green-up of leaves, coincides with transpiration. Seasons leave annual signatures that reflect the dynamic nature of the hydrologic cycle and link the different spheres of the Earth system. This paper evaluates whether continuity between AVHRR and MODIS normalized difference vegetation index (NDVI) is achievable for monitoring land surface phenology, specifically start of season (SOS), in Alaska. Additionally, two thresholds, one based on NDVI and the other on accumulated growing degree-days (GDD), are compared to determine which most accurately predicts SOS for Fairbanks. Ratio of maximum greenness at SOS was computed from biweekly AVHRR and MODIS composites for 2001 through 2004 for Anchorage and Fairbanks regions. SOS dates were determined from annual green-up observations made by GLOBE students. Results showed that different processing as well as spectral characteristics of each sensor restrict continuity between the two datasets. MODIS values were consistently higher and had less inter-annual variability during the height of the growing season than corresponding AVHRR values. Furthermore, a threshold of 131-175 accumulated GDD was a better predictor of SOS for Fairbanks than a NDVI threshold applied to AVHRR and MODIS datasets. The NDVI threshold was developed from biweekly AVHRR composites from 1982 through 2004 and corresponding annual green-up observations at University of Alaska-Fairbanks (UAF). The GDD threshold was developed from 20+ years of historic daily mean air temperature data and the same green-up observations. SOS dates computed with the GDD threshold most closely resembled actual green-up dates observed by GLOBE students and UAF researchers. Overall, biweekly composites and effects of clouds, snow, and conifers limit the ability of NDVI to monitor phenological changes in Alaska.

  1. Development of a High Precision Displacement Measurement System by Fusing a Low Cost RTK-GPS Sensor and a Force Feedback Accelerometer for Infrastructure Monitoring.

    PubMed

    Koo, Gunhee; Kim, Kiyoung; Chung, Jun Yeon; Choi, Jaemook; Kwon, Nam-Yeol; Kang, Doo-Young; Sohn, Hoon

    2017-11-28

    A displacement measurement system fusing a low cost real-time kinematic global positioning system (RTK-GPS) receiver and a force feedback accelerometer is proposed for infrastructure monitoring. The proposed system is composed of a sensor module, a base module and a computation module. The sensor module consists of a RTK-GPS rover and a force feedback accelerometer, and is installed on a target structure like conventional RTK-GPS sensors. The base module is placed on a rigid ground away from the target structure similar to conventional RTK-GPS bases, and transmits observation messages to the sensor module. Then, the initial acceleration, velocity and displacement responses measured by the sensor module are transmitted to the computation module located at a central monitoring facility. Finally, high precision and high sampling rate displacement, velocity, and acceleration are estimated by fusing the acceleration from the accelerometer, the velocity from the GPS rover, and the displacement from RTK-GPS. Note that the proposed displacement measurement system can measure 3-axis acceleration, velocity as well as displacement in real time. In terms of displacement, the proposed measurement system can estimate dynamic and pseudo-static displacement with a root-mean-square error of 2 mm and a sampling rate of up to 100 Hz. The performance of the proposed system is validated under sinusoidal, random and steady-state vibrations. Field tests were performed on the Yeongjong Grand Bridge and Yi Sun-sin Bridge in Korea, and the Xihoumen Bridge in China to compare the performance of the proposed system with a commercial RTK-GPS sensor and other data fusion techniques.

  2. HIPAA-compliant automatic monitoring system for RIS-integrated PACS operation

    NASA Astrophysics Data System (ADS)

    Jin, Jin; Zhang, Jianguo; Chen, Xiaomeng; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Feng, Jie; Sheng, Liwei; Huang, H. K.

    2006-03-01

    As a governmental regulation, Health Insurance Portability and Accountability Act (HIPAA) was issued to protect the privacy of health information that identifies individuals who are living or deceased. HIPAA requires security services supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. These controls, which proposed in HIPAA Security Standards, are Audit trails here. Audit trails can be used for surveillance purposes, to detect when interesting events might be happening that warrant further investigation. Or they can be used forensically, after the detection of a security breach, to determine what went wrong and who or what was at fault. In order to provide security control services and to achieve the high and continuous availability, we design the HIPAA-Compliant Automatic Monitoring System for RIS-Integrated PACS operation. The system consists of two parts: monitoring agents running in each PACS component computer and a Monitor Server running in a remote computer. Monitoring agents are deployed on all computer nodes in RIS-Integrated PACS system to collect the Audit trail messages defined by the Supplement 95 of the DICOM standard: Audit Trail Messages. Then the Monitor Server gathers all audit messages and processes them to provide security information in three levels: system resources, PACS/RIS applications, and users/patients data accessing. Now the RIS-Integrated PACS managers can monitor and control the entire RIS-Integrated PACS operation through web service provided by the Monitor Server. This paper presents the design of a HIPAA-compliant automatic monitoring system for RIS-Integrated PACS Operation, and gives the preliminary results performed by this monitoring system on a clinical RIS-integrated PACS.

  3. Automated validation of a computer operating system

    NASA Technical Reports Server (NTRS)

    Dervage, M. M.; Milberg, B. A.

    1970-01-01

    Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.

  4. Automatic online and real-time tumour motion monitoring during stereotactic liver treatments on a conventional linac by combined optical and sparse monoscopic imaging with kilovoltage x-rays (COSMIK)

    NASA Astrophysics Data System (ADS)

    Bertholet, Jenny; Toftegaard, Jakob; Hansen, Rune; Worm, Esben S.; Wan, Hanlin; Parikh, Parag J.; Weber, Britta; Høyer, Morten; Poulsen, Per R.

    2018-03-01

    The purpose of this study was to develop, validate and clinically demonstrate fully automatic tumour motion monitoring on a conventional linear accelerator by combined optical and sparse monoscopic imaging with kilovoltage x-rays (COSMIK). COSMIK combines auto-segmentation of implanted fiducial markers in cone-beam computed tomography (CBCT) projections and intra-treatment kV images with simultaneous streaming of an external motion signal. A pre-treatment CBCT is acquired with simultaneous recording of the motion of an external marker block on the abdomen. The 3-dimensional (3D) marker motion during the CBCT is estimated from the auto-segmented positions in the projections and used to optimize an external correlation model (ECM) of internal motion as a function of external motion. During treatment, the ECM estimates the internal motion from the external motion at 20 Hz. KV images are acquired every 3 s, auto-segmented, and used to update the ECM for baseline shifts between internal and external motion. The COSMIK method was validated using Calypso-recorded internal tumour motion with simultaneous camera-recorded external motion for 15 liver stereotactic body radiotherapy (SBRT) patients. The validation included phantom experiments and simulations hereof for 12 fractions and further simulations for 42 fractions. The simulations compared the accuracy of COSMIK with ECM-based monitoring without model updates and with model updates based on stereoscopic imaging as well as continuous kilovoltage intrafraction monitoring (KIM) at 10 Hz without an external signal. Clinical real-time tumour motion monitoring with COSMIK was performed offline for 14 liver SBRT patients (41 fractions) and online for one patient (two fractions). The mean 3D root-mean-square error for the four monitoring methods was 1.61 mm (COSMIK), 2.31 mm (ECM without updates), 1.49 mm (ECM with stereoscopic updates) and 0.75 mm (KIM). COSMIK is the first combined kV/optical real-time motion monitoring method used clinically online on a conventional accelerator. COSMIK gives less imaging dose than KIM and is in addition applicable when the kV imager cannot be deployed such as during non-coplanar fields.

  5. The Use of an Online Learning and Teaching System for Monitoring Computer Aided Design Student Participation and Predicting Student Success

    ERIC Educational Resources Information Center

    Akhtar, S.; Warburton, S.; Xu, W.

    2017-01-01

    In this paper we report on the use of a purpose built Computer Support Collaborative learning environment designed to support lab-based CAD teaching through the monitoring of student participation and identified predictors of success. This was carried out by analysing data from the interactive learning system and correlating student behaviour with…

  6. Integrating a Hand Held computer and Stethoscope into a Fetal Monitor

    PubMed Central

    Ahmad Soltani, Mitra

    2009-01-01

    This article presents procedures for modifying a hand held computer or personal digital assistant (PDA) into a versatile device functioning as an electronic stethoscope for fetal monitoring. Along with functioning as an electronic stethoscope, a PDA can provide a useful information source for a medical trainee. Feedback from medical students, residents and interns suggests the device is well accepted by medical trainees. PMID:20165517

  7. Discussion of "Computational Electrocardiography: Revisiting Holter ECG Monitoring".

    PubMed

    Baumgartner, Christian; Caiani, Enrico G; Dickhaus, Hartmut; Kulikowski, Casimir A; Schiecke, Karin; van Bemmel, Jan H; Witte, Herbert

    2016-08-05

    This article is part of a For-Discussion-Section of Methods of Information in Medicine about the paper "Computational Electrocardiography: Revisiting Holter ECG Monitoring" written by Thomas M. Deserno and Nikolaus Marx. It is introduced by an editorial. This article contains the combined commentaries invited to independently comment on the paper of Deserno and Marx. In subsequent issues the discussion can continue through letters to the editor.

  8. Wireless recording systems: from noninvasive EEG-NIRS to invasive EEG devices.

    PubMed

    Sawan, Mohamad; Salam, Muhammad T; Le Lan, Jérôme; Kassab, Amal; Gelinas, Sébastien; Vannasing, Phetsamone; Lesage, Frédéric; Lassonde, Maryse; Nguyen, Dang K

    2013-04-01

    In this paper, we present the design and implementation of a wireless wearable electronic system dedicated to remote data recording for brain monitoring. The reported wireless recording system is used for a) simultaneous near-infrared spectrometry (NIRS) and scalp electro-encephalography (EEG) for noninvasive monitoring and b) intracerebral EEG (icEEG) for invasive monitoring. Bluetooth and dual radio links were introduced for these recordings. The Bluetooth-based device was embedded in a noninvasive multichannel EEG-NIRS system for easy portability and long-term monitoring. On the other hand, the 32-channel implantable recording device offers 24-bit resolution, tunable features, and a sampling frequency up to 2 kHz per channel. The analog front-end preamplifier presents low input-referred noise of 5 μ VRMS and a signal-to-noise ratio of 112 dB. The communication link is implemented using a dual-band radio frequency transceiver offering a half-duplex 800 kb/s data rate, 16.5 mW power consumption and less than 10(-10) post-correction Bit-Error Rate (BER). The designed system can be accessed and controlled by a computer with a user-friendly graphical interface. The proposed wireless implantable recording device was tested in vitro using real icEEG signals from two patients with refractory epilepsy. The wirelessly recorded signals were compared to the original signals recorded using wired-connection, and measured normalized root-mean square deviation was under 2%.

  9. Stress Prediction for Distributed Structural Health Monitoring Using Existing Measurements and Pattern Recognition.

    PubMed

    Lu, Wei; Teng, Jun; Zhou, Qiushi; Peng, Qiexin

    2018-02-01

    The stress in structural steel members is the most useful and directly measurable physical quantity to evaluate the structural safety in structural health monitoring, which is also an important index to evaluate the stress distribution and force condition of structures during structural construction and service phases. Thus, it is common to set stress as a measure in steel structural monitoring. Considering the economy and the importance of the structural members, there are only a limited number of sensors that can be placed, which means that it is impossible to obtain the stresses of all members directly using sensors. This study aims to develop a stress response prediction method for locations where there are insufficent sensors, using measurements from a limited number of sensors and pattern recognition. The detailed improved aspects are: (1) a distributed computing process is proposed, where the same pattern is recognized by several subsets of measurements; and (2) the pattern recognition using the subset of measurements is carried out by considering the optimal number of sensors and number of fusion patterns. The validity and feasibility of the proposed method are verified using two examples: the finite-element simulation of a single-layer shell-like steel structure, and the structural health monitoring of the space steel roof of Shenzhen Bay Stadium; for the latter, the anti-noise performance of this method is verified by the stress measurements from a real-world project.

  10. 48 CFR 52.227-14 - Rights in Data-General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software. Computer software—(1) Means (i) Computer programs that comprise a series of instructions, rules... or computer software documentation. Computer software documentation means owner's manuals, user's... medium, that explain the capabilities of the computer software or provide instructions for using the...

  11. Design of Remote Monitoring System of Irrigation based on GSM and ZigBee Technology

    NASA Astrophysics Data System (ADS)

    Xiao xi, Zheng; Fang, Zhao; Shuaifei, Shao

    2018-03-01

    To solve the problems of low level of irrigation and waste of water resources, a remote monitoring system for farmland irrigation based on GSM communication technology and ZigBee technology was designed. The system is composed of sensors, GSM communication module, ZigBee module, host computer, valve and so on. The system detects and closes the pump and the electromagnetic valve according to the need of the system, and transmits the monitoring information to the host computer or the user’s Mobile phone through the GSM communication network. Experiments show that the system has low power consumption, friendly man-machine interface, convenient and simple. It can monitor agricultural environment remotely and control related irrigation equipment at any time and place, and can better meet the needs of remote monitoring of farmland irrigation.

  12. Connecticut permanent long-term bridge monitoring network, volume 3 : monitoring of a multi-steel girder composite bridge - I-91 SB over the Mattabesset River in Cromwell (bridge #3078).

    DOT National Transportation Integrated Search

    2014-08-01

    This report describes the instrumentation and data acquisition for a multi-girder, composite steel bridge in Connecticut. The : computer-based remote monitoring system was developed to collect information on the girder bending strains. The monitoring...

  13. Classic electrocardiogram-based and mobile technology derived approaches to heart rate variability are not equivalent.

    PubMed

    Guzik, Przemyslaw; Piekos, Caroline; Pierog, Olivia; Fenech, Naiman; Krauze, Tomasz; Piskorski, Jaroslaw; Wykretowicz, Andrzej

    2018-05-01

    We compared classic ECG-derived versus a mobile approach to heart rate variability (HRV) measurement. 29 young adult healthy volunteers underwent a simultaneous recording of heart rate using an ECG and a chest heart rate monitor at supine rest, during mental stress and active standing. Mean RR interval, Standard Deviation of Normal-to-Normal (SDNN) of RR intervals, and Root Mean Square of the Successive Differences (RMSSD) between RR intervals were computed in 168 pairs of 5-minute epochs by in-house software on a PC (only sinus beats) and by mobile application "ELITEHRV" on a smartphone (no beat type identification). ECG analysis showed that 33.9% of the recordings contained at least one non-sinus beat or artefact, the mobile app did not report this. The mean RR intervals were significantly longer (p = 0.0378), while SDNN (p = 0.0001) and RMSSD (p = 0.0199) were smaller for the mobile approach. Measures of identical HRV parameters by ECG-based and mobile approaches are not equivalent. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  15. Development of a Pediatric Visual Field Test

    PubMed Central

    Miranda, Marco A.; Henson, David B.; Fenerty, Cecilia; Biswas, Susmito; Aslam, Tariq

    2016-01-01

    Purpose We describe a pediatric visual field (VF) test based on a computer game where software and hardware combine to provide an enjoyable test experience. Methods The test software consists of a platform-based computer game presented to the central VF. A storyline was created around the game as was a structure surrounding the computer monitor to enhance patients' experience. The patient is asked to help the central character collect magic coins (stimuli). To collect these coins a series of obstacles need to be overcome. The test was presented on a Sony PVM-2541A monitor calibrated from a central midpoint with a Minolta CS-100 photometer placed at 50 cm. Measurements were performed at 15 locations on the screen and the contrast calculated. Retinal sensitivity was determined by modulating stimulus in size. To test the feasibility of the novel approach 20 patients (4–16 years old) with no history of VF defects were recruited. Results For the 14 subjects completing the study, 31 ± 15 data points were collected on 1 eye of each patient. Mean background luminance and stimulus contrast were 9.9 ± 0.3 cd/m2 and 27.9 ± 0.1 dB, respectively. Sensitivity values obtained were similar to an adult population but variability was considerably higher – 8.3 ± 9.0 dB. Conclusions Preliminary data show the feasibility of a game-based VF test for pediatric use. Although the test was well accepted by the target population, test variability remained very high. Translational Relevance Traditional VF tests are not well tolerated by children. This study describes a child-friendly approach to test visual fields in the targeted population. PMID:27980876

  16. Rattlesnake Mountain Observatory (46.4° N, 119.6° W) Multispectral Optical Depth Measurements: 1979-1994 (NDP-053)

    DOE Data Explorer

    Larson, Nels R. [Pacific Northwest Laboratory (PNNL), Richland, WA (USA); Michalsky, Joseph J. [Atmospheric Sciences Research Center, Albany, NY (USA); LeBaron, Brock A. [Utah Bureau of Air Quality, Salt Lake City, Utah (USA)

    2012-01-01

    Surface measurements of solar irradiance of the atmosphere were made by a multipurpose computer-controlled scanning photometer at the Rattlesnake Mountain Observatory in eastern Washington. The observatory is located at 46.4° N, 119.6° W at an elevation of 1088 m above mean sea level. The photometer measures the attenuation of direct solar radiation for different wavelengths using 12 filters. Five of these filters (i.e., at 428 nm, 486 nm, 535 nm, 785 nm, and 1010 nm, with respective half-power widths of 2, 2, 3, 18, and 28 nm) are suitable for monitoring variations in the total optical depth of the atmosphere.

  17. Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid

    NASA Astrophysics Data System (ADS)

    Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration

    2014-06-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  18. A Home Computer Primer.

    ERIC Educational Resources Information Center

    Stone, Antonia

    1982-01-01

    Provides general information on currently available microcomputers, computer programs (software), hardware requirements, software sources, costs, computer games, and programing. Includes a list of popular microcomputers, providing price category, model, list price, software (cassette, tape, disk), monitor specifications, amount of random access…

  19. Translator program converts computer printout into braille language

    NASA Technical Reports Server (NTRS)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  20. High-Fidelity Piezoelectric Audio Device

    NASA Technical Reports Server (NTRS)

    Woodward, Stanley E.; Fox, Robert L.; Bryant, Robert G.

    2003-01-01

    ModalMax is a very innovative means of harnessing the vibration of a piezoelectric actuator to produce an energy efficient low-profile device with high-bandwidth high-fidelity audio response. The piezoelectric audio device outperforms many commercially available speakers made using speaker cones. The piezoelectric device weighs substantially less (4 g) than the speaker cones which use magnets (10 g). ModalMax devices have extreme fabrication simplicity. The entire audio device is fabricated by lamination. The simplicity of the design lends itself to lower cost. The piezoelectric audio device can be used without its acoustic chambers and thereby resulting in a very low thickness of 0.023 in. (0.58 mm). The piezoelectric audio device can be completely encapsulated, which makes it very attractive for use in wet environments. Encapsulation does not significantly alter the audio response. Its small size (see Figure 1) is applicable to many consumer electronic products, such as pagers, portable radios, headphones, laptop computers, computer monitors, toys, and electronic games. The audio device can also be used in automobile or aircraft sound systems.

  1. A CAMAC based real-time noise analysis system for nuclear reactors

    NASA Astrophysics Data System (ADS)

    Ciftcioglu, Özer

    1987-05-01

    A CAMAC based real-time noise analysis system was designed for the TRIGA MARK II nuclear reactor at the Institute for Nuclear Energy, Istanbul. The input analog signals obtained from the radiation detectors are introduced to the system through CAMAC interface. The signals converted into digital form are processed by a PDP-11 computer. The fast data processing based on auto/cross power spectral density computations is carried out by means of assembly written FFT algorithms in real-time and the spectra obtained are displayed on a CAMAC driven display system as an additional monitoring device. The system has the advantage of being software programmable and controlled by a CAMAC system so that it is operated under program control for reactor surveillance, anomaly detection and diagnosis. The system can also be used for the identification of nonstationary operational characteristics of the reactor in long term by comparing the noise power spectra with the corresponding reference noise patterns prepared in advance.

  2. Hey Buddy can you spare a DNA? New surveillance technologies and the growth of mandatory volunteerism in collecting personal information.

    PubMed

    Marx, Gary T

    2007-01-01

    The new social surveillance can be defined as scrutiny through the use of technical means to extract or create personal or group data, whether from individuals or contexts. Examples include: video cameras; computer matching, profiling and data mining; work, computer and electronic location monitoring; biometrics; DNA analysis; drug tests; brain scans for lie detection; various forms of imaging to reveal what is behind walls and enclosures. There are two problems with the new surveillance technologies. One is that they don't work and the other is that they work too well. If the first, they fail to prevent disasters, bring miscarriages of justice, and waste resources. If the second, they can further inequality and invidious social categorization; they chill liberty. These twin threats are part of the enduring paradox of democratic government that must be strong enough to maintain reasonable order, but not so strong as to become undemocratic.

  3. 48 CFR 352.227-14 - Rights in Data-Exceptional Circumstances.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....] Computer database or database means a collection of recorded information in a form capable of, and for the... databases or computer software documentation. Computer software documentation means owner's manuals, user's... nature (including computer databases and computer software documentation). This term does not include...

  4. The research and application of green computer room environmental monitoring system based on internet of things technology

    NASA Astrophysics Data System (ADS)

    Wei, Wang; Chongchao, Pan; Yikai, Liang; Gang, Li

    2017-11-01

    With the rapid development of information technology, the scale of data center increases quickly, and the energy consumption of computer room also increases rapidly, among which, energy consumption of air conditioning cooling makes up a large proportion. How to apply new technology to reduce the energy consumption of the computer room becomes an important topic of energy saving in the current research. This paper study internet of things technology, and design a kind of green computer room environmental monitoring system. In the system, we can get the real-time environment data from the application of wireless sensor network technology, which will be showed in a creative way of three-dimensional effect. In the environment monitor, we can get the computer room assets view, temperature cloud view, humidity cloud view, microenvironment view and so on. Thus according to the condition of the microenvironment, we can adjust the air volume, temperature and humidity parameters of the air conditioning for the individual equipment cabinet to realize the precise air conditioning refrigeration. And this can reduce the energy consumption of air conditioning, as a result, the overall energy consumption of the green computer room will reduce greatly. At the same time, we apply this project in the computer center of Weihai, and after a year of test and running, we find that it took a good energy saving effect, which fully verified the effectiveness of this project on the energy conservation of the computer room.

  5. Development and evaluation of an office ergonomic risk checklist: ROSA--rapid office strain assessment.

    PubMed

    Sonne, Michael; Villalta, Dino L; Andrews, David M

    2012-01-01

    The Rapid Office Strain Assessment (ROSA) was designed to quickly quantify risks associated with computer work and to establish an action level for change based on reports of worker discomfort. Computer use risk factors were identified in previous research and standards on office design for the chair, monitor, telephone, keyboard and mouse. The risk factors were diagrammed and coded as increasing scores from 1 to 3. ROSA final scores ranged in magnitude from 1 to 10, with each successive score representing an increased presence of risk factors. Total body discomfort and ROSA final scores for 72 office workstations were significantly correlated (R = 0.384). ROSA final scores exhibited high inter- and intra-observer reliability (ICCs of 0.88 and 0.91, respectively). Mean discomfort increased with increasing ROSA scores, with a significant difference occurring between scores of 3 and 5 (out of 10). A ROSA final score of 5 might therefore be useful as an action level indicating when immediate change is necessary. ROSA proved to be an effective and reliable method for identifying computer use risk factors related to discomfort. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  6. Recent advances to obtain real - Time displacements for engineering applications

    USGS Publications Warehouse

    Celebi, M.

    2005-01-01

    This paper presents recent developments and approaches (using GPS technology and real-time double-integration) to obtain displacements and, in turn, drift ratios, in real-time or near real-time to meet the needs of the engineering and user community in seismic monitoring and assessing the functionality and damage condition of structures. Drift ratios computed in near real-time allow technical assessment of the damage condition of a building. Relevant parameters, such as the type of connections and story structural characteristics (including geometry) are used in computing drifts corresponding to several pre-selected threshold stages of damage. Thus, drift ratios determined from real-time monitoring can be compared to pre-computed threshold drift ratios. The approaches described herein can be used for performance evaluation of structures and can be considered as building health-monitoring applications.

  7. Monitoring means for combustion engine electric storage battery means

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, G. K.; Rautiola, R. E.; Taylor, R. E.

    Disclosed, in combination, are a combustion engine, an electric storage battery, an electrically powered starter motor for at times driving the engine in order to start the engine, and an electrical system monitor; the electrical system monitor has a first monitoring portion which senses the actual voltage across the battery and a second monitoring portion which monitors the current through the battery; an electrical switch controls associated circuitry and is actuatable into open or closed conditions; whenever the first monitoring portion senses a preselected magnitude of the actual voltage across the battery or the second monitoring portion senses a preselectedmore » magnitude of the current flow through the battery, the electrical switch is actuated.« less

  8. Continuous Seismic Threshold Monitoring

    DTIC Science & Technology

    1992-05-31

    Continuous threshold monitoring is a technique for using a seismic network to monitor a geographical area continuously in time. The method provides...area. Two approaches are presented. Site-specific monitoring: By focusing a seismic network on a specific target site, continuous threshold monitoring...recorded events at the site. We define the threshold trace for the network as the continuous time trace of computed upper magnitude limits of seismic

  9. 14 CFR 1214.801 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... customer's pro rata share of Shuttle services and used to compute the Shuttle charge factor. Means of... compute the customer's pro rata share of each element's services and used to compute the element charge... element charge factor. Parameters used in computation of the customer's flight price. Means of computing...

  10. 14 CFR 1214.801 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... customer's pro rata share of Shuttle services and used to compute the Shuttle charge factor. Means of... compute the customer's pro rata share of each element's services and used to compute the element charge... element charge factor. Parameters used in computation of the customer's flight price. Means of computing...

  11. 14 CFR 1214.801 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... customer's pro rata share of Shuttle services and used to compute the Shuttle charge factor. Means of... compute the customer's pro rata share of each element's services and used to compute the element charge... element charge factor. Parameters used in computation of the customer's flight price. Means of computing...

  12. 14 CFR 1214.801 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... customer's pro rata share of Shuttle services and used to compute the Shuttle charge factor. Means of... compute the customer's pro rata share of each element's services and used to compute the element charge... element charge factor. Parameters used in computation of the customer's flight price. Means of computing...

  13. 14 CFR § 1214.801 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... customer's pro rata share of Shuttle services and used to compute the Shuttle charge factor. Means of... compute the customer's pro rata share of each element's services and used to compute the element charge... element charge factor. Parameters used in computation of the customer's flight price. Means of computing...

  14. Color in Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Steinberg, Esther R.

    Color monitors are in wide use in computer systems. Thus, it is important to understand how to apply color effectively in computer assisted instruction (CAI) and computer based training (CBT). Color can enhance learning, but it does not automatically do so. Indiscriminate application of color can mislead a student and thereby even interfere with…

  15. A computational analysis of different endograft designs for Zone 0 aortic arch repair.

    PubMed

    van Bakel, Theodorus M; Arthurs, Christopher J; van Herwaarden, Joost A; Moll, Frans L; Eagle, Kim A; Patel, Himanshu J; Trimarchi, Santi; Figueroa, C Alberto

    2018-03-15

    Aortic arch repair remains a major surgical challenge. Multiple manufacturers are developing branched endografts for Zone 0 endovascular repair, extending the armamentarium for minimally invasive treatment of aortic arch pathologies. We hypothesize that the design of the Zone 0 endograft has a significant impact on the postoperative haemodynamic performance, particularly in the cervical arteries. The goal of our study was to compare the postoperative haemodynamic performance of different Zone 0 endograft designs. Patient-specific, clinically validated, computational fluid dynamics simulations were performed in a 71-year-old woman with a 6.5-cm saccular aortic arch aneurysm. Additionally, 4 endovascular repair scenarios using different endograft designs were created. Haemodynamic performance was evaluated by calculation of postoperative changes in blood flow and platelet activation potential (PLAP) in the cervical arteries. Preoperative cervical blood flow and mean PLAP were 1080 ml/min and 151.75, respectively. Cervical blood flow decreased and PLAP increased following endovascular repair in all scenarios. Endografts with 2 antegrade inner branches performed better compared to single-branch endografts. Scenario 3 performed the worst with a decrease in the total cervical blood flow of 4.8%, a decrease in the left hemisphere flow of 6.7% and an increase in the mean PLAP of 74.3%. Endograft design has a significant impact on haemodynamic performance following Zone 0 endovascular repair, potentially affecting cerebral blood flow during follow-up. Our results demonstrate the use of computational modelling for virtual testing of therapeutic interventions and underline the need to monitor the long-term outcomes in this cohort of patients.

  16. Computer program for analysis of hemodynamic response to head-up tilt test

    NASA Astrophysics Data System (ADS)

    ŚwiÄ tek, Eliza; Cybulski, Gerard; Koźluk, Edward; PiÄ tkowska, Agnieszka; Niewiadomski, Wiktor

    2014-11-01

    The aim of this work was to create a computer program, written in the MATLAB environment, which enables the visualization and analysis of hemodynamic parameters recorded during a passive tilt test using the CNS Task Force Monitor System. The application was created to help in the assessment of the relationship between the values and dynamics of changes of the selected parameters and the risk of orthostatic syncope. The signal analysis included: R-R intervals (RRI), heart rate (HR), systolic blood pressure (sBP), diastolic blood pressure (dBP), mean blood pressure (mBP), stroke volume (SV), stroke index (SI), cardiac output (CO), cardiac index (CI), total peripheral resistance (TPR), total peripheral resistance index (TPRI), ventricular ejection time (LVET) and thoracic fluid content (TFC). The program enables the user to visualize waveforms for a selected parameter and to perform smoothing with selected moving average parameters. It allows one to construct the graph of means for any range, and the Poincare plot for a selected time range. The program automatically determines the average value of the parameter before tilt, its minimum and maximum value immediately after changing positions and the times of their occurrence. It is possible to correct the automatically detected points manually. For the RR interval, it determines the acceleration index (AI) and the brake index (BI). It is possible to save calculated values to an XLS with a name specified by user. The application has a user-friendly graphical interface and can run on a computer that has no MATLAB software.

  17. Monitoring land degradation in southern Tunisia: A test of LANDSAT imagery and digital data

    NASA Technical Reports Server (NTRS)

    Hellden, U.; Stern, M.

    1980-01-01

    The possible use of LANDSAT imagery and digital data for monitoring desertification indicators in Tunisia was studied. Field data were sampled in Tunisia for estimation of mapping accuracy in maps generated through interpretation of LANDSAT false color composites and processing of LANDSAT computer compatible tapes respectively. Temporal change studies were carried out through geometric registration of computer classified windows from 1972 to classified data from 1979. Indications on land degradation were noted in some areas. No important differences, concerning results, between the interpretation approach and the computer processing approach were found.

  18. TMS communications software. Volume 1: Computer interfaces

    NASA Technical Reports Server (NTRS)

    Brown, J. S.; Lenker, M. D.

    1979-01-01

    A prototype bus communications system, which is being used to support the Trend Monitoring System (TMS) as well as for evaluation of the bus concept is considered. Hardware and software interfaces to the MODCOMP and NOVA minicomputers are included. The system software required to drive the interfaces in each TMS computer is described. Documentation of other software for bus statistics monitoring and for transferring files across the bus is also included.

  19. USRA/RIACS

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1992-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.

  20. Get SUNREL | Buildings | NREL

    Science.gov Websites

    ; means (a) copies of the computer program commonly known as SUNREL, and all of the contents of files accordance with the Documentation. 1.2 "Computer" means an electronic device that accepts computer." 1.4 "Licensee" means the Individual Licensee. 1.5 "Licensed Single Site"

  1. Load monitoring of aerospace structures utilizing micro-electro-mechanical systems for static and quasi-static loading conditions

    NASA Astrophysics Data System (ADS)

    Martinez, M.; Rocha, B.; Li, M.; Shi, G.; Beltempo, A.; Rutledge, R.; Yanishevsky, M.

    2012-11-01

    The National Research Council Canada (NRC) has worked on the development of structural health monitoring (SHM) test platforms for assessing the performance of sensor systems for load monitoring applications. The first SHM platform consists of a 5.5 m cantilever aluminum beam that provides an optimal scenario for evaluating the ability of a load monitoring system to measure bending, torsion and shear loads. The second SHM platform contains an added level of structural complexity, by consisting of aluminum skins with bonded/riveted stringers, typical of an aircraft lower wing structure. These two load monitoring platforms are well characterized and documented, providing loading conditions similar to those encountered during service. In this study, a micro-electro-mechanical system (MEMS) for acquiring data from triads of gyroscopes, accelerometers and magnetometers is described. The system was used to compute changes in angles at discrete stations along the platforms. The angles obtained from the MEMS were used to compute a second, third or fourth order degree polynomial surface from which displacements at every point could be computed. The use of a new Kalman filter was evaluated for angle estimation, from which displacements in the structure were computed. The outputs of the newly developed algorithms were then compared to the displacements obtained from the linear variable displacement transducers connected to the platforms. The displacement curves were subsequently post-processed either analytically, or with the help of a finite element model of the structure, to estimate strains and loads. The estimated strains were compared with baseline strain gauge instrumentation installed on the platforms. This new approach for load monitoring was able to provide accurate estimates of applied strains and shear loads.

  2. Monitoring by Use of Clusters of Sensor-Data Vectors

    NASA Technical Reports Server (NTRS)

    Iverson, David L.

    2007-01-01

    The inductive monitoring system (IMS) is a system of computer hardware and software for automated monitoring of the performance, operational condition, physical integrity, and other aspects of the health of a complex engineering system (e.g., an industrial process line or a spacecraft). The input to the IMS consists of streams of digitized readings from sensors in the monitored system. The IMS determines the type and amount of any deviation of the monitored system from a nominal or normal ( healthy ) condition on the basis of a comparison between (1) vectors constructed from the incoming sensor data and (2) corresponding vectors in a database of nominal or normal behavior. The term inductive reflects the use of a process reminiscent of traditional mathematical induction to learn about normal operation and build the nominal-condition database. The IMS offers two major advantages over prior computational monitoring systems: The computational burden of the IMS is significantly smaller, and there is no need for abnormal-condition sensor data for training the IMS to recognize abnormal conditions. The figure schematically depicts the relationships among the computational processes effected by the IMS. Training sensor data are gathered during normal operation of the monitored system, detailed computational simulation of operation of the monitored system, or both. The training data are formed into vectors that are used to generate the database. The vectors in the database are clustered into regions that represent normal or nominal operation. Once the database has been generated, the IMS compares the vectors of incoming sensor data with vectors representative of the clusters. The monitored system is deemed to be operating normally or abnormally, depending on whether the vector of incoming sensor data is or is not, respectively, sufficiently close to one of the clusters. For this purpose, a distance between two vectors is calculated by a suitable metric (e.g., Euclidean distance) and "sufficiently close" signifies lying at a distance less than a specified threshold value. It must be emphasized that although the IMS is intended to detect off-nominal or abnormal performance or health, it is not necessarily capable of performing a thorough or detailed diagnosis. Limited diagnostic information may be available under some circumstances. For example, the distance of a vector of incoming sensor data from the nearest cluster could serve as an indication of the severity of a malfunction. The identity of the nearest cluster may be a clue as to the identity of the malfunctioning component or subsystem. It is possible to decrease the IMS computation time by use of a combination of cluster-indexing and -retrieval methods. For example, in one method, the distances between each cluster and two or more reference vectors can be used for the purpose of indexing and retrieval. The clusters are sorted into a list according to these distance values, typically in ascending order of distance. When a set of input data arrives and is to be tested, the data are first arranged as an ordered set (that is, a vector). The distances from the input vector to the reference points are computed. The search of clusters from the list can then be limited to those clusters lying within a certain distance range from the input vector; the computation time is reduced by not searching the clusters at a greater distance.

  3. Monitoring Statistics Which Have Increased Power over a Reduced Time Range.

    ERIC Educational Resources Information Center

    Tang, S. M.; MacNeill, I. B.

    1992-01-01

    The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)

  4. Earth physicist describes US nuclear test monitoring system

    NASA Astrophysics Data System (ADS)

    1986-01-01

    The U. S. capabilities to monitor underground nuclear weapons tests in the USSR was examined. American methods used in monitoring the underground nuclear tests are enumerated. The U. S. technical means of monitoring Solviet nuclear weapons testing, and whether it is possible to conduct tests that could not be detected by these means are examined. The worldwide seismic station network in 55 countries available to the U. S. for seismic detection and measurement of underground nuclear explosions, and also the systems of seismic research observatories in 15 countries and seismic grouping stations in 12 countries are outlined including the advanced computerized data processing capabilities of these facilities. The level of capability of the U. S. seismic system for monitoring nuclear tests, other, nonseismic means of monitoring, such as hydroacoustic and recording of effects in the atmosphere, ionosphere, and the Earth's magnetic field, are discussed.

  5. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    NASA Astrophysics Data System (ADS)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.

  6. Electronic fetal monitoring: a Canadian survey.

    PubMed Central

    Davies, B L; Niday, P A; Nimrod, C A; Drake, E R; Sprague, A E; Trépanier, M J

    1993-01-01

    OBJECTIVES: To determine the current status of electronic fetal monitoring (EFM) in Canadian teaching and nonteaching hospitals, to review the medical and nursing standards of practice for EFM and to determine the availability of EFM educational programs. DESIGN: National survey in 1989. PARTICIPANTS: The directors of nursing at the 737 hospitals providing obstetric care were sent a questionnaire and asked to have it completed by the most appropriate staff member. The response rate was 80.5% (593/737); 44 hospitals did not have deliveries in 1988 and were excluded. The remaining hospitals varied in size from 8 to 1800 (mean 162.1) beds and had 1 to 7500 (mean 617.1) births in 1988; 18.8% were teaching hospitals. RESULTS: Of the 549 hospitals 419 (76.3%) reported having at least 1 monitor (range 1 to 30; mean 2.6); the mean number of monitors per hospital was higher in the teaching hospitals than in the nonteaching hospitals (6.2 v. 1.7). Manitoba had the lowest mean number of monitors per hospital (1.1) and Ontario the highest (3.7). In 71.8% of the hospitals with monitors almost all of the obstetric patients were monitored at some point during labour. However, 21.6% of the hospitals with monitors had no policy on EFM practice. The availability of EFM educational programs for physicians and nurses varied according to hospital size, type and region. CONCLUSIONS: Most Canadian hospitals providing obstetric services have electronic fetal monitors and use them frequently. Although substantial research has questioned the benefits of EFM, further definitive research is required. In the meantime, a national committee should be established to develop multidisciplinary guidelines for intrapartum fetal assessment. PMID:8485677

  7. Real-time in vivo rectal wall dosimetry using plastic scintillation detectors for patients with prostate cancer

    NASA Astrophysics Data System (ADS)

    Wootton, Landon; Kudchadker, Rajat; Lee, Andrew; Beddar, Sam

    2014-02-01

    We designed and constructed an in vivo dosimetry system using plastic scintillation detectors (PSDs) to monitor dose to the rectal wall in patients undergoing intensity-modulated radiation therapy for prostate cancer. Five patients were enrolled in an Institutional Review Board-approved protocol for twice weekly in vivo dose monitoring with our system, resulting in a total of 142 in vivo dose measurements. PSDs were attached to the surface of endorectal balloons used for prostate immobilization to place the PSDs in contact with the rectal wall. Absorbed dose was measured in real time and the total measured dose was compared with the dose calculated by the treatment planning system on the daily computed tomographic image dataset. The mean difference between measured and calculated doses for the entire patient population was -0.4% (standard deviation 2.8%). The mean difference between daily measured and calculated doses for each patient ranged from -3.3% to 3.3% (standard deviation ranged from 5.6% to 7.1% for four patients and was 14.0% for the last, for whom optimal positioning of the detector was difficult owing to the patient's large size). Patients tolerated the detectors well and the treatment workflow was not compromised. Overall, PSDs performed well as in vivo dosimeters, providing excellent accuracy, real-time measurement and reusability.

  8. Intracranial EEG fluctuates over months after implanting electrodes in human brain

    NASA Astrophysics Data System (ADS)

    Ung, Hoameng; Baldassano, Steven N.; Bink, Hank; Krieger, Abba M.; Williams, Shawniqua; Vitale, Flavia; Wu, Chengyuan; Freestone, Dean; Nurse, Ewan; Leyde, Kent; Davis, Kathryn A.; Cook, Mark; Litt, Brian

    2017-10-01

    Objective. Implanting subdural and penetrating electrodes in the brain causes acute trauma and inflammation that affect intracranial electroencephalographic (iEEG) recordings. This behavior and its potential impact on clinical decision-making and algorithms for implanted devices have not been assessed in detail. In this study we aim to characterize the temporal and spatial variability of continuous, prolonged human iEEG recordings. Approach. Intracranial electroencephalography from 15 patients with drug-refractory epilepsy, each implanted with 16 subdural electrodes and continuously monitored for an average of 18 months, was included in this study. Time and spectral domain features were computed each day for each channel for the duration of each patient’s recording. Metrics to capture post-implantation feature changes and inflexion points were computed on group and individual levels. A linear mixed model was used to characterize transient group-level changes in feature values post-implantation and independent linear models were used to describe individual variability. Main results. A significant decline in features important to seizure detection and prediction algorithms (mean line length, energy, and half-wave), as well as mean power in the Berger and high gamma bands, was observed in many patients over 100 d following implantation. In addition, spatial variability across electrodes declines post-implantation following a similar timeframe. All selected features decreased by 14-50% in the initial 75 d of recording on the group level, and at least one feature demonstrated this pattern in 13 of the 15 patients. Our findings indicate that iEEG signal features demonstrate increased variability following implantation, most notably in the weeks immediately post-implant. Significance. These findings suggest that conclusions drawn from iEEG, both clinically and for research, should account for spatiotemporal signal variability and that properly assessing the iEEG in patients, depending upon the application, may require extended monitoring.

  9. A framework for cognitive monitoring using computer game interactions.

    PubMed

    Jimison, Holly B; Pavel, Misha; Bissell, Payton; McKanna, James

    2007-01-01

    Many countries are faced with a rapidly increasing economic and social challenge of caring for their elderly population. Cognitive issues are at the forefront of the list of concerns. People over the age of 75 are at risk for medically related cognitive decline and confusion, and the early detection of cognitive problems would allow for more effective clinical intervention. However, standard cognitive assessments are not diagnostically sensitive and are performed infrequently. To address these issues, we have developed a set of adaptive computer games to monitor cognitive performance in a home environment. Assessment algorithms for various aspects of cognition are embedded in the games. The monitoring of these metrics allows us to detect within subject trends over time, providing a method for the early detection of cognitive decline. In addition, the real-time information on cognitive state is used to adapt the user interface to the needs of the individual user. In this paper we describe the software architecture and methodology for monitoring cognitive performance using data from natural computer interactions in a home setting.

  10. Development of sea ice monitoring with aerial remote sensing technology

    NASA Astrophysics Data System (ADS)

    Jiang, Xuhui; Han, Lei; Dong, Liang; Cui, Lulu; Bie, Jun; Fan, Xuewei

    2014-11-01

    In the north China Sea district, sea ice disaster is very serious every winter, which brings a lot of adverse effects to shipping transportation, offshore oil exploitation, and coastal engineering. In recent years, along with the changing of global climate, the sea ice situation becomes too critical. The monitoring of sea ice is playing a very important role in keeping human life and properties in safety, and undertaking of marine scientific research. The methods to monitor sea ice mainly include: first, shore observation; second, icebreaker monitoring; third, satellite remote sensing; and then aerial remote sensing monitoring. The marine station staffs use relevant equipments to monitor the sea ice in the shore observation. The icebreaker monitoring means: the workers complete the test of the properties of sea ice, such as density, salinity and mechanical properties. MODIS data and NOAA data are processed to get sea ice charts in the satellite remote sensing means. Besides, artificial visual monitoring method and some airborne remote sensors are adopted in the aerial remote sensing to monitor sea ice. Aerial remote sensing is an important means in sea ice monitoring because of its strong maneuverability, wide watching scale, and high resolution. In this paper, several methods in the sea ice monitoring using aerial remote sensing technology are discussed.

  11. Estimating generalized skew of the log-Pearson Type III distribution for annual peak floods in Illinois

    USGS Publications Warehouse

    Oberg, Kevin A.; Mades, Dean M.

    1987-01-01

    Four techniques for estimating generalized skew in Illinois were evaluated: (1) a generalized skew map of the US; (2) an isoline map; (3) a prediction equation; and (4) a regional-mean skew. Peak-flow records at 730 gaging stations having 10 or more annual peaks were selected for computing station skews. Station skew values ranged from -3.55 to 2.95, with a mean of -0.11. Frequency curves computed for 30 gaging stations in Illinois using the variations of the regional-mean skew technique are similar to frequency curves computed using a skew map developed by the US Water Resources Council (WRC). Estimates of the 50-, 100-, and 500-yr floods computed for 29 of these gaging stations using the regional-mean skew techniques are within the 50% confidence limits of frequency curves computed using the WRC skew map. Although the three variations of the regional-mean skew technique were slightly more accurate than the WRC map, there is no appreciable difference between flood estimates computed using the variations of the regional-mean technique and flood estimates computed using the WRC skew map. (Peters-PTT)

  12. Real-World Neuroimaging Technologies

    DTIC Science & Technology

    2013-05-10

    system enables long-term wear of up to 10 consecutive hours of operation time. The system’s wireless technologies, light weight (200g), and dry sensor ...biomarkers, body sensor networks , brain computer interactionbrain, computer interfaces, data acquisition, electroencephalography monitoring, translational...brain activity in real-world scenarios. INDEX TERMS Behavioral science, biomarkers, body sensor networks , brain computer interfaces, brain computer

  13. Aircraft Alerting Systems Standardization Study. Phase IV. Accident Implications on Systems Design.

    DTIC Science & Technology

    1982-06-01

    computing and processing to assimilate and process status informa- 5 tion using...provided with capabilities in computing and processing , sensing, interfacing, and controlling and displaying. 17 o Computing and Processing - Algorithms...alerting system to perform a flight status monitor function would require additional sensinq, computing and processing , interfacing, and controlling

  14. Monitoring for Human Papillomavirus Vaccine Impact Among Gay, Bisexual, and Other Men Who Have Sex With Men-United States, 2012-2014.

    PubMed

    Meites, Elissa; Gorbach, Pamina M; Gratzer, Beau; Panicker, Gitika; Steinau, Martin; Collins, Tom; Parrish, Adam; Randel, Cody; McGrath, Mark; Carrasco, Steven; Moore, Janell; Zaidi, Akbar; Braxton, Jim; Kerndt, Peter R; Unger, Elizabeth R; Crosby, Richard A; Markowitz, Lauri E

    2016-09-01

    Gay, bisexual, and other men who have sex with men (MSM) are at high risk for human papillomavirus (HPV) infection; vaccination is recommended for US males, including MSM through age 26 years. We assessed evidence of HPV among vaccine-eligible MSM and transgender women to monitor vaccine impact. During 2012-2014, MSM aged 18-26 years at select clinics completed a computer-assisted self-interview regarding sexual behavior, human immunodeficiency virus (HIV) status, and vaccinations. Self-collected anal swab and oral rinse specimens were tested for HPV DNA (37 types) by L1 consensus polymerase chain reaction; serum was tested for HPV antibodies (4 types) by a multiplexed virus-like particle-based immunoglobulin G direct enzyme-linked immunosorbent assay. Among 922 vaccine-eligible participants, the mean age was 23 years, and the mean number of lifetime sex partners was 37. Among 834 without HIV infection, any anal HPV was detected in 69.4% and any oral HPV in 8.4%, yet only 8.5% had evidence of exposure to all quadrivalent vaccine types. In multivariate analysis, HPV prevalence varied significantly (P < .05) by HIV status, sexual orientation, and lifetime number of sex partners, but not by race/ethnicity. Most young MSM lacked evidence of current or past infection with all vaccine-type HPV types, suggesting that they could benefit from vaccination. The impact of vaccination among MSM may be assessed by monitoring HPV prevalence, including in self-collected specimens. Published by Oxford University Press for the Infectious Diseases Society of America 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  15. Stroke patients and their attitudes toward mHealth monitoring to support blood pressure control and medication adherence

    PubMed Central

    Burkett, Nina-Sarena; Ovbiagele, Bruce; Mueller, Martina; Patel, Sachin; Brunner-Jackson, Brenda; Saulson, Raelle; Treiber, Frank

    2016-01-01

    Background Mobile health, or mHealth, has increasingly been signaled as an effective means to expedite communication and improve medical regimen adherence, especially for patients with chronic health conditions such as stroke. However, there is a lack of data on attitudes of stroke patients toward mHealth. Such information will aid in identifying key indicators for feasibility and optimal implementation of mHealth to prevent and/or decrease rates of secondary stroke. Our objective was to ascertain stroke patients’ attitudes toward using mobile phone enabled blood pressure (BP) monitoring and medication adherence and identify factors that modulate these attitudes. Methods Sixty stroke patients received a brief demonstration of mHealth devices to assist with BP control and medication adherence and a survey to evaluate willingness to use this technology. Results The 60 participants had a mean age of 57 years, were 43.3% male, and 53.3% were White. With respect to telecommunication prevalence, 93.3% owned a cellular device and 25% owned a smartphone. About 70% owned a working computer. Regarding attitudes, 85% felt comfortable with a doctor or nurse using mHealth technologies to monitor personal health information, 78.3% believed mHealth would help remind them to follow doctor’s directions, and 83.3% were confident that technology could effectively be used to communicate with health care providers for medical needs. Conclusions Mobile device use is high in stroke patients and they are amenable to mHealth for communication and assistance in adhering to their medical regimens. More research is needed to explore usefulness of this technology in larger stroke populations. PMID:27347490

  16. Beat-to-Beat Blood Pressure Monitor

    NASA Technical Reports Server (NTRS)

    Lee, Yong Jin

    2012-01-01

    This device provides non-invasive beat-to-beat blood pressure measurements and can be worn over the upper arm for prolonged durations. Phase and waveform analyses are performed on filtered proximal and distal photoplethysmographic (PPG) waveforms obtained from the brachial artery. The phase analysis is used primarily for the computation of the mean arterial pressure, while the waveform analysis is used primarily to obtain the pulse pressure. Real-time compliance estimate is used to refine both the mean arterial and pulse pressures to provide the beat-to-beat blood pressure measurement. This wearable physiological monitor can be used to continuously observe the beat-to-beat blood pressure (B3P). It can be used to monitor the effect of prolonged exposures to reduced gravitational environments and the effectiveness of various countermeasures. A number of researchers have used pulse wave velocity (PWV) of blood in the arteries to infer the beat-to-beat blood pressure. There has been documentation of relative success, but a device that is able to provide the required accuracy and repeatability has not yet been developed. It has been demonstrated that an accurate and repeatable blood pressure measurement can be obtained by measuring the phase change (e.g., phase velocity), amplitude change, and distortion of the PPG waveforms along the brachial artery. The approach is based on comparing the full PPG waveform between two points along the artery rather than measuring the time-of-flight. Minimizing the measurement separation and confining the measurement area to a single, well-defined artery allows the waveform to retain the general shape between the two measurement points. This allows signal processing of waveforms to determine the phase and amplitude changes.

  17. The ``Leakage Current Sentinel'': A novel plug-in socket device for online biomedical equipment electrical safety surveillance

    NASA Astrophysics Data System (ADS)

    Cappa, Paolo; Marinozzi, Franco; Sciuto, Salvatore Andrea

    2000-07-01

    The Leakage Current Sentinel (LCS) has been designed and implemented for the detection of hazardous situations caused by dangerous earth leakage current values in intensive care units and operating theaters. The device, designed and manufactured with full compliance of the high risk environment requirements, is able to monitor online the earth leakage current and detect ground wire faults. Operation utilizes a microammeter with an overall sensitivity of 2.5×104 V/A. In order to assure the reliability of the device in providing alarm signals, the simultaneous presence of absorbed power current is monitored by means of another ammeter with decreased sensitivity (3.0 V/A). The measured root mean square current values are compared with reference values in order to send signals to NAND and OR complementary metal-oxide-semiconductor gates to enable audible and visible alarms according to the possible hazardous cases examined in the article. The final LCS packaging was shaped as a wall socket adapter for common electromedical device power cord plugs, with particular attention to minimizing its dimensions and to provide analog voltage outputs for both measured leakage and power currents, in order to allow automatic data acquisition and computerized hazardous situation management. Finally, a personal computer based automatic measuring system has been configured to simultaneously monitor several LCSs installed in the same intensive care unit room and, as a consequence, to distinguish different hazardous scenarios and provide an adequate alert to the clinical personnel whose final decision is still required. The test results confirm the effectiveness and reliability of the LCS in giving an alert in case of leakage current anomalous values, either in case of a ground fault or in case of a dangerous leakage current.

  18. Early Performance and Safety of the Micra Transcatheter Pacemaker in Pigs.

    PubMed

    Bonner, Matthew; Eggen, Michael; Haddad, Tarek; Sheldon, Todd; Williams, Eric

    2015-11-01

    The Micra® Transcatheter Pacing System (TPS; Medtronic Inc., Minneapolis, MN, USA) is a miniaturized single-chamber pacemaker system that is delivered via catheter through the femoral vein. In this study, the electrical performance was compared between the TPS and a traditional leaded pacemaker. In addition, the safety profile of the two systems was compared by thorough monitoring for a number of adverse events. The TPS was implanted in the right ventricular apex of 10 Yucatan mini pigs and a Medtronic single-lead pacemaker (SLP) was implanted in the right ventricular apex of another 10 pigs and connected to a traditional pacemaker. The electrical performance of all devices was monitored for 12 weeks. The safety profile of each system was characterized using x-ray, computed tomography, ultrasound, blood work, and necropsy to monitor for a variety of adverse events. At implant the mean pacing thresholds were 0.58 ± 0.17 V @0.24 ms and 0.75 ± 0.29 V @0.21 ms for the TPS and the SLP respectively. After 12 weeks, mean thresholds were 0.94 ± 0.46 V and 1.85 ± 0.75 V (P < 0.0001). There were two pulmonary emboli that were small and past the tertiary branch, and one occurred in each arm. There were also two infections with one in each arm. There were no dislodgements (macro or micro), tissue injury, tamponade, or valve injury. Overall, despite the 10-fold size reduction of the Micra TPS, it appears to perform similarly and have a similar safety profile to a traditional pacemaker system. © 2015 Medtronic PLC. Pacing and Clinical Electrophysiology published by Wiley Periodicals, Inc.

  19. The application of simple metrics in the assessment of glycaemic variability.

    PubMed

    Monnier, L; Colette, C; Owens, D R

    2018-03-06

    The assessment of glycaemic variability (GV) remains a subject of debate with many indices proposed to represent either short- (acute glucose fluctuations) or long-term GV (variations of HbA 1c ). For the assessment of short-term within-day GV, the coefficient of variation for glucose (%CV) defined as the standard deviation adjusted on the 24-h mean glucose concentration is easy to perform and with a threshold of 36%, recently adopted by the international consensus on use of continuous glucose monitoring, separating stable from labile glycaemic states. More complex metrics such as the Low Blood Glucose Index (LBGI) or High Blood Glucose Index (HBGI) allow the risk of hypo or hyperglycaemic episodes, respectively to be assessed although in clinical practice its application is limited due to the need for more complex computation. This also applies to other indices of short-term intraday GV including the mean amplitude of glycemic excursions (MAGE), Shlichtkrull's M-value and CONGA. GV is important clinically as exaggerated glucose fluctuations are associated with an enhanced risk of adverse cardiovascular outcomes due primarily to hypoglycaemia. In contrast, there is at present no compelling evidence that elevated short-term GV is an independent risk factor of microvascular complications of diabetes. Concerning long-term GV there are numerous studies supporting its association with an enhanced risk of cardiovascular events. However, this association raises the question as to whether the impact of long-term variability is not simply the consequence of repeated exposure to short-term GV or ambient chronic hyperglycaemia. The renewed emphasis on glucose monitoring with the introduction of continuous glucose monitoring technologies can benefit from the introduction and application of simple metrics for describing GV along with supporting recommendations. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  20. An on-line reactivity and power monitor for a TRIGA reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binney, Stephen E.; Bakir, Alia J.

    1988-07-01

    As the personal computer (PC) becomes more and more of a significant influence on modern technology, it is reasonable that at some point in time they would be used to interface with TRIGA reactors. A personal computer with a special interface board has been used to monitor key parameters during operation of the Oregon State University TRIGA Reactor (OSTR). A description of the apparatus used and sample results are included.

  1. Classification Models for Pulmonary Function using Motion Analysis from Phone Sensors.

    PubMed

    Cheng, Qian; Juen, Joshua; Bellam, Shashi; Fulara, Nicholas; Close, Deanna; Silverstein, Jonathan C; Schatz, Bruce

    2016-01-01

    Smartphones are ubiquitous, but it is unknown what physiological functions can be monitored at clinical quality. Pulmonary function is a standard measure of health status for cardiopulmonary patients. We have shown phone sensors can accurately measure walking patterns. Here we show that improved classification models can accurately measure pulmonary function, with sole inputs being sensor data from carried phones. Twenty-four cardiopulmonary patients performed six minute walk tests in pulmonary rehabilitation at a regional hospital. They carried smartphones running custom software recording phone motion. For every patient, every ten-second interval was correctly computed. The trained model perfectly computed the GOLD level 1/2/3, which is a standard categorization of pulmonary function as measured by spirometry. These results are encouraging towards field trials with passive monitors always running in the background. We expect patients can simply carry their phones during daily living, while supporting automatic computation ofpulmonary function for health monitoring.

  2. Data Auditor: Analyzing Data Quality Using Pattern Tableaux

    NASA Astrophysics Data System (ADS)

    Srivastava, Divesh

    Monitoring databases maintain configuration and measurement tables about computer systems, such as networks and computing clusters, and serve important business functions, such as troubleshooting customer problems, analyzing equipment failures, planning system upgrades, etc. These databases are prone to many data quality issues: configuration tables may be incorrect due to data entry errors, while measurement tables may be affected by incorrect, missing, duplicate and delayed polls. We describe Data Auditor, a tool for analyzing data quality and exploring data semantics of monitoring databases. Given a user-supplied constraint, such as a boolean predicate expected to be satisfied by every tuple, a functional dependency, or an inclusion dependency, Data Auditor computes "pattern tableaux", which are concise summaries of subsets of the data that satisfy or fail the constraint. We discuss the architecture of Data Auditor, including the supported types of constraints and the tableau generation mechanism. We also show the utility of our approach on an operational network monitoring database.

  3. 40 CFR 141.211 - Special notice for repeated failure to conduct monitoring of the source water for Cryptosporidium...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... classification or mean Cryptosporidium level must contain the following language: We are required to monitor the... or mean Cryptosporidium level. 141.211 Section 141.211 Protection of Environment ENVIRONMENTAL... Cryptosporidium level. (a) When is the special notice for repeated failure to monitor to be given? The owner or...

  4. The meaning of computers to a group of men who are homeless.

    PubMed

    Miller, Kathleen Swenson; Bunch-Harrison, Stacey; Brumbaugh, Brett; Kutty, Rekha Sankaran; FitzGerald, Kathleen

    2005-01-01

    The purpose of this pilot study was to explore the experience with computers and the meaning of computers to a group of homeless men living in a long-term shelter. This descriptive exploratory study used semistructured interviews with seven men who had been given access to computers and had participated in individually tailored occupation based interventions through a Work Readiness Program. Three themes emerged from analyzing the interviews: access to computers, computers as a bridge to life-skill development, and changed self-perceptions as a result of connecting to technology. Because they lacked computer knowledge and feared failure, the majority of study participants had not sought out computers available through public access. The need for access to computers, the potential use of computers as a medium for intervention, and the meaning of computers to these men who represent the digital divide are described in this study.

  5. Remote maintenance monitoring system

    NASA Technical Reports Server (NTRS)

    Simpkins, Lorenz G. (Inventor); Owens, Richard C. (Inventor); Rochette, Donn A. (Inventor)

    1992-01-01

    A remote maintenance monitoring system retrofits to a given hardware device with a sensor implant which gathers and captures failure data from the hardware device, without interfering with its operation. Failure data is continuously obtained from predetermined critical points within the hardware device, and is analyzed with a diagnostic expert system, which isolates failure origin to a particular component within the hardware device. For example, monitoring of a computer-based device may include monitoring of parity error data therefrom, as well as monitoring power supply fluctuations therein, so that parity error and power supply anomaly data may be used to trace the failure origin to a particular plane or power supply within the computer-based device. A plurality of sensor implants may be rerofit to corresponding plural devices comprising a distributed large-scale system. Transparent interface of the sensors to the devices precludes operative interference with the distributed network. Retrofit capability of the sensors permits monitoring of even older devices having no built-in testing technology. Continuous real time monitoring of a distributed network of such devices, coupled with diagnostic expert system analysis thereof, permits capture and analysis of even intermittent failures, thereby facilitating maintenance of the monitored large-scale system.

  6. Long-Term Structural Health Monitoring System for a High-Speed Railway Bridge Structure.

    PubMed

    Ding, You-Liang; Wang, Gao-Xin; Sun, Peng; Wu, Lai-Yi; Yue, Qing

    2015-01-01

    Nanjing Dashengguan Bridge, which serves as the shared corridor crossing Yangtze River for both Beijing-Shanghai high-speed railway and Shanghai-Wuhan-Chengdu railway, is the first 6-track high-speed railway bridge with the longest span throughout the world. In order to ensure safety and detect the performance deterioration during the long-time service of the bridge, a Structural Health Monitoring (SHM) system has been implemented on this bridge by the application of modern techniques in sensing, testing, computing, and network communication. The SHM system includes various sensors as well as corresponding data acquisition and transmission equipment for automatic data collection. Furthermore, an evaluation system of structural safety has been developed for the real-time condition assessment of this bridge. The mathematical correlation models describing the overall structural behavior of the bridge can be obtained with the support of the health monitoring system, which includes cross-correlation models for accelerations, correlation models between temperature and static strains of steel truss arch, and correlation models between temperature and longitudinal displacements of piers. Some evaluation results using the mean value control chart based on mathematical correlation models are presented in this paper to show the effectiveness of this SHM system in detecting the bridge's abnormal behaviors under the varying environmental conditions such as high-speed trains and environmental temperature.

  7. Delamination detection using methods of computational intelligence

    NASA Astrophysics Data System (ADS)

    Ihesiulor, Obinna K.; Shankar, Krishna; Zhang, Zhifang; Ray, Tapabrata

    2012-11-01

    Abstract Reliable delamination prediction scheme is indispensable in order to prevent potential risks of catastrophic failures in composite structures. The existence of delaminations changes the vibration characteristics of composite laminates and hence such indicators can be used to quantify the health characteristics of laminates. An approach for online health monitoring of in-service composite laminates is presented in this paper that relies on methods based on computational intelligence. Typical changes in the observed vibration characteristics (i.e. change in natural frequencies) are considered as inputs to identify the existence, location and magnitude of delaminations. The performance of the proposed approach is demonstrated using numerical models of composite laminates. Since this identification problem essentially involves the solution of an optimization problem, the use of finite element (FE) methods as the underlying tool for analysis turns out to be computationally expensive. A surrogate assisted optimization approach is hence introduced to contain the computational time within affordable limits. An artificial neural network (ANN) model with Bayesian regularization is used as the underlying approximation scheme while an improved rate of convergence is achieved using a memetic algorithm. However, building of ANN surrogate models usually requires large training datasets. K-means clustering is effectively employed to reduce the size of datasets. ANN is also used via inverse modeling to determine the position, size and location of delaminations using changes in measured natural frequencies. The results clearly highlight the efficiency and the robustness of the approach.

  8. Eyes of Things.

    PubMed

    Deniz, Oscar; Vallez, Noelia; Espinosa-Aranda, Jose L; Rico-Saavedra, Jose M; Parra-Patino, Javier; Bueno, Gloria; Moloney, David; Dehghani, Alireza; Dunne, Aubrey; Pagani, Alain; Krauss, Stephan; Reiser, Ruben; Waeny, Martin; Sorci, Matteo; Llewellynn, Tim; Fedorczak, Christian; Larmoire, Thierry; Herbst, Marco; Seirafi, Andre; Seirafi, Kasra

    2017-05-21

    Embedded systems control and monitor a great deal of our reality. While some "classic" features are intrinsically necessary, such as low power consumption, rugged operating ranges, fast response and low cost, these systems have evolved in the last few years to emphasize connectivity functions, thus contributing to the Internet of Things paradigm. A myriad of sensing/computing devices are being attached to everyday objects, each able to send and receive data and to act as a unique node in the Internet. Apart from the obvious necessity to process at least some data at the edge (to increase security and reduce power consumption and latency), a major breakthrough will arguably come when such devices are endowed with some level of autonomous "intelligence". Intelligent computing aims to solve problems for which no efficient exact algorithm can exist or for which we cannot conceive an exact algorithm. Central to such intelligence is Computer Vision (CV), i.e., extracting meaning from images and video. While not everything needs CV, visual information is the richest source of information about the real world: people, places and things. The possibilities of embedded CV are endless if we consider new applications and technologies, such as deep learning, drones, home robotics, intelligent surveillance, intelligent toys, wearable cameras, etc. This paper describes the Eyes of Things (EoT) platform, a versatile computer vision platform tackling those challenges and opportunities.

  9. Eyes of Things

    PubMed Central

    Deniz, Oscar; Vallez, Noelia; Espinosa-Aranda, Jose L.; Rico-Saavedra, Jose M.; Parra-Patino, Javier; Bueno, Gloria; Moloney, David; Dehghani, Alireza; Dunne, Aubrey; Pagani, Alain; Krauss, Stephan; Reiser, Ruben; Waeny, Martin; Sorci, Matteo; Llewellynn, Tim; Fedorczak, Christian; Larmoire, Thierry; Herbst, Marco; Seirafi, Andre; Seirafi, Kasra

    2017-01-01

    Embedded systems control and monitor a great deal of our reality. While some “classic” features are intrinsically necessary, such as low power consumption, rugged operating ranges, fast response and low cost, these systems have evolved in the last few years to emphasize connectivity functions, thus contributing to the Internet of Things paradigm. A myriad of sensing/computing devices are being attached to everyday objects, each able to send and receive data and to act as a unique node in the Internet. Apart from the obvious necessity to process at least some data at the edge (to increase security and reduce power consumption and latency), a major breakthrough will arguably come when such devices are endowed with some level of autonomous “intelligence”. Intelligent computing aims to solve problems for which no efficient exact algorithm can exist or for which we cannot conceive an exact algorithm. Central to such intelligence is Computer Vision (CV), i.e., extracting meaning from images and video. While not everything needs CV, visual information is the richest source of information about the real world: people, places and things. The possibilities of embedded CV are endless if we consider new applications and technologies, such as deep learning, drones, home robotics, intelligent surveillance, intelligent toys, wearable cameras, etc. This paper describes the Eyes of Things (EoT) platform, a versatile computer vision platform tackling those challenges and opportunities. PMID:28531141

  10. Small-Animal SPECT/CT of the Progression and Recovery of Rat Liver Fibrosis by Using an Integrin αvβ3-targeting Radiotracer.

    PubMed

    Yu, Xinhe; Wu, Yue; Liu, Hao; Gao, Liquan; Sun, Xianlei; Zhang, Chenran; Shi, Jiyun; Zhao, Huiyun; Jia, Bing; Liu, Zhaofei; Wang, Fan

    2016-05-01

    To assess the potential utility of an integrin αvβ3-targeting radiotracer, technetium 99m-PEG4-E[PEG4-cyclo(arginine-glycine-aspartic acid-D-phenylalanine-lysine)]2 ((99m)Tc-3PRGD2), for single photon emission computed tomography (SPECT)/computed tomography (CT) for monitoring of the progression and prognosis of liver fibrosis in a rat model. All animal experiments were performed by following the protocol approved by the institutional animal care and use committee. (99m)Tc-3PRGD2 was prepared and longitudinal SPECT/CT was performed to monitor the progression (n = 8) and recovery (n = 5) of liver fibrosis induced in a rat model by means of thioacetamide (TAA) administration. The mean liver-to-background radioactivity per unit volume ratio was analyzed for comparisons between the TAA and control (saline) groups at different stages of liver fibrosis. Data were compared by using Student t and Mann-Whitney tests. Results:of SPECT/CT were compared with those of ex vivo biodistribution analysis (n = 5). Accumulation of (99m)Tc-3PRGD2 in the liver increased in proportion to the progression of fibrosis and TAA exposure time; accumulation levels were significantly different between the TAA and control groups as early as week 4 of TAA administration (liver-to-background ratio: 32.30 ± 3.39 vs 19.01 ± 3.31; P = .0002). Results of ex vivo immunofluorescence staining demonstrated the positive expression of integrin αvβ3 on the activated hepatic stellate cells, and the integrin αvβ3 levels in the liver corresponded to the results of SPECT/CT (R(2) = 0.75, P < .0001). (99m)Tc-3PRGD2 uptake in the fibrotic liver decreased after antifibrotic therapy with interferon α2b compared with that in the control group (relative liver-to-background ratio: 0.45 ± 0.05 vs 1.01 ± 0.05; P < .0001) or spontaneous recovery (relative liver-to-background ratio: 0.56 ± 0.06 vs 1.01 ± 0.05; P < .0001). (99m)Tc-3PRGD2 SPECT/CT was successfully used to monitor the progression and recovery of liver fibrosis and shows potential applications for noninvasive diagnosis of early stage liver fibrosis. (©) RSNA, 2015 Online supplemental material is available for this article.

  11. RIACS/USRA

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1993-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.

  12. Listening to music with personal listening devices: monitoring the noise dose using a smartphone application.

    PubMed

    Kaplan-Neeman, Ricky; Muchnik, Chava; Amir, Noam

    2017-06-01

    To monitor listening habits to personal listening devices (PLDs) using a smartphone application and to compare actual listening habits to self-report data. Two stages: self-report listening habits questionnaire, and real-time monitoring of listening habits through a smartphone application. Overall 117 participants aged 18-34 years (mean 25.5 years) completed the questionnaire, and of them, 40 participants (mean age: 25.2 years) were monitored for listening habits during two weeks. Questionnaire main findings indicated that most of the participants reported listening for 4-7 days a week, for at least 30 min at high listening levels with volume control settings at 75-100%. Monitored data showed that actual listening days per week were 1.5-6.5 d, with mean continuous time of 1.56 h, and mean volume control setting of 7.39 (on a scale of 1-15). Eight participants (22%) were found to exceed the 100% noise dose at least once during the monitoring period. One participant (2.7%) exceeded the weekly 100% daily noise dose. Correlations between actual measurements and self-report data were low to moderate. Results confirmed the feasibility of monitoring listening habits by a smartphone application, and underscore the need for such a tool to enable safe listening behaviour.

  13. An Intelligent CAI Monitor and Generative Tutor. Interim Report.

    ERIC Educational Resources Information Center

    Koffman, Elliot B.; And Others

    Design techniques for generative computer-assisted-instructional (CAI) systems are described in this report. These are systems capable of generating problems for students and of deriving and monitoring solutions; problem difficulty, instructional pace, and depth of monitoring are all individually tailored and parts of the solution algorithms can…

  14. Solar Wind Monitor--A School Geophysics Project

    ERIC Educational Resources Information Center

    Robinson, Ian

    2018-01-01

    Described is an established geophysics project to construct a solar wind monitor based on a nT resolution fluxgate magnetometer. Low-cost and appropriate from school to university level it incorporates elements of astrophysics, geophysics, electronics, programming, computer networking and signal processing. The system monitors the earth's field in…

  15. Dynamic data filtering system and method

    DOEpatents

    Bickford, Randall L; Palnitkar, Rahul M

    2014-04-29

    A computer-implemented dynamic data filtering system and method for selectively choosing operating data of a monitored asset that modifies or expands a learned scope of an empirical model of normal operation of the monitored asset while simultaneously rejecting operating data of the monitored asset that is indicative of excessive degradation or impending failure of the monitored asset, and utilizing the selectively chosen data for adaptively recalibrating the empirical model to more accurately monitor asset aging changes or operating condition changes of the monitored asset.

  16. Structure, function, and behaviour of computational models in systems biology

    PubMed Central

    2013-01-01

    Background Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such “bio-models” necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. Results We present a conceptual framework – the meaning facets – which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model’s components (structure), the meaning of the model’s intended use (function), and the meaning of the model’s dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. Conclusions The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research. PMID:23721297

  17. Simple, inexpensive computerized rodent activity meters.

    PubMed

    Horton, R M; Karachunski, P I; Kellermann, S A; Conti-Fine, B M

    1995-10-01

    We describe two approaches for using obsolescent computers, either an IBM PC XT or an Apple Macintosh Plus, to accurately quantify spontaneous rodent activity, as revealed by continuous monitoring of the spontaneous usage of running activity wheels. Because such computers can commonly be obtained at little or no expense, and other commonly available materials and inexpensive parts can be used, these meters can be built quite economically. Construction of these meters requires no specialized electronics expertise, and their software requirements are simple. The computer interfaces are potentially of general interest, as they could also be used for monitoring a variety of events in a research setting.

  18. Online production validation in a HEP environment

    NASA Astrophysics Data System (ADS)

    Harenberg, T.; Kuhl, T.; Lang, N.; Mättig, P.; Sandhoff, M.; Schwanenberger, C.; Volkmer, F.

    2017-03-01

    In high energy physics (HEP) event simulations, petabytes of data are processed and stored requiring millions of CPU-years. This enormous demand for computing resources is handled by centers distributed worldwide, which form part of the LHC computing grid. The consumption of such an important amount of resources demands for an efficient production of simulation and for the early detection of potential errors. In this article we present a new monitoring framework for grid environments, which polls a measure of data quality during job execution. This online monitoring facilitates the early detection of configuration errors (specially in simulation parameters), and may thus contribute to significant savings in computing resources.

  19. Angelcare mobile system: homecare patient monitoring using bluetooth and GPRS.

    PubMed

    Ribeiro, Anna G D; Maitelli, Andre L; Valentim, Ricardo A M; Brandao, Glaucio B; Guerreiro, Ana M G

    2010-01-01

    The quick progress in technology has brought new paradigms to the computing area, bringing with them many benefits to society. The paradigm of ubiquitous computing brings innovations applying computing in people's daily life without being noticed. For this, it has used the combination of several existing technologies like wireless communications and sensors. Several of the benefits have reached the medical area, bringing new methods of surgery, appointments and examinations. This work presents telemedicine software that adds the idea of ubiquity to the medical area, innovating the relation between doctor and patient. It also brings security and confidence to a patient being monitored in homecare.

  20. Design of Remote GPRS-based Gas Data Monitoring System

    NASA Astrophysics Data System (ADS)

    Yan, Xiyue; Yang, Jianhua; Lu, Wei

    2018-01-01

    In order to solve the problem of remote data transmission of gas flowmeter, and realize unattended operation on the spot, an unattended remote monitoring system based on GPRS for gas data is designed in this paper. The slave computer of this system adopts embedded microprocessor to read data of gas flowmeter through rs-232 bus and transfers it to the host computer through DTU. In the host computer, the VB program dynamically binds the Winsock control to receive and parse data. By using dynamic data exchange, the Kingview configuration software realizes history trend curve, real time trend curve, alarm, print, web browsing and other functions.

  1. Statistical Model Applied to NetFlow for Network Intrusion Detection

    NASA Astrophysics Data System (ADS)

    Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.

    The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.

  2. 48 CFR 970.5227-1 - Rights in data-facilities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) Computer data bases, as used in this clause, means a collection of data in a form capable of, and for the purpose of, being stored in, processed, and operated on by a computer. The term does not include computer software. (2) Computer software, as used in this clause, means (i) computer programs which are data...

  3. 48 CFR 970.5227-1 - Rights in data-facilities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) Computer data bases, as used in this clause, means a collection of data in a form capable of, and for the purpose of, being stored in, processed, and operated on by a computer. The term does not include computer software. (2) Computer software, as used in this clause, means (i) computer programs which are data...

  4. 48 CFR 970.5227-1 - Rights in data-facilities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) Computer data bases, as used in this clause, means a collection of data in a form capable of, and for the purpose of, being stored in, processed, and operated on by a computer. The term does not include computer software. (2) Computer software, as used in this clause, means (i) computer programs which are data...

  5. 48 CFR 970.5227-1 - Rights in data-facilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) Computer data bases, as used in this clause, means a collection of data in a form capable of, and for the purpose of, being stored in, processed, and operated on by a computer. The term does not include computer software. (2) Computer software, as used in this clause, means (i) computer programs which are data...

  6. Rotating Desk for Collaboration by Two Computer Programmers

    NASA Technical Reports Server (NTRS)

    Riley, John Thomas

    2005-01-01

    A special-purpose desk has been designed to facilitate collaboration by two computer programmers sharing one desktop computer or computer terminal. The impetus for the design is a trend toward what is known in the software industry as extreme programming an approach intended to ensure high quality without sacrificing the quantity of computer code produced. Programmers working in pairs is a major feature of extreme programming. The present desk design minimizes the stress of the collaborative work environment. It supports both quality and work flow by making it unnecessary for programmers to get in each other s way. The desk (see figure) includes a rotating platform that supports a computer video monitor, keyboard, and mouse. The desk enables one programmer to work on the keyboard for any amount of time and then the other programmer to take over without breaking the train of thought. The rotating platform is supported by a turntable bearing that, in turn, is supported by a weighted base. The platform contains weights to improve its balance. The base includes a stand for a computer, and is shaped and dimensioned to provide adequate foot clearance for both users. The platform includes an adjustable stand for the monitor, a surface for the keyboard and mouse, and spaces for work papers, drinks, and snacks. The heights of the monitor, keyboard, and mouse are set to minimize stress. The platform can be rotated through an angle of 40 to give either user a straight-on view of the monitor and full access to the keyboard and mouse. Magnetic latches keep the platform preferentially at either of the two extremes of rotation. To switch between users, one simply grabs the edge of the platform and pulls it around. The magnetic latch is easily released, allowing the platform to rotate freely to the position of the other user

  7. Blood Glucose Monitoring Devices

    MedlinePlus

    ... of interferences ability to transmit data to a computer cost of the meter cost of the test ... Performance FDA expands indication for continuous glucose monitoring system, first to replace fingerstick testing for diabetes treatment ...

  8. Oxygen monitor for semi-closed rebreathers: design and use for estimating metabolic oxygen consumption

    NASA Astrophysics Data System (ADS)

    Clarke, John R.; Southerland, David

    1999-07-01

    Semi-closed circuit underwater breathing apparatus (UBA) provide a constant flow of mixed gas containing oxygen and nitrogen or helium to a diver. However, as a diver's work rate and metabolic oxygen consumption varies, the oxygen percentages within the UBA can change dramatically. Hence, even a resting diver can become hypoxic and become at risk for oxygen induced seizures. Conversely, a hard working diver can become hypoxic and lose consciousness. Unfortunately, current semi-closed UBA do not contain oxygen monitors. We describe a simple oxygen monitoring system designed and prototyped at the Navy Experimental Diving Unit. The main monitor components include a PIC microcontroller, analog-to-digital converter, bicolor LED, and oxygen sensor. The LED, affixed to the diver's mask is steady green if the oxygen partial pressure is within pre- defined acceptable limits. A more advanced monitor with a depth senor and additional computational circuitry could be used to estimate metabolic oxygen consumption. The computational algorithm uses the oxygen partial pressure and the diver's depth to compute O2 using the steady state solution of the differential equation describing oxygen concentrations within the UBA. Consequently, dive transients induce errors in the O2 estimation. To evalute these errors, we used a computer simulation of semi-closed circuit UBA dives to generate transient rich data as input to the estimation algorithm. A step change in simulated O2 elicits a monoexponential change in the estimated O2 with a time constant of 5 to 10 minutes. Methods for predicting error and providing a probable error indication to the diver are presented.

  9. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    PubMed

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  10. Clinical significance of sleep bruxism on several occlusal and functional parameters.

    PubMed

    Ommerborn, Michelle A; Giraki, Maria; Schneider, Christine; Fuck, Lars Michael; Zimmer, Stefan; Franz, Matthias; Raab, Wolfgang Hans-michael; Schaefer, Ralf

    2010-10-01

    The aim of this study was to evaluate the association between various functional and occlusal parameters and sleep bruxism. Thirty-nine (39) sleep bruxism patients and 30 controls participated in this investigation. The assessment of sleep bruxism was performed using the Bruxcore Bruxism-Monitoring Device (BBMD) combined with a new computer-based analyzing method. Sixteen functional and/or occlusal parameters were recorded. With a mean slide of 0.95 mm in the sleep bruxism group and a mean slide of 0.42 mm in the control group (Mann Whitney U test; p<0.003), results solely demonstrated a significant group difference regarding the length of a slide from centric occlusion to maximum intercuspation. The results suggest that the slightly pronounced slide could be of clinical importance in the development of increased wear facets in patients with current sleep bruxism activity. Following further evaluation including polysomnographic recordings, the BBMD combined with this new analyzing technique seems to be a clinically feasible instrument that allows the practitioner to quantify abrasion over a short period.

  11. Visual communication interface for severe physically disabled patients

    NASA Astrophysics Data System (ADS)

    Savino, M. J.; Fernández, E. A.

    2007-11-01

    During the last years several interfaces have been developed to allow communication to those patients suffering serious physical disabilities. In this work, a computer based communication interface is presented. It was designed to allow communication to those patients that cannot use neither their hands nor their voice but they can do it through their eyes. The system monitors the eyes movements by means of a webcam. Then, by means of an Artificial Neural Network, the system allows the identification of specified position on the screen through the identification of the eyes positions. This way the user can control a virtual keyboard on a screen that allows him to write and browse the system and enables him to send e-mails, SMS, activate video/music programs and control environmental devices. A patient was simulated to evaluate the versatility of the system. Its operation was satisfactory and it allowed the evaluation of the system potential. The development of this system requires low cost elements that are easily found in the market.

  12. Continuous monitoring of large civil structures using a digital fiber optic motion sensor system

    NASA Astrophysics Data System (ADS)

    Hodge, Malcolm H.; Kausel, Theodore C., Jr.

    1998-03-01

    There is no single attribute which can always predict structural deterioration. Accordingly, we have developed a scheme for monitoring a wide range of incipient deterioration parameters, all based on a single motion sensor concept. In this presentation, we describe how an intrinsically low power- consumption fiber optic harness can be permanently deployed to poll an array of optical sensors. The function and design of these simple, durable, and naturally digital sensors is described, along with the manner in which they have been configured to collect information for changes in the most important structural aspects. The SIMS system is designed to interrogate each sensor up to five-thousand times per second for the life of the structure, and to report sensor data back to a remote computer base for current and long-term analysis, and is directed primarily towards bridges. By suitably modifying the actuation of this very precise motion sensor, SIMS is able to track bridge deck deflection and vibration, expansion joint travel, concrete and rebar corrosion, pothole development, pier scour and tilt. Other sensors will track bolt clamp load, cable tension, and metal fatigue. All of these data are received within microseconds, which means that appropriate computer algorithm manipulations can be carried out to correlate one sensor with other sensors in real time. This internal verification feature automatically enhances confidence in the system's predictive ability and alerts the user to any anomalous behavior.

  13. Utilization of thermoluminescent dosimetry in total skin electron beam radiotherapy of mycosis fungoides.

    PubMed

    Antolak, J A; Cundiff, J H; Ha, C S

    1998-01-01

    The purpose of this report is to discuss the utilization of thermoluminescent dosimetry (TLD) in total skin electron beam (TSEB) radiotherapy to: (a) compare patient dose distributions for similar techniques on different machines, (b) confirm beam calibration and monitor unit calculations, (c) provide data for making clinical decisions, and (d) study reasons for variations in individual dose readings. We report dosimetric results for 72 cases of mycosis fungoides, using similar irradiation techniques on two different linear accelerators. All patients were treated using a modified Stanford 6-field technique. In vivo TLD was done on all patients, and the data for all patients treated on both machines was collected into a database for analysis. Means and standard deviations (SDs) were computed for all locations. Scatter plots of doses vs. height, weight, and obesity index were generated, and correlation coefficients with these variables were computed. The TLD results show that our current TSEB implementation is dosimetrically equivalent to the previous implementation, and that our beam calibration technique and monitor unit calculation is accurate. Correlations with obesity index were significant at several sites. Individual TLD results allow us to customize the boost treatment for each patient, in addition to revealing patient positioning problems and/or systematic variations in dose caused by patient variability. The data agree well with previously published TLD results for similar TSEB techniques. TLD is an important part of the treatment planning and quality assurance programs for TSEB, and routine use of TLD measurements for TSEB is recommended.

  14. A model-based approach to monitor complex road-vehicle interactions through first principles

    NASA Astrophysics Data System (ADS)

    Chakravarty, T.; Srinivasarengan, K.; Roy, S.; Bilal, S.; Balamuralidhar, P.

    2013-02-01

    The increasing availability of portable computing devices and their interaction with physical systems ask for designing compact models and simulations to understand and characterize such interactions. For instance, monitoring a road's grade using accelerometer stationed inside a moving ground vehicle is an emerging trend in city administration. Typically the focus has largely been to develop algorithms to articulate meaning from that. But, the experimentation cannot provide with an exhaustive analysis of all scenarios and the characteristics of them. We propose an approach of modeling these interactions of physical systems with gadgets through first principles, in a compact manner to focus on limited number of interactions. We derive an approach to model the vehicle interaction with a pothole on a road, a specific case, but allowing for selectable car parameters like natural damped frequency, tire size etc, thus generalizing it. Different road profiles are also created to represent rough road with sharp irregularities. These act as excitation to the moving vehicle and the interaction is computed to determine the vertical/ lateral vibration of the system i.e vehicle with sensors using joint time-frequency signal analysis methods. The simulation is compared with experimental data for validation. We show some directions as to how simulation of such models can reveal different characteristics of the interaction through analysis of their frequency spectrum. It is envisioned that the proposed models will get enriched further as and when large data set of real life data is captured and appropriate sensitivity analysis is done.

  15. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  16. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  17. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  18. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  19. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  20. The Electronic Supervisor: New Technology, New Tensions.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Office of Technology Assessment.

    Computer technology has made it possible for employers to collect and analyze management information about employees' work performance and equipment use. There are three main tools for supervising office activities. Computer-based (electronic) monitoring systems automatically record statistics about the work of employees using computer or…

Top