Sample records for based record analysis

  1. [Application of the computer-based respiratory sound analysis system based on Mel-frequency cepstral coefficient and dynamic time warping in healthy children].

    PubMed

    Yan, W Y; Li, L; Yang, Y G; Lin, X L; Wu, J Z

    2016-08-01

    We designed a computer-based respiratory sound analysis system to identify pediatric normal lung sound. To verify the validity of the computer-based respiratory sound analysis system. First we downloaded the standard lung sounds from the network database (website: http: //www.easyauscultation.com/lung-sounds-reference-guide) and recorded 3 samples of abnormal loud sound (rhonchi, wheeze and crackles) from three patients of The Department of Pediatrics, the First Affiliated Hospital of Xiamen University. We regarded such lung sounds as"reference lung sounds". The"test lung sounds"were recorded from 29 children form Kindergarten of Xiamen University. we recorded lung sound by portable electronic stethoscope and valid lung sounds were selected by manual identification. We introduced Mel-frequency cepstral coefficient (MFCC) to extract lung sound features and dynamic time warping (DTW) for signal classification. We had 39 standard lung sounds, recorded 58 test lung sounds. This computer-based respiratory sound analysis system was carried out in 58 lung sound recognition, correct identification of 52 times, error identification 6 times. Accuracy was 89.7%. Based on MFCC and DTW, our computer-based respiratory sound analysis system can effectively identify healthy lung sounds of children (accuracy can reach 89.7%), fully embodies the reliability of the lung sounds analysis system.

  2. Identification of fidgety movements and prediction of CP by the use of computer-based video analysis is more accurate when based on two video recordings.

    PubMed

    Adde, Lars; Helbostad, Jorunn; Jensenius, Alexander R; Langaas, Mette; Støen, Ragnhild

    2013-08-01

    This study evaluates the role of postterm age at assessment and the use of one or two video recordings for the detection of fidgety movements (FMs) and prediction of cerebral palsy (CP) using computer vision software. Recordings between 9 and 17 weeks postterm age from 52 preterm and term infants (24 boys, 28 girls; 26 born preterm) were used. Recordings were analyzed using computer vision software. Movement variables, derived from differences between subsequent video frames, were used for quantitative analysis. Sensitivities, specificities, and area under curve were estimated for the first and second recording, or a mean of both. FMs were classified based on the Prechtl approach of general movement assessment. CP status was reported at 2 years. Nine children developed CP of whom all recordings had absent FMs. The mean variability of the centroid of motion (CSD) from two recordings was more accurate than using only one recording, and identified all children who were diagnosed with CP at 2 years. Age at assessment did not influence the detection of FMs or prediction of CP. The accuracy of computer vision techniques in identifying FMs and predicting CP based on two recordings should be confirmed in future studies.

  3. [Multi-channel in vivo recording techniques: signal processing of action potentials and local field potentials].

    PubMed

    Xu, Jia-Min; Wang, Ce-Qun; Lin, Long-Nian

    2014-06-25

    Multi-channel in vivo recording techniques are used to record ensemble neuronal activity and local field potentials (LFP) simultaneously. One of the key points for the technique is how to process these two sets of recorded neural signals properly so that data accuracy can be assured. We intend to introduce data processing approaches for action potentials and LFP based on the original data collected through multi-channel recording system. Action potential signals are high-frequency signals, hence high sampling rate of 40 kHz is normally chosen for recording. Based on waveforms of extracellularly recorded action potentials, tetrode technology combining principal component analysis can be used to discriminate neuronal spiking signals from differently spatially distributed neurons, in order to obtain accurate single neuron spiking activity. LFPs are low-frequency signals (lower than 300 Hz), hence the sampling rate of 1 kHz is used for LFPs. Digital filtering is required for LFP analysis to isolate different frequency oscillations including theta oscillation (4-12 Hz), which is dominant in active exploration and rapid-eye-movement (REM) sleep, gamma oscillation (30-80 Hz), which is accompanied by theta oscillation during cognitive processing, and high frequency ripple oscillation (100-250 Hz) in awake immobility and slow wave sleep (SWS) state in rodent hippocampus. For the obtained signals, common data post-processing methods include inter-spike interval analysis, spike auto-correlation analysis, spike cross-correlation analysis, power spectral density analysis, and spectrogram analysis.

  4. SLAMMER: Seismic LAndslide Movement Modeled using Earthquake Records

    USGS Publications Warehouse

    Jibson, Randall W.; Rathje, Ellen M.; Jibson, Matthew W.; Lee, Yong W.

    2013-01-01

    This program is designed to facilitate conducting sliding-block analysis (also called permanent-deformation analysis) of slopes in order to estimate slope behavior during earthquakes. The program allows selection from among more than 2,100 strong-motion records from 28 earthquakes and allows users to add their own records to the collection. Any number of earthquake records can be selected using a search interface that selects records based on desired properties. Sliding-block analyses, using any combination of rigid-block (Newmark), decoupled, and fully coupled methods, are then conducted on the selected group of records, and results are compiled in both graphical and tabular form. Simplified methods for conducting each type of analysis are also included.

  5. Installation-Restoration Program. Preliminary assessment; records search for the 155th Tactical Reconnaissance Group, Nebraska Air National Guard, Lincoln Municipal Airport, Lincoln, Nebraska

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-11-01

    The Hazardous Materials Technical Center (HMTC) was retained in May 1986 to conduct the Installation-Restoration Program (IRP) Preliminary Assessment (PA) - Records Search for the 155th Tactical Reconnaissance Group (TRG), Nebraska Air National Guard, Lincoln Municipal Airport, Lincoln, Nebraska (hereinafter referred to as the Base). The Records Search included: an onsite visit including interviews with 19 Base personnel conducted by HMTC personnel on 21-23 May 1986; the acquisition and analysis of pertinent information and records on hazardous materials use and hazardous-waste generation and disposal at the Base; the acquisition and analysis of available geologic, hydrologic, meteorologic, and environmental data frommore » pertinent Federal, State, and local agencies; and the identification of sites on the Base that may be potentially contaminated with hazardous materials/hazardous wastes (HM/HW).« less

  6. A method for environmental acoustic analysis improvement based on individual evaluation of common sources in urban areas.

    PubMed

    López-Pacheco, María G; Sánchez-Fernández, Luis P; Molina-Lozano, Herón

    2014-01-15

    Noise levels of common sources such as vehicles, whistles, sirens, car horns and crowd sounds are mixed in urban soundscapes. Nowadays, environmental acoustic analysis is performed based on mixture signals recorded by monitoring systems. These mixed signals make it difficult for individual analysis which is useful in taking actions to reduce and control environmental noise. This paper aims at separating, individually, the noise source from recorded mixtures in order to evaluate the noise level of each estimated source. A method based on blind deconvolution and blind source separation in the wavelet domain is proposed. This approach provides a basis to improve results obtained in monitoring and analysis of common noise sources in urban areas. The method validation is through experiments based on knowledge of the predominant noise sources in urban soundscapes. Actual recordings of common noise sources are used to acquire mixture signals using a microphone array in semi-controlled environments. The developed method has demonstrated great performance improvements in identification, analysis and evaluation of common urban sources. © 2013 Elsevier B.V. All rights reserved.

  7. Instructional analysis of lecture video recordings and its application for quality improvement of medical lectures.

    PubMed

    Baek, Sunyong; Im, Sun Ju; Lee, Sun Hee; Kam, Beesung; Yune, So Joung; Lee, Sang Soo; Lee, Jung A; Lee, Yuna; Lee, Sang Yeoup

    2011-12-01

    The lecture is a technique for delivering knowledge and information cost-effectively to large medical classes in medical education. The aim of this study was to analyze teaching quality, based on triangle analysis of video recordings of medical lectures, to strengthen teaching competency in medical school. The subjects of this study were 13 medical professors who taught 1st- and 2nd-year medical students and agreed to a triangle analysis of video recordings of their lectures. We first performed triangle analysis, which consisted of a professional analysis of video recordings, self-assessment by teaching professors, and feedback from students, and the data were crosschecked by five school consultants for reliability and consistency. Most of the distress that teachers experienced during the lecture occurred in uniform teaching environments, such as larger lecture classes. Larger lectures that primarily used PowerPoint as a medium to deliver information effected poor interaction with students. Other distressing factors in the lecture were personal characteristics and lack of strategic faculty development. Triangle analysis of video recordings of medical lectures gives teachers an opportunity and motive to improve teaching quality. Faculty development and various improvement strategies, based on this analysis, are expected to help teachers succeed as effective, efficient, and attractive lecturers while improving the quality of larger lecture classes.

  8. Validation of a Smartphone Image-Based Dietary Assessment Method for Pregnant Women

    PubMed Central

    Ashman, Amy M.; Collins, Clare E.; Brown, Leanne J.; Rae, Kym M.; Rollo, Megan E.

    2017-01-01

    Image-based dietary records could lower participant burden associated with traditional prospective methods of dietary assessment. They have been used in children, adolescents and adults, but have not been evaluated in pregnant women. The current study evaluated relative validity of the DietBytes image-based dietary assessment method for assessing energy and nutrient intakes. Pregnant women collected image-based dietary records (via a smartphone application) of all food, drinks and supplements consumed over three non-consecutive days. Intakes from the image-based method were compared to intakes collected from three 24-h recalls, taken on random days; once per week, in the weeks following the image-based record. Data were analyzed using nutrient analysis software. Agreement between methods was ascertained using Pearson correlations and Bland-Altman plots. Twenty-five women (27 recruited, one withdrew, one incomplete), median age 29 years, 15 primiparas, eight Aboriginal Australians, completed image-based records for analysis. Significant correlations between the two methods were observed for energy, macronutrients and fiber (r = 0.58–0.84, all p < 0.05), and for micronutrients both including (r = 0.47–0.94, all p < 0.05) and excluding (r = 0.40–0.85, all p < 0.05) supplements in the analysis. Bland-Altman plots confirmed acceptable agreement with no systematic bias. The DietBytes method demonstrated acceptable relative validity for assessment of nutrient intakes of pregnant women. PMID:28106758

  9. The Use of Continuous Wavelet Transform Based on the Fast Fourier Transform in the Analysis of Multi-channel Electrogastrography Recordings.

    PubMed

    Komorowski, Dariusz; Pietraszek, Stanislaw

    2016-01-01

    This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.

  10. Implementation of a microprocessor-based visual-evoked cortical potential recording and analysis system.

    PubMed

    Wilson, A; Fram, D; Sistar, J

    1981-06-01

    An Imsai 8080 microcomputer is being used to simultaneously generate a color graphics stimulus display and to record visual-evoked cortical potentials. A brief description of the hardware and software developed for this system is presented. Data storage and analysis techniques are also discussed.

  11. 75 FR 2514 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-15

    ... supervision; workforce study and analysis; manpower requirements studies; emergency loan program and training curricula planning and research. ROUTINE USES OF RECORDS MAINTAINED IN THE SYSTEM, INCLUDING CATEGORIES OF... appropriate. Records relating to adverse actions, grievances, excluding EEO complaints and performance-based...

  12. BioAcoustica: a free and open repository and analysis platform for bioacoustics

    PubMed Central

    Baker, Edward; Price, Ben W.; Rycroft, S. D.; Smith, Vincent S.

    2015-01-01

    We describe an online open repository and analysis platform, BioAcoustica (http://bio.acousti.ca), for recordings of wildlife sounds. Recordings can be annotated using a crowdsourced approach, allowing voice introductions and sections with extraneous noise to be removed from analyses. This system is based on the Scratchpads virtual research environment, the BioVeL portal and the Taverna workflow management tool, which allows for analysis of recordings using a grid computing service. At present the analyses include spectrograms, oscillograms and dominant frequency analysis. Further analyses can be integrated to meet the needs of specific researchers or projects. Researchers can upload and annotate their recordings to supplement traditional publication. Database URL: http://bio.acousti.ca PMID:26055102

  13. Localizing wushu players on a platform based on a video recording

    NASA Astrophysics Data System (ADS)

    Peczek, Piotr M.; Zabołotny, Wojciech M.

    2017-08-01

    This article describes the development of a method to localize an athlete during sports performance on a platform, based on a static video recording. Considered sport for this method is wushu - martial art. However, any other discipline can be applied. There are specified requirements, and 2 algorithms of image processing are described. The next part presents an experiment that was held based on recordings from the Pan American Wushu Championship. Based on those recordings the steps of the algorithm are shown. Results are evaluated manually. The last part of the article concludes if the algorithm is applicable and what improvements have to be implemented to use it during sports competitions as well as for offline analysis.

  14. Non-Contact Analysis of the Adsorptive Ink Capacity of Nano Silica Pigments on a Printing Coating Base

    PubMed Central

    Jiang, Bo; Huang, Yu Dong

    2014-01-01

    Near infrared spectra combined with partial least squares were proposed as a means of non-contact analysis of the adsorptive ink capacity of recording coating materials in ink jet printing. First, the recording coating materials were prepared based on nano silica pigments. 80 samples of the recording coating materials were selected to develop the calibration of adsorptive ink capacity against ink adsorption (g/m2). The model developed predicted samples in the validation set with r2  = 0.80 and SEP  = 1.108, analytical results showed that near infrared spectra had significant potential for the adsorption of ink capacity on the recording coating. The influence of factors such as recording coating thickness, mass ratio silica: binder-polyvinyl alcohol and the solution concentration on the adsorptive ink capacity were studied. With the help of the near infrared spectra, the adsorptive ink capacity of a recording coating material can be rapidly controlled. PMID:25329464

  15. Non-contact analysis of the adsorptive ink capacity of nano silica pigments on a printing coating base.

    PubMed

    Jiang, Bo; Huang, Yu Dong

    2014-01-01

    Near infrared spectra combined with partial least squares were proposed as a means of non-contact analysis of the adsorptive ink capacity of recording coating materials in ink jet printing. First, the recording coating materials were prepared based on nano silica pigments. 80 samples of the recording coating materials were selected to develop the calibration of adsorptive ink capacity against ink adsorption (g/m2). The model developed predicted samples in the validation set with r2  = 0.80 and SEP = 1.108, analytical results showed that near infrared spectra had significant potential for the adsorption of ink capacity on the recording coating. The influence of factors such as recording coating thickness, mass ratio silica: binder-polyvinyl alcohol and the solution concentration on the adsorptive ink capacity were studied. With the help of the near infrared spectra, the adsorptive ink capacity of a recording coating material can be rapidly controlled.

  16. Analysis of Service Records Management Systems for Rescue and Retention of Cultural Resource Documents

    DTIC Science & Technology

    2009-06-01

    this information was not migrated to the new data- base . The responsible offices were told to destroy the old cards, and thus, vast amounts of...then necessary to examine the online service-specific records management systems , namely Army Records Information Management System (ARIMS ), Air...Force Records Information Management System (AFRIMS), and the Navy Records Management System .3 Each system

  17. The analysis and forecasting of male cycling time trial records established within England and Wales.

    PubMed

    Dyer, Bryce; Hassani, Hossein; Shadi, Mehran

    2016-01-01

    The format of cycling time trials in England, Wales and Northern Ireland, involves riders competing individually over several fixed race distances of 10-100 miles in length and using time constrained formats of 12 and 24 h in duration. Drawing on data provided by the national governing body that covers the regions of England and Wales, an analysis of six male competition record progressions was undertaken to illustrate its progression. Future forecasts are then projected through use of the Singular Spectrum Analysis technique. This method has not been applied to sport-based time series data before. All six records have seen a progressive improvement and are non-linear in nature. Five records saw their highest level of record change during the 1950-1969 period. Whilst new record frequency generally has reduced since this period, the magnitude of performance improvement has generally increased. The Singular Spectrum Analysis technique successfully provided forecasted projections in the short to medium term with a high level of fit to the time series data.

  18. Test-retest reliability of computer-based video analysis of general movements in healthy term-born infants.

    PubMed

    Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars

    2015-10-01

    A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (p<0.05). This study showed high test-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. The construction and periodicity analysis of natural disaster database of Alxa area based on Chinese local records

    NASA Astrophysics Data System (ADS)

    Yan, Zheng; Mingzhong, Tian; Hengli, Wang

    2010-05-01

    Chinese hand-written local records were originated from the first century. Generally, these local records include geography, evolution, customs, education, products, people, historical sites, as well as writings of an area. Through such endeavors, the information of the natural materials of China nearly has had no "dark ages" in the evolution of its 5000-year old civilization. A compilation of all meaningful historical data of natural-disasters taken place in Alxa of inner-Mongolia, the second largest desert in China, is used here for the construction of a 500-year high resolution database. The database is divided into subsets according to the types of natural-disasters like sand-dust storm, drought events, cold wave, etc. Through applying trend, correlation, wavelet, and spectral analysis on these data, we can estimate the statistically periodicity of different natural-disasters, detect and quantify similarities and patterns of the periodicities of these records, and finally take these results in aggregate to find a strong and coherent cyclicity through the last 500 years which serves as the driving mechanism of these geological hazards. Based on the periodicity obtained from the above analysis, the paper discusses the probability of forecasting natural-disasters and the suitable measures to reduce disaster losses through history records. Keyword: Chinese local records; Alxa; natural disasters; database; periodicity analysis

  20. Ca analysis: An Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis☆

    PubMed Central

    Greensmith, David J.

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. PMID:24125908

  1. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    PubMed

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  2. Probabilistic characterization of sleep architecture: home based study on healthy volunteers.

    PubMed

    Garcia-Molina, Gary; Vissapragada, Sreeram; Mahadevan, Anandi; Goodpaster, Robert; Riedner, Brady; Bellesi, Michele; Tononi, Giulio

    2016-08-01

    The quantification of sleep architecture has high clinical value for diagnostic purposes. While the clinical standard to assess sleep architecture is in-lab based polysomnography, higher ecological validity can be obtained with multiple sleep recordings at home. In this paper, we use a dataset composed of fifty sleep EEG recordings at home (10 per study participant for five participants) to analyze the sleep stage transition dynamics using Markov chain based modeling. The statistical analysis of the duration of continuous sleep stage bouts is also analyzed to identify the speed of transition between sleep stages. This analysis identified two types of NREM states characterized by fast and slow exit rates which from the EEG analysis appear to correspond to shallow and deep sleep respectively.

  3. Spreadsheet-based engine data analysis tool - user's guide.

    DOT National Transportation Integrated Search

    2016-07-01

    This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...

  4. A PDA-based system for online recording and analysis of concurrent events in complex behavioral processes.

    PubMed

    Held, Jürgen; Manser, Tanja

    2005-02-01

    This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.

  5. Hilbert-Huang transform analysis of dynamic and earthquake motion recordings

    USGS Publications Warehouse

    Zhang, R.R.; Ma, S.; Safak, E.; Hartzell, S.

    2003-01-01

    This study examines the rationale of Hilbert-Huang transform (HHT) for analyzing dynamic and earthquake motion recordings in studies of seismology and engineering. In particular, this paper first provides the fundamentals of the HHT method, which consist of the empirical mode decomposition (EMD) and the Hilbert spectral analysis. It then uses the HHT to analyze recordings of hypothetical and real wave motion, the results of which are compared with the results obtained by the Fourier data processing technique. The analysis of the two recordings indicates that the HHT method is able to extract some motion characteristics useful in studies of seismology and engineering, which might not be exposed effectively and efficiently by Fourier data processing technique. Specifically, the study indicates that the decomposed components in EMD of HHT, namely, the intrinsic mode function (IMF) components, contain observable, physical information inherent to the original data. It also shows that the grouped IMF components, namely, the EMD-based low- and high-frequency components, can faithfully capture low-frequency pulse-like as well as high-frequency wave signals. Finally, the study illustrates that the HHT-based Hilbert spectra are able to reveal the temporal-frequency energy distribution for motion recordings precisely and clearly.

  6. (Re)Discovering Retrospective Miscue Analysis: An Action Research Exploration Using Recorded Readings to Improve Third-Grade Students' Reading Fluency

    ERIC Educational Resources Information Center

    Born, Melissa; Curtis, Reagan

    2013-01-01

    An action research project was undertaken focused on integrating recorded readings and Retrospective Miscue Analysis (RMA) into center-based instructional time in a third-grade classroom. Initial DIBELS test results were used to select 6 struggling readers, all of whom showed improved fluency in response to our instructional interventions. The…

  7. Determination of baseline periods of record for selected streamflow-gaging stations in and near Oklahoma for use in modeling applications

    USGS Publications Warehouse

    Esralew, Rachel A.

    2010-01-01

    Use of historical streamflow data from a least-altered period of record can be used in calibration of various modeling applications that are used to characterize least-altered flow and predict the effects of proposed streamflow alteration. This information can be used to enhance water-resources planning. A baseline period of record was determined for selected streamflow-gaging stations that can be used as a calibration dataset for modeling applications. The baseline period of record was defined as a period that is least-altered by anthropogenic activity and has sufficient streamflow record length to represent extreme climate variability. Streamflow data from 171 stations in and near Oklahoma with a minimum of 10 complete water years of daily streamflow record through water year 2007 and drainage areas that were less than 2,500 square miles were considered for use in the baseline period analysis. The first step to determine the least-altered period of record was to evaluate station information by using previous publications, historical station record notes, and information gathered from oral and written communication with hydrographers familiar with selected stations. The second step was to indentify stations that had substantial effects from upstream regulation by evaluating the location and extent of dams in the drainage basin. The third step was (a) the analysis of annual hydrographs and included visual hydrograph analysis for selected stations with 20 or more years of streamflow record, (b) analysis of covariance of double-mass curves, and (c) Kendall's tau trend analysis to detect statistically significant trends in base flow, runoff, total flow, and base-flow index related to anthropogenic activity for selected stations with 15 or more years of streamflow record. A preliminary least-altered period of record for each stream was identified by removing the period of streamflow record when streams were substantially affected by anthropogenic activity. After streamflow record was removed from designation as a least-altered period, stations that did not have at least 10 years of remaining continuous streamflow record were considered to have an insufficient baseline period for modeling applications. An optimum minimum period of record was determined for each of the least-altered periods for each station to ensure a sufficient streamflow record length to provide a representative sample of annual climate variability. An optimum minimum period of 10 years or more was evaluated by analyzing the variability of annual precipitation for selected 5-, 10-, 15-, 25-, and 35-year periods for each of 20 climate divisions that contained stations used in the baseline period analysis. The distribution of annual precipitation was compared for each consecutive overlapping 5-year period to the period 1925-2007 by using a Wilcoxon rank-sum test. The least-altered period of record for stations was also compared to the period 1925-2007 by using a Wilcoxon rank-sum test. The results of this analysis were used to determine how many years of annual precipitation data were needed for the selected period to be statistically similar to the distribution of annual precipitation data for a long-term period, 1925-2007. Minimum optimum periods ranged from 10 to 35 years and varied by climate division. A final baseline period was determined for 111 stations that had a baseline period of at least 10 years of continuous streamflow record after the record-elimination process. A suitable baseline period of record for use in modeling applications could not be identified for 58 of the initial 171 stations because of substantial anthropogenic alteration of the stream or drainage basin and for 2 stations because the least-altered period of record was not representative of annual climate variability. The baseline period for each station was rated ?excellent?, ?good?, ?fair?, ?poor?, or ?no baseline period.? This rating was based on a qualitative evaluation of t

  8. Opto-mechatronics issues in solid immersion lens based near-field recording

    NASA Astrophysics Data System (ADS)

    Park, No-Cheol; Yoon, Yong-Joong; Lee, Yong-Hyun; Kim, Joong-Gon; Kim, Wan-Chin; Choi, Hyun; Lim, Seungho; Yang, Tae-Man; Choi, Moon-Ho; Yang, Hyunseok; Rhim, Yoon-Chul; Park, Young-Pil

    2007-06-01

    We analyzed the effects of an external shock on a collision problem in a solid immersion lens (SIL) based near-field recording (NFR) through a shock response analysis and proposed a possible solution to this problem with adopting a protector and safety mode. With this proposed method the collision between SIL and media can be avoided. We showed possible solution for contamination problem in SIL based NFR through a numerical air flow analysis. We also introduced possible solid immersion lens designs to increase the fabrication and assembly tolerances of an optical head with replicated lens. Potentially, these research results could advance NFR technology for commercial product.

  9. How many records should be used in ASCE/SEI-7 ground motion scaling procedure?

    USGS Publications Warehouse

    Reyes, Juan C.; Kalkan, Erol

    2012-01-01

    U.S. national building codes refer to the ASCE/SEI-7 provisions for selecting and scaling ground motions for use in nonlinear response history analysis of structures. Because the limiting values for the number of records in the ASCE/SEI-7 are based on engineering experience, this study examines the required number of records statistically, such that the scaled records provide accurate, efficient, and consistent estimates of “true” structural responses. Based on elastic–perfectly plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI-7 scaling procedure is applied to 480 sets of ground motions; the number of records in these sets varies from three to ten. As compared to benchmark responses, it is demonstrated that the ASCE/SEI-7 scaling procedure is conservative if fewer than seven ground motions are employed. Utilizing seven or more randomly selected records provides more accurate estimate of the responses. Selecting records based on their spectral shape and design spectral acceleration increases the accuracy and efficiency of the procedure.

  10. Frequency analysis of electroencephalogram recorded from a bottlenose dolphin (Tursiops truncatus) with a novel method during transportation by truck

    PubMed Central

    Tamura, Shinichi; Okada, Yasunori; Morimoto, Shigeru; Ohta, Mitsuaki; Uchida, Naoyuki

    2010-01-01

    In order to obtain information regarding the correlation between an electroencephalogram (EEG) and the state of a dolphin, we developed a noninvasive recording method of EEG of a bottlenose dolphin (Tursiops truncatus) and an extraction method of true-EEG (EEG) from recorded-EEG (R-EEG) based on a human EEG recording method, and then carried out frequency analysis during transportation by truck. The frequency detected in the EEG of dolphin during apparent awakening was divided conveniently into three bands (5–15, 15–25, and 25–40 Hz) based on spectrum profiles. Analyses of the relationship between power ratio and movement of the dolphin revealed that the power ratio of dolphin in a situation when it was being quiet was evenly distributed among the three bands. These results suggested that the EEG of a dolphin could be detected accurately by this method, and that the frequency analysis of the detected EEG seemed to provide useful information for understanding the central nerve activity of these animals. PMID:20429047

  11. Toward a Reconstruction of the Atlantic Multidecadal Oscillation Using Shell-based Records from Coastal Northern Norway

    NASA Astrophysics Data System (ADS)

    Mette, M.; Wanamaker, A. D.; Carroll, M.; Ambrose, W. G., Jr.; Retelle, M.

    2016-02-01

    North Atlantic sea surface temperatures over the past 150 years have exhibited multidecadal variability, switching between relatively warm and cool periods, described by the Atlantic Multidecadal Oscillation (AMO). The influence, persistence, and causes of the AMO, however, are debated because instrumental records of North Atlantic sea surface temperatures only capture 2 cycles of this 60 to 80 year mode. Thus far, AMO reconstructions have been largely based on terrestrial archives despite the fact that the AMO is an oceanic mode. Proxy records from the marine realm are therefore necessary to better understand the behavior of the AMO over recent centuries. We present continuous, annual shell-based records of oxygen isotopes and growth from the long-lived marine bivalve Arctica islandica from coastal northern Norway (71 °N) from 1900-2012 that strongly relate to the instrumental AMO record (r = -0.59, p < 0.01). We performed calibration/verification analysis in order to assess the potential for these records to contribute to AMO reconstructions. We also compare our record with other proxy reconstructions of AMO variability over the past century. Our results show that extending shell-based records to past centuries will provide valuable information about AMO variability.

  12. The Vaiont Slide. A Geotechnical Analysis Based on New Geologic Observations of the Failure Surface. Volume 2. Appendices A through G

    DTIC Science & Technology

    1985-06-01

    Al B , STATIC SLOPE ANALYSIS METHOD USED FOR THE WAONT SLIDE -ANALYSES~ty D. L. Anderson ................................... B1 C SECTIONS,,USED...years 1960, 1961, 1962 and 1963 are given in this appendix in Tables Al , A2, A3 and A4, respectively. These were supplied through the courtesy of E.N.E.L...Tables Table Al . Daily precipitation record, Erto - 1960 Table A2. Daily precipitation record, Erto - 1961 Table A3. Daily precipitation record, Erto

  13. A wind proxy based on migrating dunes at the Baltic coast: statistical analysis of the link between wind conditions and sand movement

    NASA Astrophysics Data System (ADS)

    Bierstedt, Svenja E.; Hünicke, Birgit; Zorita, Eduardo; Ludwig, Juliane

    2017-07-01

    We statistically analyse the relationship between the structure of migrating dunes in the southern Baltic and the driving wind conditions over the past 26 years, with the long-term aim of using migrating dunes as a proxy for past wind conditions at an interannual resolution. The present analysis is based on the dune record derived from geo-radar measurements by Ludwig et al. (2017). The dune system is located at the Baltic Sea coast of Poland and is migrating from west to east along the coast. The dunes present layers with different thicknesses that can be assigned to absolute dates at interannual timescales and put in relation to seasonal wind conditions. To statistically analyse this record and calibrate it as a wind proxy, we used a gridded regional meteorological reanalysis data set (coastDat2) covering recent decades. The identified link between the dune annual layers and wind conditions was additionally supported by the co-variability between dune layers and observed sea level variations in the southern Baltic Sea. We include precipitation and temperature into our analysis, in addition to wind, to learn more about the dependency between these three atmospheric factors and their common influence on the dune system. We set up a statistical linear model based on the correlation between the frequency of days with specific wind conditions in a given season and dune migration velocities derived for that season. To some extent, the dune records can be seen as analogous to tree-ring width records, and hence we use a proxy validation method usually applied in dendrochronology, cross-validation with the leave-one-out method, when the observational record is short. The revealed correlations between the wind record from the reanalysis and the wind record derived from the dune structure is in the range between 0.28 and 0.63, yielding similar statistical validation skill as dendroclimatological records.

  14. Guidelines for Assessment of Gait and Reference Values for Spatiotemporal Gait Parameters in Older Adults: The Biomathics and Canadian Gait Consortiums Initiative

    PubMed Central

    Beauchet, Olivier; Allali, Gilles; Sekhon, Harmehr; Verghese, Joe; Guilain, Sylvie; Steinmetz, Jean-Paul; Kressig, Reto W.; Barden, John M.; Szturm, Tony; Launay, Cyrille P.; Grenier, Sébastien; Bherer, Louis; Liu-Ambrose, Teresa; Chester, Vicky L.; Callisaya, Michele L.; Srikanth, Velandai; Léonard, Guillaume; De Cock, Anne-Marie; Sawa, Ryuichi; Duque, Gustavo; Camicioli, Richard; Helbostad, Jorunn L.

    2017-01-01

    Background: Gait disorders, a highly prevalent condition in older adults, are associated with several adverse health consequences. Gait analysis allows qualitative and quantitative assessments of gait that improves the understanding of mechanisms of gait disorders and the choice of interventions. This manuscript aims (1) to give consensus guidance for clinical and spatiotemporal gait analysis based on the recorded footfalls in older adults aged 65 years and over, and (2) to provide reference values for spatiotemporal gait parameters based on the recorded footfalls in healthy older adults free of cognitive impairment and multi-morbidities. Methods: International experts working in a network of two different consortiums (i.e., Biomathics and Canadian Gait Consortium) participated in this initiative. First, they identified items of standardized information following the usual procedure of formulation of consensus findings. Second, they merged databases including spatiotemporal gait assessments with GAITRite® system and clinical information from the “Gait, cOgnitiOn & Decline” (GOOD) initiative and the Generation 100 (Gen 100) study. Only healthy—free of cognitive impairment and multi-morbidities (i.e., ≤ 3 therapeutics taken daily)—participants aged 65 and older were selected. Age, sex, body mass index, mean values, and coefficients of variation (CoV) of gait parameters were used for the analyses. Results: Standardized systematic assessment of three categories of items, which were demographics and clinical information, and gait characteristics (clinical and spatiotemporal gait analysis based on the recorded footfalls), were selected for the proposed guidelines. Two complementary sets of items were distinguished: a minimal data set and a full data set. In addition, a total of 954 participants (mean age 72.8 ± 4.8 years, 45.8% women) were recruited to establish the reference values. Performance of spatiotemporal gait parameters based on the recorded footfalls declined with increasing age (mean values and CoV) and demonstrated sex differences (mean values). Conclusions: Based on an international multicenter collaboration, we propose consensus guidelines for gait assessment and spatiotemporal gait analysis based on the recorded footfalls, and reference values for healthy older adults. PMID:28824393

  15. A Novel Quantitative Method for Diabetic Cardiac Autonomic Neuropathy Assessment in Type 1 Diabetic Mice

    PubMed Central

    Yang, Bufan; Posada-Quintero, Hugo F.; Siu, Kin L.; Rolle, Marsha; Brink, Peter; Birzgalis, Aija; Moore, Leon C.

    2014-01-01

    In this work, we used a sensitive and noninvasive computational method to assess diabetic cardiovascular autonomic neuropathy (DCAN) from pulse oximeter (photoplethysmographic; PPG) recordings from mice. The method, which could be easily applied to humans, is based on principal dynamic mode (PDM) analysis of heart rate variability (HRV). Unlike the power spectral density, PDM has been shown to be able to separately identify the activities of the parasympathetic and sympathetic nervous systems without pharmacological intervention. HRV parameters were measured by processing PPG signals from conscious 1.5- to 5-month-old C57/BL6 control mice and in Akita mice, a model of insulin-dependent type 1 diabetes, and compared with the gold-standard Western blot and immunohistochemical analyses. The PDM results indicate significant cardiac autonomic impairment in the diabetic mice in comparison to the controls. When tail-cuff PPG recordings were collected and analyzed starting from 1.5 months of age in both C57/Bl6 controls and Akita mice, onset of DCAN was seen at 3 months in the Akita mice, which persisted up to the termination of the recording at 5 months. Western blot and immunohistochemical analyses also showed a reduction in nerve density in Akita mice at 3 and 4 months as compared to the control mice, thus, corroborating our PDM data analysis of HRV records. Western blot analysis of autonomic nerve proteins corroborated the PPG-based HRV analysis via the PDM approach. In contrast, traditional HRV analysis (based on either the power spectral density or time-domain measures) failed to detect the nerve rarefaction. PMID:25097056

  16. A Real-Time Recording Model of Key Indicators for Energy Consumption and Carbon Emissions of Sustainable Buildings

    PubMed Central

    Wu, Weiwei; Yang, Huanjia; Chew, David; Hou, Yanhong; Li, Qiming

    2014-01-01

    Buildings' sustainability is one of the crucial parts for achieving urban sustainability. Applied to buildings, life-cycle assessment encompasses the analysis and assessment of the environmental effects of building materials, components and assemblies throughout the entire life of the building construction, use and demolition. Estimate of carbon emissions is essential and crucial for an accurate and reasonable life-cycle assessment. Addressing the need for more research into integrating analysis of real-time and automatic recording of key indicators for a more accurate calculation and comparison, this paper aims to design a real-time recording model of these crucial indicators concerning the calculation and estimation of energy use and carbon emissions of buildings based on a Radio Frequency Identification (RFID)-based system. The architecture of the RFID-based carbon emission recording/tracking system, which contains four functional layers including data record layer, data collection/update layer, data aggregation layer and data sharing/backup layer, is presented. Each of these layers is formed by RFID or network devices and sub-systems that operate at a specific level. In the end, a proof-of-concept system is developed to illustrate the implementation of the proposed architecture and demonstrate the feasibility of the design. This study would provide the technical solution for real-time recording system of building carbon emissions and thus is of great significance and importance to improve urban sustainability. PMID:24831109

  17. A real-time recording model of key indicators for energy consumption and carbon emissions of sustainable buildings.

    PubMed

    Wu, Weiwei; Yang, Huanjia; Chew, David; Hou, Yanhong; Li, Qiming

    2014-05-14

    Buildings' sustainability is one of the crucial parts for achieving urban sustainability. Applied to buildings, life-cycle assessment encompasses the analysis and assessment of the environmental effects of building materials, components and assemblies throughout the entire life of the building construction, use and demolition. Estimate of carbon emissions is essential and crucial for an accurate and reasonable life-cycle assessment. Addressing the need for more research into integrating analysis of real-time and automatic recording of key indicators for a more accurate calculation and comparison, this paper aims to design a real-time recording model of these crucial indicators concerning the calculation and estimation of energy use and carbon emissions of buildings based on a Radio Frequency Identification (RFID)-based system. The architecture of the RFID-based carbon emission recording/tracking system, which contains four functional layers including data record layer, data collection/update layer, data aggregation layer and data sharing/backup layer, is presented. Each of these layers is formed by RFID or network devices and sub-systems that operate at a specific level. In the end, a proof-of-concept system is developed to illustrate the implementation of the proposed architecture and demonstrate the feasibility of the design. This study would provide the technical solution for real-time recording system of building carbon emissions and thus is of great significance and importance to improve urban sustainability.

  18. A microcomputer interface for a digital audio processor-based data recording system.

    PubMed

    Croxton, T L; Stump, S J; Armstrong, W M

    1987-10-01

    An inexpensive interface is described that performs direct transfer of digitized data from the digital audio processor and video cassette recorder based data acquisition system designed by Bezanilla (1985, Biophys. J., 47:437-441) to an IBM PC/XT microcomputer. The FORTRAN callable software that drives this interface is capable of controlling the video cassette recorder and starting data collection immediately after recognition of a segment of previously collected data. This permits piecewise analysis of long intervals of data that would otherwise exceed the memory capability of the microcomputer.

  19. A microcomputer interface for a digital audio processor-based data recording system.

    PubMed Central

    Croxton, T L; Stump, S J; Armstrong, W M

    1987-01-01

    An inexpensive interface is described that performs direct transfer of digitized data from the digital audio processor and video cassette recorder based data acquisition system designed by Bezanilla (1985, Biophys. J., 47:437-441) to an IBM PC/XT microcomputer. The FORTRAN callable software that drives this interface is capable of controlling the video cassette recorder and starting data collection immediately after recognition of a segment of previously collected data. This permits piecewise analysis of long intervals of data that would otherwise exceed the memory capability of the microcomputer. PMID:3676444

  20. Time frequency analysis for automated sleep stage identification in fullterm and preterm neonates.

    PubMed

    Fraiwan, Luay; Lweesy, Khaldon; Khasawneh, Natheer; Fraiwan, Mohammad; Wenz, Heinrich; Dickhaus, Hartmut

    2011-08-01

    This work presents a new methodology for automated sleep stage identification in neonates based on the time frequency distribution of single electroencephalogram (EEG) recording and artificial neural networks (ANN). Wigner-Ville distribution (WVD), Hilbert-Hough spectrum (HHS) and continuous wavelet transform (CWT) time frequency distributions were used to represent the EEG signal from which features were extracted using time frequency entropy. The classification of features was done using feed forward back-propagation ANN. The system was trained and tested using data taken from neonates of post-conceptual age of 40 weeks for both preterm (14 recordings) and fullterm (15 recordings). The identification of sleep stages was successfully implemented and the classification based on the WVD outperformed the approaches based on CWT and HHS. The accuracy and kappa coefficient were found to be 0.84 and 0.65 respectively for the fullterm neonates' recordings and 0.74 and 0.50 respectively for preterm neonates' recordings.

  1. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity.

  2. Decision Support System for Medical Care Quality Assessment Based on Health Records Analysis in Russia.

    PubMed

    Taranik, Maksim; Kopanitsa, Georgy

    2017-01-01

    The paper presents developed decision system, oriented for healthcare providers. The system allows healthcare providers to detect and decrease nonconformities in health records and forecast the sum of insurance payments taking into account nonconformities. The components are ISO13606, fuzzy logic and case-based reasoning concept. The result of system implementation allowed to 10% increase insurance payments for healthcare provider.

  3. Low-flow characteristics for selected streams in Indiana

    USGS Publications Warehouse

    Fowler, Kathleen K.; Wilson, John T.

    2015-01-01

    The management and availability of Indiana’s water resources increase in importance every year. Specifically, information on low-flow characteristics of streams is essential to State water-management agencies. These agencies need low-flow information when working with issues related to irrigation, municipal and industrial water supplies, fish and wildlife protection, and the dilution of waste. Industrial, municipal, and other facilities must obtain National Pollutant Discharge Elimination System (NPDES) permits if their discharges go directly to surface waters. The Indiana Department of Environmental Management (IDEM) requires low-flow statistics in order to administer the NPDES permit program. Low-flow-frequency characteristics were computed for 272 continuous-record stations. The information includes low-flow-frequency analysis, flow-duration analysis, and harmonic mean for the continuous-record stations. For those stations affected by some form of regulation, low-flow frequency curves are based on the longest period of homogeneous record under current conditions. Low-flow-frequency values and harmonic mean flow (if sufficient data were available) were estimated for the 166 partial-record stations. Partial-record stations are ungaged sites where streamflow measurements were made at base flow.

  4. Feasibility of a web-based system for police crash report review and information recording.

    DOT National Transportation Integrated Search

    2016-04-01

    Police crash reports include useful additional information that is not available in crash summary records. : This information may include police sketches and narratives and is often needed for detailed site-specific : safety analysis. In addition, so...

  5. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    PubMed

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  6. Training-free compressed sensing for wireless neural recording using analysis model and group weighted {{\\ell}_{1}} -minimization

    NASA Astrophysics Data System (ADS)

    Sun, Biao; Zhao, Wenfeng; Zhu, Xinshan

    2017-06-01

    Objective. Data compression is crucial for resource-constrained wireless neural recording applications with limited data bandwidth, and compressed sensing (CS) theory has successfully demonstrated its potential in neural recording applications. In this paper, an analytical, training-free CS recovery method, termed group weighted analysis {{\\ell}1} -minimization (GWALM), is proposed for wireless neural recording. Approach. The GWALM method consists of three parts: (1) the analysis model is adopted to enforce sparsity of the neural signals, therefore overcoming the drawbacks of conventional synthesis models and enhancing the recovery performance. (2) A multi-fractional-order difference matrix is constructed as the analysis operator, thus avoiding the dictionary learning procedure and reducing the need for previously acquired data and computational complexities. (3) By exploiting the statistical properties of the analysis coefficients, a group weighting approach is developed to enhance the performance of analysis {{\\ell}1} -minimization. Main results. Experimental results on synthetic and real datasets reveal that the proposed approach outperforms state-of-the-art CS-based methods in terms of both spike recovery quality and classification accuracy. Significance. Energy and area efficiency of the GWALM make it an ideal candidate for resource-constrained, large scale wireless neural recording applications. The training-free feature of the GWALM further improves its robustness to spike shape variation, thus making it more practical for long term wireless neural recording.

  7. Training-free compressed sensing for wireless neural recording using analysis model and group weighted [Formula: see text]-minimization.

    PubMed

    Sun, Biao; Zhao, Wenfeng; Zhu, Xinshan

    2017-06-01

    Data compression is crucial for resource-constrained wireless neural recording applications with limited data bandwidth, and compressed sensing (CS) theory has successfully demonstrated its potential in neural recording applications. In this paper, an analytical, training-free CS recovery method, termed group weighted analysis [Formula: see text]-minimization (GWALM), is proposed for wireless neural recording. The GWALM method consists of three parts: (1) the analysis model is adopted to enforce sparsity of the neural signals, therefore overcoming the drawbacks of conventional synthesis models and enhancing the recovery performance. (2) A multi-fractional-order difference matrix is constructed as the analysis operator, thus avoiding the dictionary learning procedure and reducing the need for previously acquired data and computational complexities. (3) By exploiting the statistical properties of the analysis coefficients, a group weighting approach is developed to enhance the performance of analysis [Formula: see text]-minimization. Experimental results on synthetic and real datasets reveal that the proposed approach outperforms state-of-the-art CS-based methods in terms of both spike recovery quality and classification accuracy. Energy and area efficiency of the GWALM make it an ideal candidate for resource-constrained, large scale wireless neural recording applications. The training-free feature of the GWALM further improves its robustness to spike shape variation, thus making it more practical for long term wireless neural recording.

  8. Assessing a novel polymer-wick based electrode for EEG neurophysiological research.

    PubMed

    Pasion, Rita; Paiva, Tiago O; Pedrosa, Paulo; Gaspar, Hugo; Vasconcelos, Beatriz; Martins, Ana C; Amaral, Maria H; Nóbrega, João M; Páscoa, Ricardo; Fonseca, Carlos; Barbosa, Fernando

    2016-07-15

    The EEG technique has decades of valid applications in clinical and experimental neurophysiology. EEG equipment and data analysis methods have been characterized by remarkable developments, but the skin-to-electrode signal transfer remains a challenge for EEG recording. A novel quasi-dry system - the polymer wick-based electrode - was developed to overcome the limitations of conventional dry and wet silver/silver-chloride (Ag/AgCl) electrodes for EEG recording. Nine participants completed an auditory oddball protocol with simultaneous EEG acquisition using both the conventional Ag/AgCl and the wick electrodes. Wick system successfully recorded the expected P300 modulation. Standard ERP analysis, residual random noise analysis, and single-trial analysis of the P300 wave were performed in order to compare signal acquired by both electrodes. It was found that the novel wick electrode performed similarly to the conventional Ag/AgCl electrodes. The developed wick electrode appears to be a reliable alternative for EEG research, representing a promising halfway alternative between wet and dry electrodes. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Imaging Analysis of Near-Field Recording Technique for Observation of Biological Specimens

    NASA Astrophysics Data System (ADS)

    Moriguchi, Chihiro; Ohta, Akihiro; Egami, Chikara; Kawata, Yoshimasa; Terakawa, Susumu; Tsuchimori, Masaaki; Watanabe, Osamu

    2006-07-01

    We present an analysis of the properties of an imaging based on a near-field recording technique in comparison with simulation results. In the system, the optical field distributions localized near the specimens are recorded as the surface topographic distributions of a photosensitive film. It is possible to observe both soft and moving specimens, because the system does not require a scanning probe to obtain the observed image. The imaging properties are evaluated using fine structures of paramecium, and we demonstrate that it is possible to observe minute differences of refractive indices.

  10. An Analysis of Tower (Ground) Controller - Pilot Voice Communications

    DOT National Transportation Integrated Search

    1995-11-01

    This report is based on an analysis of over 48 hours of pilot-controller communications recorded from the ground-control : frequency at twelve air traffic control towers. The analysis examined the complexity of controller instructions, that : is, how...

  11. 76 FR 82283 - Privacy Act of 1974; Systems of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-30

    ..., excluding EEO complaints and performance-based actions, except SF-50s, will be retained for seven years... counseling; administration and personnel supervision; workforce study and analysis; manpower requirements studies; emergency loan program; and training curricula planning and research. Routine uses of records...

  12. 78 FR 55703 - Privacy Act of 1974; Systems of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-11

    ... Maritime Commission, or from the FMC's Web site at FMC Systems of Records Based on Privacy Act Issuances..., and/or research material used to support the final position classification. Authority for maintenance... the system through analysis, research, corroboration, field investigation, reporting, and referral...

  13. Content-based fused off-axis object illumination direct-to-digital holography

    DOEpatents

    Price, Jeffery R.

    2006-05-02

    Systems and methods are described for content-based fused off-axis illumination direct-to-digital holography. A method includes calculating an illumination angle with respect to an optical axis defined by a focusing lens as a function of data representing a Fourier analyzed spatially heterodyne hologram; reflecting a reference beam from a reference mirror at a non-normal angle; reflecting an object beam from an object the object beam incident upon the object at the illumination angle; focusing the reference beam and the object beam at a focal plane of a digital recorder to from the content-based off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis; and digitally recording the content based off-axis illuminated spatially heterodyne hologram including spatially heterodyne fringes for Fourier analysis.

  14. Active matrix-based collection of airborne analytes: an analyte recording chip providing exposure history and finger print.

    PubMed

    Fang, Jun; Park, Se-Chul; Schlag, Leslie; Stauden, Thomas; Pezoldt, Jörg; Jacobs, Heiko O

    2014-12-03

    In the field of sensors that target the detection of airborne analytes, Corona/lens-based-collection provides a new path to achieve a high sensitivity. An active-matrix-based analyte collection approach referred to as "airborne analyte memory chip/recorder" is demonstrated, which takes and stores airborne analytes in a matrix to provide an exposure history for off-site analysis. © 2014 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Motion based parsing for video from observational psychology

    NASA Astrophysics Data System (ADS)

    Kokaram, Anil; Doyle, Erika; Lennon, Daire; Joyeux, Laurent; Fuller, Ray

    2006-01-01

    In Psychology it is common to conduct studies involving the observation of humans undertaking some task. The sessions are typically recorded on video and used for subjective visual analysis. The subjective analysis is tedious and time consuming, not only because much useless video material is recorded but also because subjective measures of human behaviour are not necessarily repeatable. This paper presents tools using content based video analysis that allow automated parsing of video from one such study involving Dyslexia. The tools rely on implicit measures of human motion that can be generalised to other applications in the domain of human observation. Results comparing quantitative assessment of human motion with subjective assessment are also presented, illustrating that the system is a useful scientific tool.

  16. A Bit-Encoding Based New Data Structure for Time and Memory Efficient Handling of Spike Times in an Electrophysiological Setup.

    PubMed

    Ljungquist, Bengt; Petersson, Per; Johansson, Anders J; Schouenborg, Jens; Garwicz, Martin

    2018-04-01

    Recent neuroscientific and technical developments of brain machine interfaces have put increasing demands on neuroinformatic databases and data handling software, especially when managing data in real time from large numbers of neurons. Extrapolating these developments we here set out to construct a scalable software architecture that would enable near-future massive parallel recording, organization and analysis of neurophysiological data on a standard computer. To this end we combined, for the first time in the present context, bit-encoding of spike data with a specific communication format for real time transfer and storage of neuronal data, synchronized by a common time base across all unit sources. We demonstrate that our architecture can simultaneously handle data from more than one million neurons and provide, in real time (< 25 ms), feedback based on analysis of previously recorded data. In addition to managing recordings from very large numbers of neurons in real time, it also has the capacity to handle the extensive periods of recording time necessary in certain scientific and clinical applications. Furthermore, the bit-encoding proposed has the additional advantage of allowing an extremely fast analysis of spatiotemporal spike patterns in a large number of neurons. Thus, we conclude that this architecture is well suited to support current and near-future Brain Machine Interface requirements.

  17. Computer-based video analysis identifies infants with absence of fidgety movements.

    PubMed

    Støen, Ragnhild; Songstad, Nils Thomas; Silberg, Inger Elisabeth; Fjørtoft, Toril; Jensenius, Alexander Refsum; Adde, Lars

    2017-10-01

    BackgroundAbsence of fidgety movements (FMs) at 3 months' corrected age is a strong predictor of cerebral palsy (CP) in high-risk infants. This study evaluates the association between computer-based video analysis and the temporal organization of FMs assessed with the General Movement Assessment (GMA).MethodsInfants were eligible for this prospective cohort study if referred to a high-risk follow-up program in a participating hospital. Video recordings taken at 10-15 weeks post term age were used for GMA and computer-based analysis. The variation of the spatial center of motion, derived from differences between subsequent video frames, was used for quantitative analysis.ResultsOf 241 recordings from 150 infants, 48 (24.1%) were classified with absence of FMs or sporadic FMs using the GMA. The variation of the spatial center of motion (C SD ) during a recording was significantly lower in infants with normal (0.320; 95% confidence interval (CI) 0.309, 0.330) vs. absence of or sporadic (0.380; 95% CI 0.361, 0.398) FMs (P<0.001). A triage model with C SD thresholds chosen for sensitivity of 90% and specificity of 80% gave a 40% referral rate for GMA.ConclusionQuantitative video analysis during the FMs' period can be used to triage infants at high risk of CP to early intervention or observational GMA.

  18. Development of clinical contents model markup language for electronic health records.

    PubMed

    Yun, Ji-Hyun; Ahn, Sun-Ju; Kim, Yoon

    2012-09-01

    To develop dedicated markup language for clinical contents models (CCM) to facilitate the active use of CCM in electronic health record systems. Based on analysis of the structure and characteristics of CCM in the clinical domain, we designed extensible markup language (XML) based CCM markup language (CCML) schema manually. CCML faithfully reflects CCM in both the syntactic and semantic aspects. As this language is based on XML, it can be expressed and processed in computer systems and can be used in a technology-neutral way. CCML HAS THE FOLLOWING STRENGTHS: it is machine-readable and highly human-readable, it does not require a dedicated parser, and it can be applied for existing electronic health record systems.

  19. Modes of rationality in nursing documentation: biology, biography and the 'voice of nursing'.

    PubMed

    Hyde, Abbey; Treacy, Margaret P; Scott, P Anne; Butler, Michelle; Drennan, Jonathan; Irving, Kate; Byrne, Anne; MacNeela, Padraig; Hanrahan, Marian

    2005-06-01

    This article is based on a discourse analysis of the complete nursing records of 45 patients, and concerns the modes of rationality that mediated text-based accounts relating to patient care that nurses recorded. The analysis draws on the work of the critical theorist, Jurgen Habermas, who conceptualised rationality in the context of modernity according to two types: purposive rationality based on an instrumental logic, and value rationality based on ethical considerations and moral reasoning. Our analysis revealed that purposive rationality dominated the content of nursing documentation, as evidenced by a particularly bio-centric and modernist construction of the workings of the body within the texts. There was little reference in the documentation to central themes of contemporary nursing discourses, such as notions of partnership, autonomy, and self-determination, which are associated with value rationality. Drawing on Habermas, we argue that this nursing documentation depicted the colonisation of the sociocultural lifeworld by the bio-technocratic system. Where nurses recorded disagreements that patients had with medical regimes, the central struggle inherent in the project of modernity became transparent--the tension between the rational and instrumental control of people through scientific regulation and the autonomy of the subject. The article concludes by problematising communicative action within the context of nursing practice.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    WESTRICH, HENRY; WILSON, ANDREW; STANTON, ERIC

    LDRDView is a software tool for visualizing a collection of textual records and exploring relationships between them for the purpose of gaining new insights about the submitted information. By evaluating the content of the records and assigning coordinates to each based on its similarity to others, LDRDView graphically displays a corpus of records either as a landscape of hills and valleys or as a graph of nodes and links. A suite of data analysis tools facilitates in-depth exploration of the corpus as a whole and the content of each individual record.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Mario E.

    An area in earthquake risk reduction that needs an urgent examination is the selection of earthquake records for nonlinear dynamic analysis of structures. An often-mentioned shortcoming from results of nonlinear dynamic analyses of structures is that these results are limited to the type of records that these analyses use as input data. This paper proposes a procedure for selecting earthquake records for nonlinear dynamic analysis of structures. This procedure uses a seismic damage index evaluated using the hysteretic energy dissipated by a Single Degree of Freedom System (SDOF) representing a multi-degree-of freedom structure responding to an earthquake record, and themore » plastic work capacity of the system at collapse. The type of structural system is considered using simple parameters. The proposed method is based on the evaluation of the damage index for a suite of earthquake records and a selected type of structural system. A set of 10 strong ground motion records is analyzed to show an application of the proposed procedure for selecting earthquake records for structural design.« less

  2. Dynamic photoelasticity by TDI imaging

    NASA Astrophysics Data System (ADS)

    Asundi, Anand K.; Sajan, M. R.

    2001-06-01

    High speed photographic system like the image rotation camera, the Cranz Schardin camera and the drum camera are typically used for the recording and visualization of dynamic events in stress analysis, fluid mechanics, etc. All these systems are fairly expensive and generally not simple to use. Furthermore they are all based on photographic film recording system requiring time consuming and tedious wet processing of the films. Digital cameras are replacing the conventional cameras, to certain extent in static experiments. Recently, there is lots of interest in development and modifying CCD architectures and recording arrangements for dynamic scenes analysis. Herein we report the use of a CCD camera operating in the Time Delay and Integration mode for digitally recording dynamic photoelastic stress patterns. Applications in strobe and streak photoelastic pattern recording and system limitations will be explained in the paper.

  3. Authenticity examination of compressed audio recordings using detection of multiple compression and encoders' identification.

    PubMed

    Korycki, Rafal

    2014-05-01

    Since the appearance of digital audio recordings, audio authentication has been becoming increasingly difficult. The currently available technologies and free editing software allow a forger to cut or paste any single word without audible artifacts. Nowadays, the only method referring to digital audio files commonly approved by forensic experts is the ENF criterion. It consists in fluctuation analysis of the mains frequency induced in electronic circuits of recording devices. Therefore, its effectiveness is strictly dependent on the presence of mains signal in the recording, which is a rare occurrence. Recently, much attention has been paid to authenticity analysis of compressed multimedia files and several solutions were proposed for detection of double compression in both digital video and digital audio. This paper addresses the problem of tampering detection in compressed audio files and discusses new methods that can be used for authenticity analysis of digital recordings. Presented approaches consist in evaluation of statistical features extracted from the MDCT coefficients as well as other parameters that may be obtained from compressed audio files. Calculated feature vectors are used for training selected machine learning algorithms. The detection of multiple compression covers up tampering activities as well as identification of traces of montage in digital audio recordings. To enhance the methods' robustness an encoder identification algorithm was developed and applied based on analysis of inherent parameters of compression. The effectiveness of tampering detection algorithms is tested on a predefined large music database consisting of nearly one million of compressed audio files. The influence of compression algorithms' parameters on the classification performance is discussed, based on the results of the current study. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Blind source separation for ambulatory sleep recording

    PubMed Central

    Porée, Fabienne; Kachenoura, Amar; Gauvrit, Hervé; Morvan, Catherine; Carrault, Guy; Senhadji, Lotfi

    2006-01-01

    This paper deals with the conception of a new system for sleep staging in ambulatory conditions. Sleep recording is performed by means of five electrodes: two temporal, two frontal and a reference. This configuration enables to avoid the chin area to enhance the quality of the muscular signal and the hair region for patient convenience. The EEG, EMG and EOG signals are separated using the Independent Component Analysis approach. The system is compared to a standard sleep analysis system using polysomnographic recordings of 14 patients. The overall concordance of 67.2% is achieved between the two systems. Based on the validation results and the computational efficiency we recommend the clinical use of the proposed system in a commercial sleep analysis platform. PMID:16617618

  5. On the invariance of EEG-based signatures of individuality with application in biometric identification.

    PubMed

    Yunqi Wang; Najafizadeh, Laleh

    2016-08-01

    One of the main challenges in EEG-based biometric systems is to extract reliable signatures of individuality from recorded EEG data that are also invariant against time. In this paper, we investigate the invariability of features that are extracted based on the spatial distribution of the spectral power of EEG data corresponding to 2-second eyes-closed resting-state (ECRS) recording, in different scenarios. Eyes-closed resting-state EEG signals in 4 healthy adults are recorded in two different sessions with an interval of at least one week between sessions. The performance in terms of correct recognition rate (CRR) is examined when the training and testing datasets are chosen from the same recording session, and when the training and testing datasets are chosen from different sessions. It is shown that an CRR of 92% can be achieved based on the proposed features when the training and testing datasets are taken from different sessions. To reduce the number of recording channels, principal component analysis (PCA) is also employed to identify channels that carry the most discriminatory information across individuals. High CRR is obtained based on the data from channels mostly covering the occipital region. The results suggest that features based on the spatial distribution of the spectral power of the short-time (e.g. 2 seconds) ECRS recordings can have great potentials in EEG-based biometric identification systems.

  6. Rhythmic patterning in Malaysian and Singapore English.

    PubMed

    Tan, Rachel Siew Kuang; Low, Ee-Ling

    2014-06-01

    Previous work on the rhythm of Malaysian English has been based on impressionistic observations. This paper utilizes acoustic analysis to measure the rhythmic patterns of Malaysian English. Recordings of the read speech and spontaneous speech of 10 Malaysian English speakers were analyzed and compared with recordings of an equivalent sample of Singaporean English speakers. Analysis was done using two rhythmic indexes, the PVI and VarcoV. It was found that although the rhythm of read speech of the Singaporean speakers was syllable-based as described by previous studies, the rhythm of the Malaysian speakers was even more syllable-based. Analysis of the syllables in specific utterances showed that Malaysian speakers did not reduce vowels as much as Singaporean speakers in cases of syllables in utterances. Results of the spontaneous speech confirmed the findings for the read speech; that is, the same rhythmic patterning was found which normally triggers vowel reductions.

  7. Cognition, Corpora, and Computing: Triangulating Research in Usage-Based Language Learning

    ERIC Educational Resources Information Center

    Ellis, Nick C.

    2017-01-01

    Usage-based approaches explore how we learn language from our experience of language. Related research thus involves the analysis of the usage from which learners learn and of learner usage as it develops. This program involves considerable data recording, transcription, and analysis, using a variety of corpus and computational techniques, many of…

  8. The Behavior of Preschool Handicapped Children and Their Interactions with Model Children: An Update.

    ERIC Educational Resources Information Center

    Montemurro, Theodore J.

    The behavior patterns of 6 handicapped children and 14 nonhandicapped children were recorded during participation in a model developmental-interactive based curriculum for preschool children. Interactions were recorded using the Coping Analysis Schedule for Educational Settings. Among findings were the following: the consistently high occurrence…

  9. 14 CFR 152.315 - Reporting on accrual basis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Except as provided in paragraph (b) of this section each sponsor or planning agency shall submit all financial reports on an accrual basis. (b) If records are not maintained on an accrual basis by a sponsor or planning agency, reports may be based on an analysis of records or best estimates. ...

  10. "It's like texting at the dinner table": A qualitative analysis of the impact of electronic health records on patient-physician interaction in hospitals.

    PubMed

    Pelland, Kimberly D; Baier, Rosa R; Gardner, Rebekah L

    2017-06-30

    nBACKGROUND: Electronic health records (EHRs) may reduce medical errors and improve care, but can complicate clinical encounters. To describe hospital-based physicians' perceptions of the impact of EHRs on patient-physician interactions and contrast these findings against office-based physicians' perceptionsMethods: We performed a qualitative analysis of comments submitted in response to the 2014 Rhode Island Health Information Technology Survey. Office- and hospital-based physicians licensed in Rhode Island, in active practice, and located in Rhode Island or neighboring states completed the survey about their Electronic Health Record use. The survey's response rate was 68.3% and 2,236 (87.1%) respondents had EHRs. Among survey respondents, 27.3% of hospital-based and 37.8% of office-based physicians with EHRs responded to the question about patient interaction. Five main themes emerged for hospital-based physicians, with respondents generally perceiving EHRs as negatively altering patient interactions. We noted the same five themes among office-based physicians, but the rank-order of the top two responses differed by setting: hospital-based physicians commented most frequently that they spend less time with patients because they have to spend more time on computers; office-based physicians commented most frequently on EHRs worsening the quality of their interactions and relationships with patients. In our analysis of a large sample of physicians, hospital-based physicians generally perceived EHRs as negatively altering patient interactions, although they emphasized different reasons than their office-based counterparts. These findings add to the prior literature, which focuses on outpatient physicians, and can shape interventions to improve how EHRs are used in inpatient settings.

  11. An alternative approach to characterize nonlinear site effects

    USGS Publications Warehouse

    Zhang, R.R.; Hartzell, S.; Liang, J.; Hu, Y.

    2005-01-01

    This paper examines the rationale of a method of nonstationary processing and analysis, referred to as the Hilbert-Huang transform (HHT), for its application to a recording-based approach in quantifying influences of soil nonlinearity in site response. In particular, this paper first summarizes symptoms of soil nonlinearity shown in earthquake recordings, reviews the Fourier-based approach to characterizing nonlinearity, and offers justifications for the HHT in addressing nonlinearity issues. This study then uses the HHT method to analyze synthetic data and recordings from the 1964 Niigata and 2001 Nisqually earthquakes. In doing so, the HHT-based site response is defined as the ratio of marginal Hilbert amplitude spectra, alternative to the Fourier-based response that is the ratio of Fourier amplitude spectra. With the Fourier-based approach in studies of site response as a reference, this study shows that the alternative HHT-based approach is effective in characterizing soil nonlinearity and nonlinear site response.

  12. Speech watermarking: an approach for the forensic analysis of digital telephonic recordings.

    PubMed

    Faundez-Zanuy, Marcos; Lucena-Molina, Jose J; Hagmüller, Martin

    2010-07-01

    In this article, the authors discuss the problem of forensic authentication of digital audio recordings. Although forensic audio has been addressed in several articles, the existing approaches are focused on analog magnetic recordings, which are less prevalent because of the large amount of digital recorders available on the market (optical, solid state, hard disks, etc.). An approach based on digital signal processing that consists of spread spectrum techniques for speech watermarking is presented. This approach presents the advantage that the authentication is based on the signal itself rather than the recording format. Thus, it is valid for usual recording devices in police-controlled telephone intercepts. In addition, our proposal allows for the introduction of relevant information such as the recording date and time and all the relevant data (this is not always possible with classical systems). Our experimental results reveal that the speech watermarking procedure does not interfere in a significant way with the posterior forensic speaker identification.

  13. Analysis of PVA/AA based photopolymers at the zero spatial frequency limit using interferometric methods.

    PubMed

    Gallego, Sergi; Márquez, Andrés; Méndez, David; Ortuño, Manuel; Neipp, Cristian; Fernández, Elena; Pascual, Inmaculada; Beléndez, Augusto

    2008-05-10

    One of the problems associated with photopolymers as optical recording media is the thickness variation during the recording process. Different values of shrinkages or swelling are reported in the literature for photopolymers. Furthermore, these variations depend on the spatial frequencies of the gratings stored in the materials. Thickness variations can be measured using different methods: studying the deviation from the Bragg's angle for nonslanted gratings, using MicroXAM S/N 8038 interferometer, or by the thermomechanical analysis experiments. In a previous paper, we began the characterization of the properties of a polyvinyl alcohol/acrylamide based photopolymer at the lowest end of recorded spatial frequencies. In this work, we continue analyzing the thickness variations of these materials using a reflection interferometer. With this technique we are able to obtain the variations of the layers refractive index and, therefore, a direct estimation of the polymer refractive index.

  14. Vocalisation sound pattern identification in young broiler chickens.

    PubMed

    Fontana, I; Tullo, E; Scrase, A; Butterworth, A

    2016-09-01

    In this study, we describe the monitoring of young broiler chicken vocalisation, with sound recorded and assessed at regular intervals throughout the life of the birds from day 1 to day 38, with a focus on the first week of life. We assess whether there are recognisable, and even predictable, vocalisation patterns based on frequency and sound spectrum analysis, which can be observed in birds at different ages and stages of growth within the relatively short life of the birds in commercial broiler production cycles. The experimental trials were carried out in a farm where the broiler where reared indoor, and audio recording procedures carried out over 38 days. The recordings were made using two microphones connected to a digital recorder, and the sonic data was collected in situations without disturbance of the animals beyond that created by the routine activities of the farmer. Digital files of 1 h duration were cut into short files of 10 min duration, and these sound recordings were analysed and labelled using audio analysis software. Analysis of these short sound files showed that the key vocalisation frequency and patterns changed in relation to increasing age and the weight of the broilers. Statistical analysis showed a significant correlation (P<0.001) between the frequency of vocalisation and the age of the birds. Based on the identification of specific frequencies of the sounds emitted, in relation to age and weight, it is proposed that there is potential for audio monitoring and comparison with 'anticipated' sound patterns to be used to evaluate the status of farmed broiler chicken.

  15. Recording human cortical population spikes non-invasively--An EEG tutorial.

    PubMed

    Waterstraat, Gunnar; Fedele, Tommaso; Burghoff, Martin; Scheer, Hans-Jürgen; Curio, Gabriel

    2015-07-30

    Non-invasively recorded somatosensory high-frequency oscillations (sHFOs) evoked by electric nerve stimulation are markers of human cortical population spikes. Previously, their analysis was based on massive averaging of EEG responses. Advanced neurotechnology and optimized off-line analysis can enhance the signal-to-noise ratio of sHFOs, eventually enabling single-trial analysis. The rationale for developing dedicated low-noise EEG technology for sHFOs is unfolded. Detailed recording procedures and tailored analysis principles are explained step-by-step. Source codes in Matlab and Python are provided as supplementary material online. Combining synergistic hardware and analysis improvements, evoked sHFOs at around 600 Hz ('σ-bursts') can be studied in single-trials. Additionally, optimized spatial filters increase the signal-to-noise ratio of components at about 1 kHz ('κ-bursts') enabling their detection in non-invasive surface EEG. sHFOs offer a unique possibility to record evoked human cortical population spikes non-invasively. The experimental approaches and algorithms presented here enable also non-specialized EEG laboratories to combine measurements of conventional low-frequency EEG with the analysis of concomitant cortical population spike responses. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Simultaneous surface and depth neural activity recording with graphene transistor-based dual-modality probes.

    PubMed

    Du, Mingde; Xu, Xianchen; Yang, Long; Guo, Yichuan; Guan, Shouliang; Shi, Jidong; Wang, Jinfen; Fang, Ying

    2018-05-15

    Subdural surface and penetrating depth probes are widely applied to record neural activities from the cortical surface and intracortical locations of the brain, respectively. Simultaneous surface and depth neural activity recording is essential to understand the linkage between the two modalities. Here, we develop flexible dual-modality neural probes based on graphene transistors. The neural probes exhibit stable electrical performance even under 90° bending because of the excellent mechanical properties of graphene, and thus allow multi-site recording from the subdural surface of rat cortex. In addition, finite element analysis was carried out to investigate the mechanical interactions between probe and cortex tissue during intracortical implantation. Based on the simulation results, a sharp tip angle of π/6 was chosen to facilitate tissue penetration of the neural probes. Accordingly, the graphene transistor-based dual-modality neural probes have been successfully applied for simultaneous surface and depth recording of epileptiform activity of rat brain in vivo. Our results show that graphene transistor-based dual-modality neural probes can serve as a facile and versatile tool to study tempo-spatial patterns of neural activities. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Automatic detection of sleep macrostructure based on a sensorized T-shirt.

    PubMed

    Bianchi, Anna M; Mendez, Martin O

    2010-01-01

    In the present work we apply a fully automatic procedure to the analysis of signal coming from a sensorized T-shit, worn during the night, for sleep evaluation. The goodness and reliability of the signals recorded trough the T-shirt was previously tested, while the employed algorithms for feature extraction and sleep classification were previously developed on standard ECG recordings and the obtained classification was compared to the standard clinical practice based on polysomnography (PSG). In the present work we combined T-shirt recordings and automatic classification and could obtain reliable sleep profiles, i.e. the sleep classification in WAKE, REM (rapid eye movement) and NREM stages, based on heart rate variability (HRV), respiration and movement signals.

  18. Development of Clinical Contents Model Markup Language for Electronic Health Records

    PubMed Central

    Yun, Ji-Hyun; Kim, Yoon

    2012-01-01

    Objectives To develop dedicated markup language for clinical contents models (CCM) to facilitate the active use of CCM in electronic health record systems. Methods Based on analysis of the structure and characteristics of CCM in the clinical domain, we designed extensible markup language (XML) based CCM markup language (CCML) schema manually. Results CCML faithfully reflects CCM in both the syntactic and semantic aspects. As this language is based on XML, it can be expressed and processed in computer systems and can be used in a technology-neutral way. Conclusions CCML has the following strengths: it is machine-readable and highly human-readable, it does not require a dedicated parser, and it can be applied for existing electronic health record systems. PMID:23115739

  19. Low-cost digital dynamic visualization system

    NASA Astrophysics Data System (ADS)

    Asundi, Anand K.; Sajan, M. R.

    1995-05-01

    High speed photographic systems like the image rotation camera, the Cranz Schardin camera and the drum camera are typically used for recording and visualization of dynamic events in stress analysis, fluid mechanics, etc. All these systems are fairly expensive and generally not simple to use. Furthermore they are all based on photographic film recording systems requiring time consuming and tedious wet processing of the films. Currently digital cameras are replacing to certain extent the conventional cameras for static experiments. Recently, there is lot of interest in developing and modifying CCD architectures and recording arrangements for dynamic scene analysis. Herein we report the use of a CCD camera operating in the Time Delay and Integration (TDI) mode for digitally recording dynamic scenes. Applications in solid as well as fluid impact problems are presented.

  20. A Geographic-Information-Systems-Based Approach to Analysis of Characteristics Predicting Student Persistence and Graduation

    ERIC Educational Resources Information Center

    Ousley, Chris

    2010-01-01

    This study sought to provide empirical evidence regarding the use of spatial analysis in enrollment management to predict persistence and graduation. The research utilized data from the 2000 U.S. Census and applicant records from The University of Arizona to study the spatial distributions of enrollments. Based on the initial results, stepwise…

  1. Electronic Health Record Implementation: A SWOT Analysis.

    PubMed

    Shahmoradi, Leila; Darrudi, Alireza; Arji, Goli; Farzaneh Nejad, Ahmadreza

    2017-10-01

    Electronic Health Record (EHR) is one of the most important achievements of information technology in healthcare domain, and if deployed effectively, it can yield predominant results. The aim of this study was a SWOT (strengths, weaknesses, opportunities, and threats) analysis in electronic health record implementation. This is a descriptive, analytical study conducted with the participation of a 90-member work force from Hospitals affiliated to Tehran University of Medical Sciences (TUMS). The data were collected by using a self-structured questionnaire and analyzed by SPSS software. Based on the results, the highest priority in strength analysis was related to timely and quick access to information. However, lack of hardware and infrastructures was the most important weakness. Having the potential to share information between different sectors and access to a variety of health statistics was the significant opportunity of EHR. Finally, the most substantial threats were the lack of strategic planning in the field of electronic health records together with physicians' and other clinical staff's resistance in the use of electronic health records. To facilitate successful adoption of electronic health record, some organizational, technical and resource elements contribute; moreover, the consideration of these factors is essential for HER implementation.

  2. Evaluation of thermograph data for California streams

    USGS Publications Warehouse

    Limerinos, J.T.

    1978-01-01

    Statistical analysis of water-temperature data from California streams indicates that, for most purposes, long-term operation of thermographs (automatic water-temperature recording instruments) does not provide a more useful record than either short-term operation of such instruments or periodic measurements. Harmonic analyses were made of thermograph records 5 to 14 years in length from 82 stations. More than 80 percent of the annual variation in water temperature is explained by the harmonic function for 77 of the 82 stations. Harmonic coefficients based on 8 years of thermograph record at 12 stations varied only slightly from coefficients computed using two equally split 4-year records. At five stations where both thermograph and periodic (10 to 23 measurements per year) data were collected concurrently, harmonic coefficients for periodic data were defined nearly as well as those for thermograph data. Results of this analysis indicate that, except where detailed surveillance of water temperatures is required or where there is a chance of temporal change, thermograph operations can be reduced substantially without affecting the usefulness of temperature records.

  3. Algorithm for automatic analysis of electro-oculographic data

    PubMed Central

    2013-01-01

    Background Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. Methods The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. Results The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. Conclusion The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics. PMID:24160372

  4. Algorithm for automatic analysis of electro-oculographic data.

    PubMed

    Pettersson, Kati; Jagadeesan, Sharman; Lukander, Kristian; Henelius, Andreas; Haeggström, Edward; Müller, Kiti

    2013-10-25

    Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics.

  5. Inter-trial alignment of EEG data and phase-locking

    NASA Astrophysics Data System (ADS)

    Testorf, M. E.; Horak, P.; Connolly, A.; Holmes, G. L.; Jobst, B. C.

    2015-09-01

    Neuro-scientific studies are often aimed at imaging brain activity, which is time-locked to external stimuli. This provides the possibility to use statistical methods to extract even weak signal components, which occur with each stimulus. For electroencephalographic recordings this concept is limited by inevitable time jitter, which cannot be controlled in all cases. Our study is based on a cross-correlation analysis of trials to alignment trials based on the recorded data. This is demonstrated both with simulated signals and with clinical EEG data, which were recorded intracranially. Special attention is given to the evaluation of the time-frequency resolved phase-locking across multiple trails.

  6. Computer vision-based diameter maps to study fluoroscopic recordings of small intestinal motility from conscious experimental animals.

    PubMed

    Ramírez, I; Pantrigo, J J; Montemayor, A S; López-Pérez, A E; Martín-Fontelles, M I; Brookes, S J H; Abalo, R

    2017-08-01

    When available, fluoroscopic recordings are a relatively cheap, non-invasive and technically straightforward way to study gastrointestinal motility. Spatiotemporal maps have been used to characterize motility of intestinal preparations in vitro, or in anesthetized animals in vivo. Here, a new automated computer-based method was used to construct spatiotemporal motility maps from fluoroscopic recordings obtained in conscious rats. Conscious, non-fasted, adult, male Wistar rats (n=8) received intragastric administration of barium contrast, and 1-2 hours later, when several loops of the small intestine were well-defined, a 2 minutes-fluoroscopic recording was obtained. Spatiotemporal diameter maps (Dmaps) were automatically calculated from the recordings. Three recordings were also manually analyzed for comparison. Frequency analysis was performed in order to calculate relevant motility parameters. In each conscious rat, a stable recording (17-20 seconds) was analyzed. The Dmaps manually and automatically obtained from the same recording were comparable, but the automated process was faster and provided higher resolution. Two frequencies of motor activity dominated; lower frequency contractions (15.2±0.9 cpm) had an amplitude approximately five times greater than higher frequency events (32.8±0.7 cpm). The automated method developed here needed little investigator input, provided high-resolution results with short computing times, and automatically compensated for breathing and other small movements, allowing recordings to be made without anesthesia. Although slow and/or infrequent events could not be detected in the short recording periods analyzed to date (17-20 seconds), this novel system enhances the analysis of in vivo motility in conscious animals. © 2017 John Wiley & Sons Ltd.

  7. Pyrotechnic shock measurement and data analysis requirements

    NASA Technical Reports Server (NTRS)

    Albers, L.

    1975-01-01

    A study of laboratory measurement and analysis of pyrotechnic shock prompted by a discrepancy in preliminary Mariner Jupiter/Saturn shock test data is reported. It is shown that before generating shock response plots from any recorded pyrotechnic event, a complete review of each instrumentation and analysis system must be made. In addition, the frequency response capability of the tape recorder used should be as high as possible; the discrepancies in the above data were due to inadequate frequency response in the FM tape recorders. The slew rate of all conditioning amplifiers and input converters must be high enough to prevent signal distortion at maximum input voltage; amplifier ranges should be selected so that the input pulse is approximately 50% of full scale; the Bessel response type should be chosen for digital shock analysis if antialiasing filters are employed; and transducer selection must consider maximum acceleration limit, mounted resonance frequency, flat clean mounting surfaces, base bending sensitivity, and proper torque.

  8. Advanced IR System For Supersonic Boundary Layer Transition Flight Experiment

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2008-01-01

    Infrared thermography is a preferred method investigating transition in flight: a) Global and non-intrusive; b) Can also be used to visualize and characterize other fluid mechanic phenomena such as shock impingement, separation etc. F-15 based system was updated with new camera and digital video recorder to support high Reynolds number transition tests. Digital Recording improves image quality and analysis capability and allows for accurate quantitative (temperature) measurements and greater enhancement through image processing allows analysis of smaller scale phenomena.

  9. Refining locations of the 2005 Mukacheve, West Ukraine, earthquakes based on similarity of their waveforms

    NASA Astrophysics Data System (ADS)

    Gnyp, Andriy

    2009-06-01

    Based on the results of application of correlation analysis to records of the 2005 Mukacheve group of recurrent events and their subsequent relocation relative to the reference event of 7 July 2005, a conclusion has been drawn that all the events had most likely occurred on the same rup-ture plane. Station terms have been estimated for seismic stations of the Transcarpathians, accounting for variation of seismic velocities beneath their locations as compared to the travel time tables used in the study. In methodical aspect, potentials and usefulness of correlation analysis of seismic records for a more detailed study of seismic processes, tectonics and geodynamics of the Carpathian region have been demonstrated.

  10. MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.

    PubMed

    Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M

    2002-05-30

    Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.

  11. Acid-base equilibrium in aqueous solutions of 1,3-dimethylbarbituric acid as studied by 13C NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Gryff-Keller, A.; Kraska-Dziadecka, A.

    2011-12-01

    13C NMR spectra of 1,3-dimethylbarbituric acid in aqueous solutions of various acidities and for various solute concentrations have been recorded and interpreted. The spectra recorded at pH = 2 and below contain the signals of the neutral solute molecule exclusively, while the ones recorded at pH = 7 and above only the signals of the appropriate anion, which has been confirmed by theoretical GIAO-DFT calculations. The signals in the spectra recorded for solutions of pH < 7 show dynamic broadenings. The lineshape analysis of these signals has provided information on the kinetics of the processes running in the dynamic acid-base equilibrium. The kinetic data determined this way have been used to clarify the mechanisms of these processes. The numerical analysis has shown that under the investigated conditions deprotonation of the neutral solute molecules undergoes not only via a simple transfer of the C-H proton to water molecules but also through a process with participation of the barbiturate anions. Moreover, the importance of tautomerism, or association, or both these phenomena for the kinetics of the acid-base transformations in the investigated system has been shown. Qualitatively similar changes of 13C NMR spectra with the solution pH variation have been observed for the parent barbituric acid.

  12. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    PubMed

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  13. An optogenetics- and imaging-assisted simultaneous multiple patch-clamp recording system for decoding complex neural circuits

    PubMed Central

    Wang, Guangfu; Wyskiel, Daniel R; Yang, Weiguo; Wang, Yiqing; Milbern, Lana C; Lalanne, Txomin; Jiang, Xiaolong; Shen, Ying; Sun, Qian-Quan; Zhu, J Julius

    2015-01-01

    Deciphering neuronal circuitry is central to understanding brain function and dysfunction, yet it remains a daunting task. To facilitate the dissection of neuronal circuits, a process requiring functional analysis of synaptic connections and morphological identification of interconnected neurons, we present here a method for stable simultaneous octuple patch-clamp recordings. This method allows physiological analysis of synaptic interconnections among 4–8 simultaneously recorded neurons and/or 10–30 sequentially recorded neurons, and it allows anatomical identification of >85% of recorded interneurons and >99% of recorded principal neurons. We describe how to apply the method to rodent tissue slices; however, it can be used on other model organisms. We also describe the latest refinements and optimizations of mechanics, electronics, optics and software programs that are central to the realization of a combined single- and two-photon microscopy–based, optogenetics- and imaging-assisted, stable, simultaneous quadruple–viguple patch-clamp recording system. Setting up the system, from the beginning of instrument assembly and software installation to full operation, can be completed in 3–4 d. PMID:25654757

  14. Access control and privilege management in electronic health record: a systematic literature review.

    PubMed

    Jayabalan, Manoj; O'Daniel, Thomas

    2016-12-01

    This study presents a systematic literature review of access control for electronic health record systems to protect patient's privacy. Articles from 2006 to 2016 were extracted from the ACM Digital Library, IEEE Xplore Digital Library, Science Direct, MEDLINE, and MetaPress using broad eligibility criteria, and chosen for inclusion based on analysis of ISO22600. Cryptographic standards and methods were left outside the scope of this review. Three broad classes of models are being actively investigated and developed: access control for electronic health records, access control for interoperability, and access control for risk analysis. Traditional role-based access control models are extended with spatial, temporal, probabilistic, dynamic, and semantic aspects to capture contextual information and provide granular access control. Maintenance of audit trails and facilities for overriding normal roles to allow full access in emergency cases are common features. Access privilege frameworks utilizing ontology-based knowledge representation for defining the rules have attracted considerable interest, due to the higher level of abstraction that makes it possible to model domain knowledge and validate access requests efficiently.

  15. HEDEA: A Python Tool for Extracting and Analysing Semi-structured Information from Medical Records

    PubMed Central

    Aggarwal, Anshul; Garhwal, Sunita

    2018-01-01

    Objectives One of the most important functions for a medical practitioner while treating a patient is to study the patient's complete medical history by going through all records, from test results to doctor's notes. With the increasing use of technology in medicine, these records are mostly digital, alleviating the problem of looking through a stack of papers, which are easily misplaced, but some of these are in an unstructured form. Large parts of clinical reports are in written text form and are tedious to use directly without appropriate pre-processing. In medical research, such health records may be a good, convenient source of medical data; however, lack of structure means that the data is unfit for statistical evaluation. In this paper, we introduce a system to extract, store, retrieve, and analyse information from health records, with a focus on the Indian healthcare scene. Methods A Python-based tool, Healthcare Data Extraction and Analysis (HEDEA), has been designed to extract structured information from various medical records using a regular expression-based approach. Results The HEDEA system is working, covering a large set of formats, to extract and analyse health information. Conclusions This tool can be used to generate analysis report and charts using the central database. This information is only provided after prior approval has been received from the patient for medical research purposes. PMID:29770248

  16. HEDEA: A Python Tool for Extracting and Analysing Semi-structured Information from Medical Records.

    PubMed

    Aggarwal, Anshul; Garhwal, Sunita; Kumar, Ajay

    2018-04-01

    One of the most important functions for a medical practitioner while treating a patient is to study the patient's complete medical history by going through all records, from test results to doctor's notes. With the increasing use of technology in medicine, these records are mostly digital, alleviating the problem of looking through a stack of papers, which are easily misplaced, but some of these are in an unstructured form. Large parts of clinical reports are in written text form and are tedious to use directly without appropriate pre-processing. In medical research, such health records may be a good, convenient source of medical data; however, lack of structure means that the data is unfit for statistical evaluation. In this paper, we introduce a system to extract, store, retrieve, and analyse information from health records, with a focus on the Indian healthcare scene. A Python-based tool, Healthcare Data Extraction and Analysis (HEDEA), has been designed to extract structured information from various medical records using a regular expression-based approach. The HEDEA system is working, covering a large set of formats, to extract and analyse health information. This tool can be used to generate analysis report and charts using the central database. This information is only provided after prior approval has been received from the patient for medical research purposes.

  17. Merging Dietary Assessment with the Adolescent Lifestyle

    PubMed Central

    Schap, TusaRebecca E; Zhu, Fengqing M; Delp, Edward J; Boushey, Carol J

    2013-01-01

    The use of image-based dietary assessment methods shows promise for improving dietary self-report among children. The Technology Assisted Dietary Assessment (TADA) food record application is a self-administered food record specifically designed to address the burden and human error associated with conventional methods of dietary assessment. Users would take images of foods and beverages at all eating occasions using a mobile telephone or mobile device with an integrated camera, (e.g., Apple iPhone, Google Nexus One, Apple iPod Touch). Once the images are taken, the images are transferred to a back-end server for automated analysis. The first step in this process is image analysis, i.e., segmentation, feature extraction, and classification, allows for automated food identification. Portion size estimation is also automated via segmentation and geometric shape template modeling. The results of the automated food identification and volume estimation can be indexed with the Food and Nutrient Database for Dietary Studies (FNDDS) to provide a detailed diet analysis for use in epidemiologic or intervention studies. Data collected during controlled feeding studies in a camp-like setting have allowed for formative evaluation and validation of the TADA food record application. This review summarizes the system design and the evidence-based development of image-based methods for dietary assessment among children. PMID:23489518

  18. Reconstruction of human brain spontaneous activity based on frequency-pattern analysis of magnetoencephalography data

    PubMed Central

    Llinás, Rodolfo R.; Ustinin, Mikhail N.; Rykunov, Stanislav D.; Boyko, Anna I.; Sychev, Vyacheslav V.; Walton, Kerry D.; Rabello, Guilherme M.; Garcia, John

    2015-01-01

    A new method for the analysis and localization of brain activity has been developed, based on multichannel magnetic field recordings, over minutes, superimposed on the MRI of the individual. Here, a high resolution Fourier Transform is obtained over the entire recording period, leading to a detailed multi-frequency spectrum. Further analysis implements a total decomposition of the frequency components into functionally invariant entities, each having an invariant field pattern localizable in recording space. The method, addressed as functional tomography, makes it possible to find the distribution of magnetic field sources in space. Here, the method is applied to the analysis of simulated data, to oscillating signals activating a physical current dipoles phantom, and to recordings of spontaneous brain activity in 10 healthy adults. In the analysis of simulated data, 61 dipoles are localized with 0.7 mm precision. Concerning the physical phantom the method is able to localize three simultaneously activated current dipoles with 1 mm precision. Spatial resolution 3 mm was attained when localizing spontaneous alpha rhythm activity in 10 healthy adults, where the alpha peak was specified for each subject individually. Co-registration of the functional tomograms with each subject's head MRI localized alpha range activity to the occipital and/or posterior parietal brain region. This is the first application of this new functional tomography to human brain activity. The method successfully provides an overall view of brain electrical activity, a detailed spectral description and, combined with MRI, the localization of sources in anatomical brain space. PMID:26528119

  19. Time-Varying Networks of Inter-Ictal Discharging Reveal Epileptogenic Zone.

    PubMed

    Zhang, Luyan; Liang, Yi; Li, Fali; Sun, Hongbin; Peng, Wenjing; Du, Peishan; Si, Yajing; Song, Limeng; Yu, Liang; Xu, Peng

    2017-01-01

    The neuronal synchronous discharging may cause an epileptic seizure. Currently, most of the studies conducted to investigate the mechanism of epilepsy are based on EEGs or functional magnetic resonance imaging (fMRI) recorded during the ictal discharging or the resting-state, and few studies have probed into the dynamic patterns during the inter-ictal discharging that are much easier to record in clinical applications. Here, we propose a time-varying network analysis based on adaptive directed transfer function to uncover the dynamic brain network patterns during the inter-ictal discharging. In addition, an algorithm based on the time-varying outflow of information derived from the network analysis is developed to detect the epileptogenic zone. The analysis performed revealed the time-varying network patterns during different stages of inter-ictal discharging; the epileptogenic zone was activated prior to the discharge onset then worked as the source to propagate the activity to other brain regions. Consistence between the epileptogenic zones detected by our proposed approach and the actual epileptogenic zones proved that time-varying network analysis could not only reveal the underlying neural mechanism of epilepsy, but also function as a useful tool in detecting the epileptogenic zone based on the EEGs in the inter-ictal discharging.

  20. Validation of PC-based Sound Card with Biopac for Digitalization of ECG Recording in Short-term HRV Analysis.

    PubMed

    Maheshkumar, K; Dilara, K; Maruthy, K N; Sundareswaren, L

    2016-07-01

    Heart rate variability (HRV) analysis is a simple and noninvasive technique capable of assessing autonomic nervous system modulation on heart rate (HR) in healthy as well as disease conditions. The aim of the present study was to compare (validate) the HRV using a temporal series of electrocardiograms (ECG) obtained by simple analog amplifier with PC-based sound card (audacity) and Biopac MP36 module. Based on the inclusion criteria, 120 healthy participants, including 72 males and 48 females, participated in the present study. Following standard protocol, 5-min ECG was recorded after 10 min of supine rest by Portable simple analog amplifier PC-based sound card as well as by Biopac module with surface electrodes in Leads II position simultaneously. All the ECG data was visually screened and was found to be free of ectopic beats and noise. RR intervals from both ECG recordings were analyzed separately in Kubios software. Short-term HRV indexes in both time and frequency domain were used. The unpaired Student's t-test and Pearson correlation coefficient test were used for the analysis using the R statistical software. No statistically significant differences were observed when comparing the values analyzed by means of the two devices for HRV. Correlation analysis revealed perfect positive correlation (r = 0.99, P < 0.001) between the values in time and frequency domain obtained by the devices. On the basis of the results of the present study, we suggest that the calculation of HRV values in the time and frequency domains by RR series obtained from the PC-based sound card is probably as reliable as those obtained by the gold standard Biopac MP36.

  1. Standards for data acquisition and software-based analysis of in vivo electroencephalography recordings from animals. A TASK1-WG5 report of the AES/ILAE Translational Task Force of the ILAE.

    PubMed

    Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S

    2017-11-01

    Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  2. The Introduction and Refinement of the Assessment of Digitally Recorded Audio Presentations

    ERIC Educational Resources Information Center

    Sinclair, Stefanie

    2016-01-01

    This case study critically evaluates benefits and challenges of a form of assessment included in a final year undergraduate Religious Studies Open University module, which combines a written essay task with a digital audio recording of a short oral presentation. Based on the analysis of student and tutor feedback and sample assignments, this study…

  3. Non-contact cardiac pulse rate estimation based on web-camera

    NASA Astrophysics Data System (ADS)

    Wang, Yingzhi; Han, Tailin

    2015-12-01

    In this paper, we introduce a new methodology of non-contact cardiac pulse rate estimation based on the imaging Photoplethysmography (iPPG) and blind source separation. This novel's approach can be applied to color video recordings of the human face and is based on automatic face tracking along with blind source separation of the color channels into RGB three-channel component. First of all, we should do some pre-processings of the data which can be got from color video such as normalization and sphering. We can use spectrum analysis to estimate the cardiac pulse rate by Independent Component Analysis (ICA) and JADE algorithm. With Bland-Altman and correlation analysis, we compared the cardiac pulse rate extracted from videos recorded by a basic webcam to a Commercial pulse oximetry sensors and achieved high accuracy and correlation. Root mean square error for the estimated results is 2.06bpm, which indicates that the algorithm can realize the non-contact measurements of cardiac pulse rate.

  4. Improvement of Epicentral Direction Estimation by P-wave Polarization Analysis

    NASA Astrophysics Data System (ADS)

    Oshima, Mitsutaka

    2016-04-01

    Polarization analysis has been used to analyze the polarization characteristics of waves and developed in various spheres, for example, electromagnetics, optics, and seismology. As for seismology, polarization analysis is used to discriminate seismic phases or to enhance specific phase (e.g., Flinn, 1965)[1], by taking advantage of the difference in polarization characteristics of seismic phases. In earthquake early warning, polarization analysis is used to estimate the epicentral direction using single station, based on the polarization direction of P-wave portion in seismic records (e.g., Smart and Sproules(1981) [2], Noda et al.,(2012) [3]). Therefore, improvement of the Estimation of Epicentral Direction by Polarization Analysis (EEDPA) directly leads to enhance the accuracy and promptness of earthquake early warning. In this study, the author tried to improve EEDPA by using seismic records of events occurred around Japan from 2003 to 2013. The author selected the events that satisfy following conditions. MJMA larger than 6.5 (JMA: Japan Meteorological Agency). Seismic records are available at least 3 stations within 300km in epicentral distance. Seismic records obtained at stations with no information on seismometer orientation were excluded, so that precise and quantitative evaluation of accuracy of EEDPA becomes possible. In the analysis, polarization has calculated by Vidale(1986) [4] that extended the method proposed by Montalbetti and Kanasewich(1970)[5] to use analytical signal. As a result of the analysis, the author found that accuracy of EEDPA improves by about 15% if velocity records, not displacement records, are used contrary to the author's expectation. Use of velocity records enables reduction of CPU time in integration of seismic records and improvement in promptness of EEDPA, although this analysis is still rough and further scrutiny is essential. At this moment, the author used seismic records that obtained by simply integrating acceleration records and applied no filtering. Further study on optimal type of filter and its application frequency band is necessary. In poster presentation, the results of aforementioned study shall be shown. [1] Flinn, E. A. (1965) , Signal analysis using rectilinearity and direction of particle motion. Proceedings of the IEEE, 53(12), 1874-1876. [2] Smart, E., & Sproules, H. (1981), Regional phase processors (No. SDAC-TR-81-1). TELEDYNE GEOTECH ALEXANDRIA VA SEISMIC DATA ANALYSIS CENTER. [3] Noda, S., Yamamoto, S., Sato, S., Iwata, N., Korenaga, M., & Ashiya, K. (2012). Improvement of back-azimuth estimation in real-time by using a single station record. Earth, planets and space, 64(3), 305-308. [4] Vidale, J. E. (1986). Complex polarization analysis of particle motion. Bulletin of the Seismological society of America, 76(5), 1393-1405. [5] Montalbetti, J. F., & Kanasewich, E. R. (1970). Enhancement of teleseismic body phases with a polarization filter. Geophysical Journal International, 21(2), 119-129.

  5. Evaluation of modal pushover-based scaling of one component of ground motion: Tall buildings

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2012-01-01

    Nonlinear response history analysis (RHA) is now increasingly used for performance-based seismic design of tall buildings. Required for nonlinear RHAs is a set of ground motions selected and scaled appropriately so that analysis results would be accurate (unbiased) and efficient (having relatively small dispersion). This paper evaluates accuracy and efficiency of recently developed modal pushover–based scaling (MPS) method to scale ground motions for tall buildings. The procedure presented explicitly considers structural strength and is based on the standard intensity measure (IM) of spectral acceleration in a form convenient for evaluating existing structures or proposed designs for new structures. Based on results presented for two actual buildings (19 and 52 stories, respectively), it is demonstrated that the MPS procedure provided a highly accurate estimate of the engineering demand parameters (EDPs), accompanied by significantly reduced record-to-record variability of the responses. In addition, the MPS procedure is shown to be superior to the scaling procedure specified in the ASCE/SEI 7-05 document.

  6. Theoretical analysis of intracortical microelectrode recordings

    NASA Astrophysics Data System (ADS)

    Lempka, Scott F.; Johnson, Matthew D.; Moffitt, Michael A.; Otto, Kevin J.; Kipke, Daryl R.; McIntyre, Cameron C.

    2011-08-01

    Advanced fabrication techniques have now made it possible to produce microelectrode arrays for recording the electrical activity of a large number of neurons in the intact brain for both clinical and basic science applications. However, the long-term recording performance desired for these applications is hindered by a number of factors that lead to device failure or a poor signal-to-noise ratio (SNR). The goal of this study was to identify factors that can affect recording quality using theoretical analysis of intracortical microelectrode recordings of single-unit activity. Extracellular microelectrode recordings were simulated with a detailed multi-compartment cable model of a pyramidal neuron coupled to a finite-element volume conductor head model containing an implanted recording microelectrode. Recording noise sources were also incorporated into the overall modeling infrastructure. The analyses of this study would be very difficult to perform experimentally; however, our model-based approach enabled a systematic investigation of the effects of a large number of variables on recording quality. Our results demonstrate that recording amplitude and noise are relatively independent of microelectrode size, but instead are primarily affected by the selected recording bandwidth, impedance of the electrode-tissue interface and the density and firing rates of neurons surrounding the recording electrode. This study provides the theoretical groundwork that allows for the design of the microelectrode and recording electronics such that the SNR is maximized. Such advances could help enable the long-term functionality required for chronic neural recording applications.

  7. Theoretical analysis of intracortical microelectrode recordings

    PubMed Central

    Lempka, Scott F; Johnson, Matthew D; Moffitt, Michael A; Otto, Kevin J; Kipke, Daryl R; McIntyre, Cameron C

    2011-01-01

    Advanced fabrication techniques have now made it possible to produce microelectrode arrays for recording the electrical activity of a large number of neurons in the intact brain for both clinical and basic science applications. However, the long-term recording performance desired for these applications is hindered by a number of factors that lead to device failure or a poor signal-to-noise ratio (SNR). The goal of this study was to identify factors that can affect recording quality using theoretical analysis of intracortical microelectrode recordings of single-unit activity. Extracellular microelectrode recordings were simulated with a detailed multi-compartment cable model of a pyramidal neuron coupled to a finite element volume conductor head model containing an implanted recording microelectrode. Recording noise sources were also incorporated into the overall modeling infrastructure. The analyses of this study would be very difficult to perform experimentally; however, our model-based approach enabled a systematic investigation of the effects of a large number of variables on recording quality. Our results demonstrate that recording amplitude and noise are relatively independent of microelectrode size, but instead are primarily affected by the selected recording bandwidth, impedance of the electrode-tissue interface, and the density and firing rates of neurons surrounding the recording electrode. This study provides the theoretical groundwork that allows for the design of the microelectrode and recording electronics such that the SNR is maximized. Such advances could help enable the long-term functionality required for chronic neural recording applications. PMID:21775783

  8. Training Aids for Online Instruction: An Analysis.

    ERIC Educational Resources Information Center

    Guy, Robin Frederick

    This paper describes a number of different types of training aids currently employed in online training: non-interactive audiovisual presentations; interactive computer-based aids; partially interactive aids based on recorded searches; print-based materials; and kits. The advantages and disadvantages of each type of aid are noted, and a table…

  9. A simple computer-based measurement and analysis system of pulmonary auscultation sounds.

    PubMed

    Polat, Hüseyin; Güler, Inan

    2004-12-01

    Listening to various lung sounds has proven to be an important diagnostic tool for detecting and monitoring certain types of lung diseases. In this study a computer-based system has been designed for easy measurement and analysis of lung sound using the software package DasyLAB. The designed system presents the following features: it is able to digitally record the lung sounds which are captured with an electronic stethoscope plugged to a sound card on a portable computer, display the lung sound waveform for auscultation sites, record the lung sound into the ASCII format, acoustically reproduce the lung sound, edit and print the sound waveforms, display its time-expanded waveform, compute the Fast Fourier Transform (FFT), and display the power spectrum and spectrogram.

  10. Variability of road traffic noise recorded by stationary monitoring stations

    NASA Astrophysics Data System (ADS)

    Bąkowski, Andrzej; Radziszewski, Leszek

    2017-11-01

    The paper presents the analysis results of equivalent sound level recorded by two road traffic noise monitoring stations. The stations were located in Kielce (an example of a medium-size town in Poland) at the roads out of the town in the direction of Kraków and Warszawa. The measurements were carried out through stationary stations monitoring the noise and traffic of motor vehicles. The RMS values based on A-weighted sound level were recorded every 1 s in the buffer and the results were registered every 1 min over the period of investigations. The registered data were the basis for calculating the equivalent sound level for three time intervals: from 6:00 to 18:00, from 18:00 to 22:00 and from 22:00 to 6:00. Analysis included the values of the equivalent sound level recorded for different days of the week split into 24h periods, nights, days and evenings. The data analysed included recordings from 2013. The coefficient of variation and positional variation were proposed for performing comparative analysis of the obtained data scattering. The investigations indicated that the recorded data varied depending on the traffic routes. The differences concerned the values of coefficients of variation of the equivalent sound levels.

  11. The Sensetivity of Flood Frequency Analysis on Record Length in Continuous United States

    NASA Astrophysics Data System (ADS)

    Hu, L.; Nikolopoulos, E. I.; Anagnostou, E. N.

    2017-12-01

    In flood frequency analysis (FFA), sufficiently long data series are important to get more reliable results. Compared to return periods of interest, at-site FFA usually needs large data sets. Generally, the precision of at site estimators and time-sampling errors are associated with the length of a gauged record. In this work, we quantify the difference with various record lengths. we use generalized extreme value (GEV) and Log Pearson type III (LP3), two traditional methods on annual maximum stream flows to undertake FFA, and propose quantitative ways, relative difference in median and interquartile range (IQR) to compare the flood frequency performances on different record length from selected 350 USGS gauges, which have more than 70 years record length in Continuous United States. Also, we group those gauges into different regions separately based on hydrological unit map and discuss the geometry impacts. The results indicate that long record length can avoid imposing an upper limit on the degree of sophistication. Working with relatively longer record length may lead accurate results than working with shorter record length. Furthermore, the influence of hydrologic unites for the watershed boundary dataset on those gauges also be presented. The California region is the most sensitive to record length, while gauges in the east perform steady.

  12. Automated Assessment of Child Vocalization Development Using LENA.

    PubMed

    Richards, Jeffrey A; Xu, Dongxin; Gilkerson, Jill; Yapanel, Umit; Gray, Sharmistha; Paul, Terrance

    2017-07-12

    To produce a novel, efficient measure of children's expressive vocal development on the basis of automatic vocalization assessment (AVA), child vocalizations were automatically identified and extracted from audio recordings using Language Environment Analysis (LENA) System technology. Assessment was based on full-day audio recordings collected in a child's unrestricted, natural language environment. AVA estimates were derived using automatic speech recognition modeling techniques to categorize and quantify the sounds in child vocalizations (e.g., protophones and phonemes). These were expressed as phone and biphone frequencies, reduced to principal components, and inputted to age-based multiple linear regression models to predict independently collected criterion-expressive language scores. From these models, we generated vocal development AVA estimates as age-standardized scores and development age estimates. AVA estimates demonstrated strong statistical reliability and validity when compared with standard criterion expressive language assessments. Automated analysis of child vocalizations extracted from full-day recordings in natural settings offers a novel and efficient means to assess children's expressive vocal development. More research remains to identify specific mechanisms of operation.

  13. Analysis on the Correlation of Traffic Flow in Hainan Province Based on Baidu Search

    NASA Astrophysics Data System (ADS)

    Chen, Caixia; Shi, Chun

    2018-03-01

    Internet search data records user’s search attention and consumer demand, providing necessary database for the Hainan traffic flow model. Based on Baidu Index, with Hainan traffic flow as example, this paper conduct both qualitative and quantitative analysis on the relationship between search keyword from Baidu Index and actual Hainan tourist traffic flow, and build multiple regression model by SPSS.

  14. Do's and don'ts in Fourier analysis of steady-state potentials.

    PubMed

    Bach, M; Meigen, T

    1999-01-01

    Fourier analysis is a powerful tool in signal analysis that can be very fruitfully applied to steady-state evoked potentials (flicker ERG, pattern ERG, VEP, etc.). However, there are some inherent assumptions in the underlying discrete Fourier transform (DFT) that are not necessarily fulfilled in typical electrophysiological recording and analysis conditions. Furthermore, engineering software-packages may be ill-suited and/or may not fully exploit the information of steady-state recordings. Specifically: * In the case of steady-state stimulation we know more about the stimulus than in standard textbook situations (exact frequency, phase stability), so 'windowing' and calculation of the 'periodogram' are not necessary. * It is mandatory to choose an integer relationship between sampling rate and frame rate when employing a raster-based CRT stimulator. * The analysis interval must comprise an exact integer number (e.g., 10) of stimulus periods. * The choice of the number of stimulus periods per analysis interval needs a wise compromise: A high number increases the frequency resolution, but makes artifact removal difficult; a low number 'spills' noise into the response frequency. * There is no need to feel tied to a power-of-two number of data points as required by standard FFT, 'resampling' is an easy and efficient alternative. * Proper estimates of noise-corrected Fourier magnitude and statistical significance can be calculated that take into account the non-linear superposition of signal and noise. These aspects are developed in an intuitive approach with examples using both simulations and recordings. Proper use of Fourier analysis of our electrophysiological records will reduce recording time and/or increase the reliability of physiologic or pathologic interpretations.

  15. The Monitoring Erosion of Agricultural Land and spatial database of erosion events

    NASA Astrophysics Data System (ADS)

    Kapicka, Jiri; Zizala, Daniel

    2013-04-01

    In 2011 originated in The Czech Republic The Monitoring Erosion of Agricultural Land as joint project of State Land Office (SLO) and Research Institute for Soil and Water Conservation (RISWC). The aim of the project is collecting and record keeping information about erosion events on agricultural land and their evaluation. The main idea is a creation of a spatial database that will be source of data and information for evaluation and modeling erosion process, for proposal of preventive measures and measures to reduce negative impacts of erosion events. A subject of monitoring is the manifestations of water erosion, wind erosion and slope deformation in which cause damaged agriculture land. A website, available on http://me.vumop.cz, is used as a tool for keeping and browsing information about monitored events. SLO employees carry out record keeping. RISWC is specialist institute in the Monitoring Erosion of Agricultural Land that performs keeping the spatial database, running the website, managing the record keeping of events, analysis the cause of origins events and statistical evaluations of keeping events and proposed measures. Records are inserted into the database using the user interface of the website which has map server as a component. Website is based on database technology PostgreSQL with superstructure PostGIS and MapServer UMN. Each record is in the database spatial localized by a drawing and it contains description information about character of event (data, situation description etc.) then there are recorded information about land cover and about grown crops. A part of database is photodocumentation which is taken in field reconnaissance which is performed within two days after notify of event. Another part of database are information about precipitations from accessible precipitation gauges. Website allows to do simple spatial analysis as are area calculation, slope calculation, percentage representation of GAEC etc.. Database structure was designed on the base of needs analysis inputs to mathematical models. Mathematical models are used for detailed analysis of chosen erosion events which include soil analysis. Till the end 2012 has had the database 135 events. The content of database still accrues and gives rise to the extensive source of data that is usable for testing mathematical models.

  16. In Pursuit of Reciprocity: Researchers, Teachers, and School Reformers Engaged in Collaborative Analysis of Video Records

    ERIC Educational Resources Information Center

    Curry, Marnie W.

    2012-01-01

    In the ideal, reciprocity in qualitative inquiry occurs when there is give-and-take between researchers and the researched; however, the demands of the academy and resource constraints often make the pursuit of reciprocity difficult. Drawing on two video-based, qualitative studies in which researchers utilized video records as resources to enhance…

  17. Holocene Changes in the Distribution and Abundance of Oaks in California

    Treesearch

    Roger Byrne; Eric Edlund; Scott Mensing

    1991-01-01

    Our knowledge of the long-term history of oaks is primarily based on biogeographical analysis of anomalous distribution patterns and paleobotanical macrofossil evidence. Neither of these provide a continuous record of change. In this paper, we present fossil pollen evidence which records significant changes in oak abundance over the last 10,000 years. Between 10,000-5,...

  18. Available number of multiplexed holograms based on signal-to-noise ratio analysis in reflection-type holographic memory using three-dimensional speckle-shift multiplexing.

    PubMed

    Nishizaki, Tatsuya; Matoba, Osamu; Nitta, Kouichi

    2014-09-01

    The recording properties of three-dimensional speckle-shift multiplexing in reflection-type holographic memory are analyzed numerically. Three-dimensional recording can increase the number of multiplexed holograms by suppressing the cross-talk noise from adjacent holograms by using depth-direction multiplexing rather than in-plane multiplexing. Numerical results indicate that the number of multiplexed holograms in three-layer recording can be increased by 1.44 times as large as that of a single-layer recording when an acceptable signal-to-noise ratio is set to be 2 when NA=0.43 and the thickness of the recording medium is 0.5 mm.

  19. Based on records of Three Gorge Telemetric Seismic Network to analyze Vibration process of micro fracture of rock landslide

    NASA Astrophysics Data System (ADS)

    WANG, Q.

    2017-12-01

    Used the finite element analysis software GeoStudio to establish vibration analysis model of Qianjiangping landslide, which locates at the Three Gorges Reservoir area. In QUAKE/W module, we chosen proper Dynamic elasticity modulus and Poisson's ratio of soil layer and rock stratum. When loading, we selected the waveform data record of Three Gorge Telemetric Seismic Network as input ground motion, which includes five rupture events recorded of Lujiashan seismic station. In dynamic simulating, we mainly focused on sliding process when the earthquake date record was applied. The simulation result shows that Qianjiangping landslide wasn't not only affected by its own static force, but also experienced the dynamic process of micro fracture-creep-slip rupture-creep-slip.it provides a new approach for the early warning feasibility of rock landslide in future research.

  20. Ceramic-based microelectrode arrays: recording surface characteristics and topographical analysis

    PubMed Central

    Talauliker, Pooja M.; Price, David A.; Burmeister, Jason J.; Nagari, Silpa; Quintero, Jorge E.; Pomerleau, Francois; Huettl, Peter; Hastings, J. Todd; Gerhardt, Greg A.

    2011-01-01

    Amperometric measurements using microelectrode arrays (MEAs) provide spatially and temporally resolved measures of neuromolecules in the central nervous system of rats, mice and non-human primates. Multi-site MEAs can be mass fabricated on ceramic (Al2O3) substrate using photolithographic methods, imparting a high level of precision and reproducibility in a rigid but durable recording device. Although the functional capabilities of MEAs have been previously documented for both anesthetized and freely-moving paradigms, the performance enabling intrinsic physical properties of the MEA device have not heretofore been presented. In these studies, spectral analysis confirmed that the MEA recording sites were primarily composed of elemental platinum (Pt°). In keeping with the precision of the photolithographic process, scanning electron microscopy revealed that the Pt recording sites have unique microwell geometries post-fabrication. Atomic force microscopy demonstrated that the recording surfaces have nanoscale irregularities in the form of elevations and depressions, which contribute to increased current per unit area that exceeds previously reported microelectrode designs. The ceramic substrate on the back face of the MEA was characterized by low nanoscale texture and the ceramic sides consisted of an extended network of ridges and cavities. Thus, individual recording sites have a unique Pt° composition and surface profile that has not been previously observed for Pt-based microelectrodes. These features likely impact the physical chemistry of the device, which may influence adhesion of biological molecules and tissue as well as electrochemical recording performance post-implantation. This study is a necessary step towards understanding and extending the performance abilities of MEAs in vivo. PMID:21513736

  1. CARDfile data base representativeness, Phase 1 : general characteristics including populations, vehicles, roads, and fatal accidents

    DOT National Transportation Integrated Search

    1988-08-01

    This report details the results of an analysis performed to evaluate the : representativeness of the Crash Avoidance Research accident data base : (CARDfile). The accident records for 1983 and 1984 from six states (Indiana, : Maryland, Michigan, Penn...

  2. CARDfile Data Base Representatives - Phase I: General Characteristics including Populations, Vehicles, Roads, and Fatal Accidents

    DOT National Transportation Integrated Search

    1985-12-01

    This report details the results of an analysis performed to evaluate the representativeness of the Crash Avoidance Research accident data base (CARDfile). The accident records for 1983 and 1984 from six states (Indiana, Maryland, Michigan, Pennsylvan...

  3. Improved Overpressure Recording and Modeling for Near-Surface Explosion Forensics

    NASA Astrophysics Data System (ADS)

    Kim, K.; Schnurr, J.; Garces, M. A.; Rodgers, A. J.

    2017-12-01

    The accurate recording and analysis of air-blast acoustic waveforms is a key component of the forensic analysis of explosive events. Smartphone apps can enhance traditional technologies by providing scalable, cost-effective ubiquitous sensor solutions for monitoring blasts, undeclared activities, and inaccessible facilities. During a series of near-surface chemical high explosive tests, iPhone 6's running the RedVox infrasound recorder app were co-located with high-fidelity Hyperion overpressure sensors, allowing for direct comparison of the resolution and frequency content of the devices. Data from the traditional sensors is used to characterize blast signatures and to determine relative iPhone microphone amplitude and phase responses. A Wiener filter based source deconvolution method is applied, using a parameterized source function estimated from traditional overpressure sensor data, to estimate system responses. In addition, progress on a new parameterized air-blast model is presented. The model is based on the analysis of a large set of overpressure waveforms from several surface explosion test series. An appropriate functional form with parameters determined empirically from modern air-blast and acoustic data will allow for better parameterization of signals and the improved characterization of explosive sources.

  4. [Analysis on regularity of prescriptions in "a guide to clinical practice with medical record" for diarrhoea based on traditional Chinese medicine inheritance support system].

    PubMed

    He, Lan-Juan; Zhu, Xiang-Dong

    2016-06-01

    To analyze the regularities of prescriptions in "a guide to clinical practice with medical record" (Ye Tianshi) for diarrhoea based on traditional Chinese medicine inheritance support system(V2.5), and provide a reference for further research and development of new traditional Chinese medicines in treating diarrhoea. Traditional Chinese medicine inheritance support system was used to build a prescription database of Chinese medicines for diarrhoea. The software integration data mining method was used to analyze the prescriptions according to "four natures", "five flavors" and "meridians" in the database and achieve frequency statistics, syndrome distribution, prescription regularity and new prescription analysis. An analysis on 94 prescriptions for diarrhoea was used to determine the frequencies of medicines in prescriptions, commonly used medicine pairs and combinations, and achieve 13 new prescriptions. This study indicated that the prescriptions for diarrhoea in "a guide to clinical practice with medical record" are mostly of eliminating dampness and tonifying deficienccy, with neutral drug property, sweet, bitter or hot in flavor, and reflecting the treatment principle of "activating spleen-energy and resolving dampness". Copyright© by the Chinese Pharmaceutical Association.

  5. Acoustic Propagation Studies For Sperm Whale Phonation Analysis During LADC Experiments

    NASA Astrophysics Data System (ADS)

    Sidorovskaia, Natalia A.; Ioup, George E.; Ioup, Juliette W.; Caruthers, Jerald W.

    2004-11-01

    The Littoral Acoustic Demonstration Center (LADC) conducted a series of passive acoustic experiments in the Northern Gulf of Mexico and the Ligurian Sea in 2001 and 2002. Environmental and acoustic moorings were deployed in areas of large concentrations of marine mammals (mainly, sperm whales). Recordings and analysis of whale phonations are among the objectives of the project. Each mooring had a single autonomously recording hydrophone (Environmental Acoustic Recording System (EARS)) obtained from the U.S. Naval Oceanographic Office after modification to record signals up to 5,859 Hz in the Gulf of Mexico and up to 12,500 Hz in the Ligurian Sea. Self-recording environmental sensors, attached to the moorings, and concurrent environmental ship surveys provided the environmental data for the experiments. The results of acoustic simulations of long-range propagation of the broad-band (500-6,000 Hz) phonation pulses from a hypothetical whale location to the recording hydrophone in the experimental environments are presented. The utilization of the simulation results for an interpretation of the spectral features observed in whale clicks and for the development of tracking algorithms from single hydrophone recordings based on the identification of direct and surface and bottom reflected arrivals are discussed. [Research supported by ONR.

  6. Techniques for extracting single-trial activity patterns from large-scale neural recordings

    PubMed Central

    Churchland, Mark M; Yu, Byron M; Sahani, Maneesh; Shenoy, Krishna V

    2008-01-01

    Summary Large, chronically-implanted arrays of microelectrodes are an increasingly common tool for recording from primate cortex, and can provide extracellular recordings from many (order of 100) neurons. While the desire for cortically-based motor prostheses has helped drive their development, such arrays also offer great potential to advance basic neuroscience research. Here we discuss the utility of array recording for the study of neural dynamics. Neural activity often has dynamics beyond that driven directly by the stimulus. While governed by those dynamics, neural responses may nevertheless unfold differently for nominally identical trials, rendering many traditional analysis methods ineffective. We review recent studies – some employing simultaneous recording, some not – indicating that such variability is indeed present both during movement generation, and during the preceding premotor computations. In such cases, large-scale simultaneous recordings have the potential to provide an unprecedented view of neural dynamics at the level of single trials. However, this enterprise will depend not only on techniques for simultaneous recording, but also on the use and further development of analysis techniques that can appropriately reduce the dimensionality of the data, and allow visualization of single-trial neural behavior. PMID:18093826

  7. Explosion Source Similarity Analysis via SVD

    NASA Astrophysics Data System (ADS)

    Yedlin, Matthew; Ben Horin, Yochai; Margrave, Gary

    2016-04-01

    An important seismological ingredient for establishing a regional seismic nuclear discriminant is the similarity analysis of a sequence of explosion sources. To investigate source similarity, we are fortunate to have access to a sequence of 1805 three-component recordings of quarry blasts, shot from March 2002 to January 2015. The centroid of these blasts has an estimated location 36.3E and 29.9N. All blasts were detonated by JPMC (Jordan Phosphate Mines Co.) All data were recorded at the Israeli NDC, HFRI, located at 30.03N and 35.03E. Data were first winnowed based on the distribution of maximum amplitudes in the neighborhood of the P-wave arrival. The winnowed data were then detrended using the algorithm of Cleveland et al (1990). The detrended data were bandpass filtered between .1 to 12 Hz using an eighth order Butterworth filter. Finally, data were sorted based on maximum trace amplitude. Two similarity analysis approaches were used. First, for each component, the entire suite of traces was decomposed into its eigenvector representation, by employing singular-valued decomposition (SVD). The data were then reconstructed using 10 percent of the singular values, with the resulting enhancement of the S-wave and surface wave arrivals. The results of this first method are then compared to the second analysis method based on the eigenface decomposition analysis of Turk and Pentland (1991). While both methods yield similar results in enhancement of data arrivals and reduction of data redundancy, more analysis is required to calibrate the recorded data to charge size, a quantity that was not available for the current study. References Cleveland, R. B., Cleveland, W. S., McRae, J. E., and Terpenning, I., Stl: A seasonal-trend decomposition procedure based on loess, Journal of Official Statistics, 6, No. 1, 3-73, 1990. Turk, M. and Pentland, A., Eigenfaces for recognition. Journal of cognitive neuroscience, 3(1), 71-86, 1991.

  8. Design of microcontroller-based EMG and the analysis of EMG signals.

    PubMed

    Güler, Nihal Fatma; Hardalaç, Firat

    2002-04-01

    In this work, a microcontroller-based EMG designed and tested on 40 patients. When the patients are in rest, the fast Fourier transform (FFT) analysis was applied to EMG signals recorded from right leg peroneal region. The histograms are constructed from the results of the FFT analysis. The analysis results shows that the amplitude of fibrillation potential of the muscle fiber of 30 patients measured from peroneal region is low and the duration is short. This is the reason why the motor nerves degenerated and 10 patients were found to be healthy.

  9. Comparison of a spatio-temporal speleothem-based reconstruction of late Holocene climate variability to the timing of cultural developments

    NASA Astrophysics Data System (ADS)

    Deininger, Michael; Lippold, Jörg; Abele, Florian; McDermott, Frank

    2016-04-01

    Speleothems are considered as a valuable continental climate archive. Their δ18O records provide information onto past changes of the atmospheric circulation accompanied by changes in surface air temperature and precipitation. During the last decades European speleothem studies have assembled a European speleothem network (including numerous speleothem δ18O records) that allow now not only to picture past climate variability in time but also in space. In particular the climate variability of the last 4.5 ka was investigated by these studies. This allows the comparison of the speleothem-based reconstructed palaeoclimate with the timings of the rise and fall of ancient civilisations in this period - including the Dark Ages. Here we evaluate a compilation of 10 speleothem δ18O records covering the last 4.5 ka using a Monte Carlo based Principal Component Analysis (MC-PCA) that accounts for uncertainties in individual speleothem age models and for the different and varying temporal resolutions of each speleothem δ18O record. Our MC-PCA approach allows not only the identification of temporally coherent changes in δ18O records, i.e. the common signal in all investigated speleothem δ18O records, but it also facilitates their depiction and evaluation spatially. The speleothem δ18O records are spanning almost the entire European continent ranging from the western Margin of the European continent to Northern Turkey and from Northern Italy to Norway. For the MC-PCA analysis the 4.5 ka are divided into eight 1ka long time windows that overlap the subsequent time window by 500 years to allow a comparison of the spatio-temporal evolution of the common signal. For every single time window we derive a common mode of climate variability of all speleothem δ18O records as well as its spatial extent. This allows us to compare the rise and fall of ancient civilisations, like the Hittite and the Roman Empire, with our reconstructed spatio-temporal record.

  10. Merging dietary assessment with the adolescent lifestyle.

    PubMed

    Schap, T E; Zhu, F; Delp, E J; Boushey, C J

    2014-01-01

    The use of image-based dietary assessment methods shows promise for improving dietary self-report among children. The Technology Assisted Dietary Assessment (TADA) food record application is a self-administered food record specifically designed to address the burden and human error associated with conventional methods of dietary assessment. Users would take images of foods and beverages at all eating occasions using a mobile telephone or mobile device with an integrated camera [e.g. Apple iPhone, Apple iPod Touch (Apple Inc., Cupertino, CA, USA); Nexus One (Google, Mountain View, CA, USA)]. Once the images are taken, the images are transferred to a back-end server for automated analysis. The first step in this process is image analysis (i.e. segmentation, feature extraction and classification), which allows for automated food identification. Portion size estimation is also automated via segmentation and geometric shape template modeling. The results of the automated food identification and volume estimation can be indexed with the Food and Nutrient Database for Dietary Studies to provide a detailed diet analysis for use in epidemiological or intervention studies. Data collected during controlled feeding studies in a camp-like setting have allowed for formative evaluation and validation of the TADA food record application. This review summarises the system design and the evidence-based development of image-based methods for dietary assessment among children. © 2013 The Authors Journal of Human Nutrition and Dietetics © 2013 The British Dietetic Association Ltd.

  11. Automated processing of the single-lead electrocardiogram for the detection of obstructive sleep apnoea.

    PubMed

    de Chazal, Philip; Heneghan, Conor; Sheridan, Elaine; Reilly, Richard; Nolan, Philip; O'Malley, Mark

    2003-06-01

    A method for the automatic processing of the electrocardiogram (ECG) for the detection of obstructive apnoea is presented. The method screens nighttime single-lead ECG recordings for the presence of major sleep apnoea and provides a minute-by-minute analysis of disordered breathing. A large independently validated database of 70 ECG recordings acquired from normal subjects and subjects with obstructive and mixed sleep apnoea, each of approximately eight hours in duration, was used throughout the study. Thirty-five of these recordings were used for training and 35 retained for independent testing. A wide variety of features based on heartbeat intervals and an ECG-derived respiratory signal were considered. Classifiers based on linear and quadratic discriminants were compared. Feature selection and regularization of classifier parameters were used to optimize classifier performance. Results show that the normal recordings could be separated from the apnoea recordings with a 100% success rate and a minute-by-minute classification accuracy of over 90% is achievable.

  12. Two Methods of Automatic Evaluation of Speech Signal Enhancement Recorded in the Open-Air MRI Environment

    NASA Astrophysics Data System (ADS)

    Přibil, Jiří; Přibilová, Anna; Frollo, Ivan

    2017-12-01

    The paper focuses on two methods of evaluation of successfulness of speech signal enhancement recorded in the open-air magnetic resonance imager during phonation for the 3D human vocal tract modeling. The first approach enables to obtain a comparison based on statistical analysis by ANOVA and hypothesis tests. The second method is based on classification by Gaussian mixture models (GMM). The performed experiments have confirmed that the proposed ANOVA and GMM classifiers for automatic evaluation of the speech quality are functional and produce fully comparable results with the standard evaluation based on the listening test method.

  13. Analyses Reveal Record-Shattering Global Warm Temperatures in 2015

    NASA Image and Video Library

    2016-01-20

    2015 was the warmest year since modern record-keeping began in 1880, according to a new analysis by NASA’s Goddard Institute for Space Studies. The record-breaking year continues a long-term warming trend — 15 of the 16 warmest years on record have now occurred since 2001. Credits: Scientific Visualization Studio/Goddard Space Flight Center Details: Earth’s 2015 surface temperatures were the warmest since modern record keeping began in 1880, according to independent analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA). Globally-averaged temperatures in 2015 shattered the previous mark set in 2014 by 0.23 degrees Fahrenheit (0.13 Celsius). Only once before, in 1998, has the new record been greater than the old record by this much. The 2015 temperatures continue a long-term warming trend, according to analyses by scientists at NASA’s Goddard Institute for Space Studies (GISS) in New York (GISTEMP). NOAA scientists agreed with the finding that 2015 was the warmest year on record based on separate, independent analyses of the data. Because weather station locations and measurements change over time, there is some uncertainty in the individual values in the GISTEMP index. Taking this into account, NASA analysis estimates 2015 was the warmest year with 94 percent certainty.

  14. Reading the medical record. I. Analysis of physicians' ways of reading the medical record.

    PubMed

    Nygren, E; Henriksson, P

    1992-01-01

    Physicians were interviewed about their routines in everyday use of the medical record. From the interviews, we conclude that the medical record is a well functioning working instrument for the experienced physician. Using the medical record as a basis for decision making involves interpretation of format, layout and other textural features of the type-written data. Interpretation of these features provides effective guidance in the process of searching, reading and assessing the relevance of different items of information in the record. It seems that this is a skill which is an integrated part of diagnostic expertise. This skill plays an important role in decision making based on the large amount of information about a patient, which is exhibited to the reader in the medical record. This finding has implications for the design of user interfaces for reading computerized medical records.

  15. Can Link Analysis Be Applied to Identify Behavioral Patterns in Train Recorder Data?

    PubMed

    Strathie, Ailsa; Walker, Guy H

    2016-03-01

    A proof-of-concept analysis was conducted to establish whether link analysis could be applied to data from on-train recorders to detect patterns of behavior that could act as leading indicators of potential safety issues. On-train data recorders capture data about driving behavior on thousands of routine journeys every day and offer a source of untapped data that could be used to offer insights into human behavior. Data from 17 journeys undertaken by six drivers on the same route over a 16-hr period were analyzed using link analysis, and four key metrics were examined: number of links, network density, diameter, and sociometric status. The results established that link analysis can be usefully applied to data captured from on-vehicle recorders. The four metrics revealed key differences in normal driver behavior. These differences have promising construct validity as leading indicators. Link analysis is one method that could be usefully applied to exploit data routinely gathered by on-vehicle data recorders. It facilitates a proactive approach to safety based on leading indicators, offers a clearer understanding of what constitutes normal driving behavior, and identifies trends at the interface of people and systems, which is currently a key area of strategic risk. These research findings have direct applications in the field of transport data monitoring. They offer a means of automatically detecting patterns in driver behavior that could act as leading indicators of problems during operation and that could be used in the proactive monitoring of driver competence, risk management, and even infrastructure design. © 2015, Human Factors and Ergonomics Society.

  16. Java Programs for Using Newmark's Method and Simplified Decoupled Analysis to Model Slope Performance During Earthquakes

    USGS Publications Warehouse

    Jibson, Randall W.; Jibson, Matthew W.

    2003-01-01

    Landslides typically cause a large proportion of earthquake damage, and the ability to predict slope performance during earthquakes is important for many types of seismic-hazard analysis and for the design of engineered slopes. Newmark's method for modeling a landslide as a rigid-plastic block sliding on an inclined plane provides a useful method for predicting approximate landslide displacements. Newmark's method estimates the displacement of a potential landslide block as it is subjected to earthquake shaking from a specific strong-motion record (earthquake acceleration-time history). A modification of Newmark's method, decoupled analysis, allows modeling landslides that are not assumed to be rigid blocks. This open-file report is available on CD-ROM and contains Java programs intended to facilitate performing both rigorous and simplified Newmark sliding-block analysis and a simplified model of decoupled analysis. For rigorous analysis, 2160 strong-motion records from 29 earthquakes are included along with a search interface for selecting records based on a wide variety of record properties. Utilities are available that allow users to add their own records to the program and use them for conducting Newmark analyses. Also included is a document containing detailed information about how to use Newmark's method to model dynamic slope performance. This program will run on any platform that supports the Java Runtime Environment (JRE) version 1.3, including Windows, Mac OSX, Linux, Solaris, etc. A minimum of 64 MB of available RAM is needed, and the fully installed program requires 400 MB of disk space.

  17. Identifying past fire regimes throughout the Holocene in Ireland using new and established methods of charcoal analysis

    NASA Astrophysics Data System (ADS)

    Hawthorne, Donna; Mitchell, Fraser J. G.

    2016-04-01

    Globally, in recent years there has been an increase in the scale, intensity and level of destruction caused by wildfires. This can be seen in Ireland where significant changes in vegetation, land use, agriculture and policy, have promoted an increase in fires in the Irish landscape. This study looks at wildfire throughout the Holocene and draws on lacustrine charcoal records from seven study sites spread across Ireland, to reconstruct the past fire regimes recorded at each site. This work utilises new and accepted methods of fire history reconstruction to provide a recommended analytical procedure for statistical charcoal analysis. Digital charcoal counting was used and fire regime reconstructions carried out via the CharAnalysis programme. To verify this record new techniques are employed; an Ensemble-Member strategy to remove the objectivity associated with parameter selection, a Signal to Noise Index to determine if the charcoal record is appropriate for peak detection, and a charcoal peak screening procedure to validate the identified fire events based on bootstrapped samples. This analysis represents the first study of its kind in Ireland, examining the past record of fire on a multi-site and paleoecological timescale, and will provide a baseline level of data which can be built on in the future when the frequency and intensity of fire is predicted to increase.

  18. Use of streamflow data to estimate base flowground-water recharge for Wisconsin

    USGS Publications Warehouse

    Gebert, W.A.; Radloff, M.J.; Considine, E.J.; Kennedy, J.L.

    2007-01-01

    The average annual base flow/recharge was determined for streamflow-gaging stations throughout Wisconsin by base-flow separation. A map of the State was prepared that shows the average annual base flow for the period 1970-99 for watersheds at 118 gaging stations. Trend analysis was performed on 22 of the 118 streamflow-gaging stations that had long-term records, unregulated flow, and provided aerial coverage of the State. The analysis found that a statistically significant increasing trend was occurring for watersheds where the primary land use was agriculture. Most gaging stations where the land cover was forest had no significant trend. A method to estimate the average annual base flow at ungaged sites was developed by multiple-regression analysis using basin characteristics. The equation with the lowest standard error of estimate, 9.5%, has drainage area, soil infiltration and base flow factor as independent variables. To determine the average annual base flow for smaller watersheds, estimates were made at low-flow partial-record stations in 3 of the 12 major river basins in Wisconsin. Regression equations were developed for each of the three major river basins using basin characteristics. Drainage area, soil infiltration, basin storage and base-flow factor were the independent variables in the regression equations with the lowest standard error of estimate. The standard error of estimate ranged from 17% to 52% for the three river basins. ?? 2007 American Water Resources Association.

  19. [Web-based electronic patient record as an instrument for quality assurance within an integrated care concept].

    PubMed

    Händel, A; Jünemann, A G M; Prokosch, H-U; Beyer, A; Ganslandt, T; Grolik, R; Klein, A; Mrosek, A; Michelson, G; Kruse, F E

    2009-03-01

    A prerequisite for integrated care programmes is the implementation of a communication network meeting quality assurance standards. Against this background the main objective of the integrated care project between the University Eye Hospital Erlangen and the health insurance company AOK Bayern was to evaluate the potential and the acceptance of a web-based electronic patient record in the context of cataract and retinal surgery. Standardised modules for capturing pre-, intra- and post-operative data on the basis of clinical pathway guidelines for cataract- and retinal surgery have been developed. There are 6 data sets recorded per patient (1 pre-operative, 1 operative, 4-6 post-operative). For data collection, a web-based communication system (Soarian Integrated Care) has been chosen which meets the high requirements in data security, as well as being easy to handle. This teleconsultation system and the embedded electronic patient record are independent of the software used by respective offices and hospitals. Data transmission and storage were carried out in real-time. At present, 101 private ophthalmologists are taking part in the IGV contract with the University Eye Hospital Erlangen. This corresponds to 52% of all private ophthalmologists in the region. During the period from January 1st 2006 to December 31st 2006, 1844 patients were entered. Complete documentation was achieved in 1390 (75%) of all surgical procedures. For evaluation of this data, a multidimensional report and analysis tool (Cognos) was used. The deviation from target refraction as one quality indicator was in the mean 0.09 diopter. The web-based patient record used in this project was highly accepted by the private ophthalmologists. However there are still general concerns against the exchange of medical data via the internet. Nevertheless, the web-based patient record is an essential tool for a functional integration between the ambulatory and stationary health-care units. In addition to the telemedicine functions of the system, we achieved the export of the data to a data warehouse system in order to provide a flexible and powerful tool for quality assurance analysis and reporting.

  20. Spatial-temporal discriminant analysis for ERP-based brain-computer interface.

    PubMed

    Zhang, Yu; Zhou, Guoxu; Zhao, Qibin; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2013-03-01

    Linear discriminant analysis (LDA) has been widely adopted to classify event-related potential (ERP) in brain-computer interface (BCI). Good classification performance of the ERP-based BCI usually requires sufficient data recordings for effective training of the LDA classifier, and hence a long system calibration time which however may depress the system practicability and cause the users resistance to the BCI system. In this study, we introduce a spatial-temporal discriminant analysis (STDA) to ERP classification. As a multiway extension of the LDA, the STDA method tries to maximize the discriminant information between target and nontarget classes through finding two projection matrices from spatial and temporal dimensions collaboratively, which reduces effectively the feature dimensionality in the discriminant analysis, and hence decreases significantly the number of required training samples. The proposed STDA method was validated with dataset II of the BCI Competition III and dataset recorded from our own experiments, and compared to the state-of-the-art algorithms for ERP classification. Online experiments were additionally implemented for the validation. The superior classification performance in using few training samples shows that the STDA is effective to reduce the system calibration time and improve the classification accuracy, thereby enhancing the practicability of ERP-based BCI.

  1. A functional video-based anthropometric measuring system

    NASA Technical Reports Server (NTRS)

    Nixon, J. H.; Cater, J. P.

    1982-01-01

    A high-speed anthropometric three dimensional measurement system using the Selcom Selspot motion tracking instrument for visual data acquisition is discussed. A three-dimensional scanning system was created which collects video, audio, and performance data on a single standard video cassette recorder. Recording rates of 1 megabit per second for periods of up to two hours are possible with the system design. A high-speed off-the-shelf motion analysis system for collecting optical information as used. The video recording adapter (VRA) is interfaced to the Selspot data acquisition system.

  2. DeepIED: An epileptic discharge detector for EEG-fMRI based on deep learning.

    PubMed

    Hao, Yongfu; Khoo, Hui Ming; von Ellenrieder, Nicolas; Zazubovits, Natalja; Gotman, Jean

    2018-01-01

    Presurgical evaluation that can precisely delineate the epileptogenic zone (EZ) is one important step for successful surgical resection treatment of refractory epilepsy patients. The noninvasive EEG-fMRI recording technique combined with general linear model (GLM) analysis is considered an important tool for estimating the EZ. However, the manual marking of interictal epileptic discharges (IEDs) needed in this analysis is challenging and time-consuming because the quality of the EEG recorded inside the scanner is greatly deteriorated compared to the usual EEG obtained outside the scanner. This is one of main impediments to the widespread use of EEG-fMRI in epilepsy. We propose a deep learning based semi-automatic IED detector that can find the candidate IEDs in the EEG recorded inside the scanner which resemble sample IEDs marked in the EEG recorded outside the scanner. The manual marking burden is greatly reduced as the expert need only edit candidate IEDs. The model is trained on data from 30 patients. Validation of IEDs detection accuracy on another 37 consecutive patients shows our method can improve the median sensitivity from 50.0% for the previously proposed template-based method to 84.2%, with false positive rate as 5 events/min. Reproducibility validation on 15 patients is applied to evaluate if our method can produce similar hemodynamic response maps compared with the manual marking ground truth results. We explore the concordance between the maximum hemodynamic response and the intracerebral EEG defined EZ and find that both methods produce similar percentage of concordance (76.9%, 10 out of 13 patients, electrode was absent in the maximum hemodynamic response in two patients). This tool will make EEG-fMRI analysis more practical for clinical usage.

  3. Use of lecture recordings in dental education: assessment of status quo and recommendations.

    PubMed

    Horvath, Zsuzsa; O'Donnell, Jean A; Johnson, Lynn A; Karimbux, Nadeem Y; Shuler, Charles F; Spallek, Heiko

    2013-11-01

    This research project was part of a planned initiative at the University of Pittsburgh School of Dental Medicine to incorporate lecture recordings as standard educational support technologies. The goal of an institutional survey was 1) to gather current data about how dental educators across the United States and Canada use lecture recordings; 2) determine dental educators' perceived value and outcomes of using lecture recordings; and 3) develop recommendations based on #1 and #2 for the dental education community. Of the sixty-six North American dental schools at the time of the study, forty-five schools responded to the survey, for a 68 percent response rate. Of the respondents, twenty-eight schools were found to currently conduct lecture recording; these comprised the study sample. This study focused on the dental schools' past experiences with lecture recording; thus, those not currently engaged in lecture recording were excluded from further analysis. The survey questions covered a wide range of topics, such as the scope of the lecture recording, logistics, instructional design considerations, outcomes related to student learning, evaluation and reception, barriers to lecture recording, and issues related to copyright and intellectual property. The literature review and results from the survey showed that no common guidelines for best practice were available regarding lecture recordings in dental education. The article concludes with some preliminary recommendations based on this study.

  4. Prediction of Recidivism in Juvenile Offenders Based on Discriminant Analysis.

    ERIC Educational Resources Information Center

    Proefrock, David W.

    The recent development of strong statistical techniques has made accurate predictions of recidivism possible. To investigate the utility of discriminant analysis methodology in making predictions of recidivism in juvenile offenders, the court records of 271 male and female juvenile offenders, aged 12-16, were reviewed. A cross validation group…

  5. The Language of Negotiation Strategy.

    ERIC Educational Resources Information Center

    Lampi, Mirjaliisa

    Two Finnish courses in business English were designed to teach students language that will enable them to negotiate effectively in English. The instructional approach is based on an analysis of recorded negotiations carried out in British subsidiaries of Finnish companies. The analysis identified a group of linguistic variables through which, it…

  6. Space station systems analysis study. Part 3: Documentation. Volume 5: Cost and schedule data

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Cost estimates for the space station systems analysis were recorded. Space construction base costs and characteristics were cited as well as mission hardware costs and characteristics. Also delineated were cost ground rules, the program schedule, and a detail cost estimate and funding distribution.

  7. Data warehousing as a basis for web-based documentation of data mining and analysis.

    PubMed

    Karlsson, J; Eklund, P; Hallgren, C G; Sjödin, J G

    1999-01-01

    In this paper we present a case study for data warehousing intended to support data mining and analysis. We also describe a prototype for data retrieval. Further we discuss some technical issues related to a particular choice of a patient record environment.

  8. Daily home-based spirometry during withdrawal of inhaled corticosteroid in severe to very severe chronic obstructive pulmonary disease

    PubMed Central

    Rodriguez-Roisin, Roberto; Tetzlaff, Kay; Watz, Henrik; Wouters, Emiel FM; Disse, Bernd; Finnigan, Helen; Magnussen, Helgo; Calverley, Peter MA

    2016-01-01

    The WISDOM study (NCT00975195) reported a change in lung function following withdrawal of fluticasone propionate in patients with severe to very severe COPD treated with tiotropium and salmeterol. However, little is known about the validity of home-based spirometry measurements of lung function in COPD. Therefore, as part of this study, following suitable training, patients recorded daily home-based spirometry measurements in addition to undergoing periodic in-clinic spirometric testing throughout the study duration. We subsequently determined the validity of home-based spirometry for detecting changes in lung function by comparing in-clinic and home-based forced expiratory volume in 1 second in patients who underwent stepwise fluticasone propionate withdrawal over 12 weeks versus patients remaining on fluticasone propionate for 52 weeks. Bland–Altman analysis of these data confirmed good agreement between in-clinic and home-based measurements, both across all visits and at the individual visits at study weeks 6, 12, 18, and 52. There was a measurable difference between the forced expiratory volume in 1 second values recorded at home and in the clinic (mean difference of −0.05 L), which may be due to suboptimal patient effort in performing unsupervised recordings. However, this difference remained consistent over time. Overall, these data demonstrate that home-based and in-clinic spirometric measurements were equally valid and reliable for assessing lung function in patients with COPD, and suggest that home-based spirometry may be a useful tool to facilitate analysis of changes in lung function on a day-to-day basis. PMID:27578972

  9. The Impact of Video Review on Supervisory Conferencing

    ERIC Educational Resources Information Center

    Baecher, Laura; McCormack, Bede

    2015-01-01

    This study investigated how video-based observation may alter the nature of post-observation talk between supervisors and teacher candidates. Audio-recorded post-observation conversations were coded using a conversation analysis framework and interpreted through the lens of interactional sociology. Findings suggest that video-based observations…

  10. Filtration of human EEG recordings from physiological artifacts with empirical mode method

    NASA Astrophysics Data System (ADS)

    Grubov, Vadim V.; Runnova, Anastasiya E.; Khramova, Marina V.

    2017-03-01

    In the paper we propose the new method for dealing with noise and physiological artifacts in experimental human EEG recordings. The method is based on analysis of EEG signals with empirical mode decomposition (Hilbert-Huang transform). We consider noises and physiological artifacts on EEG as specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the method with following steps: empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing empirical modes with artifacts, reconstruction of the initial EEG signal. We test the method on filtration of experimental human EEG signals from eye-moving artifacts and show high efficiency of the method.

  11. Contributions of the NOAA Hollings Undergraduate Scholarship Program to the Geosciences Pipeline

    NASA Astrophysics Data System (ADS)

    Kaplan, M.

    2016-12-01

    Since 2005, the NOAA Ernest F. Hollings Undergraduate Scholarship Program has provided tuition support and paid summer internship opportunities at NOAA to exceptional students majoring in the geosciences. The purpose of the scholarship program is to train students in NOAA mission fields. Multiple methods were used to track the career trajectories of Hollings alumni, including mining LinkedIn data, conducting an impact analysis based on a professionally developed web-based evaluation survey, and a web-based alumni update system. At least one postgraduate record was recorded for 80% of Hollings Scholarship alumni. Of the alumni reached, more than 75% continued on to graduate school in a NOAA mission field, and 86% of those graduate degrees were in a NOAA mission field or other STEM field. More than 60% of alumni had at least one professional record, with the most alumni working in private industry, followed by nongovernmental organizations and federal, state and local government.

  12. Video coding for next-generation surveillance systems

    NASA Astrophysics Data System (ADS)

    Klasen, Lena M.; Fahlander, Olov

    1997-02-01

    Video is used as recording media in surveillance system and also more frequently by the Swedish Police Force. Methods for analyzing video using an image processing system have recently been introduced at the Swedish National Laboratory of Forensic Science, and new methods are in focus in a research project at Linkoping University, Image Coding Group. The accuracy of the result of those forensic investigations often depends on the quality of the video recordings, and one of the major problems when analyzing videos from crime scenes is the poor quality of the recordings. Enhancing poor image quality might add manipulative or subjective effects and does not seem to be the right way of getting reliable analysis results. The surveillance system in use today is mainly based on video techniques, VHS or S-VHS, and the weakest link is the video cassette recorder, (VCR). Multiplexers for selecting one of many camera outputs for recording is another problem as it often filters the video signal, and recording is limited to only one of the available cameras connected to the VCR. A way to get around the problem of poor recording is to simultaneously record all camera outputs digitally. It is also very important to build such a system bearing in mind that image processing analysis methods becomes more important as a complement to the human eye. Using one or more cameras gives a large amount of data, and the need for data compression is more than obvious. Crime scenes often involve persons or moving objects, and the available coding techniques are more or less useful. Our goal is to propose a possible system, being the best compromise with respect to what needs to be recorded, movements in the recorded scene, loss of information and resolution etc., to secure the efficient recording of the crime and enable forensic analysis. The preventative effective of having a well functioning surveillance system and well established image analysis methods is not to be neglected. Aspects of this next generation of digital surveillance systems are discussed in this paper.

  13. Importance-Performance Analysis of Personal Health Records in Taiwan: A Web-Based Survey

    PubMed Central

    Rau, Hsiao-Hsien; Chen, Kang-Hua

    2017-01-01

    Background Empowering personal health records (PHRs) provides basic human right, awareness, and intention for health promotion. As health care delivery changes toward patient-centered services, PHRs become an indispensable platform for consumers and providers. Recently, the government introduced “My health bank,” a Web-based electronic medical records (EMRs) repository for consumers. However, it is not yet a PHR. To date, we do not have a platform that can let patients manage their own PHR. Objective This study creates a vision of a value-added platform for personal health data analysis and manages their health record based on the contents of the "My health bank." This study aimed to examine consumer expectation regarding PHR, using the importance-performance analysis. The purpose of this study was to explore consumer perception regarding this type of a platform: it would try to identify the key success factors and important aspects by using the importance-performance analysis, and give some suggestions for future development based on it. Methods This is a cross-sectional study conducted in Taiwan. Web-based invitation to participate in this study was distributed through Facebook. Respondents were asked to watch an introductory movie regarding PHR before filling in the questionnaire. The questionnaire was focused on 2 aspects, including (1) system functions, and (2) system design and security and privacy. The questionnaire would employ 12 and 7 questions respectively. The questionnaire was designed following 5-points Likert scale ranging from 1 (“disagree strongly”) to 5 (“Agree strongly”). Afterwards, the questionnaire data was sorted using IBM SPSS Statistics 21 for descriptive statistics and the importance-performance analysis. Results This research received 350 valid questionnaires. Most respondents were female (219 of 350 participants, 62.6%), 21-30 years old (238 of 350 participants, 68.0%), with a university degree (228 of 350 participants, 65.1%). They were still students (195 out of 350 participants, 56.6%), with a monthly income of less than NT $30,000 (230 of 350 participants, 65.7%), and living in the North Taiwan (236 of 350 participants, 67.4%), with a good self-identified health status (171 of 350 participants, 48.9%). After performing the importance-performance analysis, we found the following: (1) instead of complex functions, people just want to have a platform that can let them integrate and manage their medical visit, health examination, and life behavior records; (2) they do not care whether their PHR is shared with others; and (3) most of the participants think the system security design is not important, but they also do not feel satisfied with the current security design. Conclusions Overall, the issues receiving the most user attention were the system functions, circulation, integrity, ease of use, and continuity of the PHRs, data security, and privacy protection. PMID:28450273

  14. 76 FR 76103 - Privacy Act; Notice of Proposed Rulemaking: State-78, Risk Analysis and Management Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-06

    ... Rulemaking: State-78, Risk Analysis and Management Records SUMMARY: Notice is hereby given that the... portions of the Risk Analysis and Management (RAM) Records, State-78, system of records contain criminal...) * * * (2) * * * Risk Analysis and Management Records, STATE-78. * * * * * (b) * * * (1) * * * Risk Analysis...

  15. Relationship between Porcine Sperm Motility and Sperm Enzymatic Activity using Paper-based Devices

    NASA Astrophysics Data System (ADS)

    Matsuura, Koji; Huang, Han-Wei; Chen, Ming-Cheng; Chen, Yu; Cheng, Chao-Min

    2017-04-01

    Mammalian sperm motility has traditionally been analyzed to determine fertility using computer-assisted semen analysis (CASA) systems. To develop low-cost and robust male fertility diagnostics, we created a paper-based MTT assay and used it to estimate motile sperm concentration. When porcine sperm motility was inhibited using sperm enzyme inhibitors for sperm enzymes related to mitochondrial activity and glycolysis, we simultaneously recorded sperm motility and enzymatic reactivity using a portable motility analysis system (iSperm) and a paper-based MTT assay, respectively. When using our paper-based MTT-assay, we calculated the area mean value signal intensity (AMV) to evaluate enzymatic reactivity. Both sperm motility and AMV decreased following treatment with iodoacetamide (IODO) and 3-bromopyruvic acid (3BP), both of which are inhibitors of glycolytic enzymes including glyceraldehyde-3-phosphate dehydrogenase (GAPDH). We found a correlation between recorded motility using iSperm and AMV from our paper-based assay (P < 0.05), suggesting that a sperm-related enzymatic reaction is involved in sperm motility. Under this protocol, MTT reduction was coupled with catalysis of GAPDH and was promoted by electron transfer from NADH. Based on this inhibitor study, sperm motility can be estimated using our paper-based MTT-assay.

  16. Electronic patient record use during ward rounds: a qualitative study of interaction between medical staff.

    PubMed

    Morrison, Cecily; Jones, Matthew; Blackwell, Alan; Vuylsteke, Alain

    2008-01-01

    Electronic patient records are becoming more common in critical care. As their design and implementation are optimized for single users rather than for groups, we aimed to understand the differences in interaction between members of a multidisciplinary team during ward rounds using an electronic, as opposed to paper, patient medical record. A qualitative study of morning ward rounds of an intensive care unit that triangulates data from video-based interaction analysis, observation, and interviews. Our analysis demonstrates several difficulties the ward round team faced when interacting with each other using the electronic record compared with the paper one. The physical setup of the technology may impede the consultant's ability to lead the ward round and may prevent other clinical staff from contributing to discussions. We discuss technical and social solutions for minimizing the impact of introducing an electronic patient record, emphasizing the need to balance both. We note that awareness of the effects of technology can enable ward-round teams to adapt their formations and information sources to facilitate multidisciplinary communication during the ward round.

  17. Electronic patient record use during ward rounds: a qualitative study of interaction between medical staff

    PubMed Central

    Morrison, Cecily; Jones, Matthew; Blackwell, Alan; Vuylsteke, Alain

    2008-01-01

    Introduction Electronic patient records are becoming more common in critical care. As their design and implementation are optimized for single users rather than for groups, we aimed to understand the differences in interaction between members of a multidisciplinary team during ward rounds using an electronic, as opposed to paper, patient medical record. Methods A qualitative study of morning ward rounds of an intensive care unit that triangulates data from video-based interaction analysis, observation, and interviews. Results Our analysis demonstrates several difficulties the ward round team faced when interacting with each other using the electronic record compared with the paper one. The physical setup of the technology may impede the consultant's ability to lead the ward round and may prevent other clinical staff from contributing to discussions. Conclusions We discuss technical and social solutions for minimizing the impact of introducing an electronic patient record, emphasizing the need to balance both. We note that awareness of the effects of technology can enable ward-round teams to adapt their formations and information sources to facilitate multidisciplinary communication during the ward round. PMID:19025662

  18. Phase 2 : evaluation of the national crash experience : comparison of CARDfile national motor vehicle accident projections with projections from NASS

    DOT National Transportation Integrated Search

    1987-07-01

    This report details the results of an analysis that compared the Crash Avoidance : Research Data Base (CARDfile) with the National Accident Sampling System (NASS). : CARDfile combines, in one data base, the police accident records for three years : (...

  19. Estimating 1970-99 average annual groundwater recharge in Wisconsin using streamflow data

    USGS Publications Warehouse

    Gebert, Warren A.; Walker, John F.; Kennedy, James L.

    2011-01-01

    Average annual recharge in Wisconsin for the period 1970-99 was estimated using streamflow data from U.S. Geological Survey continuous-record streamflow-gaging stations and partial-record sites. Partial-record sites have discharge measurements collected during low-flow conditions. The average annual base flow of a stream divided by the drainage area is a good approximation of the recharge rate; therefore, once average annual base flow is determined recharge can be calculated. Estimates of recharge for nearly 72 percent of the surface area of the State are provided. The results illustrate substantial spatial variability of recharge across the State, ranging from less than 1 inch to more than 12 inches per year. The average basin size for partial-record sites (50 square miles) was less than the average basin size for the gaging stations (305 square miles). Including results for smaller basins reveals a spatial variability that otherwise would be smoothed out using only estimates for larger basins. An error analysis indicates that the techniques used provide base flow estimates with standard errors ranging from 5.4 to 14 percent.

  20. Fundamentals of Medicare Patient Safety Surveillance: Intent, Relevance, and Transparency

    DTIC Science & Technology

    2005-01-01

    context of a patient safety surveillance system, the medical record barrier is actually twofold, presenting two challenges—indexing and analysis...preventable adverse events in elderly patients : population based review of medical records. BMJ 2000;320(7237):741–4. 3. Hayward R, Hofer T...Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events and negligence in hospitalized patients - results of the Harvard Medical Practice

  1. ECG-based gating in ultra high field cardiovascular magnetic resonance using an independent component analysis approach.

    PubMed

    Krug, Johannes W; Rose, Georg; Clifford, Gari D; Oster, Julien

    2013-11-19

    In Cardiovascular Magnetic Resonance (CMR), the synchronization of image acquisition with heart motion is performed in clinical practice by processing the electrocardiogram (ECG). The ECG-based synchronization is well established for MR scanners with magnetic fields up to 3 T. However, this technique is prone to errors in ultra high field environments, e.g. in 7 T MR scanners as used in research applications. The high magnetic fields cause severe magnetohydrodynamic (MHD) effects which disturb the ECG signal. Image synchronization is thus less reliable and yields artefacts in CMR images. A strategy based on Independent Component Analysis (ICA) was pursued in this work to enhance the ECG contribution and attenuate the MHD effect. ICA was applied to 12-lead ECG signals recorded inside a 7 T MR scanner. An automatic source identification procedure was proposed to identify an independent component (IC) dominated by the ECG signal. The identified IC was then used for detecting the R-peaks. The presented ICA-based method was compared to other R-peak detection methods using 1) the raw ECG signal, 2) the raw vectorcardiogram (VCG), 3) the state-of-the-art gating technique based on the VCG, 4) an updated version of the VCG-based approach and 5) the ICA of the VCG. ECG signals from eight volunteers were recorded inside the MR scanner. Recordings with an overall length of 87 min accounting for 5457 QRS complexes were available for the analysis. The records were divided into a training and a test dataset. In terms of R-peak detection within the test dataset, the proposed ICA-based algorithm achieved a detection performance with an average sensitivity (Se) of 99.2%, a positive predictive value (+P) of 99.1%, with an average trigger delay and jitter of 5.8 ms and 5.0 ms, respectively. Long term stability of the demixing matrix was shown based on two measurements of the same subject, each being separated by one year, whereas an averaged detection performance of Se = 99.4% and +P = 99.7% was achieved.Compared to the state-of-the-art VCG-based gating technique at 7 T, the proposed method increased the sensitivity and positive predictive value within the test dataset by 27.1% and 42.7%, respectively. The presented ICA-based method allows the estimation and identification of an IC dominated by the ECG signal. R-peak detection based on this IC outperforms the state-of-the-art VCG-based technique in a 7 T MR scanner environment.

  2. Understanding high magnitude flood risk: evidence from the past

    NASA Astrophysics Data System (ADS)

    MacDonald, N.

    2009-04-01

    The average length of gauged river flow records in the UK is ~25 years, which presents a problem in determining flood risk for high-magnitude flood events. Severe floods have been recorded in many UK catchments during the past 10 years, increasing the uncertainty in conventional flood risk estimates based on river flow records. Current uncertainty in flood risk has implications for society (insurance costs), individuals (personal vulnerability) and water resource managers (flood/drought risk). An alternative approach is required which can improve current understanding of the flood frequency/magnitude relationship. Historical documentary accounts are now recognised as a valuable resource when considering the flood frequency/magnitude relationship, but little consideration has been given to the temporal and spatial distribution of these records. Building on previous research based on British rivers (urban centre): Ouse (York), Trent (Nottingham), Tay (Perth), Severn (Shrewsbury), Dee (Chester), Great Ouse (Cambridge), Sussex Ouse (Lewes), Thames (Oxford), Tweed (Kelso) and Tyne (Hexham), this work considers the spatial and temporal distribution of historical flooding. The selected sites provide a network covering many of the largest river catchments in Britain, based on urban centres with long detailed documentary flood histories. The chronologies offer an opportunity to assess long-term patterns of flooding, indirectly determining periods of climatic variability and potentially increased geomorphic activity. This research represents the first coherent large scale analysis undertaken of historical multi-catchment flood chronologies, providing an unparalleled network of sites, permitting analysis of the spatial and temporal distribution of historical flood patterns on a national scale.

  3. Automatic evaluation of intrapartum fetal heart rate recordings: a comprehensive analysis of useful features.

    PubMed

    Chudáček, V; Spilka, J; Janků, P; Koucký, M; Lhotská, L; Huptych, M

    2011-08-01

    Cardiotocography is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO), used routinely since the 1960s by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. We assess the features on a large data set (552 records) and unlike in other published papers we use three-class expert evaluation of the records instead of the pH values. We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. The number of accelerations and decelerations, interval index, as well as Lempel-Ziv complexity and Higuchi's fractal dimension are among the top five features.

  4. Changes in cloud properties over East Asia deduced from the CLARA-A2 satellite data record

    NASA Astrophysics Data System (ADS)

    Benas, Nikos; Fokke Meirink, Jan; Hollmann, Rainer; Karlsson, Karl-Göran; Stengel, Martin

    2017-04-01

    Studies on cloud properties and processes, and their role in the Earth's changing climate, have advanced during the past decades. A significant part of this advance was enabled by satellite measurements, which offer global and continuous monitoring. Lately, a new satellite-based cloud data record was released: the CM SAF cLoud, Albedo and surface RAdiation dataset from AVHRR data - second edition (CLARA-A2) includes high resolution cloud macro- and micro-physical properties derived from the AVHRR instruments on board NOAA and MetOp polar orbiters. Based on this data record, an analysis of cloud property changes over East Asia during the 12-year period 2004-2015 was performed. Significant changes were found in both optical and geometric cloud properties, including increases in cloud liquid water path and top height. The Cloud Droplet Number Concentration (CDNC) was specifically studied in order to gain further insight into possible connections between aerosol and cloud processes. To this end, aerosol and cloud observations from MODIS, covering the same area and period, were included in the analysis.

  5. Benthic amphipods (Amphipoda: Gammaridea and Corophiidea) from the Mexican southeast sector of the Gulf of Mexico: checklist, new records and zoogeographic comments.

    PubMed

    Paz-Ríos, Carlos E; Ardisson, Pedro-Luis

    2013-01-01

    The southeast region of the Gulf of Mexico is considered to be biologically important, because it is a connection and transition zone between the Caribbean and the Gulf of Mexico, harboring great marine biodiversity. Nevertheless, benthic amphipods have been poorly studied in the Mexican southeast sector of the Gulf of Mexico with few studies listing species. The aim of this study is to provide an update checklist of species for the Mexican southeast sector (based on literature review and records from the present study) as well as a brief zoogeographical analysis for the Gulf of Mexico amphipod fauna, putting them in context with the fauna on the tropical western Atlantic. Fifty-five species were listed for the Mexican southeast sector; 36 of them showed a geographical extension to the Yucatan continental shelf representing 23 new records for the Mexican southeast sector, nine for the southeast region and four for the Gulf of Mexico. Based on the zoogeographical analysis, there is support of the application of Carolinian and Caribbean zoogeographic provinces to amphipods in the Gulf of Mexico.

  6. Urban Noise Recorded by Stationary Monitoring Stations

    NASA Astrophysics Data System (ADS)

    Bąkowski, Andrzej; Radziszewski, Leszek; Dekýš, Vladimir

    2017-10-01

    The paper presents the analysis results of equivalent sound level recorded by two road traffic noise monitoring stations. The stations were located in Kielce (an example of a medium-size town in Poland) at the roads in the town in the direction of Łódź and Lublin. The measurements were carried out through stationary stations monitoring the noise and traffic of motor vehicles. The RMS values based on A-weighted sound level were recorded every 1 s in the buffer and the results were registered every 1 min over the period of investigations. The registered data were the basis for calculating the equivalent sound level for three time intervals: from 6:00 to 18:00, from 18:00 to 22:00 and from 22:00 to 6:00. Analysis included the values of the equivalent sound level recorded for different days of the week split into 24h periods, nights, days and evenings. The data analysed included recordings from 2013. The agreement of the distribution of the variable under analysis with normal distribution was evaluated. It was demonstrated that in most cases (for both roads) there was sufficient evidence to reject the null hypothesis at the significance level of 0.05. It was noted that compared with Łódź Road, in the case of Lublin Road data, more cases were recorded for which the null hypothesis could not be rejected. Uncertainties of the equivalent sound level measurements were compared within the periods under analysis. The standard deviation, coefficient of variation, the positional coefficient of variation, the quartile deviation was proposed for performing a comparative analysis of the obtained data scattering. The investigations indicated that the recorded data varied depending on the traffic routes and time intervals. The differences concerned the values of uncertainties and coefficients of variation of the equivalent sound levels.

  7. [Electronic versus paper-based patient records: a cost-benefit analysis].

    PubMed

    Neubauer, A S; Priglinger, S; Ehrt, O

    2001-11-01

    The aim of this study is to compare the costs and benefits of electronic, paperless patient records with the conventional paper-based charts. Costs and benefits of planned electronic patient records are calculated for a University eye hospital with 140 beds. Benefit is determined by direct costs saved by electronic records. In the example shown, the additional benefits of electronic patient records, as far as they can be quantified total 192,000 DM per year. The costs of the necessary investments are 234,000 DM per year when using a linear depreciation over 4 years. In total, there are additional annual costs for electronic patient records of 42,000 DM. Different scenarios were analyzed. By increasing the time of depreciation to 6 years, the cost deficit reduces to only approximately 9,000 DM. Increased wages reduce the deficit further while the deficit increases with a loss of functions of the electronic patient record. However, several benefits of electronic records regarding research, teaching, quality control and better data access cannot be easily quantified and would greatly increase the benefit to cost ratio. Only part of the advantages of electronic patient records can easily be quantified in terms of directly saved costs. The small cost deficit calculated in this example is overcompensated by several benefits, which can only be enumerated qualitatively due to problems in quantification.

  8. High ink absorption performance of inkjet printing based on SiO2@Al13 core-shell composites

    NASA Astrophysics Data System (ADS)

    Chen, YiFan; Jiang, Bo; Liu, Li; Du, Yunzhe; Zhang, Tong; Zhao, LiWei; Huang, YuDong

    2018-04-01

    The increasing growth of the inkjet market makes the inkjet printing more necessary. A composite material based on core-shell structure has been developed and applied to prepare inkjet printing layer. In this contribution, the ink printing record layers based on SiO2@Al13 core-shell composite was elaborated. The prepared core-shell composite materials were characterized by X-ray photoelectron spectroscopy (XPS), zeta potential, X-ray diffraction (XRD), scanning electron microscopy (SEM). The results proved the presence of electrostatic adsorption between SiO2 molecules and Al13 molecules with the formation of the well-dispersed system. In addition, based on the adsorption and the liquid permeability analysis, SiO2@Al13 ink printing record layer achieved a relatively high ink uptake (2.5 gmm-1) and permeability (87%), respectively. The smoothness and glossiness of SiO2@Al13 record layers were higher than SiO2 record layers. The core-shell structure facilitated the dispersion of the silica, thereby improved its ink absorption performance and made the clear printed image. Thus, the proposed procedure based on SiO2@Al13 core-shell structure of dye particles could be applied as a promising strategy for inkjet printing.

  9. Independent component analysis for cochlear implant artifacts attenuation from electrically evoked auditory steady-state response measurements

    NASA Astrophysics Data System (ADS)

    Deprez, Hanne; Gransier, Robin; Hofmann, Michael; van Wieringen, Astrid; Wouters, Jan; Moonen, Marc

    2018-02-01

    Objective. Electrically evoked auditory steady-state responses (EASSRs) are potentially useful for objective cochlear implant (CI) fitting and follow-up of the auditory maturation in infants and children with a CI. EASSRs are recorded in the electro-encephalogram (EEG) in response to electrical stimulation with continuous pulse trains, and are distorted by significant CI artifacts related to this electrical stimulation. The aim of this study is to evaluate a CI artifacts attenuation method based on independent component analysis (ICA) for three EASSR datasets. Approach. ICA has often been used to remove CI artifacts from the EEG to record transient auditory responses, such as cortical evoked auditory potentials. Independent components (ICs) corresponding to CI artifacts are then often manually identified. In this study, an ICA based CI artifacts attenuation method was developed and evaluated for EASSR measurements with varying CI artifacts and EASSR characteristics. Artifactual ICs were automatically identified based on their spectrum. Main results. For 40 Hz amplitude modulation (AM) stimulation at comfort level, in high SNR recordings, ICA succeeded in removing CI artifacts from all recording channels, without distorting the EASSR. For lower SNR recordings, with 40 Hz AM stimulation at lower levels, or 90 Hz AM stimulation, ICA either distorted the EASSR or could not remove all CI artifacts in most subjects, except for two of the seven subjects tested with low level 40 Hz AM stimulation. Noise levels were reduced after ICA was applied, and up to 29 ICs were rejected, suggesting poor ICA separation quality. Significance. We hypothesize that ICA is capable of separating CI artifacts and EASSR in case the contralateral hemisphere is EASSR dominated. For small EASSRs or large CI artifact amplitudes, ICA separation quality is insufficient to ensure complete CI artifacts attenuation without EASSR distortion.

  10. A Way to Understand Inpatients Based on the Electronic Medical Records in the Big Data Environment

    PubMed Central

    2017-01-01

    In recent decades, information technology in healthcare, such as Electronic Medical Record (EMR) system, is potential to improve service quality and cost efficiency of the hospital. The continuous use of EMR systems has generated a great amount of data. However, hospitals tend to use these data to report their operational efficiency rather than to understand their patients. Base on a dataset of inpatients' medical records from a Chinese general public hospital, this study applies a configuration analysis from a managerial perspective and explains inpatients management in a different way. Four inpatient configurations (valued patients, managed patients, normal patients, and potential patients) are identified by the measure of the length of stay and the total hospital cost. The implications of the finding are discussed. PMID:28280506

  11. Telemetry data storage systems technology for the Space Station Freedom era

    NASA Technical Reports Server (NTRS)

    Dalton, John T.

    1989-01-01

    This paper examines the requirements and functions of the telemetry-data recording and storage systems, and the data-storage-system technology projected for the Space Station, with particular attention given to the Space Optical Disk Recorder, an on-board storage subsystem based on 160 gigabit erasable optical disk units each capable of operating at 300 M bits per second. Consideration is also given to storage systems for ground transport recording, which include systems for data capture, buffering, processing, and delivery on the ground. These can be categorized as the first in-first out storage, the fast random-access storage, and the slow access with staging. Based on projected mission manifests and data rates, the worst case requirements were developed for these three storage architecture functions. The results of the analysis are presented.

  12. Head movement compensation in real-time magnetoencephalographic recordings.

    PubMed

    Little, Graham; Boe, Shaun; Bardouille, Timothy

    2014-01-01

    Neurofeedback- and brain-computer interface (BCI)-based interventions can be implemented using real-time analysis of magnetoencephalographic (MEG) recordings. Head movement during MEG recordings, however, can lead to inaccurate estimates of brain activity, reducing the efficacy of the intervention. Most real-time applications in MEG have utilized analyses that do not correct for head movement. Effective means of correcting for head movement are needed to optimize the use of MEG in such applications. Here we provide preliminary validation of a novel analysis technique, real-time source estimation (rtSE), that measures head movement and generates corrected current source time course estimates in real-time. rtSE was applied while recording a calibrated phantom to determine phantom position localization accuracy and source amplitude estimation accuracy under stationary and moving conditions. Results were compared to off-line analysis methods to assess validity of the rtSE technique. The rtSE method allowed for accurate estimation of current source activity at the source-level in real-time, and accounted for movement of the source due to changes in phantom position. The rtSE technique requires modifications and specialized analysis of the following MEG work flow steps.•Data acquisition•Head position estimation•Source localization•Real-time source estimation This work explains the technical details and validates each of these steps.

  13. Nanowire-Based Electrode for Acute In Vivo Neural Recordings in the Brain

    PubMed Central

    Suyatin, Dmitry B.; Wallman, Lars; Thelin, Jonas; Prinz, Christelle N.; Jörntell, Henrik; Samuelson, Lars; Montelius, Lars; Schouenborg, Jens

    2013-01-01

    We present an electrode, based on structurally controlled nanowires, as a first step towards developing a useful nanostructured device for neurophysiological measurements in vivo. The sensing part of the electrode is made of a metal film deposited on top of an array of epitaxially grown gallium phosphide nanowires. We achieved the first functional testing of the nanowire-based electrode by performing acute in vivo recordings in the rat cerebral cortex and withstanding multiple brain implantations. Due to the controllable geometry of the nanowires, this type of electrode can be used as a model system for further analysis of the functional properties of nanostructured neuronal interfaces in vivo. PMID:23431387

  14. DIGIMEN, optical mass memory investigations, volume 2

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The DIGIMEM phase of the Optical Mass Memory Investigation Program addressed problems related to the analysis, design, and implementation of a direct digital optical recorder/reproducer. Effort was placed on developing an operational archival mass storage system to support one or more key NASA missions. The primary activity of the DIGIMEM program phase was the design, fabrication, and test and evaluation of a breadboard digital optical recorder/reproducer. Starting with technology and subsystem perfected during the HOLOMEM program phase, a fully operational optical spot recording breadboard that met or exceeded all program goals was evaluated. A thorough evaluation of several high resolution electrophotographic recording films was performed and a preliminary data base management/end user requirements survey was completed.

  15. Adjusted peak-flow frequency estimates for selected streamflow-gaging stations in or near Montana based on data through water year 2011: Chapter D in Montana StreamStats

    USGS Publications Warehouse

    Sando, Steven K.; Sando, Roy; McCarthy, Peter M.; Dutton, DeAnn M.

    2016-04-05

    The climatic conditions of the specific time period during which peak-flow data were collected at a given streamflow-gaging station (hereinafter referred to as gaging station) can substantially affect how well the peak-flow frequency (hereinafter referred to as frequency) results represent long-term hydrologic conditions. Differences in the timing of the periods of record can result in substantial inconsistencies in frequency estimates for hydrologically similar gaging stations. Potential for inconsistency increases with decreasing peak-flow record length. The representativeness of the frequency estimates for a short-term gaging station can be adjusted by various methods including weighting the at-site results in association with frequency estimates from regional regression equations (RREs) by using the Weighted Independent Estimates (WIE) program. Also, for gaging stations that cannot be adjusted by using the WIE program because of regulation or drainage areas too large for application of RREs, frequency estimates might be improved by using record extension procedures, including a mixed-station analysis using the maintenance of variance type I (MOVE.1) procedure. The U.S. Geological Survey, in cooperation with the Montana Department of Transportation and the Montana Department of Natural Resources and Conservation, completed a study to provide adjusted frequency estimates for selected gaging stations through water year 2011.The purpose of Chapter D of this Scientific Investigations Report is to present adjusted frequency estimates for 504 selected streamflow-gaging stations in or near Montana based on data through water year 2011. Estimates of peak-flow magnitudes for the 66.7-, 50-, 42.9-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities are reported. These annual exceedance probabilities correspond to the 1.5-, 2-, 2.33-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence intervals, respectively.The at-site frequency estimates were adjusted by weighting with frequency estimates from RREs using the WIE program for 438 selected gaging stations in Montana. These 438 selected gaging stations (1) had periods of record less than or equal to 40 years, (2) represented unregulated or minor regulation conditions, and (3) had drainage areas less than about 2,750 square miles.The weighted-average frequency estimates obtained by weighting with RREs generally are considered to provide improved frequency estimates. In some cases, there are substantial differences among the at-site frequency estimates, the regression-equation frequency estimates, and the weighted-average frequency estimates. In these cases, thoughtful consideration should be applied when selecting the appropriate frequency estimate. Some factors that might be considered when selecting the appropriate frequency estimate include (1) whether the specific gaging station has peak-flow characteristics that distinguish it from most other gaging stations used in developing the RREs for the hydrologic region; and (2) the length of the peak-flow record and the general climatic characteristics during the period when the peak-flow data were collected. For critical structure-design applications, a conservative approach would be to select the higher of the at-site frequency estimate and the weighted-average frequency estimate.The mixed-station MOVE.1 procedure generally was applied in cases where three or more gaging stations were located on the same large river and some of the gaging stations could not be adjusted using the weighted-average method because of regulation or drainage areas too large for application of RREs. The mixed-station MOVE.1 procedure was applied to 66 selected gaging stations on 19 large rivers.The general approach for using mixed-station record extension procedures to adjust at-site frequencies involved (1) determining appropriate base periods for the gaging stations on the large rivers, (2) synthesizing peak-flow data for the gaging stations with incomplete peak-flow records during the base periods by using the mixed-station MOVE.1 procedure, and (3) conducting frequency analysis on the combined recorded and synthesized peak-flow data for each gaging station. Frequency estimates for the combined recorded and synthesized datasets for 66 gaging stations with incomplete peak-flow records during the base periods are presented. The uncertainties in the mixed-station record extension results are difficult to directly quantify; thus, it is important to understand the intended use of the estimated frequencies based on analysis of the combined recorded and synthesized datasets. The estimated frequencies are considered general estimates of frequency relations among gaging stations on the same stream channel that might be expected if the gaging stations had been gaged during the same long-term base period. However, because the mixed-station record extension procedures involve secondary statistical analysis with accompanying errors, the uncertainty of the frequency estimates is larger than would be obtained by collecting systematic records for the same number of years in the base period.

  16. Gold rush - A swarm dynamics in games

    NASA Astrophysics Data System (ADS)

    Zelinka, Ivan; Bukacek, Michal

    2017-07-01

    This paper is focused on swarm intelligence techniques and its practical use in computer games. The aim is to show how a swarm dynamics can be generated by multiplayer game, then recorded, analyzed and eventually controlled. In this paper we also discuss possibility to use swarm intelligence instead of game players. Based on our previous experiments two games, using swarm algorithms are mentioned briefly here. The first one is strategy game StarCraft: Brood War, and TicTacToe in which SOMA algorithm has also take a role of player against human player. Open research reported here has shown potential benefit of swarm computation in the field of strategy games and players strategy based on swarm behavior record and analysis. We propose new game called Gold Rush as an experimental environment for human or artificial swarm behavior and consequent analysis.

  17. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    PubMed

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  18. Unobtrusive Biometric System Based on Electroencephalogram Analysis

    NASA Astrophysics Data System (ADS)

    Riera, A.; Soria-Frisch, A.; Caparrini, M.; Grau, C.; Ruffini, G.

    2007-12-01

    Features extracted from electroencephalogram (EEG) recordings have proved to be unique enough between subjects for biometric applications. We show here that biometry based on these recordings offers a novel way to robustly authenticate or identify subjects. In this paper, we present a rapid and unobtrusive authentication method that only uses 2 frontal electrodes referenced to another one placed at the ear lobe. Moreover, the system makes use of a multistage fusion architecture, which demonstrates to improve the system performance. The performance analysis of the system presented in this paper stems from an experiment with 51 subjects and 36 intruders, where an equal error rate (EER) of 3.4% is obtained, that is, true acceptance rate (TAR) of 96.6% and a false acceptance rate (FAR) of 3.4%. The obtained performance measures improve the results of similar systems presented in earlier work.

  19. New clinical insights for transiently evoked otoacoustic emission protocols.

    PubMed

    Hatzopoulos, Stavros; Grzanka, Antoni; Martini, Alessandro; Konopka, Wieslaw

    2009-08-01

    The objective of the study was to optimize the area of a time-frequency analysis and then investigate any stable patterns in the time-frequency structure of otoacoustic emissions in a population of 152 healthy adults sampled over one year. TEOAE recordings were collected from 302 ears in subjects presenting normal hearing and normal impedance values. The responses were analyzed by the Wigner-Ville distribution (WVD). The TF region of analysis was optimized by examining the energy content of various rectangular and triangular TF regions. The TEOAE components from the initial and recordings 12 months later were compared in the optimized TF region. The best region for TF analysis was identified with base point 1 at 2.24 ms and 2466 Hz, base point 2 at 6.72 ms and 2466 Hz, and the top point at 2.24 ms and 5250 Hz. Correlation indices from the TF optimized region were higher, and were statistically significant, than the traditional indices in the selected time window. An analysis of the TF data within a 12-month period indicated a 85% TEOAE component similarity in 90% of the tested subjects.

  20. Unobtrusive integration of data management with fMRI analysis.

    PubMed

    Poliakov, Andrew V; Hertzenberg, Xenia; Moore, Eider B; Corina, David P; Ojemann, George A; Brinkley, James F

    2007-01-01

    This note describes a software utility, called X-batch which addresses two pressing issues typically faced by functional magnetic resonance imaging (fMRI) neuroimaging laboratories (1) analysis automation and (2) data management. The first issue is addressed by providing a simple batch mode processing tool for the popular SPM software package (http://www.fil.ion. ucl.ac.uk/spm/; Welcome Department of Imaging Neuroscience, London, UK). The second is addressed by transparently recording metadata describing all aspects of the batch job (e.g., subject demographics, analysis parameters, locations and names of created files, date and time of analysis, and so on). These metadata are recorded as instances of an extended version of the Protégé-based Experiment Lab Book ontology created by the Dartmouth fMRI Data Center. The resulting instantiated ontology provides a detailed record of all fMRI analyses performed, and as such can be part of larger systems for neuroimaging data management, sharing, and visualization. The X-batch system is in use in our own fMRI research, and is available for download at http://X-batch.sourceforge.net/.

  1. GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis.

    PubMed

    Sogo, Hiroyuki

    2013-09-01

    Eye movement analysis is an effective method for research on visual perception and cognition. However, recordings of eye movements present practical difficulties related to the cost of the recording devices and the programming of device controls for use in experiments. GazeParser is an open-source library for low-cost eye tracking and data analysis; it consists of a video-based eyetracker and libraries for data recording and analysis. The libraries are written in Python and can be used in conjunction with PsychoPy and VisionEgg experimental control libraries. Three eye movement experiments are reported on performance tests of GazeParser. These showed that the means and standard deviations for errors in sampling intervals were less than 1 ms. Spatial accuracy ranged from 0.7° to 1.2°, depending on participant. In gap/overlap tasks and antisaccade tasks, the latency and amplitude of the saccades detected by GazeParser agreed with those detected by a commercial eyetracker. These results showed that the GazeParser demonstrates adequate performance for use in psychological experiments.

  2. [Application of case-based learning in clinical internship teaching of conservative dentistry and endodontics].

    PubMed

    Liu, Sheng-bo; Peng, Bin; Song, Ya-ling; Xu, Qing-an

    2013-12-01

    To investigate the education effect of case-based learning (CBL) pattern on clinical internship of conservative dentistry and endodontics. Forty-one undergraduates were randomly assigned into CBL group and traditional teaching group. After clinical internship in the department of conservative dentistry and endodontics for 11 weeks, each student in the 2 groups underwent comprehensive examinations including medical record writing, case analysis, academic knowledge, professional skills and the ability of winning the trust of the patients. The scores were compared between the 2 groups using SPSS 13.0 software package. There was no significant difference between the 2 groups with regard to the scores of academic knowledge and profession skills (P>0.05). However, the results of medical record writing, case analysis and the ability of winning the trust of the patients showed significant difference between the 2 groups(P<0.05). Proper application of CBL in clinical internship of conservative dentistry and endodontics contributes to improve students' ability of clinical thinking, synthetical analysis and adaptability to different patients.

  3. Economic and statistical analysis of time limitations for spotting fluids and fishing operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, P.S.; Brinkmann, P.E.; Taneja, P.K.

    1984-05-01

    This paper reviews the statistics of ''Spotting Fluids'' to free stuck drill pipe as well as the economics and statistics of drill string fishing operations. Data were taken from Mobil Oil Exploration and Producing Southeast Inc.'s (MOEPSI) records from 1970-1981. Only those events which occur after a drill string becomes stuck are discussed. The data collected were categorized as Directional Wells and Straight Wells. Bar diagrams are presented to show the Success Ratio vs. Soaking Time for each of the two categories. An analysis was made to identify the elapsed time limit to place the spotting fluid for maximum probabilitymore » of success. Also determined was the statistical minimum soaking time and the maximum soaking time. For determining the time limit for fishing operations, the following criteria were used: 1. The Risked ''Economic Breakeven Analysis'' concept was developed based on the work of Harrison. 2. Statistical Probability of Success based on MOEPSI's records from 1970-1981.« less

  4. Artifacts and noise removal in electrocardiograms using independent component analysis.

    PubMed

    Chawla, M P S; Verma, H K; Kumar, Vinod

    2008-09-26

    Independent component analysis (ICA) is a novel technique capable of separating independent components from electrocardiogram (ECG) complex signals. The purpose of this analysis is to evaluate the effectiveness of ICA in removing artifacts and noise from ECG recordings. ICA is applied to remove artifacts and noise in ECG segments of either an individual ECG CSE data base file or all files. The reconstructed ECGs are compared with the original ECG signal. For the four special cases discussed, the R-Peak magnitudes of the CSE data base ECG waveforms before and after applying ICA are also found. In the results, it is shown that in most of the cases, the percentage error in reconstruction is very small. The results show that there is a significant improvement in signal quality, i.e. SNR. All the ECG recording cases dealt showed an improved ECG appearance after the use of ICA. This establishes the efficacy of ICA in elimination of noise and artifacts in electrocardiograms.

  5. A Brief Tool to Assess Image-Based Dietary Records and Guide Nutrition Counselling Among Pregnant Women: An Evaluation

    PubMed Central

    Ashman, Amy M; Collins, Clare E; Brown, Leanne J; Rae, Kym M

    2016-01-01

    Background Dietitians ideally should provide personally tailored nutrition advice to pregnant women. Provision is hampered by a lack of appropriate tools for nutrition assessment and counselling in practice settings. Smartphone technology, through the use of image-based dietary records, can address limitations of traditional methods of recording dietary intake. Feedback on these records can then be provided by the dietitian via smartphone. Efficacy and validity of these methods requires examination. Objective The aims of the Australian Diet Bytes and Baby Bumps study, which used image-based dietary records and a purpose-built brief Selected Nutrient and Diet Quality (SNaQ) tool to provide tailored nutrition advice to pregnant women, were to assess relative validity of the SNaQ tool for analyzing dietary intake compared with nutrient analysis software, to describe the nutritional intake adequacy of pregnant participants, and to assess acceptability of dietary feedback via smartphone. Methods Eligible women used a smartphone app to record everything they consumed over 3 nonconsecutive days. Records consisted of an image of the food or drink item placed next to a fiducial marker, with a voice or text description, or both, providing additional detail. We used the SNaQ tool to analyze participants’ intake of daily food group servings and selected key micronutrients for pregnancy relative to Australian guideline recommendations. A visual reference guide consisting of images of foods and drinks in standard serving sizes assisted the dietitian with quantification. Feedback on participants’ diets was provided via 2 methods: (1) a short video summary sent to participants’ smartphones, and (2) a follow-up telephone consultation with a dietitian. Agreement between dietary intake assessment using the SNaQ tool and nutrient analysis software was evaluated using Spearman rank correlation and Cohen kappa. Results We enrolled 27 women (median age 28.8 years, 8 Indigenous Australians, 15 primiparas), of whom 25 completed the image-based dietary record. Median intakes of grains, vegetables, fruit, meat, and dairy were below recommendations. Median (interquartile range) intake of energy-dense, nutrient-poor foods was 3.5 (2.4-3.9) servings/day and exceeded recommendations (0-2.5 servings/day). Positive correlations between the SNaQ tool and nutrient analysis software were observed for energy (ρ=.898, P<.001) and all selected micronutrients (iron, calcium, zinc, folate, and iodine, ρ range .510-.955, all P<.05), both with and without vitamin and mineral supplements included in the analysis. Cohen kappa showed moderate to substantial agreement for selected micronutrients when supplements were included (kappa range .488-.803, all P ≤.001) and for calcium, iodine, and zinc when excluded (kappa range .554-.632, all P<.001). A total of 17 women reported changing their diet as a result of the personalized nutrition advice. Conclusions The SNaQ tool demonstrated acceptable validity for assessing adequacy of key pregnancy nutrient intakes and preliminary evidence of utility to support dietitians in providing women with personalized advice to optimize nutrition during pregnancy. PMID:27815234

  6. Freddie Mercury-acoustic analysis of speaking fundamental frequency, vibrato, and subharmonics.

    PubMed

    Herbst, Christian T; Hertegard, Stellan; Zangger-Borch, Daniel; Lindestad, Per-Åke

    2017-04-01

    Freddie Mercury was one of the twentieth century's best-known singers of commercial contemporary music. This study presents an acoustical analysis of his voice production and singing style, based on perceptual and quantitative analysis of publicly available sound recordings. Analysis of six interviews revealed a median speaking fundamental frequency of 117.3 Hz, which is typically found for a baritone voice. Analysis of voice tracks isolated from full band recordings suggested that the singing voice range was 37 semitones within the pitch range of F#2 (about 92.2 Hz) to G5 (about 784 Hz). Evidence for higher phonations up to a fundamental frequency of 1,347 Hz was not deemed reliable. Analysis of 240 sustained notes from 21 a-cappella recordings revealed a surprisingly high mean fundamental frequency modulation rate (vibrato) of 7.0 Hz, reaching the range of vocal tremor. Quantitative analysis utilizing a newly introduced parameter to assess the regularity of vocal vibrato corroborated its perceptually irregular nature, suggesting that vibrato (ir)regularity is a distinctive feature of the singing voice. Imitation of subharmonic phonation samples by a professional rock singer, documented by endoscopic high-speed video at 4,132 frames per second, revealed a 3:1 frequency locked vibratory pattern of vocal folds and ventricular folds.

  7. Agriculture, population growth, and statistical analysis of the radiocarbon record.

    PubMed

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L

    2016-01-26

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.

  8. Performance analysis of a Principal Component Analysis ensemble classifier for Emotiv headset P300 spellers.

    PubMed

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2014-01-01

    The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method.

  9. Protocol to describe the analysis of text-based communication in medical records for patients discharged from intensive care to hospital ward.

    PubMed

    Parsons Leigh, Jeanna; Brown, Kyla; Buchner, Denise; Stelfox, Henry T

    2016-07-08

    Effective communication during hospital transitions of patient care is fundamental to ensuring patient safety and continuity of quality care. This study will describe text-based communication included in patient medical records before, during and after patient transfer from the intensive care unit (ICU) to a hospital ward (n=10 days) by documenting (1) the structure and focus of physician progress notes within and between medical specialties, (2) the organisation of subjective and objective information, including the location and accessibility of patient data and whether/how this changes during the hospital stay and (3) missing, illegible and erroneous information. This study is part of a larger mixed methods prospective observational study of ICU to hospital ward transfer practices in 10 ICUs across Canada. Medical records will be collected and photocopied for each consenting patient for a period of up to 10 consecutive days, including the final 2 days in the ICU, the day of transfer and the first 7 days on the ward (n=10 days). Textual analysis of medical record data will be completed by 2 independent reviewers to describe communication between stakeholders involved in ICU transfer. Research ethics board approval has been obtained at all study sites, including the coordinating study centre (which covers 4 Calgary-based sites; UofC REB 13-0021) and 6 additional study sites (UofA Pro00050646; UBC PHC Hi4-01667; Sunnybrook 336-2014; QCH 20140345-01H; Sherbrooke 14-172; Laval 2015-2171). Findings from this study will inform the development of an evidence-based tool that will be used to systematically analyse the series of notes in a patient's medical record. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Multifractal analysis of real and imaginary movements: EEG study

    NASA Astrophysics Data System (ADS)

    Pavlov, Alexey N.; Maksimenko, Vladimir A.; Runnova, Anastasiya E.; Khramova, Marina V.; Pisarchik, Alexander N.

    2018-04-01

    We study abilities of the wavelet-based multifractal analysis in recognition specific dynamics of electrical brain activity associated with real and imaginary movements. Based on the singularity spectra we analyze electroencephalograms (EEGs) acquired in untrained humans (operators) during imagination of hands movements, and show a possibility to distinguish between the related EEG patterns and the recordings performed during real movements or the background electrical brain activity. We discuss how such recognition depends on the selected brain region.

  11. Retrieval of the thickness and refractive index dispersion of parallel plate from a single interferogram recorded in both spectral and angular domains

    NASA Astrophysics Data System (ADS)

    Dong, Jingtao; Lu, Rongsheng

    2018-04-01

    The principle of retrieving the thickness and refractive index dispersion of a parallel glass plate is reported based on single interferogram recording and phase analysis. With the parallel plate illuminated by a convergent light sheet, the transmitted light interfering in both spectral and angular domains is recorded. The phase recovered from the single interferogram by Fourier analysis is used to retrieve the thickness and refractive index dispersion without periodic ambiguity. Experimental results of an optical substrate standard show that the accuracy of refractive index dispersion is less than 2.5 × 10-5 and the relative uncertainty of thickness is 6 × 10-5 (3σ). This method is confirmed to be robust against the intensity noises, indicating the capability of stable and accurate measurement.

  12. [EEG-correlates of pilots' functional condition in simulated flight dynamics].

    PubMed

    Kiroy, V N; Aslanyan, E V; Bakhtin, O M; Minyaeva, N R; Lazurenko, D M

    2015-01-01

    The spectral characteristics of the EEG recorded on two professional pilots in the simulator TU-154 aircraft in flight dynamics, including takeoff, landing and horizontal flight (in particular during difficult conditions) were analyzed. EEG recording was made with frequency band 0.1-70 Hz continuously from 15 electrodes. The EEG recordings were evaluated using analysis of variance and discriminant analysis. Statistical significant of the identified differences and the influence of the main factors and their interactions were evaluated using Greenhouse - Gaiser corrections. It was shown that the spectral characteristics of the EEG are highly informative features of the state of the pilots, reflecting the different flight phases. High validity ofthe differences including individual characteristic, indicates their non-random nature and the possibility of constructing a system of pilots' state control during all phases of flight, based on EEG features.

  13. Tipping point analysis of atmospheric oxygen concentration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livina, V. N.; Forbes, A. B.; Vaz Martins, T. M.

    2015-03-15

    We apply tipping point analysis to nine observational oxygen concentration records around the globe, analyse their dynamics and perform projections under possible future scenarios, leading to oxygen deficiency in the atmosphere. The analysis is based on statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the observed data using Bayesian and wavelet techniques.

  14. Quantum-dot based nanothermometry in optical plasmonic recording media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maestro, Laura Martinez; Centre for Micro-Photonics, Faculty of Science, Engineering and Technology, Swinburne University of Technology, Hawthorn, Victoria 3122; Zhang, Qiming

    2014-11-03

    We report on the direct experimental determination of the temperature increment caused by laser irradiation in a optical recording media constituted by a polymeric film in which gold nanorods have been incorporated. The incorporation of CdSe quantum dots in the recording media allowed for single beam thermal reading of the on-focus temperature from a simple analysis of the two-photon excited fluorescence of quantum dots. Experimental results have been compared with numerical simulations revealing an excellent agreement and opening a promising avenue for further understanding and optimization of optical writing processes and media.

  15. Warmest Global Temperature on Record on This Week @NASA – January 20, 2017

    NASA Image and Video Library

    2017-01-20

    NASA and the National Oceanic and Atmospheric Administration (NOAA) announced on Jan. 18, that global surface temperatures in 2016 were the warmest since modern record keeping began in 1880. The finding was based on results of independent analyses by both agencies. According to analysis by scientists at NASA’s Goddard Institute for Space Studies (GISS) in New York, 2016 is the third year in a row to set a new record for global average surface temperatures, further demonstrating a long-term warming trend. Also, Cygnus Cargo Module Arrives at KSC, Up in 30 Seconds, and Remembering Gene Cernan.

  16. [Two cases of personal identification from dental information].

    PubMed

    Yamaguchi, T; Yamada, Y; Ohtani, S; Kogure, T; Nagao, M; Takatori, T; Ohira, H; Yamamoto, I; Watanabe, A

    1997-08-01

    We describe two cases in which unknown bodies were positively identified from dental information and biochemical examination using tooth materials. In one case, a charred body was positively identified with little effort by comparison of antemortem dental records (dental chart and dental X-ray film) with postmortem data. In the other case, although the unknown individual had dental treatment, the police were unable to obtain the antemortem dental records of the victim. We then conducted biochemical analysis of teeth, facilitating personal identification using DNA analysis and age estimation based on aspartic acid racemization. The mutation obtained from the sequence of mtDNA and the genotypes of HLADQ alpha, HPRTB and ABO blood groups including the data for estimated age supported the kinship between the unknown individual and his mother. The data for maternally inherited mtDNA were of great importance in this case, since it was possible to obtain DNA from the mother. Dental identification in one of the most accurate methods of personal identification if suitable antemortem records are available. In the absence of such records, biochemical analysis of teeth also makes it possible to increase the probability of correct personal identification.

  17. Design and reliability analysis of high-speed and continuous data recording system based on disk array

    NASA Astrophysics Data System (ADS)

    Jiang, Changlong; Ma, Cheng; He, Ning; Zhang, Xugang; Wang, Chongyang; Jia, Huibo

    2002-12-01

    In many real-time fields the sustained high-speed data recording system is required. This paper proposes a high-speed and sustained data recording system based on the complex-RAID 3+0. The system consists of Array Controller Module (ACM), String Controller Module (SCM) and Main Controller Module (MCM). ACM implemented by an FPGA chip is used to split the high-speed incoming data stream into several lower-speed streams and generate one parity code stream synchronously. It also can inversely recover the original data stream while reading. SCMs record lower-speed streams from the ACM into the SCSI disk drivers. In the SCM, the dual-page buffer technology is adopted to implement speed-matching function and satisfy the need of sustainable recording. MCM monitors the whole system, controls ACM and SCMs to realize the data stripping, reconstruction, and recovery functions. The method of how to determine the system scale is presented. At the end, two new ways Floating Parity Group (FPG) and full 2D-Parity Group (full 2D-PG) are proposed to improve the system reliability and compared with the Traditional Parity Group (TPG). This recording system can be used conveniently in many areas of data recording, storing, playback and remote backup with its high-reliability.

  18. Ion diffusion may introduce spurious current sources in current-source density (CSD) analysis.

    PubMed

    Halnes, Geir; Mäki-Marttunen, Tuomo; Pettersen, Klas H; Andreassen, Ole A; Einevoll, Gaute T

    2017-07-01

    Current-source density (CSD) analysis is a well-established method for analyzing recorded local field potentials (LFPs), that is, the low-frequency part of extracellular potentials. Standard CSD theory is based on the assumption that all extracellular currents are purely ohmic, and thus neglects the possible impact from ionic diffusion on recorded potentials. However, it has previously been shown that in physiological conditions with large ion-concentration gradients, diffusive currents can evoke slow shifts in extracellular potentials. Using computer simulations, we here show that diffusion-evoked potential shifts can introduce errors in standard CSD analysis, and can lead to prediction of spurious current sources. Further, we here show that the diffusion-evoked prediction errors can be removed by using an improved CSD estimator which accounts for concentration-dependent effects. NEW & NOTEWORTHY Standard CSD analysis does not account for ionic diffusion. Using biophysically realistic computer simulations, we show that unaccounted-for diffusive currents can lead to the prediction of spurious current sources. This finding may be of strong interest for in vivo electrophysiologists doing extracellular recordings in general, and CSD analysis in particular. Copyright © 2017 the American Physiological Society.

  19. A Regional Stable Carbon Isotope Dendro-Climatology from the South African Summer Rainfall Area.

    PubMed

    Woodborne, Stephan; Gandiwa, Patience; Hall, Grant; Patrut, Adrian; Finch, Jemma

    2016-01-01

    Carbon isotope analysis of four baobab (Adansonia digitata L.) trees from the Pafuri region of South Africa yielded a 1000-year proxy rainfall record. The Pafuri record age model was based on 17 radiocarbon dates, cross correlation of the climate record, and ring structures that were presumed to be annual for two of the trees. Here we present the analysis of five additional baobabs from the Mapungubwe region, approximately 200km west of Pafuri. The Mapungubwe chronology demonstrates that ring structures are not necessarily annually formed, and accordingly the Pafuri chronology is revised. Changes in intrinsic water-use efficiency indicate an active response by the trees to elevated atmospheric CO2, but this has little effect on the environmental signal. The revised Pafuri record, and the new Mapungubwe record correlate significantly with local rainfall. Both records confirm that the Medieval Warm Period was substantially wetter than present, and the Little Ice Age was the driest period in the last 1000 years. Although Mapungubwe is generally drier than Pafuri, both regions experience elevated rainfall peaking between AD 1570 and AD 1620 after which dry conditions persist in the Mapungubwe area until about AD 1840. Differences between the two records correlate with Agulhas Current sea-surface temperature variations suggesting east/west displacement of the temperate tropical trough system as an underlying mechanism. The Pafuri and Mapungubwe records are combined to provide a regional climate proxy record for the northern summer rainfall area of southern Africa.

  20. A Regional Stable Carbon Isotope Dendro-Climatology from the South African Summer Rainfall Area

    PubMed Central

    2016-01-01

    Carbon isotope analysis of four baobab (Adansonia digitata L.) trees from the Pafuri region of South Africa yielded a 1000-year proxy rainfall record. The Pafuri record age model was based on 17 radiocarbon dates, cross correlation of the climate record, and ring structures that were presumed to be annual for two of the trees. Here we present the analysis of five additional baobabs from the Mapungubwe region, approximately 200km west of Pafuri. The Mapungubwe chronology demonstrates that ring structures are not necessarily annually formed, and accordingly the Pafuri chronology is revised. Changes in intrinsic water-use efficiency indicate an active response by the trees to elevated atmospheric CO2, but this has little effect on the environmental signal. The revised Pafuri record, and the new Mapungubwe record correlate significantly with local rainfall. Both records confirm that the Medieval Warm Period was substantially wetter than present, and the Little Ice Age was the driest period in the last 1000 years. Although Mapungubwe is generally drier than Pafuri, both regions experience elevated rainfall peaking between AD 1570 and AD 1620 after which dry conditions persist in the Mapungubwe area until about AD 1840. Differences between the two records correlate with Agulhas Current sea-surface temperature variations suggesting east/west displacement of the temperate tropical trough system as an underlying mechanism. The Pafuri and Mapungubwe records are combined to provide a regional climate proxy record for the northern summer rainfall area of southern Africa. PMID:27427912

  1. DOAS-based total column ozone retrieval from Phaethon system

    NASA Astrophysics Data System (ADS)

    Gkertsi, F.; Bais, A. F.; Kouremeti, N.; Drosoglou, Th; Fountoulakis, I.; Fragkos, K.

    2018-05-01

    This study introduces the measurement of the total ozone column using Differential Optical Absorption Spectroscopy (DOAS) analysis of direct-sun spectra recorded by the Phaethon system. This methodology is based on the analysis of spectra relative to a reference spectrum that has been recorded by the same instrument. The slant column density of ozone associated with the reference spectrum is derived by Langley extrapolation. Total ozone data derived by Phaethon over two years in Thessaloniki are compared with those of a collocated, well-maintained and calibrated, Brewer spectrophotometer. When the retrieval of total ozone is based on the absorption cross sections of (Paur and Bass, 1984) at 228 K, Phaethon shows an average overestimation of 1.85 ± 1.86%. Taking into account the effect of the day-to-day variability of stratospheric temperature on total ozone derived by both systems, the bias is reduced to 0.94 ± 1.26%. The sensitivity of the total ozone retrieval to changes in temperature is larger for Phaethon than for Brewer.

  2. A High Resolution Late Holocene Paleo-atmospheric Co2 Reconstruction From Stomatal Frequency Analysis of Conifer Needles

    NASA Astrophysics Data System (ADS)

    Kouwenberg, L. L. R.; Kurschner, W. M.; Wagner, F.; Visscher, H.

    An inverse relation of stomatal frequency in leaves of many plant taxa and atmospheric CO2 concentration has been repeatedly demonstrated. Response curves based on this species-specific relation are increasingly used to reconstruct paleo-CO2 levels from stomatal frequency analysis on fossil leaves. This type of atmospheric CO2 records have been produced for a large part of geological history, varying from the Paleozoic to the Holocene. Quaternary glaciochemical records from Antarctica and Greenland suggest that CO2 concentration and temperature are strongly linked, in general CO2 appears to lag temperature change. However, in order to assess this relation, high res- olution records with a precise chronology are needed. During the Holocene, several century-scale climatic fluctuations took place, such as the 8.2 kyr event and the Lit- tle Ice age. Linking these temperature fluctuations to paleo-CO2 concentrations in glaciochemical records can be difficult, because the resolution of ice-cores is gen- erally low and the ice-gas age difference complicates accurate dating. An excellent alternative tool for high-resolution Holocene CO2 reconstructions can be provided by stomatal frequency analysis of leaves from Holocene peat and lake sediments. In this study, it is demonstrated that the western hemlock (Tsuga heterophylla) also ad- justs its stomatal frequency to the historical CO2 rise. After careful proxy-validation, a high resolution paleo-atmospheric CO2 record over the last 2000 years based on subfossil Tsuga heterophylla needles from Mount Rainier (Washington, USA) was re- constructed. Chronology is provided by a suite of AMS carbon isotope dates and the presence of tephra layers from nearby Mt. St Helens. The record reproduces CO2 lev- els around 280 ppmv for the Little Ice Age and the CO2 rise to 365 ppmv over the last 150 years. A prominent feature is a marked rise in CO2 at 350 years AD, gradu- ally declining over the next centuries. The CO2 record will be discussed in terms of its relation to local volcanic CO2 production, paleoclimate data and changes in the terrestrial and marine carbon sources and sinks.

  3. A conceptual framework for managing clinical processes.

    PubMed

    Buffone, G J; Moreau, D

    1997-01-01

    Reengineering of the health care delivery system is underway, as is the transformation of the processes and methods used for recording information describing patient care (i.e., the development of a computer-based record). This report describes the use of object-oriented analysis and design to develop and implement clinical process reengineering as well as the organization of clinical data. In addition, the facility of the proposed framework for implementing workflow computing is discussed.

  4. Use of Anecdotal Occurrence Data in Species Distribution Models: An Example Based on the White-Nosed Coati (Nasua narica) in the American Southwest

    PubMed Central

    Frey, Jennifer K.; Lewis, Jeremy C.; Guy, Rachel K.; Stuart, James N.

    2013-01-01

    Simple Summary We evaluated the influence of occurrence records with different reliability on predicted distribution of a unique, rare mammal in the American Southwest, the white-nosed coati (Nasua narica). We concluded that occurrence datasets that include anecdotal records can be used to infer species distributions, providing such data are used only for easily-identifiable species and based on robust modeling methods such as maximum entropy. Use of a reliability rating system is critical for using anecdotal data. Abstract Species distributions are usually inferred from occurrence records. However, these records are prone to errors in spatial precision and reliability. Although influence of spatial errors has been fairly well studied, there is little information on impacts of poor reliability. Reliability of an occurrence record can be influenced by characteristics of the species, conditions during the observation, and observer’s knowledge. Some studies have advocated use of anecdotal data, while others have advocated more stringent evidentiary standards such as only accepting records verified by physical evidence, at least for rare or elusive species. Our goal was to evaluate the influence of occurrence records with different reliability on species distribution models (SDMs) of a unique mammal, the white-nosed coati (Nasua narica) in the American Southwest. We compared SDMs developed using maximum entropy analysis of combined bioclimatic and biophysical variables and based on seven subsets of occurrence records that varied in reliability and spatial precision. We found that the predicted distribution of the coati based on datasets that included anecdotal occurrence records were similar to those based on datasets that only included physical evidence. Coati distribution in the American Southwest was predicted to occur in southwestern New Mexico and southeastern Arizona and was defined primarily by evenness of climate and Madrean woodland and chaparral land-cover types. Coati distribution patterns in this region suggest a good model for understanding the biogeographic structure of range margins. We concluded that occurrence datasets that include anecdotal records can be used to infer species distributions, providing such data are used only for easily-identifiable species and based on robust modeling methods such as maximum entropy. Use of a reliability rating system is critical for using anecdotal data. PMID:26487405

  5. Characterization and Pathogenicity of Alternaria vanuatuensis, a New Record from Allium Plants in Korea and China.

    PubMed

    Li, Mei Jia; Deng, Jian Xin; Paul, Narayan Chandra; Lee, Hyang Burm; Yu, Seung Hun

    2014-12-01

    Alternaria from different Allium plants was characterized by multilocus sequence analysis. Based on sequences of the β-tubulin (BT2b), the Alternaria allergen a1 (Alt a1), and the RNA polymerase II second largest subunit (RPB2) genes and phylogenetic data analysis, isolates were divided into two groups. The two groups were identical to representative isolates of A. porri (EGS48-147) and A. vanuatuensis (EGS45-018). The conidial characteristics and pathogenicity of A. vanuatuensis also well supported the molecular characteristics. This is the first record of A. vanuatuensis E. G. Simmons & C. F. Hill from Korea and China.

  6. Dynamic Attack Tree Tool for Risk Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Black, Karl

    2012-03-13

    DATT enables interactive visualization, qualitative analysis and recording of cyber and other forms of risk. It facilitates dynamic risk-based approaches (as opposed to static compliance-based) to security and risk management in general. DATT allows decision makers to consistently prioritize risk mitigation strategies and quickly see where attention is most needed across the enterprise.

  7. AFLP-based genetic diversity assessment of commercially important tea germplasm in India.

    PubMed

    Sharma, R K; Negi, M S; Sharma, S; Bhardwaj, P; Kumar, R; Bhattachrya, E; Tripathi, S B; Vijayan, D; Baruah, A R; Das, S C; Bera, B; Rajkumar, R; Thomas, J; Sud, R K; Muraleedharan, N; Hazarika, M; Lakshmikumaran, M; Raina, S N; Ahuja, P S

    2010-08-01

    India has a large repository of important tea accessions and, therefore, plays a major role in improving production and quality of tea across the world. Using seven AFLP primer combinations, we analyzed 123 commercially important tea accessions representing major populations in India. The overall genetic similarity recorded was 51%. No significant differences were recorded in average genetic similarity among tea populations cultivated in various geographic regions (northwest 0.60, northeast and south both 0.59). UPGMA cluster analysis grouped the tea accessions according to geographic locations, with a bias toward China or Assam/Cambod types. Cluster analysis results were congruent with principal component analysis. Further, analysis of molecular variance detected a high level of genetic variation (85%) within and limited genetic variation (15%) among the populations, suggesting their origin from a similar genetic pool.

  8. A digital advocate? Reactions of rural people who experience homelessness to the idea of recording clinical encounters.

    PubMed

    Grande, Stuart W; Castaldo, Mary Ganger; Carpenter-Song, Elizabeth; Griesemer, Ida; Elwyn, Glyn

    2017-08-01

    Are the benefits of recording clinical encounters shared across different groups, or do they vary based on social position? Studies show that educated patients record their clinical visits to enhance their experience, but very little is known about recording benefits among "hard-to-reach" populations. To examine the reactions of homeless people to the idea of using a smartphone to record their own clinical encounter, either covertly or with permission from their physician. We conducted semi-structured interviews with individuals at a temporary housing shelter in Northern New England. A thematic analysis identified themes that were iteratively refined into representative groups. Eighteen (18) interviews were conducted, 12 with women and six with men. Initial reactions to clinical recordings were positive (11 of 18). A majority (17 of 18) were willing to use recordings in future visits. A thematic analysis characterized data in two ways: (i) by providing reliable evidence for review, they functioned as an advocacy measure for patients; (ii) by promoting transparency and levelling social distance, this technology modified clinical relationships. Recordings permitted the sharing of data with others, providing tangible proof of behaviour and refuting misconceptions. Asking permission to record appeared to modify relationships and level perceived social distance with clinicians. We found that while many rural, disadvantaged individuals felt marginalized by the wide social distance between themselves and their clinicians, recording technology may serve as an advocate by holding both patients and doctors accountable and by permitting the burden of clinical proof to be shared. © 2016 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  9. A Sparsity-Based Approach to 3D Binaural Sound Synthesis Using Time-Frequency Array Processing

    NASA Astrophysics Data System (ADS)

    Cobos, Maximo; Lopez, JoseJ; Spors, Sascha

    2010-12-01

    Localization of sounds in physical space plays a very important role in multiple audio-related disciplines, such as music, telecommunications, and audiovisual productions. Binaural recording is the most commonly used method to provide an immersive sound experience by means of headphone reproduction. However, it requires a very specific recording setup using high-fidelity microphones mounted in a dummy head. In this paper, we present a novel processing framework for binaural sound recording and reproduction that avoids the use of dummy heads, which is specially suitable for immersive teleconferencing applications. The method is based on a time-frequency analysis of the spatial properties of the sound picked up by a simple tetrahedral microphone array, assuming source sparseness. The experiments carried out using simulations and a real-time prototype confirm the validity of the proposed approach.

  10. Design, fabrication and skin-electrode contact analysis of polymer microneedle-based ECG electrodes

    NASA Astrophysics Data System (ADS)

    O'Mahony, Conor; Grygoryev, Konstantin; Ciarlone, Antonio; Giannoni, Giuseppe; Kenthao, Anan; Galvin, Paul

    2016-08-01

    Microneedle-based ‘dry’ electrodes have immense potential for use in diagnostic procedures such as electrocardiography (ECG) analysis, as they eliminate several of the drawbacks associated with the conventional ‘wet’ electrodes currently used for physiological signal recording. To be commercially successful in such a competitive market, it is essential that dry electrodes are manufacturable in high volumes and at low cost. In addition, the topographical nature of these emerging devices means that electrode performance is likely to be highly dependent on the quality of the skin-electrode contact. This paper presents a low-cost, wafer-level micromoulding technology for the fabrication of polymeric ECG electrodes that use microneedle structures to make a direct electrical contact to the body. The double-sided moulding process can be used to eliminate post-process via creation and wafer dicing steps. In addition, measurement techniques have been developed to characterize the skin-electrode contact force. We perform the first analysis of signal-to-noise ratio dependency on contact force, and show that although microneedle-based electrodes can outperform conventional gel electrodes, the quality of ECG recordings is significantly dependent on temporal and mechanical aspects of the skin-electrode interface.

  11. An Analysis of 1-Year Impacts of Youth Transition Demonstration Projects

    ERIC Educational Resources Information Center

    Fraker, Thomas M.; Luecking, Richard G.; Mamun, Arif A.; Martinez, John M.; Reed, Deborah S.; Wittenburg, David C.

    2016-01-01

    This article examines the impacts of the Youth Transition Demonstration, an initiative of the Social Security Administration (SSA) to improve employment outcomes for youth with disabilities. Based on a random assignment design, the analysis uses data from a 1-year follow-up survey and SSA administrative records for 5,203 youth in six research…

  12. A microhistological technique for analysis of food habits of mycophagous rodents.

    Treesearch

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  13. Spectral analysis, vibrational assignments, NBO analysis, NMR, UV-Vis, hyperpolarizability analysis of 2-aminofluorene by density functional theory.

    PubMed

    Jone Pradeepa, S; Sundaraganesan, N

    2014-05-05

    In this present investigation, the collective experimental and theoretical study on molecular structure, vibrational analysis and NBO analysis has been reported for 2-aminofluorene. FT-IR spectrum was recorded in the range 4000-400 cm(-1). FT-Raman spectrum was recorded in the range 4000-50 cm(-1). The molecular geometry, vibrational spectra, and natural bond orbital analysis (NBO) were calculated for 2-aminofluorene using Density Functional Theory (DFT) based on B3LYP/6-31G(d,p) model chemistry. (13)C and (1)H NMR chemical shifts of 2-aminofluorene were calculated using GIAO method. The computed vibrational and NMR spectra were compared with the experimental results. The total energy distribution (TED) was derived to deepen the understanding of different modes of vibrations contributed by respective wavenumber. The experimental UV-Vis spectra was recorded in the region of 400-200 nm and correlated with simulated spectra by suitably solvated B3LYP/6-31G(d,p) model. The HOMO-LUMO energies were measured with time dependent DFT approach. The nonlinearity of the title compound was confirmed by hyperpolarizabilty examination. Using theoretical calculation Molecular Electrostatic Potential (MEP) was investigated. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. The analysis of morphometric data on rocky mountain wolves and artic wolves using statistical method

    NASA Astrophysics Data System (ADS)

    Ammar Shafi, Muhammad; Saifullah Rusiman, Mohd; Hamzah, Nor Shamsidah Amir; Nor, Maria Elena; Ahmad, Noor’ani; Azia Hazida Mohamad Azmi, Nur; Latip, Muhammad Faez Ab; Hilmi Azman, Ahmad

    2018-04-01

    Morphometrics is a quantitative analysis depending on the shape and size of several specimens. Morphometric quantitative analyses are commonly used to analyse fossil record, shape and size of specimens and others. The aim of the study is to find the differences between rocky mountain wolves and arctic wolves based on gender. The sample utilised secondary data which included seven variables as independent variables and two dependent variables. Statistical modelling was used in the analysis such was the analysis of variance (ANOVA) and multivariate analysis of variance (MANOVA). The results showed there exist differentiating results between arctic wolves and rocky mountain wolves based on independent factors and gender.

  15. Ground-based assessment of the bias and long-term stability of fourteen limb and occultation ozone profile data records.

    PubMed

    Hubert, D; Lambert, J-C; Verhoelst, T; Granville, J; Keppens, A; Baray, J-L; Cortesi, U; Degenstein, D A; Froidevaux, L; Godin-Beekmann, S; Hoppel, K W; Kyrölä, E; Leblanc, T; Lichtenberg, G; McElroy, C T; Murtagh, D; Nakane, H; Querel, R; Russell, J M; Salvador, J; Smit, H G J; Stebel, K; Steinbrecht, W; Strawbridge, K B; Stübi, R; Swart, D P J; Taha, G; Thompson, A M; Urban, J; van Gijsel, J A E; von der Gathen, P; Walker, K A; Wolfram, E; Zawodny, J M

    2016-01-01

    The ozone profile records of a large number of limb and occultation satellite instruments are widely used to address several key questions in ozone research. Further progress in some domains depends on a more detailed understanding of these data sets, especially of their long-term stability and their mutual consistency. To this end, we made a systematic assessment of fourteen limb and occultation sounders that, together, provide more than three decades of global ozone profile measurements. In particular, we considered the latest operational Level-2 records by SAGE II, SAGE III, HALOE, UARS MLS, Aura MLS, POAM II, POAM III, OSIRIS, SMR, GOMOS, MIPAS, SCIAMACHY, ACE-FTS and MAESTRO. Central to our work is a consistent and robust analysis of the comparisons against the ground-based ozonesonde and stratospheric ozone lidar networks. It allowed us to investigate, from the troposphere up to the stratopause, the following main aspects of satellite data quality: long-term stability, overall bias, and short-term variability, together with their dependence on geophysical parameters and profile representation. In addition, it permitted us to quantify the overall consistency between the ozone profilers. Generally, we found that between 20-40 km the satellite ozone measurement biases are smaller than ±5 %, the short-term variabilities are less than 5-12% and the drifts are at most ±5% decade -1 (or even ±3 % decade -1 for a few records). The agreement with ground-based data degrades somewhat towards the stratopause and especially towards the tropopause where natural variability and low ozone abundances impede a more precise analysis. In part of the stratosphere a few records deviate from the preceding general conclusions; we identified biases of 10% and more (POAM II and SCIAMACHY), markedly higher single-profile variability (SMR and SCIAMACHY), and significant long-term drifts (SCIAMACHY, OSIRIS, HALOE, and possibly GOMOS and SMR as well). Furthermore, we reflected on the repercussions of our findings for the construction, analysis and interpretation of merged data records. Most notably, the discrepancies between several recent ozone profile trend assessments can be mostly explained by instrumental drift. This clearly demonstrates the need for systematic comprehensive multi-instrument comparison analyses.

  16. Ground-Based Assessment of the Bias and Long-Term Stability of Fourteen Limb and Occultation Ozone Profile Data Records

    NASA Technical Reports Server (NTRS)

    Hubert, D.; Lambert, J.-C.; Verhoelst, T.; Granville, J.; Keppens, A.; Baray, J.-L.; Cortesi, U.; Degenstein, D. A.; Froidevaux, L.; Godin-Beekmann, S.; hide

    2016-01-01

    The ozone profile records of a large number of limb and occultation satellite instruments are widely used to address several key questions in ozone research. Further progress in some domains depends on a more detailed understanding of these data sets, especially of their long-term stability and their mutual consistency. To this end, we made a systematic assessment of fourteen limb and occultation sounders that, together, provide more than three decades of global ozone profile measurements. In particular, we considered the latest operational Level-2 records by SAGE II, SAGE III, HALOE, UARS MLS, Aura MLS, POAM II, POAM III, OSIRIS, SMR, GOMOS, MIPAS, SCIAMACHY, ACE-FTS and MAESTRO. Central to our work is a consistent and robust analysis of the comparisons against the ground-based ozonesonde and stratospheric ozone lidar networks. It allowed us to investigate, from the troposphere up to the stratopause, the following main aspects of satellite data quality: long-term stability, overall bias, and short-term variability, together with their dependence on geophysical parameters and profile representation. In addition, it permitted us to quantify the overall consistency between the ozone profilers. Generally, we found that between 20-40 kilometers the satellite ozone measurement biases are smaller than plus or minus 5 percent, the short-term variabilities are less than 5-12 percent and the drifts are at most plus or minus 5 percent per decade (or even plus or minus 3 percent per decade for a few records). The agreement with ground-based data degrades somewhat towards the stratopause and especially towards the tropopause where natural variability and low ozone abundances impede a more precise analysis. In part of the stratosphere a few records deviate from the preceding general conclusions; we identified biases of 10 percent and more (POAM II and SCIAMACHY), markedly higher single-profile variability (SMR and SCIAMACHY), and significant long-term drifts (SCIAMACHY, OSIRIS, HALOE, and possibly GOMOS and SMR as well). Furthermore, we reflected on the repercussions of our findings for the construction, analysis and interpretation of merged data records. Most notably, the discrepancies between several recent ozone profile trend assessments can be mostly explained by instrumental drift. This clearly demonstrates the need for systematic comprehensive multi-instrument comparison analyses.

  17. Ground-based assessment of the bias and long-term stability of fourteen limb and occultation ozone profile data records

    PubMed Central

    Hubert, D.; Lambert, J.-C.; Verhoelst, T.; Granville, J.; Keppens, A.; Baray, J.-L.; Cortesi, U.; Degenstein, D. A.; Froidevaux, L.; Godin-Beekmann, S.; Hoppel, K. W.; Kyrölä, E.; Leblanc, T.; Lichtenberg, G.; McElroy, C. T.; Murtagh, D.; Nakane, H.; Querel, R.; Russell, J. M.; Salvador, J.; Smit, H. G. J.; Stebel, K.; Steinbrecht, W.; Strawbridge, K. B.; Stübi, R.; Swart, D. P. J.; Taha, G.; Thompson, A. M.; Urban, J.; van Gijsel, J. A. E.; von der Gathen, P.; Walker, K. A.; Wolfram, E.; Zawodny, J. M.

    2018-01-01

    The ozone profile records of a large number of limb and occultation satellite instruments are widely used to address several key questions in ozone research. Further progress in some domains depends on a more detailed understanding of these data sets, especially of their long-term stability and their mutual consistency. To this end, we made a systematic assessment of fourteen limb and occultation sounders that, together, provide more than three decades of global ozone profile measurements. In particular, we considered the latest operational Level-2 records by SAGE II, SAGE III, HALOE, UARS MLS, Aura MLS, POAM II, POAM III, OSIRIS, SMR, GOMOS, MIPAS, SCIAMACHY, ACE-FTS and MAESTRO. Central to our work is a consistent and robust analysis of the comparisons against the ground-based ozonesonde and stratospheric ozone lidar networks. It allowed us to investigate, from the troposphere up to the stratopause, the following main aspects of satellite data quality: long-term stability, overall bias, and short-term variability, together with their dependence on geophysical parameters and profile representation. In addition, it permitted us to quantify the overall consistency between the ozone profilers. Generally, we found that between 20–40 km the satellite ozone measurement biases are smaller than ±5 %, the short-term variabilities are less than 5–12% and the drifts are at most ±5% decade−1 (or even ±3 % decade−1 for a few records). The agreement with ground-based data degrades somewhat towards the stratopause and especially towards the tropopause where natural variability and low ozone abundances impede a more precise analysis. In part of the stratosphere a few records deviate from the preceding general conclusions; we identified biases of 10% and more (POAM II and SCIAMACHY), markedly higher single-profile variability (SMR and SCIAMACHY), and significant long-term drifts (SCIAMACHY, OSIRIS, HALOE, and possibly GOMOS and SMR as well). Furthermore, we reflected on the repercussions of our findings for the construction, analysis and interpretation of merged data records. Most notably, the discrepancies between several recent ozone profile trend assessments can be mostly explained by instrumental drift. This clearly demonstrates the need for systematic comprehensive multi-instrument comparison analyses. PMID:29743958

  18. Analysis of Current Thyroid Function Testing Practices

    DTIC Science & Technology

    2017-10-18

    electric medical record (EMR). TFTs of interest were: TSH, FT4, thyroid panel )TSH + FT4), FT3, total thyroxine (T$), and total triiodothyronine (T3). These were also categorized based on the presence or absence of hypothyroidism .

  19. Visual perception-based criminal identification: a query-based approach

    NASA Astrophysics Data System (ADS)

    Singh, Avinash Kumar; Nandi, G. C.

    2017-01-01

    The visual perception of eyewitness plays a vital role in criminal identification scenario. It helps law enforcement authorities in searching particular criminal from their previous record. It has been reported that searching a criminal record manually requires too much time to get the accurate result. We have proposed a query-based approach which minimises the computational cost along with the reduction of search space. A symbolic database has been created to perform a stringent analysis on 150 public (Bollywood celebrities and Indian cricketers) and 90 local faces (our data-set). An expert knowledge has been captured to encapsulate every criminal's anatomical and facial attributes in the form of symbolic representation. A fast query-based searching strategy has been implemented using dynamic decision tree data structure which allows four levels of decomposition to fetch respective criminal records. Two types of case studies - viewed and forensic sketches have been considered to evaluate the strength of our proposed approach. We have derived 1200 views of the entire population by taking into consideration 80 participants as eyewitness. The system demonstrates an accuracy level of 98.6% for test case I and 97.8% for test case II. It has also been reported that experimental results reduce the search space up to 30 most relevant records.

  20. Development of an Individualized and Group Instructional Program Based on Financial Management for Adult/Young Farmers in Vocational Agriculture Programs in Missouri. Final Report.

    ERIC Educational Resources Information Center

    Nolting, Greg; And Others

    A study was conducted to develop competency-based curriculum materials and a computer-based analysis system for farm business records to assist local vocational agriculture teachers of adult/young farmers in their group and individualized instructional programs. A list of thirty-five competencies in financial management were validated using…

  1. Characteristics of the most intense lightning storm ever recorded at the CN Tower

    NASA Astrophysics Data System (ADS)

    Hussein, A. M.; Kazazi, S.; Anwar, M.; Yusouf, M.; Liatos, P.

    2017-02-01

    Lightning strikes to the CN Tower have been optically observed since 1978. In 1990, five independent systems started to operate to simultaneously record parameters of lightning strikes to the tower, including the time derivative of the current, the associated electric and magnetic fields, and the channel optical characteristics. On August 24, 2011, during an unusually severe lightning storm, video records showed that the CN Tower was struck with 52 lightning flashes within 84 min and 6.9 s. Thus, this storm produced, on average, a flash to the tower every 99 s. However, the CN Tower lightning current derivative measurement system only recorded 32 flashes, which were perfectly time-matched with 32 of the 52 video-recorded flashes. It is found that the current derivative measurement system recorded every video-recorded flash that contained at least one return stroke. Based on the analysis of video records, it is noted that each of the storm's 52 flashes contains an initial-stage current, proving that all flashes were upward initiated. This unique CN Tower storm - the most intense ever recorded at the tower - is here thoroughly analyzed, based on video and current records. The inter-flash time within the storm is found to vary between 10.6 s and 274 s, with an overall average of 98 s. It is also found that the inter-flash time between successive non-return-stroke flashes is on average 64% longer than that for successive flashes containing return strokes. Statistical analysis of video and current data clearly reveals that the time duration of flashes containing initial-stage currents and return strokes is on average 27% longer than that of flashes that only have initial-stage currents. Furthermore, it is important to note that the time duration of the initial-stage current in flashes containing no return strokes is on average 76% longer than that in flashes containing return strokes. Therefore, it is possible to conclude that if the time duration of the initial-stage current in a flash is long enough, resulting in large charge transfer, then there is less probability of having return strokes following it. The 32 current-recorded flashes contain a total of 156 return strokes, with an average multiplicity of 4.875. It is worth mentioning that during one decade, 1992-2001, the CN Tower current derivative measurement system only recorded 478 return strokes, demonstrating that the number of return strokes recorded at the tower within about 84 min is close to one third of those recorded at the tower during one decade. This finding clearly shows the great value and rarity of the presented extensive lightning current derivative data. Only one of the 32 current-recorded flashes is proved to be positive with a single return stroke. Based on current records, out of a total of 124 inter-stroke time intervals, 94% are found to be within 200 ms, with an overall inter-stroke time average of 68.1 ms. The maximum inter-stroke time recorded during this storm is 726.3 ms, the longest ever recorded at the CN Tower.

  2. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.

    PubMed

    Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales.

  3. An assessment of occupation and industry data from death certificates and hospital medical records for population-based cancer surveillance.

    PubMed Central

    Swanson, G M; Schwartz, A G; Burrows, R W

    1984-01-01

    This study analyzed 30,194 incident cases and 4,301 death certificates for completeness of occupational reporting. Analysis of data accuracy was based upon a comparison of more than 2,000 death certificates with incident abstracts and 352 death certificates with interview data. Death certificates had a higher proportion with occupation (94.3%) and industry (93.4%) reported than did incident abstracts of hospital medical records (39.0% and 63.5%, respectively). Compared with occupational history data obtained by interview, 76.1% of the death certificates were exact matches for usual occupation and industry. PMID:6711720

  4. Importance-Performance Analysis of Personal Health Records in Taiwan: A Web-Based Survey.

    PubMed

    Rau, Hsiao-Hsien; Wu, Yi-Syuan; Chu, Chi-Ming; Wang, Fu-Chung; Hsu, Min-Huei; Chang, Chi-Wen; Chen, Kang-Hua; Lee, Yen-Liang; Kao, Senyeong; Chiu, Yu-Lung; Wen, Hsyien-Chia; Fuad, Anis; Hsu, Chien-Yeh; Chiu, Hung-Wen

    2017-04-27

    Empowering personal health records (PHRs) provides basic human right, awareness, and intention for health promotion. As health care delivery changes toward patient-centered services, PHRs become an indispensable platform for consumers and providers. Recently, the government introduced "My health bank," a Web-based electronic medical records (EMRs) repository for consumers. However, it is not yet a PHR. To date, we do not have a platform that can let patients manage their own PHR. This study creates a vision of a value-added platform for personal health data analysis and manages their health record based on the contents of the "My health bank." This study aimed to examine consumer expectation regarding PHR, using the importance-performance analysis. The purpose of this study was to explore consumer perception regarding this type of a platform: it would try to identify the key success factors and important aspects by using the importance-performance analysis, and give some suggestions for future development based on it. This is a cross-sectional study conducted in Taiwan. Web-based invitation to participate in this study was distributed through Facebook. Respondents were asked to watch an introductory movie regarding PHR before filling in the questionnaire. The questionnaire was focused on 2 aspects, including (1) system functions, and (2) system design and security and privacy. The questionnaire would employ 12 and 7 questions respectively. The questionnaire was designed following 5-points Likert scale ranging from 1 ("disagree strongly") to 5 ("Agree strongly"). Afterwards, the questionnaire data was sorted using IBM SPSS Statistics 21 for descriptive statistics and the importance-performance analysis. This research received 350 valid questionnaires. Most respondents were female (219 of 350 participants, 62.6%), 21-30 years old (238 of 350 participants, 68.0%), with a university degree (228 of 350 participants, 65.1%). They were still students (195 out of 350 participants, 56.6%), with a monthly income of less than NT $30,000 (230 of 350 participants, 65.7%), and living in the North Taiwan (236 of 350 participants, 67.4%), with a good self-identified health status (171 of 350 participants, 48.9%). After performing the importance-performance analysis, we found the following: (1) instead of complex functions, people just want to have a platform that can let them integrate and manage their medical visit, health examination, and life behavior records; (2) they do not care whether their PHR is shared with others; and (3) most of the participants think the system security design is not important, but they also do not feel satisfied with the current security design. Overall, the issues receiving the most user attention were the system functions, circulation, integrity, ease of use, and continuity of the PHRs, data security, and privacy protection. ©Hsiao-Hsien Rau, Yi-Syuan Wu, Chi-Ming Chu, Fu-Chung Wang, Min-Huei Hsu, Chi-Wen Chang, Kang-Hua Chen, Yen-Liang Lee, Senyeong Kao, Yu-Lung Chiu, Hsyien-Chia Wen, Anis Fuad, Chien-Yeh Hsu, Hung-Wen Chiu. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 27.04.2017.

  5. m2-ABKS: Attribute-Based Multi-Keyword Search over Encrypted Personal Health Records in Multi-Owner Setting.

    PubMed

    Miao, Yinbin; Ma, Jianfeng; Liu, Ximeng; Wei, Fushan; Liu, Zhiquan; Wang, Xu An

    2016-11-01

    Online personal health record (PHR) is more inclined to shift data storage and search operations to cloud server so as to enjoy the elastic resources and lessen computational burden in cloud storage. As multiple patients' data is always stored in the cloud server simultaneously, it is a challenge to guarantee the confidentiality of PHR data and allow data users to search encrypted data in an efficient and privacy-preserving way. To this end, we design a secure cryptographic primitive called as attribute-based multi-keyword search over encrypted personal health records in multi-owner setting to support both fine-grained access control and multi-keyword search via Ciphertext-Policy Attribute-Based Encryption. Formal security analysis proves our scheme is selectively secure against chosen-keyword attack. As a further contribution, we conduct empirical experiments over real-world dataset to show its feasibility and practicality in a broad range of actual scenarios without incurring additional computational burden.

  6. Novel four-sided neural probe fabricated by a thermal lamination process of polymer films.

    PubMed

    Shin, Soowon; Kim, Jae-Hyun; Jeong, Joonsoo; Gwon, Tae Mok; Lee, Seung-Hee; Kim, Sung June

    2017-02-15

    Ideally, neural probes should have channels with a three-dimensional (3-D) configuration to record the activities of 3-D neural circuits. Many types of 3-D neural probes have been developed; however, most of them were designed as an array of multiple shanks with electrodes located along one side of the shanks. We developed a novel liquid crystal polymer (LCP)-based neural probe with four-sided electrodes. This probe has electrodes on four sides of the shank, i.e., the front, back and two sidewalls. To generate the proposed configuration of the electrodes, we used a thermal lamination process involving LCP films and laser micromachining. The proposed novel four-sided neural probe, was used to successfully perform in vivo multichannel neural recording in the mouse primary somatosensory cortex. The multichannel neural recording showed that the proposed four-sided neural probe can record spiking activities from a more diverse neuronal population than single-sided probes. This was confirmed by a pairwise Pearson correlation coefficient (Pearson's r) analysis and a cross-correlation analysis. The developed four-sided neural probe can be used to record various signals from a complex neural network. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Long-term recording and automatic analysis of cough using filtered acoustic signals and movements on static charge sensitive bed.

    PubMed

    Salmi, T; Sovijärvi, A R; Brander, P; Piirilä, P

    1988-11-01

    Reliable long-term assessment of cough is necessary in many clinical and scientific settings. A new method for long-term recording and automatic analysis of cough is presented. The method is based on simultaneous recording of two independent signals: high-pass filtered cough sounds and cough-induced fast movements of the body. The acoustic signals are recorded with a dynamic microphone in the acoustic focus of a glass fiber paraboloid mirror. Body movements are recorded with a static charge-sensitive bed located under an ordinary plastic foam mattress. The patient can be studied lying or sitting with no transducers or electrodes attached. A microcomputer is used for sampling of signals, detection of cough, statistical analyses, and on-line printing of results. The method was validated in seven adult patients with a total of 809 spontaneous cough events, using clinical observation as a reference. The sensitivity of the method to detect cough was 99.0 percent, and the positive predictivity was 98.1 percent. The system ignored speaking and snoring. The method provides a convenient means of reliable long-term follow-up of cough in clinical work and research.

  8. Low-flow analysis and selected flow statistics representative of 1930-2002 for streamflow-gaging stations in or near West Virginia

    USGS Publications Warehouse

    Wiley, Jeffrey B.

    2006-01-01

    Five time periods between 1930 and 2002 are identified as having distinct patterns of annual minimum daily mean flows (minimum flows). Average minimum flows increased around 1970 at many streamflow-gaging stations in West Virginia. Before 1930, however, there might have been a period of minimum flows greater than any period identified between 1930 and 2002. The effects of climate variability are probably the principal causes of the differences among the five time periods. Comparisons of selected streamflow statistics are made between values computed for the five identified time periods and values computed for the 1930-2002 interval for 15 streamflow-gaging stations. The average difference between statistics computed for the five time periods and the 1930-2002 interval decreases with increasing magnitude of the low-flow statistic. The greatest individual-station absolute difference was 582.5 percent greater for the 7-day 10-year low flow computed for 1970-1979 compared to the value computed for 1930-2002. The hydrologically based low flows indicate approximately equal or smaller absolute differences than biologically based low flows. The average 1-day 3-year biologically based low flow (1B3) and 4-day 3-year biologically based low flow (4B3) are less than the average 1-day 10-year hydrologically based low flow (1Q10) and 7-day 10-year hydrologic-based low flow (7Q10) respectively, and range between 28.5 percent less and 13.6 percent greater. Seasonally, the average difference between low-flow statistics computed for the five time periods and 1930-2002 is not consistent between magnitudes of low-flow statistics, and the greatest difference is for the summer (July 1-September 30) and fall (October 1-December 31) for the same time period as the greatest difference determined in the annual analysis. The greatest average difference between 1B3 and 4B3 compared to 1Q10 and 7Q10, respectively, is in the spring (April 1-June 30), ranging between 11.6 and 102.3 percent greater. Statistics computed for the individual station's record period may not represent the statistics computed for the period 1930 to 2002 because (1) station records are available predominantly after about 1970 when minimum flows were greater than the average between 1930 and 2002 and (2) some short-term station records are mostly during dry periods, whereas others are mostly during wet periods. A criterion-based sampling of the individual station's record periods at stations was taken to reduce the effects of statistics computed for the entire record periods not representing the statistics computed for 1930-2002. The criterion used to sample the entire record periods is based on a comparison between the regional minimum flows and the minimum flows at the stations. Criterion-based sampling of the available record periods was superior to record-extension techniques for this study because more stations were selected and areal distribution of stations was more widespread. Principal component and correlation analyses of the minimum flows at 20 stations in or near West Virginia identify three regions of the State encompassing stations with similar patterns of minimum flows: the Lower Appalachian Plateaus, the Upper Appalachian Plateaus, and the Eastern Panhandle. All record periods of 10 years or greater between 1930 and 2002 where the average of the regional minimum flows are nearly equal to the average for 1930-2002 are determined as representative of 1930-2002. Selected statistics are presented for the longest representative record period that matches the record period for 77 stations in West Virginia and 40 stations near West Virginia. These statistics can be used to develop equations for estimating flow in ungaged stream locations.

  9. Peak horizontal acceleration and velocity from strong-motion records including records from the 1979 imperial valley, California, earthquake

    USGS Publications Warehouse

    Joyner, William B.; Boore, David M.

    1981-01-01

    We have taken advantage of the recent increase in strong-motion data at close distances to derive new attenuation relations for peak horizontal acceleration and velocity. This new analysis uses a magnitude-independent shape, based on geometrical spreading and anelastic attenuation, for the attenuation curve. An innovation in technique is introduced that decouples the determination of the distance dependence of the data from the magnitude dependence.

  10. Development of an "Alert Framework" Based on the Practices in the Medical Front.

    PubMed

    Sakata, Takuya; Araki, Kenji; Yamazaki, Tomoyoshi; Kawano, Koichi; Maeda, Minoru; Kushima, Muneo; Araki, Sanae

    2018-05-09

    At the University of Miyazaki Hospital (UMH), we have accumulated and semantically structured a vast amount of medical information since the activation of the electronic health record system approximately 10 years ago. With this medical information, we have decided to develop an alert system for aiding in medical treatment. The purpose of this investigation is to not only to integrate an alert framework into the electronic heath record system, but also to formulate a modeling method of this knowledge. A trial alert framework was developed for the staff in various occupational categories at the UMH. Based on findings of subsequent interviews, a more detailed and upgraded alert framework was constructed, resulting in the final model. Based on our current findings, an alert framework was developed with four major items. Based on the analysis of the medical practices from the trial model, it has been concluded that there are four major risk patterns that trigger the alert. Furthermore, the current alert framework contains detailed definitions which are easily substituted into the database, leading to easy implementation of the electronic health records.

  11. Identifying regional moisture source during the last deglaciation through analysis of central Texas speleothems

    NASA Astrophysics Data System (ADS)

    James, C.; Charlton, T.; Banner, J.; Koleszar, A. M.; James, E. W.; Breecker, D.; Miller, N. R.; Edwards, R. L.

    2016-12-01

    Precipitation in the drought-prone Southwest US is principally sourced from the Gulf of Mexico (GoM) and/or Pacific Ocean, each with distinctive δ18O compositions. Temporal changes in moisture source in this region have been reconstructed from speleothem proxies for the last deglaciation ( 19 - 11 ka). In this study we focus on how moisture sources in Texas speleothem records varied in response to abrupt deglacial warming/cooling events, namely the Bølling-Allerod (BA) interstadial (14.7-14.1 ka) and the Younger Dryas stadial (12.9-11.5 ka). A recent speleothem oxygen isotope record (CWN-4) in a stalagmite from Cave Without A Name near Boerne, Texas, suggests that both the magnitude and timing of changes in δ18O values during the deglaciation closely match δ18O values from a seawater proxy record based on GoM foraminifera (Feng et al., 2014; Flower et al., 2004). A distinct negative δ18O excursion of -3‰ is observed in both records at the onset of the BA, with a subsequent 3‰ increase occurring at the end of the BA. Based on these similarities, regional moisture was hypothesized to be predominantly sourced from the GoM. In order to test the hypothesis that the CWN-4 record captures a regional climate signal, as opposed to recording local variations in in-cave processes, we analyze multiple central Texas speleothem δ18O records. We present oxygen isotope records of a flowstone (IC-2) from a cave 130 km WSW of CWN and a stalagmite (McN-1) from a cave 140 km NE of CWN, with both records spanning the onset of the deglaciation through the Holocene. The IC-2 δ18O record is consistent with that of CWN-4, however, the age model relies on several dates with high uncertainties related to elevated concentrations of detritus containing common Th. The McN-1 record is anticipated to have a more robust age model as it appears to contain less detrital material based on low 232Th concentrations (< 0.009 ppb) in the growth layers that have been selected for U-series analysis. Preliminary δ18O values for the pre-BA period of McN-1 cover a similar range of δ18O values (-4.5 to -2.5‰) compared with the pre-BA portions of the CWN-4 and IC-2 records. These results are consistent with the regional climate hypothesis, which will be tested further by building a complete, high-resolution oxygen isotope time series for McN-1.

  12. Reference-Free Removal of EEG-fMRI Ballistocardiogram Artifacts with Harmonic Regression

    PubMed Central

    Krishnaswamy, Pavitra; Bonmassar, Giorgio; Poulsen, Catherine; Pierce, Eric T; Purdon, Patrick L.; Brown, Emery N.

    2016-01-01

    Combining electroencephalogram (EEG) recording and functional magnetic resonance imaging (fMRI) offers the potential for imaging brain activity with high spatial and temporal resolution. This potential remains limited by the significant ballistocardiogram (BCG) artifacts induced in the EEG by cardiac pulsation-related head movement within the magnetic field. We model the BCG artifact using a harmonic basis, pose the artifact removal problem as a local harmonic regression analysis, and develop an efficient maximum likelihood algorithm to estimate and remove BCG artifacts. Our analysis paradigm accounts for time-frequency overlap between the BCG artifacts and neurophysiologic EEG signals, and tracks the spatiotemporal variations in both the artifact and the signal. We evaluate performance on: simulated oscillatory and evoked responses constructed with realistic artifacts; actual anesthesia-induced oscillatory recordings; and actual visual evoked potential recordings. In each case, the local harmonic regression analysis effectively removes the BCG artifacts, and recovers the neurophysiologic EEG signals. We further show that our algorithm outperforms commonly used reference-based and component analysis techniques, particularly in low SNR conditions, the presence of significant time-frequency overlap between the artifact and the signal, and/or large spatiotemporal variations in the BCG. Because our algorithm does not require reference signals and has low computational complexity, it offers a practical tool for removing BCG artifacts from EEG data recorded in combination with fMRI. PMID:26151100

  13. Openness of patients' reporting with use of electronic records: psychiatric clinicians' views

    PubMed Central

    Blackford, Jennifer Urbano; Rosenbloom, S Trent; Seidel, Sandra; Clayton, Ellen Wright; Dilts, David M; Finder, Stuart G

    2010-01-01

    Objectives Improvements in electronic health record (EHR) system development will require an understanding of psychiatric clinicians' views on EHR system acceptability, including effects on psychotherapy communications, data-recording behaviors, data accessibility versus security and privacy, data quality and clarity, communications with medical colleagues, and stigma. Design Multidisciplinary development of a survey instrument targeting psychiatric clinicians who recently switched to EHR system use, focus group testing, data analysis, and data reliability testing. Measurements Survey of 120 university-based, outpatient mental health clinicians, with 56 (47%) responding, conducted 18 months after transition from a paper to an EHR system. Results Factor analysis gave nine item groupings that overlapped strongly with five a priori domains. Respondents both praised and criticized the EHR system. A strong majority (81%) felt that open therapeutic communications were preserved. Regarding data quality, content, and privacy, clinicians (63%) were less willing to record highly confidential information and disagreed (83%) with including their own psychiatric records among routinely accessed EHR systems. Limitations single time point; single academic medical center clinic setting; modest sample size; lack of prior instrument validation; survey conducted in 2005. Conclusions In an academic medical center clinic, the presence of electronic records was not seen as a dramatic impediment to therapeutic communications. Concerns regarding privacy and data security were significant, and may contribute to reluctances to adopt electronic records in other settings. Further study of clinicians' views and use patterns may be helpful in guiding development and deployment of electronic records systems. PMID:20064802

  14. Organizational learning in the implementation and adoption of national electronic health records: case studies of two hospitals participating in the National Programme for Information Technology in England.

    PubMed

    Takian, Amirhossein; Sheikh, Aziz; Barber, Nicholas

    2014-09-01

    To explore the role of organizational learning in enabling implementation and supporting adoption of electronic health record systems into two English hospitals. In the course of conducting our prospective and sociotechnical evaluation of the implementation and adoption of electronic health record into 12 "early adopter" hospitals across England, we identified two hospitals implementing virtually identical versions of the same "off-the-shelf" software (Millennium) within a comparable timeframe. We undertook a longitudinal qualitative case study-based analysis of these two hospitals (referred to hereafter as Alpha and Omega) and their implementation experiences. Data included the following: 63 in-depth interviews with various groups of internal and external stakeholders; 41-h on-site observation; and content analysis of 218 documents of various types. Analysis was both inductive and deductive, the latter being informed by the "sociotechnical changing" theoretical perspective. Although Alpha and Omega shared a number of contextual similarities, our evaluation revealed fundamental differences in visions of electronic health record and the implementation strategy between the hospitals, which resulted in distinct local consequences of electronic health record implementation and impacted adoption. Both hospitals did not, during our evaluation, see the hoped-for benefits to the organization as a result of the introduction of electronic health record, such as speeding-up tasks. Nonetheless, the Millennium software worked out to be easier to use at Omega. Interorganizational learning was at the heart of this difference. Despite the turbulent overall national "roll out" of electronic health record systems into the English hospitals, considerable opportunities for organizational learning were offered by sequential delivery of the electronic health record software into "early adopter" hospitals. We argue that understanding the process of organizational learning and its enabling factors has the potential to support efforts at implementing national electronic health record implementation endeavors. © The Author(s) 2013.

  15. Medication regularity of pulmonary fibrosis treatment by contemporary traditional Chinese medicine experts based on data mining.

    PubMed

    Zhang, Suxian; Wu, Hao; Liu, Jie; Gu, Huihui; Li, Xiujuan; Zhang, Tiansong

    2018-03-01

    Treatment of pulmonary fibrosis by traditional Chinese medicine (TCM) has accumulated important experience. Our interest is in exploring the medication regularity of contemporary Chinese medical specialists treating pulmonary fibrosis. Through literature search, medical records from TCM experts who treat pulmonary fibrosis, which were published in Chinese and English medical journals, were selected for this study. As the object of study, a database was established after analysing the records. After data cleaning, the rules of medicine in the treatment of pulmonary fibrosis in medical records of TCM were explored by using data mining technologies such as frequency analysis, association rule analysis, and link analysis. A total of 124 medical records from 60 doctors were selected in this study; 263 types of medicinals were used a total of 5,455 times; the herbs that were used more than 30 times can be grouped into 53 species and were used a total of 3,681 times. Using main medicinals cluster analysis, medicinals were divided into qi-tonifying, yin-tonifying, blood-activating, phlegm-resolving, cough-suppressing, panting-calming, and ten other major medicinal categories. According to the set conditions, a total of 62 drug compatibility rules have been obtained, involving mainly qi-tonifying, yin-tonifying, blood-activating, phlegm-resolving, qi-descending, and panting-calming medicinals, as well as other medicinals used in combination. The results of data mining are consistent with clinical practice and it is feasible to explore the medical rules applicable to the treatment of pulmonary fibrosis in medical records of TCM by data mining.

  16. A spatial database of wildfires in the United States, 1992-2011

    NASA Astrophysics Data System (ADS)

    Short, K. C.

    2013-07-01

    The statistical analysis of wildfire activity is a critical component of national wildfire planning, operations, and research in the United States (US). However, there are multiple federal, state, and local entities with wildfire protection and reporting responsibilities in the US, and no single, unified system of wildfire record-keeping exists. To conduct even the most rudimentary interagency analyses of wildfire numbers and area burned from the authoritative systems of record, one must harvest records from dozens of disparate databases with inconsistent information content. The onus is then on the user to check for and purge redundant records of the same fire (i.e. multijurisdictional incidents with responses reported by several agencies or departments) after pooling data from different sources. Here we describe our efforts to acquire, standardize, error-check, compile, scrub, and evaluate the completeness of US federal, state, and local wildfire records from 1992-2011 for the national, interagency Fire Program Analysis (FPA) application. The resulting FPA Fire-occurrence Database (FPA FOD) includes nearly 1.6 million records from the 20 yr period, with values for at least the following core data elements: location at least as precise as a Public Land Survey System section (2.6 km2 grid), discovery date, and final fire size. The FPA FOD is publicly available from the Research Data Archive of the US Department of Agriculture, Forest Service (doi:10.2737/RDS-2013-0009). While necessarily incomplete in some aspects, the database is intended to facilitate fairly high-resolution geospatial analysis of US wildfire activity over the past two decades, based on available information from the authoritative systems of record.

  17. A spatial database of wildfires in the United States, 1992-2011

    NASA Astrophysics Data System (ADS)

    Short, K. C.

    2014-01-01

    The statistical analysis of wildfire activity is a critical component of national wildfire planning, operations, and research in the United States (US). However, there are multiple federal, state, and local entities with wildfire protection and reporting responsibilities in the US, and no single, unified system of wildfire record keeping exists. To conduct even the most rudimentary interagency analyses of wildfire numbers and area burned from the authoritative systems of record, one must harvest records from dozens of disparate databases with inconsistent information content. The onus is then on the user to check for and purge redundant records of the same fire (i.e., multijurisdictional incidents with responses reported by several agencies or departments) after pooling data from different sources. Here we describe our efforts to acquire, standardize, error-check, compile, scrub, and evaluate the completeness of US federal, state, and local wildfire records from 1992-2011 for the national, interagency Fire Program Analysis (FPA) application. The resulting FPA Fire-Occurrence Database (FPA FOD) includes nearly 1.6 million records from the 20 yr period, with values for at least the following core data elements: location, at least as precise as a Public Land Survey System section (2.6 km2 grid), discovery date, and final fire size. The FPA FOD is publicly available from the Research Data Archive of the US Department of Agriculture, Forest Service (doi:10.2737/RDS-2013-0009). While necessarily incomplete in some aspects, the database is intended to facilitate fairly high-resolution geospatial analysis of US wildfire activity over the past two decades, based on available information from the authoritative systems of record.

  18. Vocalisations of Killer Whales (Orcinus orca) in the Bremer Canyon, Western Australia.

    PubMed

    Wellard, Rebecca; Erbe, Christine; Fouda, Leila; Blewitt, Michelle

    2015-01-01

    To date, there has been no dedicated study in Australian waters on the acoustics of killer whales. Hence no information has been published on the sounds produced by killer whales from this region. Here we present the first acoustical analysis of recordings collected off the Western Australian coast. Underwater sounds produced by Australian killer whales were recorded during the months of February and March 2014 and 2015 in the Bremer Canyon in Western Australia. Vocalisations recorded included echolocation clicks, burst-pulse sounds and whistles. A total of 28 hours and 29 minutes were recorded and analysed, with 2376 killer whale calls (whistles and burst-pulse sounds) detected. Recordings of poor quality or signal-to-noise ratio were excluded from analysis, resulting in 142 whistles and burst-pulse vocalisations suitable for analysis and categorisation. These were grouped based on their spectrographic features into nine Bremer Canyon (BC) "call types". The frequency of the fundamental contours of all call types ranged from 600 Hz to 29 kHz. Calls ranged from 0.05 to 11.3 seconds in duration. Biosonar clicks were also recorded, but not studied further. Surface behaviours noted during acoustic recordings were categorised as either travelling or social behaviour. A detailed description of the acoustic characteristics is necessary for species acoustic identification and for the development of passive acoustic tools for population monitoring, including assessments of population status, habitat usage, migration patterns, behaviour and acoustic ecology. This study provides the first quantitative assessment and report on the acoustic features of killer whales vocalisations in Australian waters, and presents an opportunity to further investigate this little-known population.

  19. Vocalisations of Killer Whales (Orcinus orca) in the Bremer Canyon, Western Australia

    PubMed Central

    Wellard, Rebecca; Erbe, Christine; Fouda, Leila; Blewitt, Michelle

    2015-01-01

    To date, there has been no dedicated study in Australian waters on the acoustics of killer whales. Hence no information has been published on the sounds produced by killer whales from this region. Here we present the first acoustical analysis of recordings collected off the Western Australian coast. Underwater sounds produced by Australian killer whales were recorded during the months of February and March 2014 and 2015 in the Bremer Canyon in Western Australia. Vocalisations recorded included echolocation clicks, burst-pulse sounds and whistles. A total of 28 hours and 29 minutes were recorded and analysed, with 2376 killer whale calls (whistles and burst-pulse sounds) detected. Recordings of poor quality or signal-to-noise ratio were excluded from analysis, resulting in 142 whistles and burst-pulse vocalisations suitable for analysis and categorisation. These were grouped based on their spectrographic features into nine Bremer Canyon (BC) “call types”. The frequency of the fundamental contours of all call types ranged from 600 Hz to 29 kHz. Calls ranged from 0.05 to 11.3 seconds in duration. Biosonar clicks were also recorded, but not studied further. Surface behaviours noted during acoustic recordings were categorised as either travelling or social behaviour. A detailed description of the acoustic characteristics is necessary for species acoustic identification and for the development of passive acoustic tools for population monitoring, including assessments of population status, habitat usage, migration patterns, behaviour and acoustic ecology. This study provides the first quantitative assessment and report on the acoustic features of killer whales vocalisations in Australian waters, and presents an opportunity to further investigate this little-known population. PMID:26352429

  20. Auditing The Completeness and Legibility of Computerized Radiological Request Forms.

    PubMed

    Al Muallem, Yahya; Al Dogether, Majed; Househ, Mowafa; Saddik, Basema

    2017-11-04

    Certain Saudi healthcare organizations transfer outpatients to medical imaging departments for radiological examinations in a manual process that relies on the use of paper-based forms. With the increased implementation of electronic medical records in Saudi Hospitals, little is known about the completeness and legibility of information captured in  electronic-based medical imaging forms. The purpose of this study is to audit the completeness and legibility of medical imaging paper-based forms in comparison with electronic-based medical imaging forms. As a secondary objective, we also examined the number of errors found on the forms.An observational retrospective cross-sectional study was utilized to audit the completeness and legibility of both paper and electronic forms collected between March 1 and May 15, 2015. The study measured the association among categorical variables using Chi-Square analysis. The results of this investigation show a significant association between form completion and type of record (i.e., paper vs. electronic) where electronic-based systems were found to be more complete than paper-based records. Electrnoic based records were also found to improve form legibility, promote user adherence to complete the forms and minimize entry errors. In conclusion, electronic-based medical imaging forms are more complete and legible than paper based forms. Future studies should evaluate other hospitals and compare both legibility and completeness of electronic-based medical imaging forms and conduct usability evaluation studies with users to explore the impacts of system design on both completeness and legibility of electronic forms, in general, but more specifically, electronic-based medical imaging forms.

  1. Operational problems of Haniwa net as a form of social capital: interdependence between human networks of physicians and information networks.

    PubMed

    Maeda, Minoru; Araki, Sanae; Suzuki, Muneou; Umemoto, Katsuhiro; Kai, Yukiko; Araki, Kenji

    2012-10-01

    In August 2009, Miyazaki Health and Welfare Network (Haniwa Net, hereafter referred to as "the Net"), centrally led by University of Miyazaki Hospital (UMH), adopted a center hospital-based system offering a unilateral linkage that enables the viewing of UMH's medical records through a web-based browser (electronic medical records (EMR)). By the end of December 2010, the network had developed into a system of 79 collaborating physicians from within the prefecture. Beginning in August 2010, physicians in 12 medical institutions were visited and asked to speak freely on the operational issues concerning the Net. Recordings and written accounts were coded using the text analysis software MAXQDA 10 to understand the actual state of operations. Analysis of calculations of Kendall's rank correlation confirmed that the interdependency between human networks and information networks is significant. At the same time, while the negative opinions concerning the functions of the Net were somewhat conspicuous, the results showed a correlation between requests and proposals for operational improvements of the Net, clearly indicating the need for a more user-friendly system and a better viewer.

  2. Defending sleepwalkers with science and an illustrative case.

    PubMed

    Cartwright, Rosalind D; Guilleminault, Christian

    2013-07-15

    To test whether laboratory-based research differentiating sleepwalkers (SW) from controls (C) can be applied in an uncontrolled forensic case as evidence the alleged crime was committed during an arousal from sleep in which the mind is not fully conscious due to a SW disorder. A PSG study recorded 8 months after the defendant was charged was analyzed independently by spectral analysis. Slow wave activity (SWA) and cyclic alternating pattern (CAP) rates were computed. Clinical interviews and police records were reviewed for data re: the defendant's sleep prior to the event and use of drugs, alcohol, and stimulants. The SWA distribution was abnormally low and flat, significantly lower than published controls; in the first NREM cycle, CAP rate 55 was above normal. Two weeks of prior sleep deprivation was confirmed from interviews and defendant's observed daytime sleepiness. Caffeine intake the day before the event was calculated at 826 mg over 14 hours. Snoring and a mild breathing disorder were present in the PSG. Testimony based on spectral analysis of PSG recorded following an alleged criminal event supported a SW explanation for the non-rational behaviors charged. The defendant was acquitted of all charges and has been successfully treated.

  3. Classification of communication signals of the little brown bat

    NASA Astrophysics Data System (ADS)

    Melendez, Karla V.; Jones, Douglas L.; Feng, Albert S.

    2005-09-01

    Little brown bats, Myotis lucifugus, are known for their ability to echolocate and utilize their echolocation system to navigate, locate, and identify prey. Their echolocation signals have been characterized in detail, but their communication signals are poorly understood despite their widespread use during the social interactions. The goal of this study was to characterize the communication signals of little brown bats. Sound recordings were made overnight on five individual bats (housed separately from a large group of captive bats) for 7 nights, using a Pettersson ultrasound detector D240x bat detector and Nagra ARES-BB digital recorder. The spectral and temporal characteristics of recorded sounds were first analyzed using BATSOUND software from Pettersson. Sounds were first classified by visual observation of calls' temporal pattern and spectral composition, and later using an automatic classification scheme based on multivariate statistical parameters in MATLAB. Human- and machine-based analysis revealed five discrete classes of bat's communication signals: downward frequency-modulated calls, constant frequency calls, broadband noise bursts, broadband chirps, and broadband click trains. Future studies will focus on analysis of calls' spectrotemporal modulations to discriminate any subclasses that may exist. [Research supported by Grant R01-DC-04998 from the National Institute for Deafness and Communication Disorders.

  4. StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab

    NASA Astrophysics Data System (ADS)

    Grund, Michael

    2017-08-01

    SplitLab is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to the noisy seaside, ocean bottom or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure which is based on MATLAB. The effectiveness and use of this plugin is demonstrated with data examples of a long running seismological recording station in Finland.

  5. Design of the SGML-based electronic patient record system with the use of object-oriented analysis methods.

    PubMed

    Kuikka, E; Eerola, A; Porrasmaa, J; Miettinen, A; Komulainen, J

    1999-01-01

    Since a patient record is typically a document updated by many users, required to be represented in many different layouts, and transferred from place to place, it is a good candidate to be represented structured and coded using the SGML document standard. The use of the SGML requires that the structure of the document is defined in advance by a Document Type Definition (DTD) and the document follows it. This paper represents a method which derives an SGML DTD by starting from the description of the usage of the patient record in medical care and nursing.

  6. MATLAB-based automated patch-clamp system for awake behaving mice

    PubMed Central

    Siegel, Jennifer J.; Taylor, William; Chitwood, Raymond A.; Johnston, Daniel

    2015-01-01

    Automation has been an important part of biomedical research for decades, and the use of automated and robotic systems is now standard for such tasks as DNA sequencing, microfluidics, and high-throughput screening. Recently, Kodandaramaiah and colleagues (Nat Methods 9: 585–587, 2012) demonstrated, using anesthetized animals, the feasibility of automating blind patch-clamp recordings in vivo. Blind patch is a good target for automation because it is a complex yet highly stereotyped process that revolves around analysis of a single signal (electrode impedance) and movement along a single axis. Here, we introduce an automated system for blind patch-clamp recordings from awake, head-fixed mice running on a wheel. In its design, we were guided by 3 requirements: easy-to-use and easy-to-modify software; seamless integration of behavioral equipment; and efficient use of time. The resulting system employs equipment that is standard for patch recording rigs, moderately priced, or simple to make. It is written entirely in MATLAB, a programming environment that has an enormous user base in the neuroscience community and many available resources for analysis and instrument control. Using this system, we obtained 19 whole cell patch recordings from neurons in the prefrontal cortex of awake mice, aged 8–9 wk. Successful recordings had series resistances that averaged 52 ± 4 MΩ and required 5.7 ± 0.6 attempts to obtain. These numbers are comparable with those of experienced electrophysiologists working manually, and this system, written in a simple and familiar language, will be useful to many cellular electrophysiologists who wish to study awake behaving mice. PMID:26084901

  7. Using Spider-Web Patterns To Determine Toxicity

    NASA Technical Reports Server (NTRS)

    Noever, David A.; Cronise, Raymond J.; Relwani, Rachna A.

    1995-01-01

    Method of determining toxicities of chemicals involves recording and analysis of spider-web patterns. Based on observation spiders exposed to various chemicals spin webs that differ, in various ways, from normal webs. Potential alternative to toxicity testing on higher animals.

  8. Frequency-duration analysis of dissolved-oxygen concentrations in two southwestern Wisconsin streams

    USGS Publications Warehouse

    Greb, Steven R.; Graczyk, David J.

    2007-01-01

    Historically, dissolved-oxygen (DO) data have been collected in the same manner as other water-quality constituents, typically at infrequent intervals as a grab sample or an instantaneous meter reading. Recent years have seen an increase in continuous water-quality monitoring with electronic dataloggers. This new technique requires new approaches in the statistical analysis of the continuous record. This paper presents an application of frequency-duration analysis to the continuous DO records of a cold and a warm water stream in rural southwestern Wisconsin. This method offers a quick, concise way to summarize large time-series data bases in an easily interpretable manner. Even though the two streams had similar mean DO concentrations, frequency-duration analyses showed distinct differences in their DO-concentration regime. This type of analysis also may be useful in relating DO concentrations to biological effects and in predicting low DO occurrences.

  9. "Willing but unwilling": attitudinal barriers to adoption of home-based health information technology among older adults.

    PubMed

    Young, Rachel; Willis, Erin; Cameron, Glen; Geana, Mugur

    2014-06-01

    While much research focuses on adoption of electronic health-care records and other information technology among health-care providers, less research explores patient attitudes. This qualitative study examines barriers to adoption of home-based health information technology, particularly personal electronic health records, among older adults. We conducted in-depth interviews (30-90 min duration) with 35 American adults, aged 46-72 years, to determine their perceptions of and attitudes toward home-based health information technology. Analysis of interview data revealed that most barriers to adoption fell under four themes: technological discomfort, privacy or security concerns, lack of relative advantage, and perceived distance from the user representation. Based on our findings, systems to promote home-based health information technology should incorporate familiar computer applications, alleviate privacy and security concerns, and align with older adults' active and engaged self-image.

  10. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    PubMed

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2018-01-01

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  11. The HISTMAG database: combining historical, archaeomagnetic and volcanic data

    NASA Astrophysics Data System (ADS)

    Arneitz, Patrick; Leonhardt, Roman; Schnepp, Elisabeth; Heilig, Balázs; Mayrhofer, Franziska; Kovacs, Peter; Hejda, Pavel; Valach, Fridrich; Vadasz, Gergely; Hammerl, Christa; Egli, Ramon; Fabian, Karl; Kompein, Niko

    2017-09-01

    Records of the past geomagnetic field can be divided into two main categories. These are instrumental historical observations on the one hand, and field estimates based on the magnetization acquired by rocks, sediments and archaeological artefacts on the other hand. In this paper, a new database combining historical, archaeomagnetic and volcanic records is presented. HISTMAG is a relational database, implemented in MySQL, and can be accessed via a web-based interface (http://www.conrad-observatory.at/zamg/index.php/data-en/histmag-database). It combines available global historical data compilations covering the last ∼500 yr as well as archaeomagnetic and volcanic data collections from the last 50 000 yr. Furthermore, new historical and archaeomagnetic records, mainly from central Europe, have been acquired. In total, 190 427 records are currently available in the HISTMAG database, whereby the majority is related to historical declination measurements (155 525). The original database structure was complemented by new fields, which allow for a detailed description of the different data types. A user-comment function provides the possibility for a scientific discussion about individual records. Therefore, HISTMAG database supports thorough reliability and uncertainty assessments of the widely different data sets, which are an essential basis for geomagnetic field reconstructions. A database analysis revealed systematic offset for declination records derived from compass roses on historical geographical maps through comparison with other historical records, while maps created for mining activities represent a reliable source.

  12. Two-dimensional random surface model for asperity-contact in elastohydrodynamic lubrication

    NASA Technical Reports Server (NTRS)

    Coy, J. J.; Sidik, S. M.

    1979-01-01

    Relations for the asperity-contact time function during elastohydrodynamic lubrication of a ball bearing are presented. The analysis is based on a two-dimensional random surface model, and actual profile traces of the bearing surfaces are used as statistical sample records. The results of the analysis show that transition from 90 percent contact to 1 percent contact occurs within a dimensionless film thickness range of approximately four to five. This thickness ratio is several times large than reported in the literature where one-dimensional random surface models were used. It is shown that low pass filtering of the statistical records will bring agreement between the present results and those in the literature.

  13. Characterization and Pathogenicity of Alternaria vanuatuensis, a New Record from Allium Plants in Korea and China

    PubMed Central

    Li, Mei Jia; Deng, Jian Xin; Paul, Narayan Chandra

    2014-01-01

    Alternaria from different Allium plants was characterized by multilocus sequence analysis. Based on sequences of the β-tubulin (BT2b), the Alternaria allergen a1 (Alt a1), and the RNA polymerase II second largest subunit (RPB2) genes and phylogenetic data analysis, isolates were divided into two groups. The two groups were identical to representative isolates of A. porri (EGS48-147) and A. vanuatuensis (EGS45-018). The conidial characteristics and pathogenicity of A. vanuatuensis also well supported the molecular characteristics. This is the first record of A. vanuatuensis E. G. Simmons & C. F. Hill from Korea and China. PMID:25606017

  14. You Learn by Your Mistakes. Effective Training Strategies Based on the Analysis of Video-Recorded Worked-Out Examples

    ERIC Educational Resources Information Center

    Cattaneo, Alberto A. P.; Boldrini, Elena

    2017-01-01

    This paper presents an empirical study on procedural learning from errors that was conducted within the field of vocational education. It examines whether, and to what extent, procedural learning can benefit more from the detection and written analysis of errors (experimental condition) than from the correct elements (control group). The study…

  15. The Impact of California's Class Size Reduction Initiative on Student Achievement: Detailed Findings from Eight School Districts.

    ERIC Educational Resources Information Center

    Mitchell, Douglas E.; Mitchell, Ross E.

    This report presents a comprehensive preliminary analysis of how California's Class Size Reduction (CSR) initiative has impacted student achievement during the first 2 years of implementation. The analysis is based on complete student, classroom, and teacher records from 26,126 students in 1,174 classrooms from 83 schools in 8 Southern California…

  16. Study on user interface of pathology picture archiving and communication system.

    PubMed

    Kim, Dasueran; Kang, Peter; Yun, Jungmin; Park, Sung-Hye; Seo, Jeong-Wook; Park, Peom

    2014-01-01

    It is necessary to improve the pathology workflow. A workflow task analysis was performed using a pathology picture archiving and communication system (pathology PACS) in order to propose a user interface for the Pathology PACS considering user experience. An interface analysis of the Pathology PACS in Seoul National University Hospital and a task analysis of the pathology workflow were performed by observing recorded video. Based on obtained results, a user interface for the Pathology PACS was proposed. Hierarchical task analysis of Pathology PACS was classified into 17 tasks including 1) pre-operation, 2) text, 3) images, 4) medical record viewer, 5) screen transition, 6) pathology identification number input, 7) admission date input, 8) diagnosis doctor, 9) diagnosis code, 10) diagnosis, 11) pathology identification number check box, 12) presence or absence of images, 13) search, 14) clear, 15) Excel save, 16) search results, and 17) re-search. And frequently used menu items were identified and schematized. A user interface for the Pathology PACS considering user experience could be proposed as a preliminary step, and this study may contribute to the development of medical information systems based on user experience and usability.

  17. Impact of tailored feedback in assessment of communication skills for medical students.

    PubMed

    Uhm, Seilin; Lee, Gui H; Jin, Jeong K; Bak, Yong I; Jeoung, Yeon O; Kim, Chan W

    2015-01-01

    Finding out the effective ways of teaching and assessing communication skills remain a challenging part of medication education. This study aims at exploring the usefulness and effectiveness of having additional feedback using qualitative analysis in assessment of communication skills in undergraduate medical training. We also determined the possibilities of using qualitative analysis in developing tailored strategies for improvement in communication skills training. This study was carried out on medical students (n=87) undergoing their final year clinical performance examination on communication skills using standardized patient by video-recording and transcribing their performances. Video-recordings of 26 students were randomly selected for qualitative analysis, and additional feedback was provided. We assessed the level of acceptance of communication skills scores between the study and nonstudy group and within the study group, before and after receiving feedback based on qualitative analysis. There was a statistically significant increase in the level of acceptance of feedback after delivering additional feedback using qualitative analysis, where the percentage of agreement with feedback increased from 15.4 to 80.8% (p<0.001). Incorporating feedback based on qualitative analysis for communication skills assessment gives essential information for medical students to learn and self-reflect, which could potentially lead to improved communication skills. As evident from our study, feedback becomes more meaningful and effective with additional feedback using qualitative analysis.

  18. Impact of tailored feedback in assessment of communication skills for medical students

    PubMed Central

    Uhm, Seilin; Lee, Gui H.; Jin, Jeong K.; Bak, Yong I.; Jeoung, Yeon O.; Kim, Chan W.

    2015-01-01

    Background Finding out the effective ways of teaching and assessing communication skills remain a challenging part of medication education. This study aims at exploring the usefulness and effectiveness of having additional feedback using qualitative analysis in assessment of communication skills in undergraduate medical training. We also determined the possibilities of using qualitative analysis in developing tailored strategies for improvement in communication skills training. Methods This study was carried out on medical students (n=87) undergoing their final year clinical performance examination on communication skills using standardized patient by video-recording and transcribing their performances. Video-recordings of 26 students were randomly selected for qualitative analysis, and additional feedback was provided. We assessed the level of acceptance of communication skills scores between the study and nonstudy group and within the study group, before and after receiving feedback based on qualitative analysis. Results There was a statistically significant increase in the level of acceptance of feedback after delivering additional feedback using qualitative analysis, where the percentage of agreement with feedback increased from 15.4 to 80.8% (p<0.001). Conclusions Incorporating feedback based on qualitative analysis for communication skills assessment gives essential information for medical students to learn and self-reflect, which could potentially lead to improved communication skills. As evident from our study, feedback becomes more meaningful and effective with additional feedback using qualitative analysis. PMID:26154864

  19. Smart Extraction and Analysis System for Clinical Research.

    PubMed

    Afzal, Muhammad; Hussain, Maqbool; Khan, Wajahat Ali; Ali, Taqdir; Jamshed, Arif; Lee, Sungyoung

    2017-05-01

    With the increasing use of electronic health records (EHRs), there is a growing need to expand the utilization of EHR data to support clinical research. The key challenge in achieving this goal is the unavailability of smart systems and methods to overcome the issue of data preparation, structuring, and sharing for smooth clinical research. We developed a robust analysis system called the smart extraction and analysis system (SEAS) that consists of two subsystems: (1) the information extraction system (IES), for extracting information from clinical documents, and (2) the survival analysis system (SAS), for a descriptive and predictive analysis to compile the survival statistics and predict the future chance of survivability. The IES subsystem is based on a novel permutation-based pattern recognition method that extracts information from unstructured clinical documents. Similarly, the SAS subsystem is based on a classification and regression tree (CART)-based prediction model for survival analysis. SEAS is evaluated and validated on a real-world case study of head and neck cancer. The overall information extraction accuracy of the system for semistructured text is recorded at 99%, while that for unstructured text is 97%. Furthermore, the automated, unstructured information extraction has reduced the average time spent on manual data entry by 75%, without compromising the accuracy of the system. Moreover, around 88% of patients are found in a terminal or dead state for the highest clinical stage of disease (level IV). Similarly, there is an ∼36% probability of a patient being alive if at least one of the lifestyle risk factors was positive. We presented our work on the development of SEAS to replace costly and time-consuming manual methods with smart automatic extraction of information and survival prediction methods. SEAS has reduced the time and energy of human resources spent unnecessarily on manual tasks.

  20. Stormwater Characterization and Lagoon Sediment Analysis, Grand Forks Air Force Base, North Dakota

    DTIC Science & Technology

    1990-08-01

    tetrachloroethylene, and 0.0026 mg/l ethyl benzene. Analyses showed no pesticides . 4. Extraction Procedure (EP) Analysis. An AFOEHL contractor performed EP extraction ...runoff met North Dakota state stream standards. Lagoon sediment did not contain Extraction Procedure hazardous chemicals. Stormwater runoff exceeded...Standards for Water Quality for the State of North Dakota ( Extracts ) 39 D Site/Analysis Summary 69 E Lift Station Flow Records 73 F Wastewater

  1. 76 FR 76215 - Privacy Act; System of Records: State-78, Risk Analysis and Management Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-06

    ... network. Vetting requests, analyses, and results will be stored separately on a classified computer... DEPARTMENT OF STATE [Public Notice 7709] Privacy Act; System of Records: State-78, Risk Analysis... a system of records, Risk Analysis and Management Records, State-78, pursuant to the provisions of...

  2. Distinguishing low frequency oscillations within the 1/f spectral behaviour of electromagnetic brain signals.

    PubMed

    Demanuele, Charmaine; James, Christopher J; Sonuga-Barke, Edmund Js

    2007-12-10

    It has been acknowledged that the frequency spectrum of measured electromagnetic (EM) brain signals shows a decrease in power with increasing frequency. This spectral behaviour may lead to difficulty in distinguishing event-related peaks from ongoing brain activity in the electro- and magnetoencephalographic (EEG and MEG) signal spectra. This can become an issue especially in the analysis of low frequency oscillations (LFOs) - below 0.5 Hz - which are currently being observed in signal recordings linked with specific pathologies such as epileptic seizures or attention deficit hyperactivity disorder (ADHD), in sleep studies, etc. In this work we propose a simple method that can be used to compensate for this 1/f trend hence achieving spectral normalisation. This method involves filtering the raw measured EM signal through a differentiator prior to further data analysis. Applying the proposed method to various exemplary datasets including very low frequency EEG recordings, epileptic seizure recordings, MEG data and Evoked Response data showed that this compensating procedure provides a flat spectral base onto which event related peaks can be clearly observed. Findings suggest that the proposed filter is a useful tool for the analysis of physiological data especially in revealing very low frequency peaks which may otherwise be obscured by the 1/f spectral activity inherent in EEG/MEG recordings.

  3. [Questions on the first operation with ethyl ether as anaesthetic by Dr. Peter Parker].

    PubMed

    Chen, Q

    2017-01-28

    Ethyl ether was the first accepted effective general anaesthetic. It was introduced into China by an America missionary, Dr. Peter Parker. This was one of the historical events of medical communication between China and the West. In the records of the first operation with ether, however, Dr. Parker unusually omitted the patient's medical record number and the date of the operation, while those of other operations with ether anesthetics were all available. This was very unusual for a doctor like Peter Parker who always recorded every important case in detail in the hospital reports. It seems that he deliberately rather than carelessly omitted the information for some reasons. Based on the analysis of Parker's reports, a conclusion is made that the anesthetic effect of the case was actually ineffective. Furthermore, possible answers to this are outlined and question by discussion based on the situation that Parker faced in the late Qing era.

  4. [Development and clinical evaluation of an anesthesia information management system].

    PubMed

    Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei

    2010-09-21

    To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.

  5. Semantic extraction and processing of medical records for patient-oriented visual index

    NASA Astrophysics Data System (ADS)

    Zheng, Weilin; Dong, Wenjie; Chen, Xiangjiao; Zhang, Jianguo

    2012-02-01

    To have comprehensive and completed understanding healthcare status of a patient, doctors need to search patient medical records from different healthcare information systems, such as PACS, RIS, HIS, USIS, as a reference of diagnosis and treatment decisions for the patient. However, it is time-consuming and tedious to do these procedures. In order to solve this kind of problems, we developed a patient-oriented visual index system (VIS) to use the visual technology to show health status and to retrieve the patients' examination information stored in each system with a 3D human model. In this presentation, we present a new approach about how to extract the semantic and characteristic information from the medical record systems such as RIS/USIS to create the 3D Visual Index. This approach includes following steps: (1) Building a medical characteristic semantic knowledge base; (2) Developing natural language processing (NLP) engine to perform semantic analysis and logical judgment on text-based medical records; (3) Applying the knowledge base and NLP engine on medical records to extract medical characteristics (e.g., the positive focus information), and then mapping extracted information to related organ/parts of 3D human model to create the visual index. We performed the testing procedures on 559 samples of radiological reports which include 853 focuses, and achieved 828 focuses' information. The successful rate of focus extraction is about 97.1%.

  6. 2011 Tohoku tsunami hydrographs, currents, flow velocities and ship tracks based on video and TLS measurements

    NASA Astrophysics Data System (ADS)

    Fritz, Hermann M.; Phillips, David A.; Okayasu, Akio; Shimozono, Takenori; Liu, Haijiang; Takeda, Seiichi; Mohammed, Fahad; Skanavis, Vassilis; Synolakis, Costas E.; Takahashi, Tomoyuki

    2013-04-01

    The March 11, 2011, magnitude Mw 9.0 earthquake off the Tohoku coast of Japan caused catastrophic damage and loss of life to a tsunami aware population. The mid-afternoon tsunami arrival combined with survivors equipped with cameras on top of vertical evacuation buildings provided fragmented spatially and temporally resolved inundation recordings. This report focuses on the surveys at 9 tsunami eyewitness video recording locations in Myako, Kamaishi, Kesennuma and Yoriisohama along Japan's Sanriku coast and the subsequent video image calibration, processing, tsunami hydrograph and flow velocity analysis. Selected tsunami video recording sites were explored, eyewitnesses interviewed and some ground control points recorded during the initial tsunami reconnaissance in April, 2011. A follow-up survey in June, 2011 focused on terrestrial laser scanning (TLS) at locations with high quality eyewitness videos. We acquired precise topographic data using TLS at the video sites producing a 3-dimensional "point cloud" dataset. A camera mounted on the Riegl VZ-400 scanner yields photorealistic 3D images. Integrated GPS measurements allow accurate georeferencing. The original video recordings were recovered from eyewitnesses and the Japanese Coast Guard (JCG). The analysis of the tsunami videos follows an adapted four step procedure originally developed for the analysis of 2004 Indian Ocean tsunami videos at Banda Aceh, Indonesia (Fritz et al., 2006). The first step requires the calibration of the sector of view present in the eyewitness video recording based on ground control points measured in the LiDAR data. In a second step the video image motion induced by the panning of the video camera was determined from subsequent images by particle image velocimetry (PIV) applied to fixed objects. The third step involves the transformation of the raw tsunami video images from image coordinates to world coordinates with a direct linear transformation (DLT) procedure. Finally, the instantaneous tsunami surface current and flooding velocity vector maps are determined by applying the digital PIV analysis method to the rectified tsunami video images with floating debris clusters. Tsunami currents up to 11 m/s were measured in Kesennuma Bay making navigation impossible (Fritz et al., 2012). Tsunami hydrographs are derived from the videos based on water surface elevations at surface piercing objects identified in the acquired topographic TLS data. Apart from a dominant tsunami crest the hydrograph at Kamaishi also reveals a subsequent draw down to minus 10m exposing the harbor bottom. In some cases ship moorings resist the main tsunami crest only to be broken by the extreme draw down and setting vessels a drift for hours. Further we discuss the complex effects of coastal structures on inundation and outflow hydrographs and flow velocities. Lastly a perspective on the recovery and reconstruction process is provided based on numerous revisits of identical sites between April 2011 and July 2012.

  7. Characterisation of slope directional resonance by analysing ambient noise instantaneous polarisation

    NASA Astrophysics Data System (ADS)

    Del Gaudio, Vincenzo; Wasowski, Janusz

    2014-05-01

    Several studies have shown that the dynamic response of landslide prone slopes to seismic shaking can play an important role in failure triggering during earthquakes. It was also demonstrated that slope seismic response is often characterised by directional resonance phenomena. Directivity can be revealed by the analysis of ambient noise recordings according to a technique known as HVNR method based on the analysis of azimuthal variation of spectral ratios between the spectral amplitude of horizontal H and vertical V component of noise recording. Directional resonance is then revealed by the presence of a preferential polarisation of H/V ratio peaks, whose frequencies correspond to resonance frequencies and whose amplitudes depend on the impedance contrast between surface material and bedrock. H/V ratio amplitudes can potentially provide information also on amplification factors. However, the relation is not straightforward depending on the nature of the waves contributing to the ambient noise. Thus, it is desirable to distinguish different kinds of noise wave packets, possibly isolating the contribution of Rayleigh waves, which appear to better reflect site response properties. To identify Rayleigh wave packets in noise recording a new approach was tested, based on a technique of analysis of instantaneous polarisation. The results are promising for the investigation of site response directional properties, particularly in the case of complex site conditions, where resonance can be characterised by multiple anisotropic peaks. In our preliminary tests of noise recordings carried out at a site located on a slope affected by landslides, only a small fraction of data samples (in the order of 1 %) were identified as Rayleigh type waves: this was likely due the fact that the noise recording was dominated by an overlapping of signals with different kinds of polarisation. Thus, it was possible to recognise Rayleigh polarisation only when the energy of this kind of wave was prevalent. However, from a relatively short noise recording (in the order of 30-45 minutes) one can obtain a high number (in the order of thousands) of estimates of H/V amplitude and azimuth, providing a robust statistics to recognise ground vibration properties reflecting site response. The tests on sites where directional resonance properties had been verified through the analysis of seismic event recordings, showed that more coherent observations can be obtained for H/V ratios and directivity estimates by selecting Rayleigh type data samples, rather than analysing the entire data set or SH-type wave packets.This offers the possibility of reducing the uncertainties in data interpretation related to the influence of the nature of the noise wavefield.

  8. Usability testing in medical informatics: cognitive approaches to evaluation of information systems and user interfaces.

    PubMed Central

    Kushniruk, A. W.; Patel, V. L.; Cimino, J. J.

    1997-01-01

    This paper describes an approach to the evaluation of health care information technologies based on usability engineering and a methodological framework from the study of medical cognition. The approach involves collection of a rich set of data including video recording of health care workers as they interact with systems, such as computerized patient records and decision support tools. The methodology can be applied in the laboratory setting, typically involving subjects "thinking aloud" as they interact with a system. A similar approach to data collection and analysis can also be extended to study of computer systems in the "live" environment of hospital clinics. Our approach is also influenced from work in the area of cognitive task analysis, which aims to characterize the decision making and reasoning of subjects of varied levels of expertise as they interact with information technology in carrying out representative tasks. The stages involved in conducting cognitively-based usability analyses are detailed and the application of such analysis in the iterative process of system and interface development is discussed. PMID:9357620

  9. Automatic identification of artifacts in electrodermal activity data.

    PubMed

    Taylor, Sara; Jaques, Natasha; Chen, Weixuan; Fedor, Szymon; Sano, Akane; Picard, Rosalind

    2015-01-01

    Recently, wearable devices have allowed for long term, ambulatory measurement of electrodermal activity (EDA). Despite the fact that ambulatory recording can be noisy, and recording artifacts can easily be mistaken for a physiological response during analysis, to date there is no automatic method for detecting artifacts. This paper describes the development of a machine learning algorithm for automatically detecting EDA artifacts, and provides an empirical evaluation of classification performance. We have encoded our results into a freely available web-based tool for artifact and peak detection.

  10. Relative effects of posture and activity on human height estimation from surveillance footage.

    PubMed

    Ramstrand, Nerrolyn; Ramstrand, Simon; Brolund, Per; Norell, Kristin; Bergström, Peter

    2011-10-10

    Height estimations based on security camera footage are often requested by law enforcement authorities. While valid and reliable techniques have been established to determine vertical distances from video frames, there is a discrepancy between a person's true static height and their height as measured when assuming different postures or when in motion (e.g., walking). The aim of the research presented in this report was to accurately record the height of subjects as they performed a variety of activities typically observed in security camera footage and compare results to height recorded using a standard height measuring device. Forty-six able bodied adults participated in this study and were recorded using a 3D motion analysis system while performing eight different tasks. Height measurements captured using the 3D motion analysis system were compared to static height measurements in order to determine relative differences. It is anticipated that results presented in this report can be used by forensic image analysis experts as a basis for correcting height estimations of people captured on surveillance footage. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Generalized Skew Coefficients of Annual Peak Flows for Rural, Unregulated Streams in West Virginia

    USGS Publications Warehouse

    Atkins, John T.; Wiley, Jeffrey B.; Paybins, Katherine S.

    2009-01-01

    Generalized skew was determined from analysis of records from 147 streamflow-gaging stations in or near West Virginia. The analysis followed guidelines established by the Interagency Advisory Committee on Water Data described in Bulletin 17B, except that stations having 50 or more years of record were used instead of stations with the less restrictive recommendation of 25 or more years of record. The generalized-skew analysis included contouring, averaging, and regression of station skews. The best method was considered the one with the smallest mean square error (MSE). MSE is defined as the following quantity summed and divided by the number of peaks: the square of the difference of an individual logarithm (base 10) of peak flow less the mean of all individual logarithms of peak flow. Contouring of station skews was the best method for determining generalized skew for West Virginia, with a MSE of about 0.2174. This MSE is an improvement over the MSE of about 0.3025 for the national map presented in Bulletin 17B.

  12. Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering.

    PubMed

    Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G

    2012-10-01

    The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. SSVEP recognition using common feature analysis in brain-computer interface.

    PubMed

    Zhang, Yu; Zhou, Guoxu; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2015-04-15

    Canonical correlation analysis (CCA) has been successfully applied to steady-state visual evoked potential (SSVEP) recognition for brain-computer interface (BCI) application. Although the CCA method outperforms the traditional power spectral density analysis through multi-channel detection, it requires additionally pre-constructed reference signals of sine-cosine waves. It is likely to encounter overfitting in using a short time window since the reference signals include no features from training data. We consider that a group of electroencephalogram (EEG) data trials recorded at a certain stimulus frequency on a same subject should share some common features that may bear the real SSVEP characteristics. This study therefore proposes a common feature analysis (CFA)-based method to exploit the latent common features as natural reference signals in using correlation analysis for SSVEP recognition. Good performance of the CFA method for SSVEP recognition is validated with EEG data recorded from ten healthy subjects, in contrast to CCA and a multiway extension of CCA (MCCA). Experimental results indicate that the CFA method significantly outperformed the CCA and the MCCA methods for SSVEP recognition in using a short time window (i.e., less than 1s). The superiority of the proposed CFA method suggests it is promising for the development of a real-time SSVEP-based BCI. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Implementation and evaluation of an efficient secure computation system using ‘R’ for healthcare statistics

    PubMed Central

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-01-01

    Background and objective While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Materials and methods Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software ‘R’ by effectively combining secret-sharing-based secure computation with original computation. Results Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50 000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. Discussion If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using ‘R’ that works interactively while secure computation protocols generally require a significant amount of processing time. Conclusions We propose a secure statistical analysis system using ‘R’ for medical data that effectively integrates secret-sharing-based secure computation and original computation. PMID:24763677

  15. Implementation and evaluation of an efficient secure computation system using 'R' for healthcare statistics.

    PubMed

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-10-01

    While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software 'R' by effectively combining secret-sharing-based secure computation with original computation. Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50,000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using 'R' that works interactively while secure computation protocols generally require a significant amount of processing time. We propose a secure statistical analysis system using 'R' for medical data that effectively integrates secret-sharing-based secure computation and original computation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  16. Review of health maintenance program findings, 1960-1974

    NASA Technical Reports Server (NTRS)

    White, E. S.

    1975-01-01

    A preliminary analysis of the employee's examination records of the automated medical data base at the NASA Wallops Flight Center, Va., with an emphasis on the primary mission of the program-the early detection and control of cardiovascular disease, is presented.

  17. Monitoring the fetal heart rate variability during labor.

    PubMed

    Moslem, B; Mohydeen, A; Bazzi, O

    2015-08-01

    In respect to the main goal of our ongoing work for estimating the heart rate variability (HRV) from fetal electrocardiogram (FECG) signals for monitoring the health of the fetus, we investigate in this paper the possibility of extracting the fetal heart rate variability (HRV) directly from the abdominal composite recordings. Our proposed approach is based on a combination of two techniques: Periodic Component Analysis (PiCA) and recursive least square (RLS) adaptive filtering. The Fetal HRV of the estimated FECG signal is compared to a reference value extracted from an FECG signal recorded by using a spiral electrode attached directly to the fetal scalp. The results obtained show that the fetal HRV can be directly evaluated from the abdominal composite recordings without the need of recording an external reference signal.

  18. Shear-wave velocity compilation for Northridge strong-motion recording sites

    USGS Publications Warehouse

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  19. Hospitalization costs of severe bacterial pneumonia in children: comparative analysis considering different costing methods

    PubMed Central

    Nunes, Sheila Elke Araujo; Minamisava, Ruth; Vieira, Maria Aparecida da Silva; Itria, Alexander; Pessoa, Vicente Porfirio; de Andrade, Ana Lúcia Sampaio Sgambatti; Toscano, Cristiana Maria

    2017-01-01

    ABSTRACT Objective To determine and compare hospitalization costs of bacterial community-acquired pneumonia cases via different costing methods under the Brazilian Public Unified Health System perspective. Methods Cost-of-illness study based on primary data collected from a sample of 59 children aged between 28 days and 35 months and hospitalized due to bacterial pneumonia. Direct medical and non-medical costs were considered and three costing methods employed: micro-costing based on medical record review, micro-costing based on therapeutic guidelines and gross-costing based on the Brazilian Public Unified Health System reimbursement rates. Costs estimates obtained via different methods were compared using the Friedman test. Results Cost estimates of inpatient cases of severe pneumonia amounted to R$ 780,70/$Int. 858.7 (medical record review), R$ 641,90/$Int. 706.90 (therapeutic guidelines) and R$ 594,80/$Int. 654.28 (Brazilian Public Unified Health System reimbursement rates). Costs estimated via micro-costing (medical record review or therapeutic guidelines) did not differ significantly (p=0.405), while estimates based on reimbursement rates were significantly lower compared to estimates based on therapeutic guidelines (p<0.001) or record review (p=0.006). Conclusion Brazilian Public Unified Health System costs estimated via different costing methods differ significantly, with gross-costing yielding lower cost estimates. Given costs estimated by different micro-costing methods are similar and costing methods based on therapeutic guidelines are easier to apply and less expensive, this method may be a valuable alternative for estimation of hospitalization costs of bacterial community-acquired pneumonia in children. PMID:28767921

  20. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals.

    PubMed

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A

    2016-03-05

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.

  1. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals

    PubMed Central

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.

    2016-01-01

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029

  2. 40 CFR 312.26 - Reviews of Federal, State, Tribal, and local government records.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... records or data bases of government records of the subject property and adjoining properties must be... data bases of such government records and local government records and data bases of such records... registered, or state-permitted or registered waste management activities. Such records or data bases that may...

  3. 40 CFR 312.26 - Reviews of Federal, State, Tribal, and local government records.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... records or data bases of government records of the subject property and adjoining properties must be... data bases of such government records and local government records and data bases of such records... registered, or state-permitted or registered waste management activities. Such records or data bases that may...

  4. 40 CFR 312.26 - Reviews of Federal, State, Tribal, and local government records.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... records or data bases of government records of the subject property and adjoining properties must be... data bases of such government records and local government records and data bases of such records... registered, or state-permitted or registered waste management activities. Such records or data bases that may...

  5. 40 CFR 312.26 - Reviews of Federal, State, Tribal, and local government records.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... records or data bases of government records of the subject property and adjoining properties must be... data bases of such government records and local government records and data bases of such records... registered, or state-permitted or registered waste management activities. Such records or data bases that may...

  6. Dynamic Photorefractive Memory and its Application for Opto-Electronic Neural Networks.

    NASA Astrophysics Data System (ADS)

    Sasaki, Hironori

    This dissertation describes the analysis of the photorefractive crystal dynamics and its application for opto-electronic neural network systems. The realization of the dynamic photorefractive memory is investigated in terms of the following aspects: fast memory update, uniform grating multiplexing schedules and the prevention of the partial erasure of existing gratings. The fast memory update is realized by the selective erasure process that superimposes a new grating on the original one with an appropriate phase shift. The dynamics of the selective erasure process is analyzed using the first-order photorefractive material equations and experimentally confirmed. The effects of beam coupling and fringe bending on the selective erasure dynamics are also analyzed by numerically solving a combination of coupled wave equations and the photorefractive material equation. Incremental recording technique is proposed as a uniform grating multiplexing schedule and compared with the conventional scheduled recording technique in terms of phase distribution in the presence of an external dc electric field, as well as the image gray scale dependence. The theoretical analysis and experimental results proved the superiority of the incremental recording technique over the scheduled recording. Novel recirculating information memory architecture is proposed and experimentally demonstrated to prevent partial degradation of the existing gratings by accessing the memory. Gratings are circulated through a memory feed back loop based on the incremental recording dynamics and demonstrate robust read/write/erase capabilities. The dynamic photorefractive memory is applied to opto-electronic neural network systems. Module architecture based on the page-oriented dynamic photorefractive memory is proposed. This module architecture can implement two complementary interconnection organizations, fan-in and fan-out. The module system scalability and the learning capabilities are theoretically investigated using the photorefractive dynamics described in previous chapters of the dissertation. The implementation of the feed-forward image compression network with 900 input and 9 output neurons with 6-bit interconnection accuracy is experimentally demonstrated. Learning of the Perceptron network that determines sex based on input face images of 900 pixels is also successfully demonstrated.

  7. Holocene vegetation and climate dynamics of NE China based on the pollen record from Sihailongwan Maar Lake

    NASA Astrophysics Data System (ADS)

    Stebich, Martina; Rehfeld, Kira; Schlütz, Frank; Tarasov, Pavel E.; Liu, Jiaqi; Mingram, Jens

    2015-09-01

    High-resolution palynological analysis on annually laminated sediments of Sihailongwan Maar Lake (SHL) provides new insights into the Holocene vegetation and climate dynamics of NE China. The robust chronology of the presented record is based on varve counting and AMS radiocarbon dates from terrestrial plant macro-remains. In addition to the qualitative interpretation of the pollen data, we provide quantitative reconstructions of vegetation and climate based on the method of biomization and weighted averaging partial least squares regression (WA-PLS) technique, respectively. Power spectra were computed to investigate the frequency domain distribution of proxy signals and potential natural periodicities. Pollen assemblages, pollen-derived biome scores and climate variables as well as the cyclicity pattern indicate that NE China experienced significant changes in temperature and moisture conditions during the Holocene. Within the earliest phase of the Holocene, a large-scale reorganization of vegetation occurred, reflecting the reconstructed shift towards higher temperatures and precipitation values and the initial Holocene strengthening and northward expansion of the East Asian summer monsoon (EASM). Afterwards, summer temperatures remain at a high level, whereas the reconstructed precipitation shows an increasing trend until approximately 4000 cal. yr BP. Since 3500 cal. yr BP, temperature and precipitation values decline, indicating moderate cooling and weakening of the EASM. A distinct periodicity of 550-600 years and evidence of a Mid-Holocene transition from a temperature-triggered to a predominantly moisture-triggered climate regime are derived from the power spectra analysis. The results obtained from SHL are largely consistent with other palaeoenvironmental records from NE China, substantiating the regional nature of the reconstructed vegetation and climate patterns. However, the reconstructed climate changes contrast with the moisture evolution recorded in S China and the mid-latitude (semi-)arid regions of N China. Whereas a clear insolation-related trend of monsoon intensity over the Holocene is lacking from the SHL record, variations in the coupled atmosphere-Pacific Ocean system can largely explain the reconstructed changes in NE China.

  8. Object-based image analysis for cadastral mapping using satellite images

    NASA Astrophysics Data System (ADS)

    Kohli, D.; Crommelinck, S.; Bennett, R.; Koeva, M.; Lemmen, C.

    2017-10-01

    Cadasters together with land registry form a core ingredient of any land administration system. Cadastral maps comprise of the extent, ownership and value of land which are essential for recording and updating land records. Traditional methods for cadastral surveying and mapping often prove to be labor, cost and time intensive: alternative approaches are thus being researched for creating such maps. With the advent of very high resolution (VHR) imagery, satellite remote sensing offers a tremendous opportunity for (semi)-automation of cadastral boundaries detection. In this paper, we explore the potential of object-based image analysis (OBIA) approach for this purpose by applying two segmentation methods, i.e. MRS (multi-resolution segmentation) and ESP (estimation of scale parameter) to identify visible cadastral boundaries. Results show that a balance between high percentage of completeness and correctness is hard to achieve: a low error of commission often comes with a high error of omission. However, we conclude that the resulting segments/land use polygons can potentially be used as a base for further aggregation into tenure polygons using participatory mapping.

  9. Analytical report of the 2016 dengue outbreak in Córdoba city, Argentina.

    PubMed

    Rotela, Camilo; Lopez, Laura; Frías Céspedes, María; Barbas, Gabriela; Lighezzolo, Andrés; Porcasi, Ximena; Lanfri, Mario A; Scavuzzo, Carlos M; Gorla, David E

    2017-11-06

    After elimination of the Aedes aegypti vector in South America in the 1960s, dengue outbreaks started to reoccur during the 1990s; strongly in Argentina since 1998. In 2016, Córdoba City had the largest dengue outbreak in its history. In this article we report this outbreak including spatio-temporal analysis of cases and vectors in the city. A total of 653 dengue cases were recorded by the laboratory-based dengue surveillance system and georeferenced by their residential addresses. Case maps were generated from the epidemiological week 1 (beginning of January) to week 19 (mid-May). Dengue outbreak temporal evolution was analysed globally and three specific, high-incidence zones were detected using Knox analysis to characterising its spatio-temporal attributes. Field and remotely sensed data were collected and analysed in real time and a vector presence map based on the MaxEnt approach was generated to define hotspots, towards which the pesticide- based strategy was then targeted. The recorded pattern of cases evolution within the community suggests that dengue control measures should be improved.

  10. An Analysis of a Computerized System for Managing Curriculum Decisions and Tracking Student Progress in a Home-Based Pre-School Education Project.

    ERIC Educational Resources Information Center

    Lutz, John E.; And Others

    The degree of success of the computerized Child-Based Information System (CBIS) was analyzed in two areas--presenting, delivering, and managing a developmental curriculum; and recording, filing, and monitoring child tracking data, including requirements for Individualized Education Plans (IEP's). Preschool handicapped and high-risk children and…

  11. Peer Assessment of Webpage Design: Behavioral Sequential Analysis Based on Eye-Tracking Evidence

    ERIC Educational Resources Information Center

    Hsu, Ting-Chia; Chang, Shao-Chen; Liu, Nan-Cen

    2018-01-01

    This study employed an eye-tracking machine to record the process of peer assessment. Each web page was divided into several regions of interest (ROIs) based on the frame design and content. A total of 49 undergraduate students with a visual learning style participated in the experiment. This study investigated the peer assessment attitudes of the…

  12. The Feasibility of Personal Digital Assistants (PDAs) to Collect Dietary Intake Data in Low-Income Pregnant Women

    ERIC Educational Resources Information Center

    Fowles, Eileen R.; Gentry, Breine

    2008-01-01

    Objectives: To determine the feasibility of using personal digital assistant (PDA)-based technology for tracking and analysis of food intake in low-income pregnant women. Design: Descriptive. Participants provided an initial 24-hour dietary recall and recorded their food intake using a PDA-based software program for 2 days. Setting: Recruitment…

  13. 2011 Tohoku tsunami video and TLS based measurements: hydrographs, currents, inundation flow velocities, and ship tracks

    NASA Astrophysics Data System (ADS)

    Fritz, H. M.; Phillips, D. A.; Okayasu, A.; Shimozono, T.; Liu, H.; Takeda, S.; Mohammed, F.; Skanavis, V.; Synolakis, C. E.; Takahashi, T.

    2012-12-01

    The March 11, 2011, magnitude Mw 9.0 earthquake off the coast of the Tohoku region caused catastrophic damage and loss of life in Japan. The mid-afternoon tsunami arrival combined with survivors equipped with cameras on top of vertical evacuation buildings provided spontaneous spatially and temporally resolved inundation recordings. This report focuses on the surveys at 9 tsunami eyewitness video recording locations in Myako, Kamaishi, Kesennuma and Yoriisohama along Japan's Sanriku coast and the subsequent video image calibration, processing, tsunami hydrograph and flow velocity analysis. Selected tsunami video recording sites were explored, eyewitnesses interviewed and some ground control points recorded during the initial tsunami reconnaissance in April, 2011. A follow-up survey in June, 2011 focused on terrestrial laser scanning (TLS) at locations with high quality eyewitness videos. We acquired precise topographic data using TLS at the video sites producing a 3-dimensional "point cloud" dataset. A camera mounted on the Riegl VZ-400 scanner yields photorealistic 3D images. Integrated GPS measurements allow accurate georeferencing. The original video recordings were recovered from eyewitnesses and the Japanese Coast Guard (JCG). The analysis of the tsunami videos follows an adapted four step procedure originally developed for the analysis of 2004 Indian Ocean tsunami videos at Banda Aceh, Indonesia (Fritz et al., 2006). The first step requires the calibration of the sector of view present in the eyewitness video recording based on ground control points measured in the LiDAR data. In a second step the video image motion induced by the panning of the video camera was determined from subsequent images by particle image velocimetry (PIV) applied to fixed objects. The third step involves the transformation of the raw tsunami video images from image coordinates to world coordinates with a direct linear transformation (DLT) procedure. Finally, the instantaneous tsunami surface current and flooding velocity vector maps are determined by applying the digital PIV analysis method to the rectified tsunami video images with floating debris clusters. Tsunami currents up to 11 m/s per second were measured in Kesennuma Bay making navigation impossible. Tsunami hydrographs are derived from the videos based on water surface elevations at surface piercing objects identified in the acquired topographic TLS data. Apart from a dominant tsunami crest the hydrograph at Kamaishi also reveals a subsequent draw down to -10m exposing the harbor bottom. In some cases ship moorings resist the main tsunami crest only to be broken by the extreme draw down and setting vessels a drift for hours. Further we discuss the complex effects of coastal structures on inundation and outflow hydrographs and flow velocities.;

  14. Automated detection of records in biological sequence databases that are inconsistent with the literature.

    PubMed

    Bouadjenek, Mohamed Reda; Verspoor, Karin; Zobel, Justin

    2017-07-01

    We investigate and analyse the data quality of nucleotide sequence databases with the objective of automatic detection of data anomalies and suspicious records. Specifically, we demonstrate that the published literature associated with each data record can be used to automatically evaluate its quality, by cross-checking the consistency of the key content of the database record with the referenced publications. Focusing on GenBank, we describe a set of quality indicators based on the relevance paradigm of information retrieval (IR). Then, we use these quality indicators to train an anomaly detection algorithm to classify records as "confident" or "suspicious". Our experiments on the PubMed Central collection show assessing the coherence between the literature and database records, through our algorithms, is an effective mechanism for assisting curators to perform data cleansing. Although fewer than 0.25% of the records in our data set are known to be faulty, we would expect that there are many more in GenBank that have not yet been identified. By automated comparison with literature they can be identified with a precision of up to 10% and a recall of up to 30%, while strongly outperforming several baselines. While these results leave substantial room for improvement, they reflect both the very imbalanced nature of the data, and the limited explicitly labelled data that is available. Overall, the obtained results show promise for the development of a new kind of approach to detecting low-quality and suspicious sequence records based on literature analysis and consistency. From a practical point of view, this will greatly help curators in identifying inconsistent records in large-scale sequence databases by highlighting records that are likely to be inconsistent with the literature. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Assessment of modal-pushover-based scaling procedure for nonlinear response history analysis of ordinary standard bridges

    USGS Publications Warehouse

    Kalkan, E.; Kwong, N.

    2012-01-01

    The earthquake engineering profession is increasingly utilizing nonlinear response history analyses (RHA) to evaluate seismic performance of existing structures and proposed designs of new structures. One of the main ingredients of nonlinear RHA is a set of ground motion records representing the expected hazard environment for the structure. When recorded motions do not exist (as is the case in the central United States) or when high-intensity records are needed (as is the case in San Francisco and Los Angeles), ground motions from other tectonically similar regions need to be selected and scaled. The modal-pushover-based scaling (MPS) procedure was recently developed to determine scale factors for a small number of records such that the scaled records provide accurate and efficient estimates of “true” median structural responses. The adjective “accurate” refers to the discrepancy between the benchmark responses and those computed from the MPS procedure. The adjective “efficient” refers to the record-to-record variability of responses. In this paper, the accuracy and efficiency of the MPS procedure are evaluated by applying it to four types of existing Ordinary Standard bridges typical of reinforced concrete bridge construction in California. These bridges are the single-bent overpass, multi-span bridge, curved bridge, and skew bridge. As compared with benchmark analyses of unscaled records using a larger catalog of ground motions, it is demonstrated that the MPS procedure provided an accurate estimate of the engineering demand parameters (EDPs) accompanied by significantly reduced record-to-record variability of the EDPs. Thus, it is a useful tool for scaling ground motions as input to nonlinear RHAs of Ordinary Standard bridges.

  16. Documentation for assessment of modal pushover-based scaling procedure for nonlinear response history analysis of "ordinary standard" bridges

    USGS Publications Warehouse

    Kalkan, Erol; Kwong, Neal S.

    2010-01-01

    The earthquake engineering profession is increasingly utilizing nonlinear response history analyses (RHA) to evaluate seismic performance of existing structures and proposed designs of new structures. One of the main ingredients of nonlinear RHA is a set of ground-motion records representing the expected hazard environment for the structure. When recorded motions do not exist (as is the case for the central United States), or when high-intensity records are needed (as is the case for San Francisco and Los Angeles), ground motions from other tectonically similar regions need to be selected and scaled. The modal-pushover-based scaling (MPS) procedure recently was developed to determine scale factors for a small number of records, such that the scaled records provide accurate and efficient estimates of 'true' median structural responses. The adjective 'accurate' refers to the discrepancy between the benchmark responses and those computed from the MPS procedure. The adjective 'efficient' refers to the record-to-record variability of responses. Herein, the accuracy and efficiency of the MPS procedure are evaluated by applying it to four types of existing 'ordinary standard' bridges typical of reinforced-concrete bridge construction in California. These bridges are the single-bent overpass, multi span bridge, curved-bridge, and skew-bridge. As compared to benchmark analyses of unscaled records using a larger catalog of ground motions, it is demonstrated that the MPS procedure provided an accurate estimate of the engineering demand parameters (EDPs) accompanied by significantly reduced record-to-record variability of the responses. Thus, the MPS procedure is a useful tool for scaling ground motions as input to nonlinear RHAs of 'ordinary standard' bridges.

  17. Ground Motion in Central Mexico: A Comprehensive Analysis

    NASA Astrophysics Data System (ADS)

    Ramirez-Guzman, L.; Juarez, A.; Rábade, S.; Aguirre, J.; Bielak, J.

    2015-12-01

    This study presents a detailed analysis of the ground motion in Central Mexico based on numerical simulations, as well as broadband and strong ground motion records. We describe and evaluate a velocity model for Central Mexico derived from noise and regional earthquake cross-correlations, which is used throughout this research to estimate the ground motion in the region. The 3D crustal model includes a geotechnical structure of the Valley of Mexico (VM), subduction zone geometry, and 3D velocity distributions. The latter are based on more than 200 low magnitude (Mw < 4.5) earthquakes and two years of noise recordings. We emphasize the analysis on the ground motion in the Valley of Mexico originating from intra-slab deep events and temblors located along the Pacific coast. Also, we quantify the effects Trans-Mexican Volcanic Belt (TMVB) and the low-velocity deposits on the ground motion. The 3D octree-based finite element wave propagation computations, valid up to 1 Hz, reveal that the inclusion of a basin with a structure as complex as the Valley of Mexico dramatically enhances the regional effects induced by the TMVB. Moreover, the basin not only produces ground motion amplification and anomalous duration, but it also favors the energy focusing into zones of Mexico City where structures typically undergo high levels of damage.

  18. Structural-change localization and monitoring through a perturbation-based inverse problem.

    PubMed

    Roux, Philippe; Guéguen, Philippe; Baillet, Laurent; Hamze, Alaa

    2014-11-01

    Structural-change detection and characterization, or structural-health monitoring, is generally based on modal analysis, for detection, localization, and quantification of changes in structure. Classical methods combine both variations in frequencies and mode shapes, which require accurate and spatially distributed measurements. In this study, the detection and localization of a local perturbation are assessed by analysis of frequency changes (in the fundamental mode and overtones) that are combined with a perturbation-based linear inverse method and a deconvolution process. This perturbation method is applied first to a bending beam with the change considered as a local perturbation of the Young's modulus, using a one-dimensional finite-element model for modal analysis. Localization is successful, even for extended and multiple changes. In a second step, the method is numerically tested under ambient-noise vibration from the beam support with local changes that are shifted step by step along the beam. The frequency values are revealed using the random decrement technique that is applied to the time-evolving vibrations recorded by one sensor at the free extremity of the beam. Finally, the inversion method is experimentally demonstrated at the laboratory scale with data recorded at the free end of a Plexiglas beam attached to a metallic support.

  19. Early prediction of cerebral palsy by computer-based video analysis of general movements: a feasibility study.

    PubMed

    Adde, Lars; Helbostad, Jorunn L; Jensenius, Alexander R; Taraldsen, Gunnar; Grunewaldt, Kristine H; Støen, Ragnhild

    2010-08-01

    The aim of this study was to investigate the predictive value of a computer-based video analysis of the development of cerebral palsy (CP) in young infants. A prospective study of general movements used recordings from 30 high-risk infants (13 males, 17 females; mean gestational age 31wks, SD 6wks; range 23-42wks) between 10 and 15 weeks post term when fidgety movements should be present. Recordings were analysed using computer vision software. Movement variables, derived from differences between subsequent video frames, were used for quantitative analyses. CP status was reported at 5 years. Thirteen infants developed CP (eight hemiparetic, four quadriparetic, one dyskinetic; seven ambulatory, three non-ambulatory, and three unknown function), of whom one had fidgety movements. Variability of the centroid of motion had a sensitivity of 85% and a specificity of 71% in identifying CP. By combining this with variables reflecting the amount of motion, specificity increased to 88%. Nine out of 10 children with CP, and for whom information about functional level was available, were correctly predicted with regard to ambulatory and non-ambulatory function. Prediction of CP can be provided by computer-based video analysis in young infants. The method may serve as an objective and feasible tool for early prediction of CP in high-risk infants.

  20. Spatiotemporal analysis of single-trial EEG of emotional pictures based on independent component analysis and source location

    NASA Astrophysics Data System (ADS)

    Liu, Jiangang; Tian, Jie

    2007-03-01

    The present study combined the Independent Component Analysis (ICA) and low-resolution brain electromagnetic tomography (LORETA) algorithms to identify the spatial distribution and time course of single-trial EEG record differences between neural responses to emotional stimuli vs. the neutral. Single-trial multichannel (129-sensor) EEG records were collected from 21 healthy, right-handed subjects viewing the emotion emotional (pleasant/unpleasant) and neutral pictures selected from International Affective Picture System (IAPS). For each subject, the single-trial EEG records of each emotional pictures were concatenated with the neutral, and a three-step analysis was applied to each of them in the same way. First, the ICA was performed to decompose each concatenated single-trial EEG records into temporally independent and spatially fixed components, namely independent components (ICs). The IC associated with artifacts were isolated. Second, the clustering analysis classified, across subjects, the temporally and spatially similar ICs into the same clusters, in which nonparametric permutation test for Global Field Power (GFP) of IC projection scalp maps identified significantly different temporal segments of each emotional condition vs. neutral. Third, the brain regions accounted for those significant segments were localized spatially with LORETA analysis. In each cluster, a voxel-by-voxel randomization test identified significantly different brain regions between each emotional condition vs. the neutral. Compared to the neutral, both emotional pictures elicited activation in the visual, temporal, ventromedial and dorsomedial prefrontal cortex and anterior cingulated gyrus. In addition, the pleasant pictures activated the left middle prefrontal cortex and the posterior precuneus, while the unpleasant pictures activated the right orbitofrontal cortex, posterior cingulated gyrus and somatosensory region. Our results were well consistent with other functional imaging studies, while revealed temporal dynamics of emotional processing of specific brain structure with high temporal resolution.

  1. On Being Echolalic: An Analysis of the Interactional and Phonetic Aspects of an Autistic's Language.

    ERIC Educational Resources Information Center

    Local, John; Wootton, Tony

    1996-01-01

    A case study analyzed the echolalia behavior of an autistic 11-year-old boy, based on recordings made in his home and school. Focus was on the subset of immediate echolalia referred to as pure echoing. Using an approach informed by conversation analysis and descriptive phonetics, distinctions are drawn between different forms of pure echo. It is…

  2. Computer programs for describing the recession of ground-water discharge and for estimating mean ground-water recharge and discharge from streamflow records-update

    USGS Publications Warehouse

    Rutledge, A.T.

    1998-01-01

    The computer programs included in this report can be used to develop a mathematical expression for recession of ground-water discharge and estimate mean ground-water recharge and discharge. The programs are intended for analysis of the daily streamflow record of a basin where one can reasonably assume that all, or nearly all, ground water discharges to the stream except for that which is lost to riparian evapotranspiration, and where regulation and diversion of flow can be considered to be negligible. The program RECESS determines the master reces-sion curve of streamflow recession during times when all flow can be considered to be ground-water discharge and when the profile of the ground-water-head distribution is nearly stable. The method uses a repetitive interactive procedure for selecting several periods of continuous recession, and it allows for nonlinearity in the relation between time and the logarithm of flow. The program RORA uses the recession-curve displacement method to estimate the recharge for each peak in the streamflow record. The method is based on the change in the total potential ground-water discharge that is caused by an event. Program RORA is applied to a long period of record to obtain an estimate of the mean rate of ground-water recharge. The program PART uses streamflow partitioning to estimate a daily record of base flow under the streamflow record. The method designates base flow to be equal to streamflow on days that fit a requirement of antecedent recession, linearly interpolates base flow for other days, and is applied to a long period of record to obtain an estimate of the mean rate of ground-water discharge. The results of programs RORA and PART correlate well with each other and compare reasonably with results of the corresponding manual method.

  3. LabTrove: A Lightweight, Web Based, Laboratory “Blog” as a Route towards a Marked Up Record of Work in a Bioscience Research Laboratory

    PubMed Central

    Milsted, Andrew J.; Hale, Jennifer R.; Frey, Jeremy G.; Neylon, Cameron

    2013-01-01

    Background The electronic laboratory notebook (ELN) has the potential to replace the paper notebook with a marked-up digital record that can be searched and shared. However, it is a challenge to achieve these benefits without losing the usability and flexibility of traditional paper notebooks. We investigate a blog-based platform that addresses the issues associated with the development of a flexible system for recording scientific research. Methodology/Principal Findings We chose a blog-based approach with the journal characteristics of traditional notebooks in mind, recognizing the potential for linking together procedures, materials, samples, observations, data, and analysis reports. We implemented the LabTrove blog system as a server process written in PHP, using a MySQL database to persist posts and other research objects. We incorporated a metadata framework that is both extensible and flexible while promoting consistency and structure where appropriate. Our experience thus far is that LabTrove is capable of providing a successful electronic laboratory recording system. Conclusions/Significance LabTrove implements a one-item one-post system, which enables us to uniquely identify each element of the research record, such as data, samples, and protocols. This unique association between a post and a research element affords advantages for monitoring the use of materials and samples and for inspecting research processes. The combination of the one-item one-post system, consistent metadata, and full-text search provides us with a much more effective record than a paper notebook. The LabTrove approach provides a route towards reconciling the tensions and challenges that lie ahead in working towards the long-term goals for ELNs. LabTrove, an electronic laboratory notebook (ELN) system from the Smart Research Framework, based on a blog-type framework with full access control, facilitates the scientific experimental recording requirements for reproducibility, reuse, repurposing, and redeployment. PMID:23935832

  4. LabTrove: a lightweight, web based, laboratory "blog" as a route towards a marked up record of work in a bioscience research laboratory.

    PubMed

    Milsted, Andrew J; Hale, Jennifer R; Frey, Jeremy G; Neylon, Cameron

    2013-01-01

    The electronic laboratory notebook (ELN) has the potential to replace the paper notebook with a marked-up digital record that can be searched and shared. However, it is a challenge to achieve these benefits without losing the usability and flexibility of traditional paper notebooks. We investigate a blog-based platform that addresses the issues associated with the development of a flexible system for recording scientific research. We chose a blog-based approach with the journal characteristics of traditional notebooks in mind, recognizing the potential for linking together procedures, materials, samples, observations, data, and analysis reports. We implemented the LabTrove blog system as a server process written in PHP, using a MySQL database to persist posts and other research objects. We incorporated a metadata framework that is both extensible and flexible while promoting consistency and structure where appropriate. Our experience thus far is that LabTrove is capable of providing a successful electronic laboratory recording system. LabTrove implements a one-item one-post system, which enables us to uniquely identify each element of the research record, such as data, samples, and protocols. This unique association between a post and a research element affords advantages for monitoring the use of materials and samples and for inspecting research processes. The combination of the one-item one-post system, consistent metadata, and full-text search provides us with a much more effective record than a paper notebook. The LabTrove approach provides a route towards reconciling the tensions and challenges that lie ahead in working towards the long-term goals for ELNs. LabTrove, an electronic laboratory notebook (ELN) system from the Smart Research Framework, based on a blog-type framework with full access control, facilitates the scientific experimental recording requirements for reproducibility, reuse, repurposing, and redeployment.

  5. Emergency Response Capability Baseline Needs Assessment Compliance Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharry, John A.

    2013-09-16

    This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2013 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2009 BNA, the 2012 BNA document, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, firemore » department training records, and fire department policies and procedures.« less

  6. Dealing with noise and physiological artifacts in human EEG recordings: empirical mode methods

    NASA Astrophysics Data System (ADS)

    Runnova, Anastasiya E.; Grubov, Vadim V.; Khramova, Marina V.; Hramov, Alexander E.

    2017-04-01

    In the paper we propose the new method for removing noise and physiological artifacts in human EEG recordings based on empirical mode decomposition (Hilbert-Huang transform). As physiological artifacts we consider specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the proposed method with steps including empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing these empirical modes and reconstructing of initial EEG signal. We show the efficiency of the method on the example of filtration of human EEG signal from eye-moving artifacts.

  7. Psychoacoustic Analysis of Synthesized Jet Noise

    NASA Technical Reports Server (NTRS)

    Okcu, Selen; Rathsam, Jonathan; Rizzi, Stephen A.

    2013-01-01

    An aircraft noise synthesis capability is being developed so the annoyance caused by proposed aircraft can be assessed during the design stage. To make synthesized signals as realistic as possible, high fidelity simulation is required for source (e.g., engine noise, airframe noise), propagation and receiver effects. This psychoacoustic study tests whether the jet noise component of synthesized aircraft engine noise can be made more realistic using a low frequency oscillator (LFO) technique to simulate fluctuations in level observed in recordings. Jet noise predictions are commonly made in the frequency domain based on models of time-averaged empirical data. The synthesis process involves conversion of the frequency domain prediction into an audible pressure time history. However, because the predictions are time-invariant, the synthesized sound lacks fluctuations observed in recordings. Such fluctuations are hypothesized to be perceptually important. To introduce time-varying characteristics into jet noise synthesis, a method has been developed that modulates measured or predicted 1/3-octave band levels with a (<20Hz) LFO. The LFO characteristics are determined through analysis of laboratory jet noise recordings. For the aft emission angle, results indicate that signals synthesized using a generic LFO are perceived as more similar to recordings than those using no LFO, and signals synthesized with an angle-specific LFO are more similar to recordings than those synthesized with a generic LFO.

  8. Local soil effects on the Ground Motion Prediction model for the Racha region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Shengelia, I.; Otinashvili, M.; Tvaliashvili, A.

    2016-12-01

    The Caucasus is a region of numerous natural hazards and ensuing disasters. Analysis of the losses due to past disasters indicates those most catastrophic in the region have historically been due to strong earthquakes. Estimation of expected ground motion is a fundamental earthquake hazard assessment. The most commonly used parameter for attenuation relation is the peak ground acceleration because this parameter gives useful information for Seismic Hazard Assessment that was selected for the analysis. One of the most important topics that have a significant influence on earthquake records is the site ground conditions that are the main issue of the study because the same earthquake recorded at the same distance may cause different damage according to ground conditions. In the study earthquake records were selected for the Racha region in Georgia which has the highest seismic activity in the region. Next, new GMP models are obtained based on new digital data recorded in the same area. After removing the site effect the earthquake records on the rock site were obtained. Thus, two GMP models were obtained: one for the ground surface and the other for the rock site. At the end, comparison was done for the both models in order to analyze the influence of the local soil conditions on the GMP model.

  9. High-efficient and high-content cytotoxic recording via dynamic and continuous cell-based impedance biosensor technology.

    PubMed

    Hu, Ning; Fang, Jiaru; Zou, Ling; Wan, Hao; Pan, Yuxiang; Su, Kaiqi; Zhang, Xi; Wang, Ping

    2016-10-01

    Cell-based bioassays were effective method to assess the compound toxicity by cell viability, and the traditional label-based methods missed much information of cell growth due to endpoint detection, while the higher throughputs were demanded to obtain dynamic information. Cell-based biosensor methods can dynamically and continuously monitor with cell viability, however, the dynamic information was often ignored or seldom utilized in the toxin and drug assessment. Here, we reported a high-efficient and high-content cytotoxic recording method via dynamic and continuous cell-based impedance biosensor technology. The dynamic cell viability, inhibition ratio and growth rate were derived from the dynamic response curves from the cell-based impedance biosensor. The results showed that the biosensors has the dose-dependent manners to diarrhetic shellfish toxin, okadiac acid based on the analysis of the dynamic cell viability and cell growth status. Moreover, the throughputs of dynamic cytotoxicity were compared between cell-based biosensor methods and label-based endpoint methods. This cell-based impedance biosensor can provide a flexible, cost and label-efficient platform of cell viability assessment in the shellfish toxin screening fields.

  10. A qualitative analysis of communication between members of a hospital-based multidisciplinary lung cancer team.

    PubMed

    Rowlands, S; Callen, J

    2013-01-01

    The aim of the study was to explore how patient information is communicated between health professionals within a multidisciplinary hospital-based lung cancer team and to identify mechanisms to improve these communications. A qualitative method was employed using semi-structured in-depth interviews with a representative sample (n = 22) of members of a multidisciplinary hospital-based lung cancer team including medical, nursing and allied health professionals. Analysis was undertaken using a thematic grounded theory approach to derive key themes to describe communication patterns within the team and how communication could be improved. Two themes with sub-themes were identified: (1) characteristics of communication between team members including the impact of role on direction of communications, and doctors' dominance in communications; and (2) channels of communication including, preference for face-to-face and the suboptimal roles of the Multidisciplinary Team Meeting and the hospital medical record as mediums for communication. Traditional influences of role delineation and the dominance of doctors were found to impact on communication within the multidisciplinary hospital-based lung cancer team. Existing guidelines on implementation of multidisciplinary cancer care fail to address barriers to effective team communication. The paper-based medical record does not support team communications and alternative electronic solutions need to be used. © 2012 Blackwell Publishing Ltd.

  11. Estimating challenge load due to disease outbreaks and other challenges using reproduction records of sows.

    PubMed

    Mathur, P K; Herrero-Medrano, J M; Alexandri, P; Knol, E F; ten Napel, J; Rashidi, H; Mulder, H A

    2014-12-01

    A method was developed and tested to estimate challenge load due to disease outbreaks and other challenges in sows using reproduction records. The method was based on reproduction records from a farm with known disease outbreaks. It was assumed that the reduction in weekly reproductive output within a farm is proportional to the magnitude of the challenge. As the challenge increases beyond certain threshold, it is manifested as an outbreak. The reproduction records were divided into 3 datasets. The first dataset called the Training dataset consisted of 57,135 reproduction records from 10,901 sows from 1 farm in Canada with several outbreaks of porcine reproductive and respiratory syndrome (PRRS). The known disease status of sows was regressed on the traits number born alive, number of losses as a combination of still birth and mummified piglets, and number of weaned piglets. The regression coefficients from this analysis were then used as weighting factors for derivation of an index measure called challenge load indicator. These weighting factors were derived with i) a two-step approach using residuals or year-week solutions estimated from a previous step, and ii) a single-step approach using the trait values directly. Two types of models were used for each approach: a logistic regression model and a general additive model. The estimates of challenge load indicator were then compared based on their ability to detect PRRS outbreaks in a Test dataset consisting of records from 65,826 sows from 15 farms in the Netherlands. These farms differed from the Canadian farm with respect to PRRS virus strains, severity and frequency of outbreaks. The single-step approach using a general additive model was best and detected 14 out of the 15 outbreaks. This approach was then further validated using the third dataset consisting of reproduction records of 831,855 sows in 431 farms located in different countries in Europe and America. A total of 41 out of 48 outbreaks detected using data analysis were confirmed based on diagnostic information received from the farms. Among these, 30 outbreaks were due to PRRS while 11 were due to other diseases and challenging conditions. The results suggest that proposed method could be useful for estimation of challenge load and detection of challenge phases such as disease outbreaks.

  12. An e-consent-based shared EHR system architecture for integrated healthcare networks.

    PubMed

    Bergmann, Joachim; Bott, Oliver J; Pretschner, Dietrich P; Haux, Reinhold

    2007-01-01

    Virtual integration of distributed patient data promises advantages over a consolidated health record, but raises questions mainly about practicability and authorization concepts. Our work aims on specification and development of a virtual shared health record architecture using a patient-centred integration and authorization model. A literature survey summarizes considerations of current architectural approaches. Complemented by a methodical analysis in two regional settings, a formal architecture model was specified and implemented. Results presented in this paper are a survey of architectural approaches for shared health records and an architecture model for a virtual shared EHR, which combines a patient-centred integration policy with provider-oriented document management. An electronic consent system assures, that access to the shared record remains under control of the patient. A corresponding system prototype has been developed and is currently being introduced and evaluated in a regional setting. The proposed architecture is capable of partly replacing message-based communications. Operating highly available provider repositories for the virtual shared EHR requires advanced technology and probably means additional costs for care providers. Acceptance of the proposed architecture depends on transparently embedding document validation and digital signature into the work processes. The paradigm shift from paper-based messaging to a "pull model" needs further evaluation.

  13. A detailed taxonomy of Upper Cretaceous and lower Tertiary Crassatellidae in the Eastern United States; an example of the nature of extinction at the boundary

    USGS Publications Warehouse

    Wingard, G. Lynn

    1993-01-01

    Current theories on the causes of extinction at the CretaceousTertiary boundary have been based on previously published data; however, few workers have stopped to ask the question, 'How good is the basic data set?' To test the accuracy of the published record, a quantitative and qualitative analysis of the Crassatellidae (Mollusca, Bivalvia) of the Gulf and Mid-Atlantic Coastal Plains of the United States for the Upper Cretaceous and lower Tertiary was conducted. Thirty-eight species names and four generic names are used in publications for the Crassatellidae within the geographic and stratigraphic constraints of this analysis. Fourteen of the 38 species names are represented by statistically valid numbers of specimens and were tested by using canonical discriminant analysis. All 38 names, with the exception of 1 invalid name and 4 names for which no representative specimen could be located, were evaluated qualitatively. The results show that the published fossil record is highly inaccurate. Only 8 valid, recognizable species exist in the Crassatellidae within the limits of this study, 14 names are synonymized, and 11 names are represented by indeterminate molds or poorly preserved specimens. Three of the four genera are well founded; the fourth is based on the juvenile of another genus and therefore synonymized. This detailed taxonomic analysis of the Crassatellidae illustrates that the published fossil record is not reliable. Calculations of evolutionary and paleobiologic significance based on poorly defined, overly split fossil groups, such as the Crassatellidae, are biased in the following ways: Rates of evolution and extinction are higher, Faunal turnover at mass extinctions appears more catastrophic, Species diversity is high, Average species durations are shortened, and Geographic ranges are restricted. The data on the taxonomically standardized Crassatellidae show evolutionary rates one-quarter to one-half that of the published fossil record; faunal change at the Cretaceous-Tertiary boundary that was not catastrophic; a constant number of species on each side of the Cretaceous-Tertiary boundary; a decrease in abundance in the Tertiary; and lower species diversity, longer average species durations, and expanded geographic ranges. Similar detailed taxonomic studies need to be conducted on other groups of organisms to test the patterns illustrated for the Crassatellidae and to determine the extent and direction of the bias in the published fossil record. Answers to our questions about evolutionary change cannot be found in the literature but rather with the fossils themselves. Evolution and extinction occur within small populations of species groups, and it is only through detailed analysis of these groups that we can achieve an understanding of the causes and effects of evolution and extinction.

  14. Instrumentation for low noise nanopore-based ionic current recording under laser illumination

    NASA Astrophysics Data System (ADS)

    Roelen, Zachary; Bustamante, José A.; Carlsen, Autumn; Baker-Murray, Aidan; Tabard-Cossa, Vincent

    2018-01-01

    We describe a nanopore-based optofluidic instrument capable of performing low-noise ionic current recordings of individual biomolecules under laser illumination. In such systems, simultaneous optical measurements generally introduce significant parasitic noise in the electrical signal, which can severely reduce the instrument sensitivity, critically hindering the monitoring of single-molecule events in the ionic current traces. Here, we present design rules and describe simple adjustments to the experimental setup to mitigate the different noise sources encountered when integrating optical components to an electrical nanopore system. In particular, we address the contributions to the electrical noise spectra from illuminating the nanopore during ionic current recording and mitigate those effects through control of the illumination source and the use of a PDMS layer on the SiNx membrane. We demonstrate the effectiveness of our noise minimization strategies by showing the detection of DNA translocation events during membrane illumination with a signal-to-noise ratio of ˜10 at 10 kHz bandwidth. The instrumental guidelines for noise minimization that we report are applicable to a wide range of nanopore-based optofluidic systems and offer the possibility of enhancing the quality of synchronous optical and electrical signals obtained during single-molecule nanopore-based analysis.

  15. Implementation of electronic logbook for trainees of general surgery in Thailand.

    PubMed

    Aphinives, Potchavit

    2013-01-01

    All trainees are required to keep a record of their surgical skill and experiences throughout the trainingperiod in a logbook format. Paper-based logbook has several limitations. Therefore, an electronic logbook was introduced to replace the paper-based logbook. An electronic logbook program was developed in November 2005. This program was designed as web-based application based upon PHP scripts beneath Apache web server and MySQL database implementation. Only simpliJfied and essential data, such as hospital number diagnosis, surgical procedure, and pathological findings, etc. are recorded. The electronic logbook databases between Academic year 2006 and 2011 were analyzed. The annual recordedsurgical procedures gradually increasedfrom 41,214 procedures in 2006 to 66,643 procedures in 2011. Around one-third of all records were not verified by attending staffs, i.e. 27.59% (2006), 31.69% (2007), 18.06% (2008), 28.42% (2009), 30.18% (2010), and 31.41% (2011). On the Education year 2011, the three most common procedural groups included colon, rectum & anus group, appendix group, and vascular group, respectively. Advantages of the electronic logbook included more efficient data access, increased ability to monitor trainees and trainers, and analysis of procedural varieties among the training institutes.

  16. A strategy to study regional hydrology and terrestrial ecosystem processes using satellite remote sensing, ground-based data and computer modeling

    NASA Technical Reports Server (NTRS)

    Vorosmarty, C.; Grace, A.; Moore, B.; Choudhury, B.; Willmott, C. J.

    1990-01-01

    A strategy is presented for integrating scanning multichannel microwave radiometer data from the Nimbus-7 satellite with meteorological station records and computer simulations of land surface hydrology, terrestrial nutrient cycling, and trace gas emission. Analysis of the observations together with radiative transfer analysis shows that in the tropics the temporal and spatial variations of the polarization difference are determined primarily by the structure and phenology of vegetation and seasonal inundations of major rivers and wetlands. It is concluded that the proposed surface hydrology model, along with climatological records, and, potentially, 37-GHz data for phenology, will provide inputs to a terrestrial ecosystem model that predicts regional net primary production and CO2 gas exchange.

  17. Hidden Markov model analysis of force/torque information in telemanipulation

    NASA Technical Reports Server (NTRS)

    Hannaford, Blake; Lee, Paul

    1991-01-01

    A model for the prediction and analysis of sensor information recorded during robotic performance of telemanipulation tasks is presented. The model uses the hidden Markov model to describe the task structure, the operator's or intelligent controller's goal structure, and the sensor signals. A methodology for constructing the model parameters based on engineering knowledge of the task is described. It is concluded that the model and its optimal state estimation algorithm, the Viterbi algorithm, are very succesful at the task of segmenting the data record into phases corresponding to subgoals of the task. The model provides a rich modeling structure within a statistical framework, which enables it to represent complex systems and be robust to real-world sensory signals.

  18. Global Ocean Evaporation: How Well Can We Estimate Interannual to Decadal Variability?

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Bosilovich, Michael G.; Roberts, Jason B.; Wang, Hailan

    2015-01-01

    Evaporation from the world's oceans constitutes the largest component of the global water balance. It is important not only as the ultimate source of moisture that is tied to the radiative processes determining Earth's energy balance but also to freshwater availability over land, governing habitability of the planet. Here we focus on variability of ocean evaporation on scales from interannual to decadal by appealing to three sources of data: the new MERRA-2 (Modern-Era Retrospective analysis for Research and Applications -2); climate models run with historical sea-surface temperatures, ice and atmospheric constituents (so-called AMIP experiments); and state-of-the-art satellite retrievals from the Seaflux and HOAPS (Hamburg Ocean-Atmosphere Parameters and Fluxes from Satellite) projects. Each of these sources has distinct advantages as well as drawbacks. MERRA-2, like other reanalyses, synthesizes evaporation estimates consistent with observationally constrained physical and dynamical models-but data stream discontinuities are a major problem for interpreting multi-decadal records. The climate models used in data assimilation can also be run with lesser constraints such as with SSTs and sea-ice (i.e. AMIPs) or with additional, minimal observations of surface pressure and marine observations that have longer and less fragmentary observational records. We use the new ERA-20C reanalysis produced by ECMWF embodying the latter methodology. Still, the model physics biases in climate models and the lack of a predicted surface energy balance are of concern. Satellite retrievals and comparisons to ship-based measurements offer the most observationally-based estimates, but sensor inter-calibration, algorithm retrieval assumptions, and short records are dominant issues. Our strategy depends on maximizing the advantages of these combined records. The primary diagnostic tool used here is an analysis of bulk aerodynamic computations produced by these sources and uses a first-order Taylor series analysis of wind speed, SST, near-surface stability and relative humidity variations around climatology to gauge the importance of these components. We find that the MERRA-2 evaporation record is strongly influenced by the availability of wind speed and humidity from passive microwave imagers beginning in the late 1980s as well as by the SST record. The trend over the period 1980 to present is nearly 10%. AMIP or the ERA-20C trends are much smaller. We find that ENSO-related signals involving both wind speed and thermodynamic variability remain the primary signal in the latter and are confirmed by satellite retrievals. We present uncertainty estimates based on the various data sources and discuss the implications for GEWEX water and energy budget science challenges.

  19. Spatial-phase-modulation-based study of polyvinyl-alcohol/acrylamide photopolymers in the low spatial frequency range.

    PubMed

    Gallego, Sergi; Márquez, André; Méndez, David; Marini, Stephan; Beléndez, Augusto; Pascual, Inmaculada

    2009-08-01

    Photopolymers are appealing materials for the fabrication of diffractive optical elements (DOEs). We evaluate the possibilities of polyvinyl-alcohol/acrylamide-based photopolymers to store diffractive elements with low spatial frequencies. We record gratings with different spatial frequencies in the material and analyze the material behavior measuring the transmitted and the reflected orders as a function of exposition. We study two different compositions for the photopolymer, with and without a cross-linker. The values of diffraction efficiency achieved for both compositions make the material suitable to record DOEs with long spatial periods. Assuming a Fermi-Dirac-function-based profile, we fitted the diffracted intensities (up to the eighth order) to obtain the phase profile of the recorded gratings. This analysis shows that it is possible to achieve a phase shift larger than 2pi rad with steep edges in the periodic phase profile. In the case of the measurements in reflection, we have obtained information dealing with the surface profile, which show that it has a smooth shape with an extremely large phase-modulation depth.

  20. Analysis of Direct Recordings from the Surface of the Human Brain

    NASA Astrophysics Data System (ADS)

    Towle, Vernon L.

    2006-03-01

    Recording electrophysiologic signals directly from the cortex of patients with chronically implanted subdural electrodes provides an opportunity to map the functional organization of human cortex. In addition to using direct cortical stimulation, sensory evoked potentials, and electrocorticography (ECoG) can also be used. The analysis of ECoG power spectrums and inter-electrode lateral coherence patterns may be helpful in identifying important eloquent cortical areas and epileptogenic regions in cortical multifocal epilepsy. Analysis of interictal ECoG coherence can reveal pathological cortical areas that are functionally distinct from patent cortex. Subdural ECoGs have been analyzed from 50 medically refractive pediatric epileptic patients as part of their routine surgical work-up. Recording arrays were implanted over the frontal, parietal, occipital or temporal lobes for 4-10 days, depending on the patient's seizure semiology and imaging studies. Segments of interictal ECoG ranging in duration from 5 sec to 45 min were examined to identify areas of increased local coherence. Ictal records were examined to identify the stages and spread of the seizures. Immediately before a seizure began, lateral coherence values decreased, reorganized, and then increased during the late ictal and post-ictal periods. When computed over relatively long interictal periods (45 min) coherence patterns were found to be highly stable (r = 0.97, p < .001), and only changed gradually over days. On the other hand, when calculated over short periods of time (5 sec) coherence patterns were highly dynamic. Coherence patterns revealed a rich topography, with reduced coherence across sulci and major fissures. Areas that participate in receptive and expressive speech can be mapped through event-related potentials and analysis of task-specific changes in power spectrums. Information processing is associated with local increases in high frequency activity, with concomitant changes in coherence, suggestive of a transiently active language network. Our findings suggest that analysis of coherence patterns can supplement visual inspection of conventional records to help identify pathological regions of cortex. With further study, it is hoped that analysis of single channel dynamics, along with analysis of multichannel lateral coherence patterns, and the functional holographic technique may allow determination of the boundaries of epileptic foci based on brief interictal recordings, possibly obviating the current need for extended monitoring of seizures.

  1. Temperature variations in the southern Great Lakes during the last deglaciation: Comparison between pollen and GDGT proxies

    NASA Astrophysics Data System (ADS)

    Watson, Benjamin I.; Williams, John W.; Russell, James M.; Jackson, Stephen T.; Shane, Linda; Lowell, Thomas V.

    2018-02-01

    Our understanding of deglacial climate history in the southern Great Lakes region of the United States is primarily based upon fossil pollen data, with few independent and multi-proxy climate reconstructions. Here we introduce a new, well-dated fossil pollen record from Stotzel-Leis, OH, and a new deglacial temperature record based on branched glycerol dialkyl glycerol tetraethers (brGDGTs) at Silver Lake, OH. We compare these new data to previously published records and to a regional stack of pollen-based temperature reconstructions from Stotzel-Leis, Silver Lake, and three other well-dated sites. The new and previously published pollen records at Stotzel-Leis are similar, but our new age model brings vegetation events into closer alignment with known climatic events such as the Younger Dryas (YD). brGDGT-inferred temperatures correlate strongly with pollen-based regional temperature reconstructions, with the strongest correlation obtained for a global soil-based brGDGT calibration (r2 = 0.88), lending confidence to the deglacial reconstructions and the use of brGDGT and regional pollen stacks as paleotemperature proxies in eastern North America. However, individual pollen records show large differences in timing, rates, and amplitudes of inferred temperature change, indicating caution with paleoclimatic inferences based on single-site pollen records. From 16.0 to 10.0ka, both proxies indicate that regional temperatures rose by ∼10 °C, roughly double the ∼5 °C estimates for the Northern Hemisphere reported in prior syntheses. Change-point analysis of the pollen stack shows accelerated warming at 14.0 ± 1.2ka, cooling at 12.6 ± 0.4ka, and warming from 11.6 ± 0.5ka into the Holocene. The timing of Bølling-Allerød (B-A) warming and YD onset in our records lag by ∼300-500 years those reported in syntheses of temperature records from the northern mid-latitudes. This discrepancy is too large to be attributed to uncertainties in radiocarbon dating, and correlation between pollen and brGDGT temperature reconstructions rules out vegetation lags as a cause. However, the YD termination appears synchronous among the brGDGT record, regional pollen stack, and Northern Hemisphere stack. The cause of the larger and lagged temperature changes in the southern Great Lakes relative to Northern Hemisphere averages remains unclear, but may be due to the effects of continentality and ice sheet extent on regional climate evolution.

  2. Temperature variations in the southern Great Lakes during the last deglaciation: Comparison between pollen and GDGT proxies

    USGS Publications Warehouse

    Watson, Benjamin I.; Williams, John W.; Russell, James M.; Jackson, Stephen T.; Shane, Linda; Lowell, Thomas V.

    2018-01-01

    Our understanding of deglacial climate history in the southern Great Lakes region of the United States is primarily based upon fossil pollen data, with few independent and multi-proxy climate reconstructions. Here we introduce a new, well-dated fossil pollen record from Stotzel-Leis, OH, and a new deglacial temperature record based on branched glycerol dialkyl glycerol tetraethers (brGDGTs) at Silver Lake, OH. We compare these new data to previously published records and to a regional stack of pollen-based temperature reconstructions from Stotzel-Leis, Silver Lake, and three other well-dated sites. The new and previously published pollen records at Stotzel-Leis are similar, but our new age model brings vegetation events into closer alignment with known climatic events such as the Younger Dryas (YD). brGDGT-inferred temperatures correlate strongly with pollen-based regional temperature reconstructions, with the strongest correlation obtained for a global soil-based brGDGT calibration (r2 = 0.88), lending confidence to the deglacial reconstructions and the use of brGDGT and regional pollen stacks as paleotemperature proxies in eastern North America. However, individual pollen records show large differences in timing, rates, and amplitudes of inferred temperature change, indicating caution with paleoclimatic inferences based on single-site pollen records. From 16.0 to 10.0ka, both proxies indicate that regional temperatures rose by ∼10 °C, roughly double the ∼5 °C estimates for the Northern Hemisphere reported in prior syntheses. Change-point analysis of the pollen stack shows accelerated warming at 14.0 ± 1.2ka, cooling at 12.6 ± 0.4ka, and warming from 11.6 ± 0.5ka into the Holocene. The timing of Bølling-Allerød (B-A) warming and YD onset in our records lag by ∼300–500 years those reported in syntheses of temperature records from the northern mid-latitudes. This discrepancy is too large to be attributed to uncertainties in radiocarbon dating, and correlation between pollen and brGDGT temperature reconstructions rules out vegetation lags as a cause. However, the YD termination appears synchronous among the brGDGT record, regional pollen stack, and Northern Hemisphere stack. The cause of the larger and lagged temperature changes in the southern Great Lakes relative to Northern Hemisphere averages remains unclear, but may be due to the effects of continentality and ice sheet extent on regional climate evolution.

  3. VetCompass Australia: A National Big Data Collection System for Veterinary Science.

    PubMed

    McGreevy, Paul; Thomson, Peter; Dhand, Navneet K; Raubenheimer, David; Masters, Sophie; Mansfield, Caroline S; Baldwin, Timothy; Soares Magalhaes, Ricardo J; Rand, Jacquie; Hill, Peter; Peaston, Anne; Gilkerson, James; Combs, Martin; Raidal, Shane; Irwin, Peter; Irons, Peter; Squires, Richard; Brodbelt, David; Hammond, Jeremy

    2017-09-26

    VetCompass Australia is veterinary medical records-based research coordinated with the global VetCompass endeavor to maximize its quality and effectiveness for Australian companion animals (cats, dogs, and horses). Bringing together all seven Australian veterinary schools, it is the first nationwide surveillance system collating clinical records on companion-animal diseases and treatments. VetCompass data service collects and aggregates real-time, clinical records for researchers to interrogate, delivering sustainable and cost-effective access to data from hundreds of veterinary practitioners nationwide. Analysis of these clinical records will reveal geographical and temporal trends in the prevalence of inherited and acquired diseases, identify frequently prescribed treatments, revolutionize clinical auditing, help the veterinary profession to rank research priorities, and assure evidence-based companion-animal curricula in veterinary schools. VetCompass Australia will progress in three phases: (1) roll-out of the VetCompass platform to harvest Australian veterinary clinical record data; (2) development and enrichment of the coding (data-presentation) platform; and (3) creation of a world-first, real-time surveillance interface with natural language processing (NLP) technology. The first of these three phases is described in the current article. Advances in the collection and sharing of records from numerous practices will enable veterinary professionals to deliver a vastly improved level of care for companion animals that will improve their quality of life.

  4. Automated Assessment of Child Vocalization Development Using LENA

    ERIC Educational Resources Information Center

    Richards, Jeffrey A.; Xu, Dongxin; Gilkerson, Jill; Yapanel, Umit; Gray, Sharmistha; Paul, Terrance

    2017-01-01

    Purpose: To produce a novel, efficient measure of children's expressive vocal development on the basis of automatic vocalization assessment (AVA), child vocalizations were automatically identified and extracted from audio recordings using Language Environment Analysis (LENA) System technology. Method: Assessment was based on full-day audio…

  5. 40 CFR 63.495 - Back-end process provisions-procedures to determine compliance with residual organic HAP...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... current process operating conditions. (iii) Design analysis based on accepted chemical engineering... quantity are production records, measurement of stream characteristics, and engineering calculations. (5...-end process operations using engineering assessment. Engineering assessment includes, but is not...

  6. 40 CFR 63.495 - Back-end process provisions-procedures to determine compliance with residual organic HAP...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... current process operating conditions. (iii) Design analysis based on accepted chemical engineering... quantity are production records, measurement of stream characteristics, and engineering calculations. (5...-end process operations using engineering assessment. Engineering assessment includes, but is not...

  7. 40 CFR 63.495 - Back-end process provisions-procedures to determine compliance with residual organic HAP...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... current process operating conditions. (iii) Design analysis based on accepted chemical engineering... quantity are production records, measurement of stream characteristics, and engineering calculations. (5...-end process operations using engineering assessment. Engineering assessment includes, but is not...

  8. 40 CFR 63.495 - Back-end process provisions-procedures to determine compliance with residual organic HAP...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... current process operating conditions. (iii) Design analysis based on accepted chemical engineering... quantity are production records, measurement of stream characteristics, and engineering calculations. (5...-end process operations using engineering assessment. Engineering assessment includes, but is not...

  9. GIS-based smartphone application to support water quality data collection

    USDA-ARS?s Scientific Manuscript database

    Thorough registration of locations, conditions, and operations at sampling sites is essential for any field survey work. The collected information of this type has to be systematic, transferable, and accessible from different locations of data analysis. The handheld data-recording with smart devices...

  10. Estimating monetary damages from flooding in the United States under a changing climate

    EPA Science Inventory

    A national-scale analysis of potential changes in monetary damages from flooding under climate change. The approach uses empirically based statistical relationships between historical precipitation and flood damage records from 18 hydrologic regions of the United States, along w...

  11. The Use of Correctional “NO!” Approach to Reduce Destructive Behavior on Autism Student of CANDA Educational Institution in Surakarta

    NASA Astrophysics Data System (ADS)

    Anggraini, N.

    2017-02-01

    This research aims to reduce the destructive behavior such as throwing the learning materials on autism student by using correctional “NO!” approach in CANDA educational institution Surakarta. This research uses Single Subject Research (SSR) method with A-B design, it is baseline and intervention. Subject of this research is one autism student of CANDA educational institution named G.A.P. Data were collected through recording in direct observation in the form of recording events at the time of implementation baseline and intervention. Data were analyzed by simple descriptive statistical analysis and is displayed in graphical form. Based on the result of data analysis, it could be concluded that destructive behavior such as throwing the learning material on autism student was significantly reduced after given an intervention. Based on the research results, using correctional “NO!” approach can be used by teacher or therapist to reduce the destructive behavior on autism student.

  12. Encryption Characteristics of Two USB-based Personal Health Record Devices

    PubMed Central

    Wright, Adam; Sittig, Dean F.

    2007-01-01

    Personal health records (PHRs) hold great promise for empowering patients and increasing the accuracy and completeness of health information. We reviewed two small USB-based PHR devices that allow a patient to easily store and transport their personal health information. Both devices offer password protection and encryption features. Analysis of the devices shows that they store their data in a Microsoft Access database. Due to a flaw in the encryption of this database, recovering the user’s password can be accomplished with minimal effort. Our analysis also showed that, rather than encrypting health information with the password chosen by the user, the devices stored the user’s password as a string in the database and then encrypted that database with a common password set by the manufacturer. This is another serious vulnerability. This article describes the weaknesses we discovered, outlines three critical flaws with the security model used by the devices, and recommends four guidelines for improving the security of similar devices. PMID:17460132

  13. Classification of pregnancy and labor contractions using a graph theory based analysis.

    PubMed

    Nader, N; Hassan, M; Falou, W; Diab, A; Al-Omar, S; Khalil, M; Marque, C

    2015-08-01

    In this paper, we propose a new framework to characterize the electrohysterographic (EHG) signals recorded during pregnancy and labor. The approach is based on the analysis of the propagation of the uterine electrical activity. The processing pipeline includes i) the estimation of the statistical dependencies between the different recorded EHG signals, ii) the characterization of the obtained connectivity matrices using network measures and iii) the use of these measures in clinical application: the classification between pregnancy and labor. Due to its robustness to volume conductor, we used the imaginary part of coherence in order to produce the connectivity matrix which is then transformed into a graph. We evaluate the performance of several graph measures. We also compare the results with the parameter mostly used in the literature: the peak frequency combined with the propagation velocity (PV +PF). Our results show that the use of the network measures is a promising tool to classify labor and pregnancy contractions with a small superiority of the graph strength over PV+PF.

  14. Modification of Rat Lung Decellularization Protocol Based on Dynamic Conductometry of Working Solution.

    PubMed

    Kuevda, E V; Gubareva, E A; Gumenyuk, I S; Sotnichenko, A S; Gilevich, I V; Nakokhov, R Z; Rusinova, T V; Yudina, T G; Red'ko, A N; Alekseenko, S N

    2017-03-01

    We modified the protocol of obtaining of biological scaffolds of rat lungs based on dynamic recording of specific resistivity of working detergent solution (conductometry) during perfusion decellularization. Termination of sodium deoxycholate exposure after attaining ionic equilibrium plateau did not impair the quality of decellularization and preserved structural matrix components, which was confirmed by morphological analysis and quantitative assay of residual DNA.

  15. Did recent world record marathon runners employ optimal pacing strategies?

    PubMed

    Angus, Simon D

    2014-01-01

    We apply statistical analysis of high frequency (1 km) split data for the most recent two world-record marathon runs: Run 1 (2:03:59, 28 September 2008) and Run 2 (2:03:38, 25 September 2011). Based on studies in the endurance cycling literature, we develop two principles to approximate 'optimal' pacing in the field marathon. By utilising GPS and weather data, we test, and then de-trend, for each athlete's field response to gradient and headwind on course, recovering standardised proxies for power-based pacing traces. The resultant traces were analysed to ascertain if either runner followed optimal pacing principles; and characterise any deviations from optimality. Whereas gradient was insignificant, headwind was a significant factor in running speed variability for both runners, with Runner 2 targeting the (optimal) parallel variation principle, whilst Runner 1 did not. After adjusting for these responses, neither runner followed the (optimal) 'even' power pacing principle, with Runner 2's macro-pacing strategy fitting a sinusoidal oscillator with exponentially expanding envelope whilst Runner 1 followed a U-shaped, quadratic form. The study suggests that: (a) better pacing strategy could provide elite marathon runners with an economical pathway to significant performance improvements at world-record level; and (b) the data and analysis herein is consistent with a complex-adaptive model of power regulation.

  16. A stacking method and its applications to Lanzarote tide gauge records

    NASA Astrophysics Data System (ADS)

    Zhu, Ping; van Ruymbeke, Michel; Cadicheanu, Nicoleta

    2009-12-01

    A time-period analysis tool based on stacking is introduced in this paper. The original idea comes from the classical tidal analysis method. It is assumed that the period of each major tidal component is precisely determined based on the astronomical constants and it is unchangeable with time at a given point in the Earth. We sum the tidal records at a fixed tidal component center period T then take the mean of it. The stacking could significantly increase the signal-to-noise ratio (SNR) if a certain number of stacking circles is reached. The stacking results were fitted using a sinusoidal function, the amplitude and phase of the fitting curve is computed by the least squares methods. The advantage of the method is that: (1) an individual periodical signal could be isolated by stacking; (2) one can construct a linear Stacking-Spectrum (SSP) by changing the stacking period Ts; (3) the time-period distribution of the singularity component could be approximated by a Sliding-Stacking approach. The shortcoming of the method is that in order to isolate a low energy frequency or separate the nearby frequencies, we need a long enough series with high sampling rate. The method was tested with a numeric series and then it was applied to 1788 days Lanzarote tide gauge records as an example.

  17. Paleoflood Data, Extreme Floods and Frequency: Data and Models for Dam Safety Risk Scenarios

    NASA Astrophysics Data System (ADS)

    England, J. F.; Godaire, J.; Klinger, R.

    2007-12-01

    Extreme floods and probability estimates are crucial components in dam safety risk analysis and scenarios for water-resources decision making. The field-based collection of paleoflood data provides needed information on the magnitude and probability of extreme floods at locations of interest in a watershed or region. The stratigraphic record present along streams in the form of terrace and floodplain deposits represent direct indicators of the magnitude of large floods on a river, and may provide 10 to 100 times longer records than conventional stream gaging records of large floods. Paleoflood data is combined with gage and historical streamflow estimates to gain insights to flood frequency scaling, model extrapolations and uncertainty, and provide input scenarios to risk analysis event trees. We illustrate current data collection and flood frequency modeling approaches via case studies in the western United States, including the American River in California and the Arkansas River in Colorado. These studies demonstrate the integration of applied field geology, hydraulics, and surface-water hydrology. Results from these studies illustrate the gains in information content on extreme floods, provide data- based means to separate flood generation processes, guide flood frequency model extrapolations, and reduce uncertainties. These data and scenarios strongly influence water resources management decisions.

  18. Mining of Business-Oriented Conversations at a Call Center

    NASA Astrophysics Data System (ADS)

    Takeuchi, Hironori; Nasukawa, Tetsuya; Watanabe, Hideo

    Recently it has become feasible to transcribe textual records from telephone conversations at call centers by using automatic speech recognition. In this research, we extended a text mining system for call summary records and constructed a conversation mining system for the business-oriented conversations at the call center. To acquire useful business insights from the conversational data through the text mining system, it is critical to identify appropriate textual segments and expressions as the viewpoints to focus on. In the analysis of call summary data using a text mining system, some experts defined the viewpoints for the analysis by looking at some sample records and by preparing the dictionaries based on frequent keywords in the sample dataset. However with conversations it is difficult to identify such viewpoints manually and in advance because the target data consists of complete transcripts that are often lengthy and redundant. In this research, we defined a model of the business-oriented conversations and proposed a mining method to identify segments that have impacts on the outcomes of the conversations and can then extract useful expressions in each of these identified segments. In the experiment, we processed the real datasets from a car rental service center and constructed a mining system. With this system, we show the effectiveness of the method based on the defined conversation model.

  19. Independent component analysis for automatic note extraction from musical trills

    NASA Astrophysics Data System (ADS)

    Brown, Judith C.; Smaragdis, Paris

    2004-05-01

    The method of principal component analysis, which is based on second-order statistics (or linear independence), has long been used for redundancy reduction of audio data. The more recent technique of independent component analysis, enforcing much stricter statistical criteria based on higher-order statistical independence, is introduced and shown to be far superior in separating independent musical sources. This theory has been applied to piano trills and a database of trill rates was assembled from experiments with a computer-driven piano, recordings of a professional pianist, and commercially available compact disks. The method of independent component analysis has thus been shown to be an outstanding, effective means of automatically extracting interesting musical information from a sea of redundant data.

  20. Assessment of Evidence Base from Medical Debriefs Data on Space Motion Sickness Incidence and Treatment

    NASA Technical Reports Server (NTRS)

    Younker, D.R.; Daniels, V.R.; Boyd, J.L.; Putcha, L.

    2008-01-01

    An objective of this data compilation and analysis project is to examine incidence and treatment efficacy of common patho-physiological disturbances during spaceflight. Analysis of medical debriefs data indicated that astronauts used medications to alleviate symptoms of four major ailments for which astronauts received treatment for sleep disturbances, space motion sickness (SMS), pain (headache, back pain) and sinus congestion. In the present data compilation and analysis project on SMS treatment during space missions, subject demographics (gender, age, first-time or repeat flyer), incidence and severity of SMS symptoms and subjective treatment efficacy from 317 crewmember debrief records were examined from STS-1 through STS-89. Preliminary analysis of data revealed that 50% of crew members reported SMS symptoms on at least one flight and 22% never experienced it. In addition, there were 387 medication dosing episodes reported, and promethazine was the most commonly used medication. Results of analysis of symptom check lists, medication use/efficacy and gender and flight record differences in incidence and treatment efficacy will be presented. Evidence gaps for treatment efficacy along with medication use trend analysis will be identified.

  1. Required number of records for ASCE/SEI 7 ground-motion scaling procedure

    USGS Publications Warehouse

    Reyes, Juan C.; Kalkan, Erol

    2011-01-01

    The procedures and criteria in 2006 IBC (International Council of Building Officials, 2006) and 2007 CBC (International Council of Building Officials, 2007) for the selection and scaling ground-motions for use in nonlinear response history analysis (RHA) of structures are based on ASCE/SEI 7 provisions (ASCE, 2005, 2010). According to ASCE/SEI 7, earthquake records should be selected from events of magnitudes, fault distance, and source mechanisms that comply with the maximum considered earthquake, and then scaled so that the average value of the 5-percent-damped response spectra for the set of scaled records is not less than the design response spectrum over the period range from 0.2Tn to 1.5Tn sec (where Tn is the fundamental vibration period of the structure). If at least seven ground-motions are analyzed, the design values of engineering demand parameters (EDPs) are taken as the average of the EDPs determined from the analyses. If fewer than seven ground-motions are analyzed, the design values of EDPs are taken as the maximum values of the EDPs. ASCE/SEI 7 requires a minimum of three ground-motions. These limits on the number of records in the ASCE/SEI 7 procedure are based on engineering experience, rather than on a comprehensive evaluation. This study statistically examines the required number of records for the ASCE/SEI 7 procedure, such that the scaled records provide accurate, efficient, and consistent estimates of" true" structural responses. Based on elastic-perfectly-plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI 7 scaling procedure is applied to 480 sets of ground-motions. The number of records in these sets varies from three to ten. The records in each set were selected either (i) randomly, (ii) considering their spectral shapes, or (iii) considering their spectral shapes and design spectral-acceleration value, A(Tn). As compared to benchmark (that is, "true") responses from unscaled records using a larger catalog of ground-motions, it is demonstrated that the ASCE/SEI 7 scaling procedure is overly conservative if fewer than seven ground-motions are employed. Utilizing seven or more randomly selected records provides a more accurate estimate of the EDPs accompanied by reduced record-to-record variability of the responses. Consistency in accuracy and efficiency is achieved only if records are selected on the basis of their spectral shape and A(Tn).

  2. Calibration of an M L scale for South Africa using tectonic earthquake data recorded by the South African National Seismograph Network: 2006 to 2009

    NASA Astrophysics Data System (ADS)

    Saunders, Ian; Ottemöller, Lars; Brandt, Martin B. C.; Fourie, Christoffel J. S.

    2013-04-01

    A relation to determine local magnitude ( M L) based on the original Richter definition is empirically derived from synthetic Wood-Anderson seismograms recorded by the South African National Seismograph Network. In total, 263 earthquakes in the distance range 10 to 1,000 km, representing 1,681 trace amplitudes measured in nanometers from synthesized Wood-Anderson records on the vertical channel were considered to derive an attenuation relation appropriate for South Africa through multiple regression analysis. Additionally, station corrections were determined for 26 stations during the regression analysis resulting in values ranging between -0.31 and 0.50. The most appropriate M L scale for South Africa from this study satisfies the equation: {M_{{{L}}}} = {{lo}}{{{g}}_{{10}}}(A) + 1.149{{lo}}{{{g}}_{{10}}}(R) + 0.00063R + 2.04 - S The anelastic attenuation term derived from this study indicates that ground motion attenuation is significantly different from Southern California but comparable with stable continental regions.

  3. Rapid earthquake detection through GPU-Based template matching

    NASA Astrophysics Data System (ADS)

    Mu, Dawei; Lee, En-Jui; Chen, Po

    2017-12-01

    The template-matching algorithm (TMA) has been widely adopted for improving the reliability of earthquake detection. The TMA is based on calculating the normalized cross-correlation coefficient (NCC) between a collection of selected template waveforms and the continuous waveform recordings of seismic instruments. In realistic applications, the computational cost of the TMA is much higher than that of traditional techniques. In this study, we provide an analysis of the TMA and show how the GPU architecture provides an almost ideal environment for accelerating the TMA and NCC-based pattern recognition algorithms in general. So far, our best-performing GPU code has achieved a speedup factor of more than 800 with respect to a common sequential CPU code. We demonstrate the performance of our GPU code using seismic waveform recordings from the ML 6.6 Meinong earthquake sequence in Taiwan.

  4. Identification of Bearing Failure Using Signal Vibrations

    NASA Astrophysics Data System (ADS)

    Yani, Irsyadi; Resti, Yulia; Burlian, Firmansyah

    2018-04-01

    Vibration analysis can be used to identify damage to mechanical systems such as journal bearings. Identification of failure can be done by observing the resulting vibration spectrum by measuring the vibration signal occurring in a mechanical system Bearing is one of the engine elements commonly used in mechanical systems. The main purpose of this research is to monitor the bearing condition and to identify bearing failure on a mechanical system by observing the resulting vibration. Data collection techniques based on recordings of sound caused by the vibration of the mechanical system were used in this study, then created a database system based bearing failure due to vibration signal recording sounds on a mechanical system The next step is to group the bearing damage by type based on the databases obtained. The results show the percentage of success in identifying bearing damage is 98 %.

  5. Novel method based on video tracking system for simultaneous measurement of kinematics and flow in the wake of a freely swimming fish

    NASA Astrophysics Data System (ADS)

    Wu, Guanhao; Yang, Yan; Zeng, Lijiang

    2006-11-01

    A novel method based on video tracking system for simultaneous measurement of kinematics and flow in the wake of a freely swimming fish is described. Spontaneous and continuous swimming behaviors of a variegated carp (Cyprinus carpio) are recorded by two cameras mounted on a translation stage which is controlled to track the fish. By processing the images recorded during tracking, the detailed kinematics based on calculated midlines and quantitative analysis of the flow in the wake during a low-speed turn and burst-and-coast swimming are revealed. We also draw the trajectory of the fish during a continuous swimming bout containing several moderate maneuvers. The results prove that our method is effective for studying maneuvers of fish both from kinematic and hydrodynamic viewpoints.

  6. Isolation, purification and analysis of dissolved organic carbon from Gohagoda uncontrolled open dumpsite leachate, Sri Lanka.

    PubMed

    Vithanage, Meththika; Wijesekara, Hasintha; Mayakaduwa, S S

    2017-07-01

    Extract and analysis of the Dissolved Organic Carbon (DOC) fractions were analyzed from the leachate of an uncontrolled dumpsite at Gohagoda, Sri Lanka. DOC fractions, humic acid (HA), fulvic acid (FA) and the hydrophilic (Hyd) fractions were isolated and purified with the resin techniques. Spectroscopic techniques and elemental analysis were performed to characterize DOCs. Maximum TOC and DOC values recorded were 56,955 and 28,493 mg/L, respectively. Based on the total amount of DOC fractionation, Hyd dominated accounting for ∼60%, and HA and FA constituted ∼22% and ∼17%, respectively, exhibiting the mature phase of the dumpsite. The elemental analysis of DOCs revealed carbon variation following HA > FA > Hyd, while hydrogen and nitrogen were similar in each fraction. The N/C ratio for HA was recorded as 0.18, following a similar trend in old dumpsite leachate elsewhere. The O/C ratios for HA and FA were recorded higher as much as 1.0 and 9.3, respectively, indicating high degree of carbon mineralization in the leachates. High content of carboxylic, phenolic and lactone groups in all DOCs was observed disclosing their potential for toxic substances transportation. The results strongly suggest the risk associated with DOCs in dumpsite leachate to the aquatic and terrestrial environment.

  7. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data

    PubMed Central

    Jang, Min Jee; Nam, Yoonkey

    2015-01-01

    Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973

  8. Special Flood Hazard Evaluation Report, Maumee River, Defiance and Paulding Counties, Ohio

    DTIC Science & Technology

    1988-01-01

    into the Flood Flow Frequency Analysis (FFFA) computer program (Reference 3) to determine the discharge-frequency relationship for the Maumee River...although the flood may occur in any year. It is based on statistical analysis of streamflow records available for the watershed and analysis of rainfall...C) K) K4 10 ERFODBUDR .S ryEgne itit ufI N - FODA ONAYSEIA LO AADEAUTO 6 ? -F -C )I= ~ - %E )tvXJ. AE LO LVTO MAMERVE CROS SECIONLOCAION DEFINCEAND

  9. Recording Brain Electromagnetic Activity During the Administration of the Gaseous Anesthetic Agents Xenon and Nitrous Oxide in Healthy Volunteers.

    PubMed

    Pelentritou, Andria; Kuhlmann, Levin; Cormack, John; Woods, Will; Sleigh, Jamie; Liley, David

    2018-01-13

    Anesthesia arguably provides one of the only systematic ways to study the neural correlates of global consciousness/unconsciousness. However to date most neuroimaging or neurophysiological investigations in humans have been confined to the study of γ-Amino-Butyric-Acid-(GABA)-receptor-agonist-based anesthetics, while the effects of dissociative N-Methyl-D-Aspartate-(NMDA)-receptor-antagonist-based anesthetics ketamine, nitrous oxide (N2O) and xenon (Xe) are largely unknown. This paper describes the methods underlying the simultaneous recording of magnetoencephalography (MEG) and electroencephalography (EEG) from healthy males during inhalation of the gaseous anesthetic agents N2O and Xe. Combining MEG and EEG data enables the assessment of electromagnetic brain activity during anesthesia at high temporal, and moderate spatial, resolution. Here we describe a detailed protocol, refined over multiple recording sessions, that includes subject recruitment, anesthesia equipment setup in the MEG scanner room, data collection and basic data analysis. In this protocol each participant is exposed to varying levels of Xe and N2O in a repeated measures cross-over design. Following relevant baseline recordings participants are exposed to step-wise increasing inspired concentrations of Xe and N2O of 8, 16, 24 and 42%, and 16, 32 and 47% respectively, during which their level of responsiveness is tracked with an auditory continuous performance task (aCPT). Results are presented for a number of recordings to highlight the sensor-level properties of the raw data, the spectral topography, the minimization of head movements, and the unequivocal level dependent effects on the auditory evoked responses. This paradigm describes a general approach to the recording of electromagnetic signals associated with the action of different kinds of gaseous anesthetics, which can be readily adapted to be used with volatile and intravenous anesthetic agents. It is expected that the method outlined can contribute to the understanding of the macro-scale mechanisms of anesthesia by enabling methodological extensions involving source space imaging and functional network analysis.

  10. Is it acceptable to video-record palliative care consultations for research and training purposes? A qualitative interview study exploring the views of hospice patients, carers and clinical staff.

    PubMed

    Pino, Marco; Parry, Ruth; Feathers, Luke; Faull, Christina

    2017-09-01

    Research using video recordings can advance understanding of healthcare communication and improve care, but making and using video recordings carries risks. To explore views of hospice patients, carers and clinical staff about whether videoing patient-doctor consultations is acceptable for research and training purposes. We used semi-structured group and individual interviews to gather hospice patients, carers and clinical staff views. We used Braun and Clark's thematic analysis. Interviews were conducted at one English hospice to inform the development of a larger video-based study. We invited patients with capacity to consent and whom the care team judged were neither acutely unwell nor severely distressed (11), carers of current or past patients (5), palliative medicine doctors (7), senior nurses (4) and communication skills educators (5). Participants viewed video-based research on communication as valuable because of its potential to improve communication, care and staff training. Video-based research raised concerns including its potential to affect the nature and content of the consultation and threats to confidentiality; however, these were not seen as sufficient grounds for rejecting video-based research. Video-based research was seen as acceptable and useful providing that measures are taken to reduce possible risks across the recruitment, recording and dissemination phases of the research process. Video-based research is an acceptable and worthwhile way of investigating communication in palliative medicine. Situated judgements should be made about when it is appropriate to involve individual patients and carers in video-based research on the basis of their level of vulnerability and ability to freely consent.

  11. Is it acceptable to video-record palliative care consultations for research and training purposes? A qualitative interview study exploring the views of hospice patients, carers and clinical staff

    PubMed Central

    Pino, Marco; Parry, Ruth; Feathers, Luke; Faull, Christina

    2017-01-01

    Background: Research using video recordings can advance understanding of healthcare communication and improve care, but making and using video recordings carries risks. Aim: To explore views of hospice patients, carers and clinical staff about whether videoing patient–doctor consultations is acceptable for research and training purposes. Design: We used semi-structured group and individual interviews to gather hospice patients, carers and clinical staff views. We used Braun and Clark’s thematic analysis. Setting/participants: Interviews were conducted at one English hospice to inform the development of a larger video-based study. We invited patients with capacity to consent and whom the care team judged were neither acutely unwell nor severely distressed (11), carers of current or past patients (5), palliative medicine doctors (7), senior nurses (4) and communication skills educators (5). Results: Participants viewed video-based research on communication as valuable because of its potential to improve communication, care and staff training. Video-based research raised concerns including its potential to affect the nature and content of the consultation and threats to confidentiality; however, these were not seen as sufficient grounds for rejecting video-based research. Video-based research was seen as acceptable and useful providing that measures are taken to reduce possible risks across the recruitment, recording and dissemination phases of the research process. Conclusion: Video-based research is an acceptable and worthwhile way of investigating communication in palliative medicine. Situated judgements should be made about when it is appropriate to involve individual patients and carers in video-based research on the basis of their level of vulnerability and ability to freely consent. PMID:28590153

  12. Algorithm for Video Summarization of Bronchoscopy Procedures

    PubMed Central

    2011-01-01

    Background The duration of bronchoscopy examinations varies considerably depending on the diagnostic and therapeutic procedures used. It can last more than 20 minutes if a complex diagnostic work-up is included. With wide access to videobronchoscopy, the whole procedure can be recorded as a video sequence. Common practice relies on an active attitude of the bronchoscopist who initiates the recording process and usually chooses to archive only selected views and sequences. However, it may be important to record the full bronchoscopy procedure as documentation when liability issues are at stake. Furthermore, an automatic recording of the whole procedure enables the bronchoscopist to focus solely on the performed procedures. Video recordings registered during bronchoscopies include a considerable number of frames of poor quality due to blurry or unfocused images. It seems that such frames are unavoidable due to the relatively tight endobronchial space, rapid movements of the respiratory tract due to breathing or coughing, and secretions which occur commonly in the bronchi, especially in patients suffering from pulmonary disorders. Methods The use of recorded bronchoscopy video sequences for diagnostic, reference and educational purposes could be considerably extended with efficient, flexible summarization algorithms. Thus, the authors developed a prototype system to create shortcuts (called summaries or abstracts) of bronchoscopy video recordings. Such a system, based on models described in previously published papers, employs image analysis methods to exclude frames or sequences of limited diagnostic or education value. Results The algorithm for the selection or exclusion of specific frames or shots from video sequences recorded during bronchoscopy procedures is based on several criteria, including automatic detection of "non-informative", frames showing the branching of the airways and frames including pathological lesions. Conclusions The paper focuses on the challenge of generating summaries of bronchoscopy video recordings. PMID:22185344

  13. Observed SWE trends and climate analysis for Northwest Pacific North America: validation for future projection of SWE using the CRCM and VIC

    NASA Astrophysics Data System (ADS)

    Bennett, K. E.; Bronaugh, D.; Rodenhuis, D.

    2008-12-01

    Observational databases of snow water equivalent (SWE) have been collected from Alaska, western US states and the Canadian provinces of British Columbia, Alberta, Saskatchewan, and territories of NWT, and the Yukon. These databases were initially validated to remove inconsistencies and errors in the station records, dates or the geographic co-ordinates of the station. The cleaned data was then analysed for historical (1950 to 2006) trend using emerging techniques for trend detection based on (first of the month) estimates for January to June. Analysis of SWE showed spatial variability in the count of records across the six month time period, and this study illustrated differences between Canadian and US (or the north and south) collection. Two different data sets (one gridded and one station) were then used to analyse April 1st records, for which there was the greatest spatial spread of station records for analysis with climate information. Initial results show spatial variability (in both magnitude and direction of trend) for trend results, and climate correlations and principal components indicate different drivers of change in SWE across the western US, Canada and north to Alaska. These results will be used to validate future predictions of SWE that are being undertaken using the Canadian Regional Climate Model (CRCM) and the Variable Infiltration Capacity (VIC) hydrologic model for Western Northern America (CRCM) and British Columbia (VIC).

  14. Reappraisal of fetal abdominal circumference in an Asian population: analysis of 50,131 records.

    PubMed

    Lu, Szu-Ching; Chang, Chiung-Hsin; Yu, Chen-Hsiang; Kang, Lin; Tsai, Pei-Ying; Chang, Fong-Ming

    2008-03-01

    Fetuses from different populations may show different growth patterns. In obstetrics, fetal abdominal circumference (AC) is a very useful index for assessing fetal growth. In this study, we attempted to establish the normal fetal growth curves of AC in an Asian population in South Taiwan. We reviewed our computer ultrasound database of fetal AC records from January 1991 to December 2006. During the study period of 16 years, only the fetuses examined by ultrasonography with gestational age between 14 and 41 weeks were included. We excluded extreme bilateral records after initial analysis. Eventually, 50,131 records of AC were included for final analysis. The observed gestation-specific AC values and the predicted AC values were calculated. The best-fit regression equation of AC versus gestational age is a second-order polynomial equation. In general, fetal AC values in our population showed similar patterns to those in Western populations. Besides, we established a table of the predicted AC values based on specific gestational age, including the 5 th , 10 th , 50 th , 90 th and 95 th centiles, for clinical reference. To the best of our knowledge, our series is the largest sample of AC reported in the medical literature. We believe that the gestational age-specific nomogram of fetal AC is important for further clinical assessment of fetal growth.

  15. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring

    PubMed Central

    Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Objective Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Methods Six MSPE algorithms—derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis—were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. Results CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. Conclusions MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales. PMID:27723803

  16. Acoustic communication at the water's edge: evolutionary insights from a mudskipper.

    PubMed

    Polgar, Gianluca; Malavasi, Stefano; Cipolato, Giacomo; Georgalas, Vyron; Clack, Jennifer A; Torricelli, Patrizia

    2011-01-01

    Coupled behavioural observations and acoustical recordings of aggressive dyadic contests showed that the mudskipper Periophthalmodon septemradiatus communicates acoustically while out of water. An analysis of intraspecific variability showed that specific acoustic components may act as tags for individual recognition, further supporting the sounds' communicative value. A correlative analysis amongst acoustical properties and video-acoustical recordings in slow-motion supported first hypotheses on the emission mechanism. Acoustic transmission through the wet exposed substrate was also discussed. These observations were used to support an "exaptation hypothesis", i.e. the maintenance of key adaptations during the first stages of water-to-land vertebrate eco-evolutionary transitions (based on eco-evolutionary and palaeontological considerations), through a comparative bioacoustic analysis of aquatic and semiterrestrial gobiid taxa. In fact, a remarkable similarity was found between mudskipper vocalisations and those emitted by gobioids and other soniferous benthonic fishes.

  17. Determining team cognition from delay analysis using cross recurrence plot.

    PubMed

    Hajari, Nasim; Cheng, Irene; Bin Zheng; Basu, Anup

    2016-08-01

    Team cognition is an important factor in evaluating and determining team performance. Forming a team with good shared cognition is even more crucial for laparoscopic surgery applications. In this study, we analyzed the eye tracking data of two surgeons during a laparoscopic simulation operation, then performed Cross Recurrence Analysis (CRA) on the recorded data to study the delay behaviour for good performer and poor performer teams. Dual eye tracking data for twenty two dyad teams were recorded during a laparoscopic task and then the teams were divided into good performer and poor performer teams based on the task times. Eventually we studied the delay between two team members for good and poor performer teams. The results indicated that the good performer teams show a smaller delay comparing to poor performer teams. This study is compatible with gaze overlap analysis between team members and therefore it is a good evidence of shared cognition between team members.

  18. #LancerHealth: Using Twitter and Instagram as a tool in a campus wide health promotion initiative.

    PubMed

    Santarossa, Sara; Woodruff, Sarah J

    2018-02-05

    The present study aimed to explore using popular technology that people already have/use as a health promotion tool, in a campus wide social media health promotion initiative, entitled #LancerHealth . During a two-week period the university community was asked to share photos on Twitter and Instagram of What does being healthy on campus look like to you ?, while tagging the image with #LancerHealth . All publically tagged media was collected using the Netlytic software and analysed. Text analysis (N=234 records, Twitter; N=141 records, Instagram) revealed that the majority of the conversation was positive and focused on health and the university. Social network analysis, based on five network properties, showed a small network with little interaction. Lastly, photo coding analysis (N=71 unique image) indicated that the majority of the shared images were of physical activity (52%) and on campus (80%). Further research into this area is warranted.

  19. Acoustic Communication at the Water's Edge: Evolutionary Insights from a Mudskipper

    PubMed Central

    Polgar, Gianluca; Malavasi, Stefano; Cipolato, Giacomo; Georgalas, Vyron; Clack, Jennifer A.; Torricelli, Patrizia

    2011-01-01

    Coupled behavioural observations and acoustical recordings of aggressive dyadic contests showed that the mudskipper Periophthalmodon septemradiatus communicates acoustically while out of water. An analysis of intraspecific variability showed that specific acoustic components may act as tags for individual recognition, further supporting the sounds' communicative value. A correlative analysis amongst acoustical properties and video-acoustical recordings in slow-motion supported first hypotheses on the emission mechanism. Acoustic transmission through the wet exposed substrate was also discussed. These observations were used to support an “exaptation hypothesis”, i.e. the maintenance of key adaptations during the first stages of water-to-land vertebrate eco-evolutionary transitions (based on eco-evolutionary and palaeontological considerations), through a comparative bioacoustic analysis of aquatic and semiterrestrial gobiid taxa. In fact, a remarkable similarity was found between mudskipper vocalisations and those emitted by gobioids and other soniferous benthonic fishes. PMID:21738663

  20. Signal analysis of accelerometry data using gravity-based modeling

    NASA Astrophysics Data System (ADS)

    Davey, Neil P.; James, Daniel A.; Anderson, Megan E.

    2004-03-01

    Triaxial accelerometers have been used to measure human movement parameters in swimming. Interpretation of data is difficult due to interference sources including interaction of external bodies. In this investigation the authors developed a model to simulate the physical movement of the lower back. Theoretical accelerometery outputs were derived thus giving an ideal, or noiseless dataset. An experimental data collection apparatus was developed by adapting a system to the aquatic environment for investigation of swimming. Model data was compared against recorded data and showed strong correlation. Comparison of recorded and modeled data can be used to identify changes in body movement, this is especially useful when cyclic patterns are present in the activity. Strong correlations between data sets allowed development of signal processing algorithms for swimming stroke analysis using first the pure noiseless data set which were then applied to performance data. Video analysis was also used to validate study results and has shown potential to provide acceptable results.

  1. Rapid climate change from north Andean Lake Fúquene pollen records driven by obliquity: implications for a basin-wide biostratigraphic zonation for the last 284 ka

    NASA Astrophysics Data System (ADS)

    Bogotá-A, R. G.; Groot, M. H. M.; Hooghiemstra, H.; Lourens, L. J.; Van der Linden, M.; Berrio, J. C.

    2011-11-01

    This paper compares a new super-high resolution pollen record from a central location in Lake Fúquene (4°N) with 3 pollen records from marginal sites from the same lake basin, located at 2540 m elevation in the Eastern Cordillera of Colombia. We harmonized the pollen sum of all records, and provided previously published records of climate change with an improved age model using a new approach for long continental pollen records. We dissociated from subjective curve matching and applied a more objective procedure including radiocarbon ages, cyclostratigraphy, and orbital tuning using the new 284 ka long Fúquene Basin Composite record (Fq-BC) as the backbone ( Groot et al., 2011). We showed that a common ˜9 m cycle in the arboreal pollen percentage (AP%) records reflects obliquity forcing and drives vegetational and climatic change. The AP% records were tuned to the 41 kyr component filtered from standard benthic δ 18O LR04 record. Changes in sediment supply to the lake are reflected in concert by the four records making frequency analysis in the depth domain an adequate method to compare records from the same basin. We calibrated the original 14C ages and used where necessary biostratigraphic correlation, i.e. for records shorter than one obliquity cycle. Pollen records from the periphery of the lake showed changes in the abundance of Alnus and Weinmannia forests more clearly while centrally located record Fq-9C shows a more integrated signal of regional vegetation change. The revised age models show that core Fq-2 reflects the last 44 ka and composite record Fq-7C the last 85.5 ka. Marginally located core Fq-3 has an age of 133 ka at 32 m core depth and the lowermost 11 m of sediments appear of older but unknown age. The longest record Fq-BC shows ˜60 yr resolution over the period of 284-27 ka. All pollen records are in support of a common regional vegetation development leading to a robust reconstruction of long series of submillennial climate oscillations reflecting Dansgaard-Oeschger (DO) cycles. Reconstructed climate variability in the tropical Andes since marine isotope stage (MIS) 8 compares well with NGRIP (δ 18O based), Epica Dome C (δD based) and the Mediterranean sea surface temperature record MD01-2443/44 (U K'37 based) underpinning the global significance of the climate record from this tropical Andean lake. A basin-wide biostratigraphy is presented and we concluded although with varying robustness that each core is representative of regional vegetational and climatic change.

  2. An integrated platform for simultaneous multi-well field potential recording and Fura-2-based calcium transient ratiometry in human induced pluripotent stem cell (hiPSC)-derived cardiomyocytes.

    PubMed

    Rast, Georg; Weber, Jürgen; Disch, Christoph; Schuck, Elmar; Ittrich, Carina; Guth, Brian D

    2015-01-01

    Human induced pluripotent stem cell-derived cardiomyocytes are available from various sources and they are being evaluated for safety testing. Several platforms are available offering different assay principles and read-out parameters: patch-clamp and field potential recording, imaging or photometry, impedance measurement, and recording of contractile force. Routine use will establish which assay principle and which parameters best serve the intended purpose. We introduce a combination of field potential recording and calcium ratiometry from spontaneously beating cardiomyocytes as a novel assay providing a complementary read-out parameter set. Field potential recording is performed using a commercial multi-well multi-electrode array platform. Calcium ratiometry is performed using a fiber optic illumination and silicon avalanche photodetectors. Data condensation and statistical analysis are designed to enable statistical inference of differences and equivalence with regard to a solvent control. Simultaneous recording of field potentials and calcium transients from spontaneously beating monolayers was done in a nine-well format. Calcium channel blockers (e.g. nifedipine) and a blocker of calcium store release (ryanodine) can be recognized and discriminated based on the calcium transient signal. An agonist of L-type calcium channels, FPL 64176, increased and prolonged the calcium transient, whereas BAY K 8644, another L-type calcium channel agonist, had no effect. Both FPL 64176 and various calcium channel antagonists have chronotropic effects, which can be discriminated from typical "chronotropic" compounds, like (±)isoprenaline (positive) and arecaidine propargyl ester (negative), based on their effects on the calcium transient. Despite technical limitations in temporal resolution and exact matching of composite calcium transient with the field potential of a subset of cells, the combined recording platform enables a refined interpretation of the field potential recording and a more reliable identification of drug effects on calcium handling. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Rural telemedicine project in northern New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zink, S.; Hahn, H.; Rudnick, J.

    A virtual electronic medical record system is being deployed over the Internet with security in northern New Mexico using TeleMed, a multimedia medical records management system that uses CORBA-based client-server technology and distributed database architecture. The goal of the NNM Rural Telemedicine Project is to implement TeleMed into fifteen rural clinics and two hospitals within a 25,000 square mile area of northern New Mexico. Evaluation of the project consists of three components: job task analysis, audit of immunized children, and time motion studies. Preliminary results of the evaluation components are presented.

  4. Stage description, new combination and new records of Neotropical Brachycercinae (Ephemeroptera: Caenidae).

    PubMed

    Angeli, Kamila Batista; Salles, Frederico Falcão; Paresque, Roberta; Molineri, Carlos; Lima, Lucas Ramos Costa

    2016-03-08

    We present taxonomic contributions and new records for Neotropical Brachycercinae based on material from Brazil. We performed a phylogenetic analysis in order to test the relationship between Alloretochus Sun & McCafferty, 2008 and Latineosus Sun & Mc- Cafferty, 2008, and Alloretochus sigillatus was recovered in the Latineosus clade. Therefore, we propose a new combination, Latineosus sigillatus comb. n. The nymph of Latineosus sigillatus is described and is associated with imago through molecular tools. Moreover, Alloretochus peruanicus (Soldán, 1986) is reported for the first time from Brazil.

  5. OPTOELECTRONICS, FIBER OPTICS, AND OTHER ASPECTS OF QUANTUM ELECTRONICS: Use of a photothermoplastic disk in memories based on one-dimensional holograms

    NASA Astrophysics Data System (ADS)

    Gulanyan, É. Kh; Mikaélyan, A. L.; Molchanova, L. V.; Sidorov, Vladimir A.; Fedorov, I. V.

    1989-08-01

    An analysis was made of the results of multichannel recording of one-dimensional holograms in a reusable photothermoplastic carrier, particularly in the form of a disk. It was found experimentally that in development and erasure of optical data recorded in a photothermoelastic carrier one could use not only heating by an electric current, but also heating by YAG:Nd laser radiation (λ = 1.06 μm) or semiconductor laser radiation (λ = 0.82 μm).

  6. NACA documents database project

    NASA Technical Reports Server (NTRS)

    Smith, Ruth S.

    1991-01-01

    The plan to get all the National Advisory Committee on Aeronautics (NACA) collection online, with quality records, led to the NACA Documents Data base Project. The project has a two fold purpose: (1) to develop the definitive bibliography of NACA produced and/or held documents; and (2) to make that bibliography and the associated documents available to the aerospace community. This study supports the first objective by providing an analysis of the NACA collection and its bibliographic records, and supports the second objective by defining the NACA archive and recommending methodologies for meeting the project objectives.

  7. [Meteorological risk factors of stroke].

    PubMed

    Lebedev, I A; Gilvanov, V A; Akinina, S A; Anishchenko, L I

    2013-01-01

    Based on correlation analysis of stroke, recorded in Khanty-Mansiysk during 5 years, and standard meteorological factors, we found the significant relationship between the frequency of stroke and daily temperature amplitude. The positive correlation between the frequency of stroke and between-day changes in air temperature in the combination with changes in atmospheric pressure during 3 h was identified. A maximal number of strokes was recorded in December, April, May and July and a minimal number was in January, June, August and September. The frequency of stroke and fatal outcomes did not depend on the season.

  8. Prospecting by sampling and analysis of airborne particulates and gases

    DOEpatents

    Sehmel, G.A.

    1984-05-01

    A method is claimed for prospecting by sampling airborne particulates or gases at a ground position and recording wind direction values at the time of sampling. The samples are subsequently analyzed to determine the concentrations of a desired material or the ratios of the desired material to other identifiable materials in the collected samples. By comparing the measured concentrations or ratios to expected background data in the vicinity sampled, one can select recorded wind directions indicative of the upwind position of the land-based source of the desired material.

  9. Synthetic Training Data Generation for Activity Monitoring and Behavior Analysis

    NASA Astrophysics Data System (ADS)

    Monekosso, Dorothy; Remagnino, Paolo

    This paper describes a data generator that produces synthetic data to simulate observations from an array of environment monitoring sensors. The overall goal of our work is to monitor the well-being of one occupant in a home. Sensors are embedded in a smart home to unobtrusively record environmental parameters. Based on the sensor observations, behavior analysis and modeling are performed. However behavior analysis and modeling require large data sets to be collected over long periods of time to achieve the level of accuracy expected. A data generator - was developed based on initial data i.e. data collected over periods lasting weeks to facilitate concurrent data collection and development of algorithms. The data generator is based on statistical inference techniques. Variation is introduced into the data using perturbation models.

  10. Text mining facilitates database curation - extraction of mutation-disease associations from Bio-medical literature.

    PubMed

    Ravikumar, Komandur Elayavilli; Wagholikar, Kavishwar B; Li, Dingcheng; Kocher, Jean-Pierre; Liu, Hongfang

    2015-06-06

    Advances in the next generation sequencing technology has accelerated the pace of individualized medicine (IM), which aims to incorporate genetic/genomic information into medicine. One immediate need in interpreting sequencing data is the assembly of information about genetic variants and their corresponding associations with other entities (e.g., diseases or medications). Even with dedicated effort to capture such information in biological databases, much of this information remains 'locked' in the unstructured text of biomedical publications. There is a substantial lag between the publication and the subsequent abstraction of such information into databases. Multiple text mining systems have been developed, but most of them focus on the sentence level association extraction with performance evaluation based on gold standard text annotations specifically prepared for text mining systems. We developed and evaluated a text mining system, MutD, which extracts protein mutation-disease associations from MEDLINE abstracts by incorporating discourse level analysis, using a benchmark data set extracted from curated database records. MutD achieves an F-measure of 64.3% for reconstructing protein mutation disease associations in curated database records. Discourse level analysis component of MutD contributed to a gain of more than 10% in F-measure when compared against the sentence level association extraction. Our error analysis indicates that 23 of the 64 precision errors are true associations that were not captured by database curators and 68 of the 113 recall errors are caused by the absence of associated disease entities in the abstract. After adjusting for the defects in the curated database, the revised F-measure of MutD in association detection reaches 81.5%. Our quantitative analysis reveals that MutD can effectively extract protein mutation disease associations when benchmarking based on curated database records. The analysis also demonstrates that incorporating discourse level analysis significantly improved the performance of extracting the protein-mutation-disease association. Future work includes the extension of MutD for full text articles.

  11. Analyzing a 35-Year Hourly Data Record: Why So Difficult?

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2014-01-01

    At the Goddard Distributed Active Archive Center, we have recently added a 35-Year record of output data from the North American Land Assimilation System (NLDAS) to the Giovanni web-based analysis and visualization tool. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) offers a variety of data summarization and visualization to users that operate at the data center, obviating the need for users to download and read the data themselves for exploratory data analysis. However, the NLDAS data has proven surprisingly resistant to application of the summarization algorithms. Algorithms that were perfectly happy analyzing 15 years of daily satellite data encountered limitations both at the algorithm and system level for 35 years of hourly data. Failures arose, sometimes unexpectedly, from command line overflows, memory overflows, internal buffer overflows, and time-outs, among others. These serve as an early warning sign for the problems likely to be encountered by the general user community as they try to scale up to Big Data analytics. Indeed, it is likely that more users will seek to perform remote web-based analysis precisely to avoid the issues, or the need to reprogram around them. We will discuss approaches to mitigating the limitations and the implications for data systems serving the user communities that try to scale up their current techniques to analyze Big Data.

  12. A stratigraphic framework for abrupt climatic changes during the Last Glacial period based on three synchronized Greenland ice-core records: refining and extending the INTIMATE event stratigraphy

    NASA Astrophysics Data System (ADS)

    Rasmussen, Sune O.; Bigler, Matthias; Blockley, Simon P.; Blunier, Thomas; Buchardt, Susanne L.; Clausen, Henrik B.; Cvijanovic, Ivana; Dahl-Jensen, Dorthe; Johnsen, Sigfus J.; Fischer, Hubertus; Gkinis, Vasileios; Guillevic, Myriam; Hoek, Wim Z.; Lowe, J. John; Pedro, Joel B.; Popp, Trevor; Seierstad, Inger K.; Steffensen, Jørgen Peder; Svensson, Anders M.; Vallelonga, Paul; Vinther, Bo M.; Walker, Mike J. C.; Wheatley, Joe J.; Winstrup, Mai

    2014-12-01

    Due to their outstanding resolution and well-constrained chronologies, Greenland ice-core records provide a master record of past climatic changes throughout the Last Interglacial-Glacial cycle in the North Atlantic region. As part of the INTIMATE (INTegration of Ice-core, MArine and TErrestrial records) project, protocols have been proposed to ensure consistent and robust correlation between different records of past climate. A key element of these protocols has been the formal definition and ordinal numbering of the sequence of Greenland Stadials (GS) and Greenland Interstadials (GI) within the most recent glacial period. The GS and GI periods are the Greenland expressions of the characteristic Dansgaard-Oeschger events that represent cold and warm phases of the North Atlantic region, respectively. We present here a more detailed and extended GS/GI template for the whole of the Last Glacial period. It is based on a synchronization of the NGRIP, GRIP, and GISP2 ice-core records that allows the parallel analysis of all three records on a common time scale. The boundaries of the GS and GI periods are defined based on a combination of stable-oxygen isotope ratios of the ice (δ18O, reflecting mainly local temperature) and calcium ion concentrations (reflecting mainly atmospheric dust loading) measured in the ice. The data not only resolve the well-known sequence of Dansgaard-Oeschger events that were first defined and numbered in the ice-core records more than two decades ago, but also better resolve a number of short-lived climatic oscillations, some defined here for the first time. Using this revised scheme, we propose a consistent approach for discriminating and naming all the significant abrupt climatic events of the Last Glacial period that are represented in the Greenland ice records. The final product constitutes an extended and better resolved Greenland stratotype sequence, against which other proxy records can be compared and correlated. It also provides a more secure basis for investigating the dynamics and fundamental causes of these climatic perturbations.

  13. Feasibility of an electronic stethoscope system for monitoring neonatal bowel sounds.

    PubMed

    Dumas, Jasmine; Hill, Krista M; Adrezin, Ronald S; Alba, Jorge; Curry, Raquel; Campagna, Eric; Fernandes, Cecilia; Lamba, Vineet; Eisenfeld, Leonard

    2013-09-01

    Bowel dysfunction remains a major problem in neonates. Traditional auscultation of bowel sounds as a diagnostic aid in neonatal gastrointestinal complications is limited by skill and inability to document and reassess. Consequently, we built a unique prototype to investigate the feasibility of an electronic monitoring system for continuous assessment of bowel sounds. We attained approval by the Institutional Review Boards for the investigational study to test our system. The system incorporated a prototype stethoscope head with a built-in microphone connected to a digital recorder. Recordings made over extended periods were evaluated for quality. We also considered the acoustic environment of the hospital, where the stethoscope was used. The stethoscope head was attached to the abdomen with a hydrogel patch designed especially for this purpose. We used the system to obtain recordings from eight healthy, full-term babies. A scoring system was used to determine loudness, clarity, and ease of recognition comparing it to the traditional stethoscope. The recording duration was initially two hours and was increased to a maximum of eight hours. Median duration of attachment was three hours (3.75, 2.68). Based on the scoring, the bowel sound recording was perceived to be as loud and clear in sound reproduction as a traditional stethoscope. We determined that room noise and other noises were significant forms of interference in the recordings, which at times prevented analysis. However, no sound quality drift was noted in the recordings and no patient discomfort was noted. Minimal erythema was observed over the fixation site which subsided within one hour. We demonstrated the long-term recording of infant bowel sounds. Our contributions included a prototype stethoscope head, which was affixed using a specially designed hydrogel adhesive patch. Such a recording can be reviewed and reassessed, which is new technology and an improvement over current practice. The use of this system should also, theoretically, reduce risk of infection. Based on our research we concluded that while automatic assessment of bowel sounds is feasible over an extended period, there will be times when analysis is not possible. One limitation is noise interference. Our larger goals include producing a meaningful vital sign to characterize bowel sounds that can be produced in real-time, as well as providing automatic control for patient feeding pumps.

  14. Testing an automated method to estimate ground-water recharge from streamflow records

    USGS Publications Warehouse

    Rutledge, A.T.; Daniel, C.C.

    1994-01-01

    The computer program, RORA, allows automated analysis of streamflow hydrographs to estimate ground-water recharge. Output from the program, which is based on the recession-curve-displacement method (often referred to as the Rorabaugh method, for whom the program is named), was compared to estimates of recharge obtained from a manual analysis of 156 years of streamflow record from 15 streamflow-gaging stations in the eastern United States. Statistical tests showed that there was no significant difference between paired estimates of annual recharge by the two methods. Tests of results produced by the four workers who performed the manual method showed that results can differ significantly between workers. Twenty-two percent of the variation between manual and automated estimates could be attributed to having different workers perform the manual method. The program RORA will produce estimates of recharge equivalent to estimates produced manually, greatly increase the speed od analysis, and reduce the subjectivity inherent in manual analysis.

  15. Initial commented checklist of Iranian mayflies, with new area records and description of Procloeon caspicum sp. n. (Insecta, Ephemeroptera, Baetidae)

    PubMed Central

    Bojková, Jindřiška; Sroka, Pavel; Soldán, Tomáš; Namin, Javid Imanpour; Staniczek, Arnold H.; Polášek, Marek; Hrivniak, Ľuboš; Abdoli, Ashgar; Godunko, Roman J.

    2018-01-01

    Abstract An initial checklist of mayflies (Ephemeroptera) of Iran is compiled based on critical review of available literature data, complemented with new data from 38 localities of Gilan and Ardabil provinces. At present, altogether only 46 species and 25 genera are known from Iran, 18 species are reported as new to Iran in this study. Some previously published data are critically evaluated and doubtful taxa are excluded from the list. Basic analysis of the distribution and biogeography of recorded species is given. Procloeon (Pseudocentroptilum) caspicum Sroka, sp. n. is described based on mature larva and egg. Critical differential diagnostic characters distinguishing the species from related taxa are discussed in detail. PMID:29674922

  16. PhotoMEA: an opto-electronic biosensor for monitoring in vitro neuronal network activity.

    PubMed

    Ghezzi, Diego; Pedrocchi, Alessandra; Menegon, Andrea; Mantero, Sara; Valtorta, Flavia; Ferrigno, Giancarlo

    2007-02-01

    PhotoMEA is a biosensor useful for the analysis of an in vitro neuronal network, fully based on optical methods. Its function is based on the stimulation of neurons with caged glutamate and the recording of neuronal activity by Voltage-Sensitive fluorescent Dyes (VSD). The main advantage is that it will be possible to stimulate even at sub-single neuron level and to record with high resolution the activity of the entire network in the culture. A large-scale view of neuronal intercommunications offers a unique opportunity for testing the ability of drugs to affect neuronal properties as well as alterations in the behaviour of the entire network. The concept and a prototype for validation is described here in detail.

  17. Approximate classification of mining tremors harmfulness based on free-field and building foundation vibrations

    NASA Astrophysics Data System (ADS)

    Kuzniar, Krystyna; Stec, Krystyna; Tatara, Tadeusz

    2018-04-01

    The paper compares the results of an approximate evaluation of mining tremors harmfulness performed on the basis of free-field and simultaneously measured building foundation vibrations. The focus is on the office building located in the Upper Silesian Basin (USB). The empirical Mining Intensity Scale GSI-GZWKW-2012 has been applied to classify the harmfulness of the rockbursts. This scale is based on the measurements of free-field vibrations but, for research purposes, it was also used in the cases of building foundation vibrations. The analysis was carried out using the set of 156 pairs ground - foundation of velocity vibration records as well as the set of 156 pairs of acceleration records induced by the same mining tremors.

  18. Digital data detection and synchronization

    NASA Technical Reports Server (NTRS)

    Noack, T. L.; Morris, J. F.

    1973-01-01

    The primary accomplishments have been in the analysis and simulation of receivers and bit synchronizers. It has been discovered that tracking rate effects play, a rather fundamental role in both receiver and synchronizer performance, but that data relating to recorder time-base-error, for the proper characterization of this phenomenon, is in rather short supply. It is possible to obtain operationally useful tape recorder time-base-error data from high signal-to-noise ratio tapes using synchronizers with relatively wideband tracking loops. Low signal-to-noise ratio tapes examined in the same way would not be synchronizable. Additional areas of interest covered are receiver false lock, cycle slipping, and other unusual phenomena, which have been described to some extent in this and earlier reports and simulated during the study.

  19. Predictions of Experimentally Observed Stochastic Ground Vibrations Induced by Blasting

    PubMed Central

    Kostić, Srđan; Perc, Matjaž; Vasović, Nebojša; Trajković, Slobodan

    2013-01-01

    In the present paper, we investigate the blast induced ground motion recorded at the limestone quarry “Suva Vrela” near Kosjerić, which is located in the western part of Serbia. We examine the recorded signals by means of surrogate data methods and a determinism test, in order to determine whether the recorded ground velocity is stochastic or deterministic in nature. Longitudinal, transversal and the vertical ground motion component are analyzed at three monitoring points that are located at different distances from the blasting source. The analysis reveals that the recordings belong to a class of stationary linear stochastic processes with Gaussian inputs, which could be distorted by a monotonic, instantaneous, time-independent nonlinear function. Low determinism factors obtained with the determinism test further confirm the stochastic nature of the recordings. Guided by the outcome of time series analysis, we propose an improved prediction model for the peak particle velocity based on a neural network. We show that, while conventional predictors fail to provide acceptable prediction accuracy, the neural network model with four main blast parameters as input, namely total charge, maximum charge per delay, distance from the blasting source to the measuring point, and hole depth, delivers significantly more accurate predictions that may be applicable on site. We also perform a sensitivity analysis, which reveals that the distance from the blasting source has the strongest influence on the final value of the peak particle velocity. This is in full agreement with previous observations and theory, thus additionally validating our methodology and main conclusions. PMID:24358140

  20. Predictors of Sex Offender Treatment Completion.

    ERIC Educational Resources Information Center

    Moore, Donna L.; Bergman, Barbara A.; Knox, Pamela L.

    1999-01-01

    Reviews records of 126 incarcerated offenders who participated in a prison-based sex offender treatment program. Discriminate function analysis reveals that offenders who completed treatment were more often diagnosed with a substance disorder, had a history of nonviolence offenses, and were less often diagnosed as having an antisocial personality…

  1. 75 FR 42000 - Requirements for Fingerprint-Based Criminal History Records Checks for Individuals Seeking...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-20

    ... Compatibility VIII. Plain Language IX. Voluntary Consensus Standards X. Finding of No Significant Environmental Impact: Availability XI. Paperwork Reduction Act Statement XII. Regulatory Analysis: Availability XIII... received seven comment letters from interested parties: Four from RTR licensees, one from the Nuclear...

  2. Recording and automated analysis of naturalistic bioptic driving.

    PubMed

    Luo, Gang; Peli, Eli

    2011-05-01

    People with moderate central vision loss are legally permitted to drive with a bioptic telescope in 39 US states and the Netherlands, but the safety of bioptic driving remains highly controversial. There is no scientific evidence about bioptic use and its impact on safety. We propose searching for evidence by recording naturalistic driving activities in patients' cars. In a pilot study we used an analogue video system to record two bioptic drivers' daily driving activities for 10 and 5 days, respectively. In this technical report, we also describe our novel digital system that collects vehicle manoeuvre information and enables recording over more extended periods, and discuss our approach to analyzing the vast amount of data. Our observations of telescope use by the pilot subjects were quite different from their reports in a previous survey. One subject used the telescope only seven times in nearly 6 h of driving. For the other subject, the average interval between telescope use was about 2 min, and Mobile (cell) phone use in one trip extended the interval to almost 5 min. We demonstrate that computerized analysis of lengthy recordings based on video, GPS, acceleration, and black box data can be used to select informative segments for efficient off-line review of naturalistic driving behaviours. The inconsistency between self reports and objective data as well as infrequent telescope use underscores the importance of recording bioptic driving behaviours in naturalistic conditions over extended periods. We argue that the new recording system is important for understanding bioptic use behaviours and bioptic driving safety. © 2011 The College of Optometrists.

  3. Recording and automated analysis of naturalistic bioptic driving

    PubMed Central

    Luo, Gang; Peli, Eli

    2011-01-01

    Purpose People with moderate central vision loss are legally permitted to drive with a bioptic telescope in 39 US states and the Netherlands, but the safety of bioptic driving remains highly controversial. There is no scientific evidence about bioptic use and its impact on safety. We propose searching for evidence by recording naturalistic driving activities in patients' cars. Methods In a pilot study we used an analogue video system to record two bioptic drivers' daily driving activities for 10 and 5 days, respectively. In this technical report, we also describe our novel digital system that collects vehicle maneuver information and enables recording over more extended periods, and discuss our approach to analyzing the vast amount of data. Results Our observations of telescope use by the pilot subjects were quite different from their reports in a previous survey. One subject used the telescope only 7 times in nearly 6 hours of driving. For the other subject, the average interval between telescope use was about 2 minutes, and cell phone use in one trip extended the interval to almost 5 minutes. We demonstrate that computerized analysis of lengthy recordings based on video, GPS, acceleration, and black box data can be used to select informative segments for efficient off-line review of naturalistic driving behaviors. Conclusions The inconsistency between self reports and objective data as well as infrequent telescope use underscores the importance of recording bioptic driving behaviors in naturalistic conditions over extended periods. We argue that the new recording system is important for understanding bioptic use behaviors and bioptic driving safety. PMID:21410498

  4. Initial assessment of facial nerve paralysis based on motion analysis using an optical flow method.

    PubMed

    Samsudin, Wan Syahirah W; Sundaraj, Kenneth; Ahmad, Amirozi; Salleh, Hasriah

    2016-01-01

    An initial assessment method that can classify as well as categorize the severity of paralysis into one of six levels according to the House-Brackmann (HB) system based on facial landmarks motion using an Optical Flow (OF) algorithm is proposed. The desired landmarks were obtained from the video recordings of 5 normal and 3 Bell's Palsy subjects and tracked using the Kanade-Lucas-Tomasi (KLT) method. A new scoring system based on the motion analysis using area measurement is proposed. This scoring system uses the individual scores from the facial exercises and grades the paralysis based on the HB system. The proposed method has obtained promising results and may play a pivotal role towards improved rehabilitation programs for patients.

  5. Analyses Reveal Record-Shattering Global Warm Temperatures in 2015

    NASA Image and Video Library

    2017-12-08

    2015 was the warmest year since modern record-keeping began in 1880, according to a new analysis by NASA’s Goddard Institute for Space Studies. The record-breaking year continues a long-term warming trend — 15 of the 16 warmest years on record have now occurred since 2001. Credits: Scientific Visualization Studio/Goddard Space Flight Center Details: Earth’s 2015 surface temperatures were the warmest since modern record keeping began in 1880, according to independent analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA). Globally-averaged temperatures in 2015 shattered the previous mark set in 2014 by 0.23 degrees Fahrenheit (0.13 Celsius). Only once before, in 1998, has the new record been greater than the old record by this much. The 2015 temperatures continue a long-term warming trend, according to analyses by scientists at NASA’s Goddard Institute for Space Studies (GISS) in New York (GISTEMP). NOAA scientists agreed with the finding that 2015 was the warmest year on record based on separate, independent analyses of the data. Because weather station locations and measurements change over time, there is some uncertainty in the individual values in the GISTEMP index. Taking this into account, NASA analysis estimates 2015 was the warmest year with 94 percent certainty. Read more: www.nasa.gov/press-release/nasa-noaa-analyses-reveal-reco... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  6. 40 CFR 98.337 - Records that must be retained.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... analysis records for carbon content of zinc bearing materials, flux materials (e.g., limestone, dolomite...). (7) Sampling and analysis records for carbon content of electrode materials. (8) You must keep... records specified in paragraphs (b)(1) through (b)(7) of this section. (1) Records of all analyses and...

  7. Real-Time EEG Signal Enhancement Using Canonical Correlation Analysis and Gaussian Mixture Clustering

    PubMed Central

    Huang, Chih-Sheng; Yang, Wen-Yu; Chuang, Chun-Hsiang; Wang, Yu-Kai

    2018-01-01

    Electroencephalogram (EEG) signals are usually contaminated with various artifacts, such as signal associated with muscle activity, eye movement, and body motion, which have a noncerebral origin. The amplitude of such artifacts is larger than that of the electrical activity of the brain, so they mask the cortical signals of interest, resulting in biased analysis and interpretation. Several blind source separation methods have been developed to remove artifacts from the EEG recordings. However, the iterative process for measuring separation within multichannel recordings is computationally intractable. Moreover, manually excluding the artifact components requires a time-consuming offline process. This work proposes a real-time artifact removal algorithm that is based on canonical correlation analysis (CCA), feature extraction, and the Gaussian mixture model (GMM) to improve the quality of EEG signals. The CCA was used to decompose EEG signals into components followed by feature extraction to extract representative features and GMM to cluster these features into groups to recognize and remove artifacts. The feasibility of the proposed algorithm was demonstrated by effectively removing artifacts caused by blinks, head/body movement, and chewing from EEG recordings while preserving the temporal and spectral characteristics of the signals that are important to cognitive research. PMID:29599950

  8. Construction and evaluation of FiND, a fall risk prediction model of inpatients from nursing data.

    PubMed

    Yokota, Shinichiroh; Ohe, Kazuhiko

    2016-04-01

    To construct and evaluate an easy-to-use fall risk prediction model based on the daily condition of inpatients from secondary use electronic medical record system data. The present authors scrutinized electronic medical record system data and created a dataset for analysis by including inpatient fall report data and Intensity of Nursing Care Needs data. The authors divided the analysis dataset into training data and testing data, then constructed the fall risk prediction model FiND from the training data, and tested the model using the testing data. The dataset for analysis contained 1,230,604 records from 46,241 patients. The sensitivity of the model constructed from the training data was 71.3% and the specificity was 66.0%. The verification result from the testing dataset was almost equivalent to the theoretical value. Although the model's accuracy did not surpass that of models developed in previous research, the authors believe FiND will be useful in medical institutions all over Japan because it is composed of few variables (only age, sex, and the Intensity of Nursing Care Needs items), and the accuracy for unknown data was clear. © 2016 Japan Academy of Nursing Science.

  9. Improving MEG source localizations: an automated method for complete artifact removal based on independent component analysis.

    PubMed

    Mantini, D; Franciotti, R; Romani, G L; Pizzella, V

    2008-03-01

    The major limitation for the acquisition of high-quality magnetoencephalography (MEG) recordings is the presence of disturbances of physiological and technical origins: eye movements, cardiac signals, muscular contractions, and environmental noise are serious problems for MEG signal analysis. In the last years, multi-channel MEG systems have undergone rapid technological developments in terms of noise reduction, and many processing methods have been proposed for artifact rejection. Independent component analysis (ICA) has already shown to be an effective and generally applicable technique for concurrently removing artifacts and noise from the MEG recordings. However, no standardized automated system based on ICA has become available so far, because of the intrinsic difficulty in the reliable categorization of the source signals obtained with this technique. In this work, approximate entropy (ApEn), a measure of data regularity, is successfully used for the classification of the signals produced by ICA, allowing for an automated artifact rejection. The proposed method has been tested using MEG data sets collected during somatosensory, auditory and visual stimulation. It was demonstrated to be effective in attenuating both biological artifacts and environmental noise, in order to reconstruct clear signals that can be used for improving brain source localizations.

  10. Evaluation of arterial propagation velocity based on the automated analysis of the Pulse Wave Shape

    NASA Astrophysics Data System (ADS)

    Clara, F. M.; Scandurra, A. G.; Meschino, G. J.; Passoni, L. I.

    2011-12-01

    This paper proposes the automatic estimation of the arterial propagation velocity from the pulse wave raw records measured in the region of the radial artery. A fully automatic process is proposed to select and analyze typical pulse cycles from the raw data. An adaptive neuro-fuzzy inference system, together with a heuristic search is used to find a functional approximation of the pulse wave. The estimation of the propagation velocity is carried out via the analysis of the functional approximation obtained with the fuzzy model. The analysis of the pulse wave records with the proposed methodology showed small differences compared with the method used so far, based on a strong interaction with the user. To evaluate the proposed methodology, we estimated the propagation velocity in a population of healthy men from a wide range of ages. It has been found in these studies that propagation velocity increases linearly with age and it presents a considerable dispersion of values in healthy individuals. We conclude that this process could be used to evaluate indirectly the propagation velocity of the aorta, which is related to physiological age in healthy individuals and with the expectation of life in cardiovascular patients.

  11. High-speed digital phonoscopy images analyzed by Nyquist plots

    NASA Astrophysics Data System (ADS)

    Yan, Yuling

    2012-02-01

    Vocal-fold vibration is a key dynamic event in voice production, and the vibratory characteristics of the vocal fold correlate closely with voice quality and health condition. Laryngeal imaging provides direct means to observe the vocal fold vibration; in the past, however, available modalities were either too slow or impractical to resolve the actual vocal fold vibrations. This limitation has now been overcome by high-speed digital imaging (HSDI) (or high-speed digital phonoscopy), which records images of the vibrating vocal folds at a rate of 2000 frames per second or higher- fast enough to resolve a specific, sustained phonatory vocal fold vibration. The subsequent image-based functional analysis of voice is essential to better understanding the mechanism underlying voice production, as well as assisting the clinical diagnosis of voice disorders. Our primary objective is to develop a comprehensive analytical platform for voice analysis using the HSDI recordings. So far, we have developed various analytical approaches for the HSDI-based voice analyses. These include Nyquist plots and associated analysese that are used along with FFT and Spectrogram in the analysis of the HSDI data representing normal voice and specific voice pathologies.

  12. Methods for peak-flow frequency analysis and reporting for streamgages in or near Montana based on data through water year 2015

    USGS Publications Warehouse

    Sando, Steven K.; McCarthy, Peter M.

    2018-05-10

    This report documents the methods for peak-flow frequency (hereinafter “frequency”) analysis and reporting for streamgages in and near Montana following implementation of the Bulletin 17C guidelines. The methods are used to provide estimates of peak-flow quantiles for 50-, 42.9-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities for selected streamgages operated by the U.S. Geological Survey Wyoming-Montana Water Science Center (WY–MT WSC). These annual exceedance probabilities correspond to 2-, 2.33-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence intervals, respectively.Standard procedures specific to the WY–MT WSC for implementing the Bulletin 17C guidelines include (1) the use of the Expected Moments Algorithm analysis for fitting the log-Pearson Type III distribution, incorporating historical information where applicable; (2) the use of weighted skew coefficients (based on weighting at-site station skew coefficients with generalized skew coefficients from the Bulletin 17B national skew map); and (3) the use of the Multiple Grubbs-Beck Test for identifying potentially influential low flows. For some streamgages, the peak-flow records are not well represented by the standard procedures and require user-specified adjustments informed by hydrologic judgement. The specific characteristics of peak-flow records addressed by the informed-user adjustments include (1) regulated peak-flow records, (2) atypical upper-tail peak-flow records, and (3) atypical lower-tail peak-flow records. In all cases, the informed-user adjustments use the Expected Moments Algorithm fit of the log-Pearson Type III distribution using the at-site station skew coefficient, a manual potentially influential low flow threshold, or both.Appropriate methods can be applied to at-site frequency estimates to provide improved representation of long-term hydroclimatic conditions. The methods for improving at-site frequency estimates by weighting with regional regression equations and by Maintenance of Variance Extension Type III record extension are described.Frequency analyses were conducted for 99 example streamgages to indicate various aspects of the frequency-analysis methods described in this report. The frequency analyses and results for the example streamgages are presented in a separate data release associated with this report consisting of tables and graphical plots that are structured to include information concerning the interpretive decisions involved in the frequency analyses. Further, the separate data release includes the input files to the PeakFQ program, version 7.1, including the peak-flow data file and the analysis specification file that were used in the peak-flow frequency analyses. Peak-flow frequencies are also reported in separate data releases for selected streamgages in the Beaverhead River and Clark Fork Basins and also for selected streamgages in the Ruby, Jefferson, and Madison River Basins.

  13. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform

    PubMed Central

    Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features. PMID:27304979

  14. Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform.

    PubMed

    Wu, Hau-Tieng; Wu, Han-Kuei; Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong

    2016-01-01

    We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features.

  15. Techniques for Soundscape Retrieval and Synthesis

    NASA Astrophysics Data System (ADS)

    Mechtley, Brandon Michael

    The study of acoustic ecology is concerned with the manner in which life interacts with its environment as mediated through sound. As such, a central focus is that of the soundscape: the acoustic environment as perceived by a listener. This dissertation examines the application of several computational tools in the realms of digital signal processing, multimedia information retrieval, and computer music synthesis to the analysis of the soundscape. Namely, these tools include a) an open source software library, Sirens, which can be used for the segmentation of long environmental field recordings into individual sonic events and compare these events in terms of acoustic content, b) a graph-based retrieval system that can use these measures of acoustic similarity and measures of semantic similarity using the lexical database WordNet to perform both text-based retrieval and automatic annotation of environmental sounds, and c) new techniques for the dynamic, realtime parametric morphing of multiple field recordings, informed by the geographic paths along which they were recorded.

  16. Online Recorded Data-Based Composite Neural Control of Strict-Feedback Systems With Application to Hypersonic Flight Dynamics.

    PubMed

    Xu, Bin; Yang, Daipeng; Shi, Zhongke; Pan, Yongping; Chen, Badong; Sun, Fuchun

    2017-09-25

    This paper investigates the online recorded data-based composite neural control of uncertain strict-feedback systems using the backstepping framework. In each step of the virtual control design, neural network (NN) is employed for uncertainty approximation. In previous works, most designs are directly toward system stability ignoring the fact how the NN is working as an approximator. In this paper, to enhance the learning ability, a novel prediction error signal is constructed to provide additional correction information for NN weight update using online recorded data. In this way, the neural approximation precision is highly improved, and the convergence speed can be faster. Furthermore, the sliding mode differentiator is employed to approximate the derivative of the virtual control signal, and thus, the complex analysis of the backstepping design can be avoided. The closed-loop stability is rigorously established, and the boundedness of the tracking error can be guaranteed. Through simulation of hypersonic flight dynamics, the proposed approach exhibits better tracking performance.

  17. On-line Tool Wear Detection on DCMT070204 Carbide Tool Tip Based on Noise Cutting Audio Signal using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Prasetyo, T.; Amar, S.; Arendra, A.; Zam Zami, M. K.

    2018-01-01

    This study develops an on-line detection system to predict the wear of DCMT070204 tool tip during the cutting process of the workpiece. The machine used in this research is CNC ProTurn 9000 to cut ST42 steel cylinder. The audio signal has been captured using the microphone placed in the tool post and recorded in Matlab. The signal is recorded at the sampling rate of 44.1 kHz, and the sampling size of 1024. The recorded signal is 110 data derived from the audio signal while cutting using a normal chisel and a worn chisel. And then perform signal feature extraction in the frequency domain using Fast Fourier Transform. Feature selection is done based on correlation analysis. And tool wear classification was performed using artificial neural networks with 33 input features selected. This artificial neural network is trained with back propagation method. Classification performance testing yields an accuracy of 74%.

  18. 1,500 year quantitative reconstruction of winter precipitation in the Pacific Northwest

    PubMed Central

    Steinman, Byron A.; Abbott, Mark B.; Mann, Michael E.; Stansell, Nathan D.; Finney, Bruce P.

    2012-01-01

    Multiple paleoclimate proxies are required for robust assessment of past hydroclimatic conditions. Currently, estimates of drought variability over the past several thousand years are based largely on tree-ring records. We produced a 1,500-y record of winter precipitation in the Pacific Northwest using a physical model-based analysis of lake sediment oxygen isotope data. Our results indicate that during the Medieval Climate Anomaly (MCA) (900–1300 AD) the Pacific Northwest experienced exceptional wetness in winter and that during the Little Ice Age (LIA) (1450–1850 AD) conditions were drier, contrasting with hydroclimatic anomalies in the desert Southwest and consistent with climate dynamics related to the El Niño Southern Oscillation (ENSO) and the Pacific Decadal Oscillation (PDO). These findings are somewhat discordant with drought records from tree rings, suggesting that differences in seasonal sensitivity between the two proxies allow a more compete understanding of the climate system and likely explain disparities in inferred climate trends over centennial timescales. PMID:22753510

  19. Large historical growth in global terrestrial gross primary production

    DOE PAGES

    Campbell, J. E.; Berry, J. A.; Seibt, U.; ...

    2017-04-05

    Growth in terrestrial gross primary production (GPP) may provide a negative feedback for climate change. It remains uncertain, however, to what extent biogeochemical processes can suppress global GPP growth. In consequence, model estimates of terrestrial carbon storage and carbon cycle –climate feedbacks remain poorly constrained. Here we present a global, measurement-based estimate of GPP growth during the twentieth century based on long-term atmospheric carbonyl sulphide (COS) records derived from ice core, firn, and ambient air samples. Here, we interpret these records using a model that simulates changes in COS concentration due to changes in its sources and sinks, including amore » large sink that is related to GPP. We find that the COS record is most consistent with climate-carbon cycle model simulations that assume large GPP growth during the twentieth century (31% ± 5%; mean ± 95% confidence interval). Finally, while this COS analysis does not directly constrain estimates of future GPP growth it provides a global-scale benchmark for historical carbon cycle simulations.« less

  20. Large historical growth in global terrestrial gross primary production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, J. E.; Berry, J. A.; Seibt, U.

    Growth in terrestrial gross primary production (GPP) may provide a negative feedback for climate change. It remains uncertain, however, to what extent biogeochemical processes can suppress global GPP growth. In consequence, model estimates of terrestrial carbon storage and carbon cycle –climate feedbacks remain poorly constrained. Here we present a global, measurement-based estimate of GPP growth during the twentieth century based on long-term atmospheric carbonyl sulphide (COS) records derived from ice core, firn, and ambient air samples. Here, we interpret these records using a model that simulates changes in COS concentration due to changes in its sources and sinks, including amore » large sink that is related to GPP. We find that the COS record is most consistent with climate-carbon cycle model simulations that assume large GPP growth during the twentieth century (31% ± 5%; mean ± 95% confidence interval). Finally, while this COS analysis does not directly constrain estimates of future GPP growth it provides a global-scale benchmark for historical carbon cycle simulations.« less

  1. iCHRCloud: Web & Mobile based Child Health Imprints for Smart Healthcare.

    PubMed

    Singh, Harpreet; Mallaiah, Raghuram; Yadav, Gautam; Verma, Nitin; Sawhney, Ashu; Brahmachari, Samir K

    2017-11-29

    Reducing child mortality with quality care is the prime-most concern of all nations. Thus in current IT era, our healthcare industry needs to focus on adapting information technology in healthcare services. Barring few preliminary attempts to digitalize basic hospital administrative and clinical functions, even today in India, child health and vaccination records are still maintained as paper-based records. Also, error in manually plotting the parameters in growth charts results in missed opportunities for early detection of growth disorders in children. To address these concerns, we present India's first hospital linked, affordable automated vaccination and real-time child's growth monitoring cloud based application- Integrated Child Health Record cloud (iCHRcloud). This application is based on HL7 protocol enabling integration with hospital's HIS/EMR system. It provides Java (Enterprise Service Bus and Hibernate) based web portal for doctors and mobile application for parents, enhancing doctor-parent engagement. It leverages highchart to automate chart preparation and provides access of data via Push Notification (GCM and APNS) to parents on iOS and Android mobile platforms. iCHRcloud has also been recognized as one of the best innovative solution in three nationwide challenges, 2016 in India. iCHRcloud offers a seamless, secure (256 bit HTTPS) and sustainable solution to reduce child mortality. Detail analysis on preliminary data of 16,490 child health records highlight the diversified need of various demographic regions. Thus, primary lesson would be to implement better validation strategies to fulfill the customize requisites of entire population. This paper presents first glimpse of data and power of the analytics in policy framework.

  2. An eConsent-based System Architecture Supporting Cooperation in Integrated Healthcare Networks.

    PubMed

    Bergmann, Joachim; Bott, Oliver J; Hoffmann, Ina; Pretschner, Dietrich P

    2005-01-01

    The economical need for efficient healthcare leads to cooperative shared care networks. A virtual electronic health record is required, which integrates patient related information but reflects the distributed infrastructure and restricts access only to those health professionals involved into the care process. Our work aims on specification and development of a system architecture fulfilling these requirements to be used in concrete regional pilot studies. Methodical analysis and specification have been performed in a healthcare network using the formal method and modelling tool MOSAIK-M. The complexity of the application field was reduced by focusing on the scenario of thyroid disease care, which still includes various interdisciplinary cooperation. Result is an architecture for a secure distributed electronic health record for integrated care networks, specified in terms of a MOSAIK-M-based system model. The architecture proposes business processes, application services, and a sophisticated security concept, providing a platform for distributed document-based, patient-centred, and secure cooperation. A corresponding system prototype has been developed for pilot studies, using advanced application server technologies. The architecture combines a consolidated patient-centred document management with a decentralized system structure without needs for replication management. An eConsent-based approach assures, that access to the distributed health record remains under control of the patient. The proposed architecture replaces message-based communication approaches, because it implements a virtual health record providing complete and current information. Acceptance of the new communication services depends on compatibility with the clinical routine. Unique and cross-institutional identification of a patient is also a challenge, but will loose significance with establishing common patient cards.

  3. Development and pilot study of an essential set of indicators for general surgery services.

    PubMed

    Soria-Aledo, Victor; Angel-Garcia, Daniel; Martinez-Nicolas, Ismael; Rebasa Cladera, Pere; Cabezali Sanchez, Roger; Pereira García, Luis Francisco

    2016-11-01

    At present there is a lack of appropriate quality measures for benchmarking in general surgery units of Spanish National Health System. The aim of this study is to present the selection, development and pilot-testing of an initial set of surgical quality indicators for this purpose. A modified Delphi was performed with experts from the Spanish Surgeons Association in order to prioritize previously selected indicators. Then, a pilot study was carried out in a public hospital encompassing qualitative analysis of feasibility for prioritized indicators and an additional qualitative and quantitative three-rater reliability assessment for medical record-based indicators. Observed inter-rater agreement, prevalence adjusted and bias adjusted kappa and non-adjusted kappa were performed, using a systematic random sample (n=30) for each of these indicators. Twelve out of 13 proposed indicators were feasible: 5 medical record-based indicators and 7 indicators based on administrative databases. From medical record-based indicators, 3 were reliable (observed agreement >95%, adjusted kappa index >0.6 or non-adjusted kappa index >0.6 for composites and its components) and 2 needed further refinement. Currently, medical record-based indicators could be used for comparison purposes, whilst further research must be done for validation and risk-adjustment of outcome indicators from administrative databases. Compliance results in the adequacy of informed consent, diagnosis-to-treatment delay in colorectal cancer, and antibiotic prophylaxis show room for improvement in the pilot-tested hospital. Copyright © 2016 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. An application of HOMER and ACMANT for homogenising monthly precipitation records in Ireland

    NASA Astrophysics Data System (ADS)

    Coll, John; Curley, Mary; Domonkos, Peter; Aguilar, Enric; Walsh, Seamus; Sweeney, John

    2015-04-01

    Climate change studies based only on raw long-term data are potentially flawed due to the many breaks introduced from non-climatic sources. Consequently, accurate climate data is an essential prerequisite for basing climate related decision making on; and quality controlled, homogenised climate data are becoming integral to European Union Member State efforts to deliver climate services. Ireland has a good repository of monthly precipitation data at approximately 1900 locations stored in the Met Éireann database. The record length at individual precipitation stations varies greatly. However, an audit of the data established the continuous record length at each station and the number of missing months, and based on this two initial subsets of station series (n = 88 and n = 110) were identified for preliminary homogenisation efforts. The HOMER joint detection algorithm was applied to the combined network of these 198 longer station series on an Ireland-wide basis where contiguous intact monthly records ranged from ~40 to 71 years (1941 - 2010). HOMER detected 91 breaks in total in the country-wide series analysis distributed across 63 (~32%) of the 71 year series records analysed. In a separate approach, four sub-series clusters (n = 38 - 61) for the 1950 - 2010 period were used in a parallel analysis applying both ACMANT and HOMER to a regionalised split of the 198 series. By comparison ACMANT detected a considerably higher number of breaks across the four regional series clusters, 238 distributed across 123 (~62%) of the 61 year series records analysed. These preliminary results indicate a relatively high proportion of detected breaks in the series, a situation not generally reflected in observed later 20th century precipitation records across Europe (Domonkos, 2014). However, this elevated ratio of series with detected breaks (~32% in HOMER and ~62% in ACMANT) parallels the break detection rate in a recent analysis of series in the Netherlands (Buishand et al 2013). In the case of Ireland, the climate is even more markedly maritime than that of the Netherlands and the spatial correlations between the Irish series are high (>0.8). Therefore it is likely that both HOMER and ACMANT are detecting relatively small breaks in the series; e.g. the overall range of correction amplitudes derived by HOMER were small and only applied to sections of the corrected series. As Ireland has a relatively dense network of highly correlated station series, we anticipate continued high detection rates as the analysis is extended to incorporate a greater number of station series, and that the ongoing work will quantify the extent of any breaks in Ireland's monthly precipitation series. KEY WORDS: Ireland, precipitation, time series, homogenisation, HOMER, ACMANT. References Buishand, T.A., DeMartino, G., Spreeuw, J.N., Brandsma, T. (2013). Homogeneity of precipitation series in the Netherlands and their trends in the past century. International Journal of Climatology. 33:815-833 Domonkos, P. (2014). Homogenisation of precipitation time series with ACMANT. Theoretical and Applied Climatology. 118:1-2. DOI 10.1007/s00704-014-1298-5.

  5. Data cleaning and management protocols for linked perinatal research data: a good practice example from the Smoking MUMS (Maternal Use of Medications and Safety) Study.

    PubMed

    Tran, Duong Thuy; Havard, Alys; Jorm, Louisa R

    2017-07-11

    Data cleaning is an important quality assurance in data linkage research studies. This paper presents the data cleaning and preparation process for a large-scale cross-jurisdictional Australian study (the Smoking MUMS Study) to evaluate the utilisation and safety of smoking cessation pharmacotherapies during pregnancy. Perinatal records for all deliveries (2003-2012) in the States of New South Wales (NSW) and Western Australia were linked to State-based data collections including hospital separation, emergency department and death data (mothers and babies) and congenital defect notifications (babies in NSW) by State-based data linkage units. A national data linkage unit linked pharmaceutical dispensing data for the mothers. All linkages were probabilistic. Twenty two steps assessed the uniqueness of records and consistency of items within and across data sources, resolved discrepancies in the linkages between units, and identified women having records in both States. State-based linkages yielded a cohort of 783,471 mothers and 1,232,440 babies. Likely false positive links relating to 3703 mothers were identified. Corrections of baby's date of birth and age, and parity were made for 43,578 records while 1996 records were flagged as duplicates. Checks for the uniqueness of the matches between State and national linkages detected 3404 ID clusters, suggestive of missed links in the State linkages, and identified 1986 women who had records in both States. Analysis of content data can identify inaccurate links that cannot be detected by data linkage units that have access to personal identifiers only. Perinatal researchers are encouraged to adopt the methods presented to ensure quality and consistency among studies using linked administrative data.

  6. Seasonal-Scale Dating of a Shallow Ice Core From Greenland Using Oxygen Isotope Matching Between Data and Simulation

    NASA Astrophysics Data System (ADS)

    Furukawa, Ryoto; Uemura, Ryu; Fujita, Koji; Sjolte, Jesper; Yoshimura, Kei; Matoba, Sumito; Iizuka, Yoshinori

    2017-10-01

    A precise age scale based on annual layer counting is essential for investigating past environmental changes from ice core records. However, subannual scale dating is hampered by the irregular intraannual variabilities of oxygen isotope (δ18O) records. Here we propose a dating method based on matching the δ18O variations between ice core records and records simulated by isotope-enabled climate models. We applied this method to a new δ18O record from an ice core obtained from a dome site in southeast Greenland. The close similarity between the δ18O records from the ice core and models enables correlation and the production of a precise age scale, with an accuracy of a few months. A missing δ18O minimum in the 1995/1996 winter is an example of an indistinct δ18O seasonal cycle. Our analysis suggests that the missing δ18O minimum is likely caused by a combination of warm air temperature, weak moisture transport, and cool ocean temperature. Based on the age scale, the average accumulation rate from 1960 to 2014 is reconstructed as 1.02 m yr-1 in water equivalent. The annual accumulation rate shows an increasing trend with a slope of 3.6 mm yr-1, which is mainly caused by the increase in the autumn accumulation rate of 2.6 mm yr-1. This increase is likely linked to the enhanced hydrological cycle caused by the decrease in Arctic sea ice area. Unlike the strong seasonality of precipitation amount in the ERA reanalysis data in the southeast dome region, our reconstructed accumulation rate suggests a weak seasonality.

  7. Recording of electrohysterogram laplacian potential.

    PubMed

    Alberola-Rubio, J; Garcia-Casado, J; Ye-Lin, Y; Prats-Boluda, G; Perales, A

    2011-01-01

    Preterm birth is the main cause of the neonatal morbidity. Noninvasive recording of uterine myoelectrical activity (electrohysterogram, EHG) could be an alternative to the monitoring of uterine dynamics which are currently based on tocodynamometers (TOCO). The analysis of uterine electromyogram characteristics could help the early diagnosis of preterm birth. Laplacian recordings of other bioelectrical signals have proved to enhance spatial selectivity and to reduce interferences in comparison to monopolar and bipolar surface recordings. The main objective of this paper is to check the feasibility of the noninvasive recording of uterine myoelectrical activity by means of laplacian techniques. Four bipolar EHG signals, discrete laplacian obtained from five monopolar electrodes and the signals picked up by two active concentric-ringed-electrodes were recorded on 5 women with spontaneous or induced labor. Intrauterine pressure (IUP) and TOCO were also simultaneously recorded. To evaluate the uterine contraction detectability of the different noninvasive methods in comparison to IUP the contractions consistency index (CCI) was calculated. Results show that TOCO is less consistent (83%) than most EHG bipolar recording channels (91%, 83%, 87%, and 76%) to detect the uterine contractions identified in IUP. Moreover laplacian EHG signals picked up by ringed-electrodes proved to be as consistent (91%) as the best bipolar recordings in addition to significantly reduce ECG interference.

  8. Nurses' Experiences of an Initial and Reimplemented Electronic Health Record Use.

    PubMed

    Chang, Chi-Ping; Lee, Ting-Ting; Liu, Chia-Hui; Mills, Mary Etta

    2016-04-01

    The electronic health record is a key component of healthcare information systems. Currently, numerous hospitals have adopted electronic health records to replace paper-based records to document care processes and improve care quality. Integrating healthcare information system into traditional nursing daily operations requires time and effort for nurses to become familiarized with this new technology. In the stages of electronic health record implementation, smooth adoption can streamline clinical nursing activities. In order to explore the adoption process, a descriptive qualitative study design and focus group interviews were conducted 3 months after and 2 years after electronic health record system implementation (system aborted 1 year in between) in one hospital located in southern Taiwan. Content analysis was performed to analyze the interview data, and six main themes were derived, in the first stage: (1) liability, work stress, and anticipation for electronic health record; (2) slow network speed, user-unfriendly design for learning process; (3) insufficient information technology/organization support; on the second stage: (4) getting used to electronic health record and further system requirements, (5) benefits of electronic health record in time saving and documentation, (6) unrealistic information technology competence expectation and future use. It concluded that user-friendly design and support by informatics technology and manpower backup would facilitate this adoption process as well.

  9. KINEMATIC VARIABLES AND BLOOD ACID-BASE STATUS IN THE ANALYSIS OF COLLEGIATE SWIMMERS’ ANAEROBIC CAPACITY

    PubMed Central

    Bielec, G.; Makar, P.; Laskowski, R.

    2013-01-01

    Short duration repeated maximal efforts are often used in swimming training to improve lactate tolerance, which gives swimmers the ability to maintain a high work rate for a longer period of time. The aim of the study was to examine the kinematics of swimming and its relation to the changes in blood acid-base status and potassium level. Seven collegiate swimmers, with at least 6 years of training experience, volunteered to participate in the study. The test consisted of 8 x 25 m front crawl performed with maximum effort. The rest period between repetitions was set to five seconds. Blood samples were taken from the fingertip at rest, after warm-up and in the 3rd minute after completion of the test. The swimming was recorded with a video recorder, for later analysis of time, velocity and technique (stroke index). Based on the swimming velocity results, the obtained curve can be divided into rapid decrease of velocity and relatively stable velocities. The breaking point of repetition in swimming velocity was assumed as the swimming velocity threshold and it was highly correlated with the decrease of the blood acid-base status (pH r=0.82, BE r=0.87, HCO3 - r=0.76; p<0.05 in all cases). There was no correlation between stroke index or fatigue index and blood acid-base status. Analysis of the swimming speed in the 8 x 25 m test seems to be helpful in evaluation of lactate tolerance (anaerobic capacity) in collegiate swimmers. PMID:24744491

  10. A systematic intercomparison of regional flood frequency analysis models in a simulation framework

    NASA Astrophysics Data System (ADS)

    Ganora, Daniele; Laio, Francesco; Claps, Pierluigi

    2015-04-01

    Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve (or other discharge-related variables), based on the fundamental concept of substituting temporal information at a site (no data or short time series) by exploiting observations at other sites (spatial information). Different RFA paradigms exist, depending on the way the information is transferred to the site of interest. Despite the wide use of such methodology, a systematic comparison between these paradigms has not been performed. The aim of this study is to provide a framework wherein carrying out the intercomparison: we thus synthetically generate data through Monte Carlo simulations for a number of (virtual) stations, following a GEV parent distribution; different scenarios can be created to represent different spatial heterogeneity patterns by manipulating the parameters of the parent distribution at each station (e.g. with a linear variation in space of the shape parameter of the GEV). A special case is the homogeneous scenario where each station record is sampled from the same parent distribution. For each scenario and each simulation, different regional models are applied to evaluate the 200-year growth factor at each station. Results are than compared to the exact growth factor of each station, which is known in our virtual world. Considered regional approaches include: (i) a single growth curve for the whole region; (ii) a multiple-region model based on cluster analysis which search for an adequate number of homogeneous subregions; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially-smooth estimation procedure based on linear regressions.. A further benchmark model is the at-site estimate based on the analysis of the local record. A comprehensive analysis of the results of the simulations shows that, if the scenario is homogeneous (no spatial variability), all the regional approaches have comparable performances. Moreover, as expected, regional estimates are much more reliable than the at-site estimates. If the scenario is heterogeneous, the performances of the regional models depend on the pattern of heterogeneity; in general, however, the spatially-smooth regional approach performs better than the others, and its performances improve for increasing record lengths. For heterogeneous scenarios, the at-site estimates appear to be comparably more efficient than in the homogeneous case, and in general less biased than the regional estimates.

  11. A novel GLM-based method for the Automatic IDentification of functional Events (AIDE) in fNIRS data recorded in naturalistic environments.

    PubMed

    Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias

    2017-07-15

    Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is possible and that the present method can provide a novel solution to analyse real-world fNIRS data, filling the gap between real-life testing and functional neuroimaging. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Radar mechanocardiography: a novel analysis of the mechanical behavior of the heart.

    PubMed

    Tavakolian, Kouhyar; Zadeh, Faranak M; Chuo, Yindar; Siu, Tiffany; Vaseghi, Ali; Kaminska, Bozena

    2008-01-01

    In this paper a novel system for detection of the mechanical movement of heart, mechanocardiography (MCG), with no connection to the subject's body is presented. This signal is based on radar technology. The acquired signal is highly correlated to the acceleration-based ballistocardiograph signal (BCG) recorded directly from the sternum. It is shown that the heart and breathing rates can be reliably detected using this system.

  13. Utilizing IHE-based Electronic Health Record systems for secondary use.

    PubMed

    Holzer, K; Gall, W

    2011-01-01

    Due to the increasing adoption of Electronic Health Records (EHRs) for primary use, the number of electronic documents stored in such systems will soar in the near future. In order to benefit from this development in secondary fields such as medical research, it is important to define requirements for the secondary use of EHR data. Furthermore, analyses of the extent to which an IHE (Integrating the Healthcare Enterprise)-based architecture would fulfill these requirements could provide further information on upcoming obstacles for the secondary use of EHRs. A catalog of eight core requirements for secondary use of EHR data was deduced from the published literature, the risk analysis of the IHE profile MPQ (Multi-Patient Queries) and the analysis of relevant questions. The IHE-based architecture for cross-domain, patient-centered document sharing was extended to a cross-patient architecture. We propose an IHE-based architecture for cross-patient and cross-domain secondary use of EHR data. Evaluation of this architecture concerning the eight core requirements revealed positive fulfillment of six and the partial fulfillment of two requirements. Although not regarded as a primary goal in modern electronic healthcare, the re-use of existing electronic medical documents in EHRs for research and other fields of secondary application holds enormous potential for the future. Further research in this respect is necessary.

  14. Record of hospitalizations for ambulatory care sensitive conditions: validation of the hospital information system.

    PubMed

    Rehem, Tania Cristina Morais Santa Barbara; de Oliveira, Maria Regina Fernandes; Ciosak, Suely Itsuko; Egry, Emiko Yoshikawa

    2013-01-01

    To estimate the sensitivity, specificity and positive and negative predictive values of the Unified Health System's Hospital Information System for the appropriate recording of hospitalizations for ambulatory care-sensitive conditions. The hospital information system records for conditions which are sensitive to ambulatory care, and for those which are not, were considered for analysis, taking the medical records as the gold standard. Through simple random sampling, a sample of 816 medical records was defined and selected by means of a list of random numbers using the Statistical Package for Social Sciences. The sensitivity was 81.89%, specificity was 95.19%, the positive predictive value was 77.61% and the negative predictive value was 96.27%. In the study setting, the Hospital Information System (SIH) was more specific than sensitive, with nearly 20% of care sensitive conditions not detected. There are no validation studies in Brazil of the Hospital Information System records for the hospitalizations which are sensitive to primary health care. These results are relevant when one considers that this system is one of the bases for assessment of the effectiveness of primary health care.

  15. Gait functional assessment: Spatio-temporal analysis and classification of barefoot plantar pressure in a group of 11-12-year-old children.

    PubMed

    Latour, Ewa; Latour, Marek; Arlet, Jarosław; Adach, Zdzisław; Bohatyrewicz, Andrzej

    2011-07-01

    Analysis of pedobarographical data requires geometric identification of specific anatomical areas extracted from recorded plantar pressures. This approach has led to ambiguity in measurements that may underlie the inconsistency of conclusions reported in pedobarographical studies. The goal of this study was to design a new analysis method less susceptible to the projection accuracy of anthropometric points and distance estimation, based on rarely used spatio-temporal indices. Six pedobarographic records per person (three per foot) from a group of 60 children aged 11-12 years were obtained and analyzed. The basis of the analysis was a mutual relationship between two spatio-temporal indices created by excursion of the peak pressure point and the center-of-pressure point on the dynamic pedobarogram. Classification of weight-shift patterns was elaborated and performed, and their frequencies of occurrence were assessed. This new method allows an assessment of body weight shift through the plantar pressure surface based on distribution analysis of spatio-temporal indices not affected by the shape of this surface. Analysis of the distribution of the created index confirmed the existence of typical ways of weight shifting through the plantar surface of the foot during gait, as well as large variability of the intrasubject occurrence. This method may serve as the basis for interpretation of foot functional features and may extend the clinical usefulness of pedobarography. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Construction patterns of birds' nests provide insight into nest-building behaviours.

    PubMed

    Biddle, Lucia; Goodman, Adrian M; Deeming, D Charles

    2017-01-01

    Previous studies have suggested that birds and mammals select materials needed for nest building based on their thermal or structural properties, although the amounts or properties of the materials used have been recorded for only a very small number of species. Some of the behaviours underlying the construction of nests can be indirectly determined by careful deconstruction of the structure and measurement of the biomechanical properties of the materials used. Here we examined this idea in an investigation of Bullfinch ( Pyrrhula pyrrhula ) nests as a model for open-nesting songbird species that construct a "twig" nest, and tested the hypothesis that materials in different parts of nests serve different functions. The quantities of materials present in the nest base, sides and cup were recorded before structural analysis. Structural analysis showed that the base of the outer nests were composed of significantly thicker, stronger and more rigid materials compared to the side walls, which in turn were significantly thicker, stronger and more rigid than materials used in the cup. These results suggest that the placement of particular materials in nests may not be random, but further work is required to determine if the final structure of a nest accurately reflects the construction process.

  17. Modeling a terminology-based electronic nursing record system: an object-oriented approach.

    PubMed

    Park, Hyeoun-Ae; Cho, InSook; Byeun, NamSoo

    2007-10-01

    The aim of this study was to present our perspectives on healthcare information analysis at a conceptual level and the lessons learned from our experience with the development of a terminology-based enterprise electronic nursing record system - which was one of components in an EMR system at a tertiary teaching hospital in Korea - using an object-oriented system analysis and design concept. To ensure a systematic approach and effective collaboration, the department of nursing constituted a system modeling team comprising a project manager, systems analysts, user representatives, an object-oriented methodology expert, and healthcare informaticists (including the authors). A rational unified process (RUP) and the Unified Modeling Language were used as a development process and for modeling notation, respectively. From the scenario and RUP approach, user requirements were formulated into use case sets and the sequence of activities in the scenario was depicted in an activity diagram. The structure of the system was presented in a class diagram. This approach allowed us to identify clearly the structural and behavioral states and important factors of a terminology-based ENR system (e.g., business concerns and system design concerns) according to the viewpoints of both domain and technical experts.

  18. The Soil Moisture Dependence of TRMM Microwave Imager Rainfall Estimates

    NASA Astrophysics Data System (ADS)

    Seyyedi, H.; Anagnostou, E. N.

    2011-12-01

    This study presents an in-depth analysis of the dependence of overland rainfall estimates from the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) on the soil moisture conditions at the land surface. TMI retrievals are verified against rainfall fields derived from a high resolution rain-gauge network (MESONET) covering Oklahoma. Soil moisture (SOM) patterns are extracted based on recorded data from 2000-2007 with 30 minutes temporal resolution. The area is divided into wet and dry regions based on normalized SOM (Nsom) values. Statistical comparison between two groups is conducted based on recorded ground station measurements and the corresponding passive microwave retrievals from TMI overpasses at the respective MESONET station location and time. The zero order error statistics show that the Probability of Detection (POD) for the wet regions (higher Nsom values) is higher than the dry regions. The Falls Alarm Ratio (FAR) and volumetric FAR is lower for the wet regions. The volumetric missed rain for the wet region is lower than dry region. Analysis of the MESONET-to-TMI ratio values shows that TMI tends to overestimate for surface rainfall intensities less than 12 (mm/h), however the magnitude of the overestimation over the wet regions is lower than the dry regions.

  19. THE ECOLOGY OF TICKS TRANSMITTING ROCKY MOUNTAIN SPOTTED FEVER IN THE EASTERN UNITED STATES.

    DTIC Science & Technology

    occurrence of Rocky Mountain spotted fever in Virginia, based upon medical analysis case records reported to the Virginia State Health Department and the...some reports of laboratory investigations done in support of the field investigations. Infection with Rocky Mountain spotted fever was found in 6

  20. RAMP: a computer system for mapping regional areas

    Treesearch

    Bradley B. Nickey

    1975-01-01

    Until 1972, the U.S. Forest Service's Individual Fire Reports recorded locations by the section-township-range system..These earlier fire reports, therefore, lacked congruent locations. RAMP (Regional Area Mapping Procedure) was designed to make the reports more useful for quantitative analysis. This computer-based technique converts locations expressed in...

  1. Monitoring Performance: It's All in the Data.

    ERIC Educational Resources Information Center

    Ward, Phillip

    1992-01-01

    Data collection and analysis helps coaches ensure that they conduct practices and make decisions based on objective and consistently recorded information. The article discusses how, where, and when coaches should collect data on athletes, noting data collection should emphasize include training loads, body weight, sleep, health, and injuries. (SM)

  2. Estimating error cross-correlations in soil moisture data sets using extended collocation analysis

    USDA-ARS?s Scientific Manuscript database

    Consistent global soil moisture records are essential for studying the role of hydrologic processes within the larger earth system. Various studies have shown the benefit of assimilating satellite-based soil moisture data into water balance models or merging multi-source soil moisture retrievals int...

  3. Students' Experiences in Interdisciplinary Problembased Learning: A Discourse Analysis of Group Interaction

    ERIC Educational Resources Information Center

    Imafuku, Rintaro; Kataoka, Ryuta; Mayahara, Mitsuori; Suzuki, Hisayoshi; Saiki, Takuya

    2014-01-01

    Interdisciplinary problem-based learning (PBL) aims to provide students with opportunities to develop the necessary skills to work with different health professionals in a collaborative manner. This discourse study examined the processes of collective knowledge construction in Japanese students in the tutorials. Analyses of video-recorded data…

  4. Empirical Data Collection and Analysis Using Camtasia and Transana

    ERIC Educational Resources Information Center

    Thorsteinsson, Gisli; Page, Tom

    2009-01-01

    One of the possible techniques for collecting empirical data is video recordings of a computer screen with specific screen capture software. This method for collecting empirical data shows how students use the BSCWII (Be Smart Cooperate Worldwide--a web based collaboration/groupware environment) to coordinate their work and collaborate in…

  5. 47 CFR 36.214 - Long distance message revenue-Account 5100.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... relative number of minutes-of-use in the study area. Effective July 1, 2001 through June 30, 2012, all study areas shall apportion Wideband Message Service revenues among the jurisdictions using the relative... are directly assigned based on their subsidiary record categories or on the basis of analysis and...

  6. Seismic response analysis of an instrumented building structure

    USGS Publications Warehouse

    Li, H.-J.; Zhu, S.-Y.; Celebi, M.

    2003-01-01

    The Sheraton - Universal hotel, an instrumented building lying in North Hollywood, USA is selected for case study in this paper. The finite element method is used to produce a linear time - invariant structural model, and the SAP2000 program is employed for the time history analysis of the instrumented structure under the base excitation of strong motions recorded in the basement during the Northridge, California earthquake of 17 January 1994. The calculated structural responses are compared with the recorded data in both time domain and frequency domain, and the effects of structural parameters evaluation and indeterminate factors are discussed. Some features of structural response, such as the reason why the peak responses of acceleration in the ninth floor are larger than those in the sixteenth floor, are also explained.

  7. An animal tracking system for behavior analysis using radio frequency identification.

    PubMed

    Catarinucci, Luca; Colella, Riccardo; Mainetti, Luca; Patrono, Luigi; Pieretti, Stefano; Secco, Andrea; Sergi, Ilaria

    2014-09-01

    Evaluating the behavior of mice and rats has substantially contributed to the progress of research in many scientific fields. Researchers commonly observe recorded video of animal behavior and manually record their observations for later analysis, but this approach has several limitations. The authors developed an automated system for tracking and analyzing the behavior of rodents that is based on radio frequency identification (RFID) in an ultra-high-frequency bandwidth. They provide an overview of the system's hardware and software components as well as describe their technique for surgically implanting passive RFID tags in mice. Finally, the authors present the findings of two validation studies to compare the accuracy of the RFID system versus commonly used approaches for evaluating the locomotor activity and object exploration of mice.

  8. Changes in behavior as side effects in methylphenidate treatment: review of the literature.

    PubMed

    Konrad-Bindl, Doris Susanne; Gresser, Ursula; Richartz, Barbara Maria

    2016-01-01

    Our review of the scientific literature focused on an analysis of studies describing instances of methylphenidate treatment leading (or not) to behavioral changes in the pediatric, adolescent, and adult populations. We conducted a literature search in PubMed, Medline, and Google using the keywords "methylphenidate", "behavioral changes", "adverse effects", and "side effects". A total of 44 studies were identified as reporting on the effects and adverse effects of methylphenidate administration, and were included in the analysis. Five studies specifically set out to study, record, and discuss changes in behavior. Eight studies did not set out to study behavioral effects, but record and discuss them. A total of 28 studies recorded behavioral effects, but failed to discuss these further. Three studies did not include behavioral effects. This review records what data have been published in respect of changes in behavior in association with the use of methylphenidate. While there is some evidence to suggest that methylphenidate causes changes in behavior, the majority of the studies reviewed paid little or no attention to this issue. Based on the available data, it is impossible to determine the point at which such behavioral effects occur. The frequency of occurrence of behavioral effects is also impossible to determine with certainty. Based on the available data, it is not possible to rule out whether behavioral effects may persist or not persist once treatment is discontinued. In conclusion, despite countless publications and extensive administration, especially to children, we have insufficient data to judge the long-term effects and risks of methylphenidate taking.

  9. A large-aperture low-cost hydrophone array for tracking whales from small boats.

    PubMed

    Miller, B; Dawson, S

    2009-11-01

    A passive sonar array designed for tracking diving sperm whales in three dimensions from a single small vessel is presented, and the advantages and limitations of operating this array from a 6 m boat are described. The system consists of four free floating buoys, each with a hydrophone, built-in recorder, and global positioning system receiver (GPS), and one vertical stereo hydrophone array deployed from the boat. Array recordings are post-processed onshore to obtain diving profiles of vocalizing sperm whales. Recordings are synchronized using a GPS timing pulse recorded onto each track. Sensitivity analysis based on hyperbolic localization methods is used to obtain probability distributions for the whale's three-dimensional location for vocalizations received by at least four hydrophones. These localizations are compared to those obtained via isodiachronic sequential bound estimation. Results from deployment of the system around a sperm whale in the Kaikoura Canyon in New Zealand are shown.

  10. [Quality assurance of hospital medical records as a risk management tool].

    PubMed

    Terranova, Giuseppina; Cortesi, Elisabetta; Briani, Silvia; Giannini, Raffaella

    2006-01-01

    A retrospective analysis of hospital medical records was performed jointly by the Medicolegal department of the Pistoia Local Health Unit N. 3 and by the management of the SS. Cosma and Damiano di Pescia Hospital. Evaluation was based on ANDEM criteria, JCAHO standards, and the 1992 discharge abstract guidelines of the Italian Health Ministry. In the first phase of the study, data were collected and processed for each hospital ward and then discussed with clinicians and audited. After auditing, appropriate actions were agreed upon for correcting identified problems. Approximately one year later a second smaller sample of medical records was evaluated and a higher compliance rate with the established corrective actions was found in all wards for all data categories. In this study the evaluation of medical records can be considered in the wider context of risk management, a multidisciplinary process directed towards identifying and monitoring risk through the use of appropriate quality indicators.

  11. Closed-loop optical neural stimulation based on a 32-channel low-noise recording system with online spike sorting

    NASA Astrophysics Data System (ADS)

    Nguyen, T. K. T.; Navratilova, Z.; Cabral, H.; Wang, L.; Gielen, G.; Battaglia, F. P.; Bartic, C.

    2014-08-01

    Objective. Closed-loop operation of neuro-electronic systems is desirable for both scientific and clinical (neuroprosthesis) applications. Integrating optical stimulation with recording capability further enhances the selectivity of neural stimulation. We have developed a system enabling the local delivery of optical stimuli and the simultaneous electrical measuring of the neural activities in a closed-loop approach. Approach. The signal analysis is performed online through the implementation of a template matching algorithm. The system performance is demonstrated with the recorded data and in awake rats. Main results. Specifically, the neural activities are simultaneously recorded, detected, classified online (through spike sorting) from 32 channels, and used to trigger a light emitting diode light source using generated TTL signals. Significance. A total processing time of 8 ms is achieved, suitable for optogenetic studies of brain mechanisms online.

  12. Electronic 12-Hour Dietary Recall (e-12HR): Comparison of a Mobile Phone App for Dietary Intake Assessment With a Food Frequency Questionnaire and Four Dietary Records.

    PubMed

    Béjar, Luis María; Reyes, Óscar Adrián; García-Perea, María Dolores

    2018-06-15

    One of the greatest challenges in nutritional epidemiology is improving upon traditional self-reporting methods for the assessment of habitual dietary intake. The aim of this study was to evaluate the relative validity of a new method known as the current-day dietary recall (or current-day recall), based on a smartphone app called 12-hour dietary recall, for determining the habitual intake of a series of key food and drink groups using a food frequency questionnaire (FFQ) and four dietary records as reference methods. University students over the age of 18 years recorded their consumption of certain groups of food and drink using 12-hour dietary recall for 28 consecutive days. During this 28-day period, they also completed four dietary records on randomly selected days. Once the monitoring period was over, subjects then completed an FFQ. The two methods were compared using the Spearman correlation coefficient (SCC), a cross-classification analysis, and weighted kappa. A total of 87 participants completed the study (64% women, 56/87; 36% men, 31/87). For e-12HR versus FFQ, for all food and drink groups, the average SCC was 0.70. Cross-classification analysis revealed that the average percentage of individuals classified in the exact agreement category was 51.5%; exact agreement + adjacent was 91.8%, and no participant (0%) was classified in the extreme disagreement category. The average weighted kappa was 0.51. For e-12HR versus the four dietary records, for all food and drink groups, the average SCC was 0.63. Cross-classification analysis revealed that the average percentage of individuals classified in the exact agreement category was 47.1%; exact agreement + adjacent was 89.2%; and no participant (0%) was classified in the extreme disagreement category. The average weighted kappa was 0.47. Current-day recall, based on the 12-hour dietary recall app, was found to be in good agreement with the two reference methods (FFQ & four dietary records), demonstrating its potential usefulness for categorizing individuals according to their habitual dietary intake of certain food and drink groups. ©Luis María Béjar, Óscar Adrián Reyes, María Dolores García-Perea. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 15.06.2018.

  13. Implications of high amplitude atmospheric CO2 fluctuations on past millennium climate change

    NASA Astrophysics Data System (ADS)

    van Hoof, Thomas; Kouwenberg, Lenny; Wagner-Cremer, Friederike; Visscher, Henk

    2010-05-01

    Stomatal frequency analysis of leaves of land plants preserved in peat and lake deposits can provide a proxy record of pre-industrial atmospheric CO2 concentration complementary to measurements in Antarctic ice cores. Stomatal frequency based CO2 trends from the USA and NW European support the presence of significant CO2 variability during the first half of the last millennium (Kouwenberg et al., 2005; Wagner et al., 2004; van Hoof et al., 2008). The timing of the most significant perturbation in the stomata records (1200 AD) is in agreement with an observed CO2 fluctuation in the D47 Antarctic ice-core record (Barnola et al., 1995; van Hoof et al., 2005). The amplitude of the stomatal frequency based CO2 changes (> 34ppmv) exceeds the maximum amplitude of CO2 variability in the D47 ice core (< 10 ppmv). A modelling experiment taking into account firn-densification based smoothing processes in the D47 ice core proved, however, that the amplitude difference between the stomata record and the D47 ice-core can be explained by natural smoothing processes in the ice (van Hoof et al., 2005). This observation gives credence to the existence of high-amplitude CO2 fluctuations during the last millennium and suggests that high resolution ice core CO2 records should be regarded as a smoothed representation of the atmospheric CO2 signal. In the present study, potential marine and terrestrial sources and sinks associated with the observed atmospheric CO2 perturbation will be discussed. The magnitude of the observed CO2 variability implies that inferred changes in CO2 radiative forcing are of a similar magnitude as variations ascribed to other forcing mechanisms (e.g. solar forcing and volcanism), therefore challenging the IPCC concept of CO2 as an insignificant preindustrial climate forcing factor. References Barnola J.M., M. Anklin, J. Porcheron, D. Raynaud, J. Schwander and B. Stauffer 1995. CO2 evolution during the last millennium as recorded by Antarctic and Greenland ice. Tellus, v 47B, p. 264-272 Kouwenberg L.L.R., F. Wagner, W.M. Kürschner and H. Visscher 2005. Atmospheric CO2 fluctuations during the last Millennium reconstructed by stomatal frequency analysis of Tsuga heterophylla needles. Geology, v. 33, no.1, pp. 33-36 van Hoof T.B., K.A. Kaspers, F. Wagner, R.S.W. van de Wal, W.M. Kürschner and H. Visscher 2005. Atmospheric CO2 during the 13th century AD: reconciliation of data from ice core measurements and stomatal frequency analysis. Tellus B, v. 57, pp. 351-355 van Hoof T.B., F. Wagner-Cremer, W.M. K Kürschner and H. Visscher 2008. A role for atmospheric CO2 in preindustrial climate forcing. Proceedings of the National Academy of Sciences of the USA, v. 105, no. 41, pp. 15815-15818 Wagner F., L.L.R. Kouwenberg, T.B. van Hoof and H. Visscher 2004. Reproducibility of Holocene atmospheric CO2 records based on stomatal frequency. Quartenary Science Reviews. V. 23, pp. 1947-1954

  14. Design and development of an ethnically-diverse imaging informatics-based eFolder system for multiple sclerosis patients.

    PubMed

    Ma, Kevin C; Fernandez, James R; Amezcua, Lilyana; Lerner, Alex; Shiroishi, Mark S; Liu, Brent J

    2015-12-01

    MRI has been used to identify multiple sclerosis (MS) lesions in brain and spinal cord visually. Integrating patient information into an electronic patient record system has become key for modern patient care in medicine in recent years. Clinically, it is also necessary to track patients' progress in longitudinal studies, in order to provide comprehensive understanding of disease progression and response to treatment. As the amount of required data increases, there exists a need for an efficient systematic solution to store and analyze MS patient data, disease profiles, and disease tracking for both clinical and research purposes. An imaging informatics based system, called MS eFolder, has been developed as an integrated patient record system for data storage and analysis of MS patients. The eFolder system, with a DICOM-based database, includes a module for lesion contouring by radiologists, a MS lesion quantification tool to quantify MS lesion volume in 3D, brain parenchyma fraction analysis, and provide quantitative analysis and tracking of volume changes in longitudinal studies. Patient data, including MR images, have been collected retrospectively at University of Southern California Medical Center (USC) and Los Angeles County Hospital (LAC). The MS eFolder utilizes web-based components, such as browser-based graphical user interface (GUI) and web-based database. The eFolder database stores patient clinical data (demographics, MS disease history, family history, etc.), MR imaging-related data found in DICOM headers, and lesion quantification results. Lesion quantification results are derived from radiologists' contours on brain MRI studies and quantified into 3-dimensional volumes and locations. Quantified results of white matter lesions are integrated into a structured report based on DICOM-SR protocol and templates. The user interface displays patient clinical information, original MR images, and viewing structured reports of quantified results. The GUI also includes a data mining tool to handle unique search queries for MS. System workflow and dataflow steps has been designed based on the IHE post-processing workflow profile, including workflow process tracking, MS lesion contouring and quantification of MR images at a post-processing workstation, and storage of quantitative results as DICOM-SR in DICOM-based storage system. The web-based GUI is designed to display zero-footprint DICOM web-accessible data objects (WADO) and the SR objects. The MS eFolder system has been designed and developed as an integrated data storage and mining solution in both clinical and research environments, while providing unique features, such as quantitative lesion analysis and disease tracking over a longitudinal study. A comprehensive image and clinical data integrated database provided by MS eFolder provides a platform for treatment assessment, outcomes analysis and decision-support. The proposed system serves as a platform for future quantitative analysis derived automatically from CAD algorithms that can also be integrated within the system for individual disease tracking and future MS-related research. Ultimately the eFolder provides a decision-support infrastructure that can eventually be used as add-on value to the overall electronic medical record. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Design and development of an ethnically-diverse imaging informatics-based eFolder system for multiple sclerosis patients

    PubMed Central

    Ma, Kevin C.; Fernandez, James R.; Amezcua, Lilyana; Lerner, Alex; Shiroishi, Mark S.; Liu, Brent J.

    2016-01-01

    Purpose MRI has been used to identify multiple sclerosis (MS) lesions in brain and spinal cord visually. Integrating patient information into an electronic patient record system has become key for modern patient care in medicine in recent years. Clinically, it is also necessary to track patients' progress in longitudinal studies, in order to provide comprehensive understanding of disease progression and response to treatment. As the amount of required data increases, there exists a need for an efficient systematic solution to store and analyze MS patient data, disease profiles, and disease tracking for both clinical and research purposes. Method An imaging informatics based system, called MS eFolder, has been developed as an integrated patient record system for data storage and analysis of MS patients. The eFolder system, with a DICOM-based database, includes a module for lesion contouring by radiologists, a MS lesion quantification tool to quantify MS lesion volume in 3D, brain parenchyma fraction analysis, and provide quantitative analysis and tracking of volume changes in longitudinal studies. Patient data, including MR images, have been collected retrospectively at University of Southern California Medical Center (USC) and Los Angeles County Hospital (LAC). The MS eFolder utilizes web-based components, such as browser-based graphical user interface (GUI) and web-based database. The eFolder database stores patient clinical data (demographics, MS disease history, family history, etc.), MR imaging-related data found in DICOM headers, and lesion quantification results. Lesion quantification results are derived from radiologists' contours on brain MRI studies and quantified into 3-dimensional volumes and locations. Quantified results of white matter lesions are integrated into a structured report based on DICOM-SR protocol and templates. The user interface displays patient clinical information, original MR images, and viewing structured reports of quantified results. The GUI also includes a data mining tool to handle unique search queries for MS. System workflow and dataflow steps has been designed based on the IHE post-processing workflow profile, including workflow process tracking, MS lesion contouring and quantification of MR images at a post-processing workstation, and storage of quantitative results as DICOM-SR in DICOM-based storage system. The web-based GUI is designed to display zero-footprint DICOM web-accessible data objects (WADO) and the SR objects. Summary The MS eFolder system has been designed and developed as an integrated data storage and mining solution in both clinical and research environments, while providing unique features, such as quantitative lesion analysis and disease tracking over a longitudinal study. A comprehensive image and clinical data integrated database provided by MS eFolder provides a platform for treatment assessment, outcomes analysis and decision-support. The proposed system serves as a platform for future quantitative analysis derived automatically from CAD algorithms that can also be integrated within the system for individual disease tracking and future MS-related research. Ultimately the eFolder provides a decision-support infrastructure that can eventually be used as add-on value to the overall electronic medical record. PMID:26564667

  16. Speleothem records of western Mediterranean. Hydrological variability along the Last Interglacial Period and marine linkages

    NASA Astrophysics Data System (ADS)

    Torner, Judit; Cacho, Isabel; Moreno, Ana; Stoll, Heather; Belmonte, Anchel; Sierro, Francisco J.; Frigola, Jaime; Martrat, Belen; Fornós, Joan; Arnau Fernández, Pedro; Hellstrom, John; Cheng, Hai; Edwards, R. Lawrence

    2016-04-01

    This study aims to identify and characterize regional hydrological variability in the western Mediterranean region in base to different geochemical parameters (δ18O, δ13C, and Mg/Ca ratios). Speleothems have been recovered from several caves located in southern central Pyrenees one and the others form the Balearic Islands. Their chronologies have been constructed in base on U/Th absolute dating and indicate that the speleothem sequences cover the end of the last interglacial and the glacial inception. One of the most remarkable features of the records is the intense and abrupt shift toward more arid conditions that marks the end of the last interglacial (MIS 5e). Furthermore, our speleothem records also show relatively humid but highly variable hydrological conditions during the interstadial periods from MIS 5c to 5a. These speleothem records have been compared with new generated western Mediterranean marine records from the Balearic Sea (MD99-2343) and Alboran Sea (OPD-977). Marine records include (1) proxies of sea surface temperature and changes in evaporation-precipitation rates based on pair analysis of δ18O and the Mg/Ca ratios in planktonic foraminifera Globigerina bulloides; (2) proxies of deep-water currents associated with the Western Mediterranean Deep Water (WMDW) based on grain size analyses. The results reveal that arid conditions on land were coeval with cold sea surface sub-stages (MIS 5b and 5d), and also with increases in the intensity of the WMDW-related currents. By contrast, humid and hydrological unstable atmosphere conditions were synchronous with sea surface warm sub-stages, and lower WMDW-related currents intensities (MIS 5a, c and e). Consequently, our results highly evidence a strong atmospheric-oceanic coupling, involving parallel changes in both surface but also deep western Mediterranean Sea conditions during the last interglacial period and the glacial inception.

  17. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    NASA Astrophysics Data System (ADS)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  18. Commissioning of a CERN Production and Analysis Facility Based on xrootd

    NASA Astrophysics Data System (ADS)

    Campana, Simone; van der Ster, Daniel C.; Di Girolamo, Alessandro; Peters, Andreas J.; Duellmann, Dirk; Coelho Dos Santos, Miguel; Iven, Jan; Bell, Tim

    2011-12-01

    The CERN facility hosts the Tier-0 of the four LHC experiments, but as part of WLCG it also offers a platform for production activities and user analysis. The CERN CASTOR storage technology has been extensively tested and utilized for LHC data recording and exporting to external sites according to experiments computing model. On the other hand, to accommodate Grid data processing activities and, more importantly, chaotic user analysis, it was realized that additional functionality was needed including a different throttling mechanism for file access. This paper will describe the xroot-based CERN production and analysis facility for the ATLAS experiment and in particular the experiment use case and data access scenario, the xrootd redirector setup on top of the CASTOR storage system, the commissioning of the system and real life experience for data processing and data analysis.

  19. Determination of subthalamic nucleus location by quantitative analysis of despiked background neural activity from microelectrode recordings obtained during deep brain stimulation surgery.

    PubMed

    Danish, Shabbar F; Baltuch, Gordon H; Jaggi, Jurg L; Wong, Stephen

    2008-04-01

    Microelectrode recording during deep brain stimulation surgery is a useful adjunct for subthalamic nucleus (STN) localization. We hypothesize that information in the nonspike background activity can help identify STN boundaries. We present results from a novel quantitative analysis that accomplishes this goal. Thirteen consecutive microelectrode recordings were retrospectively analyzed. Spikes were removed from the recordings with an automated algorithm. The remaining "despiked" signals were converted via root mean square amplitude and curve length calculations into "feature profile" time series. Subthalamic nucleus boundaries determined by inspection, based on sustained deviations from baseline for each feature profile, were compared against those determined intraoperatively by the clinical neurophysiologist. Feature profile activity within STN exhibited a sustained rise in 10 of 13 tracks (77%). The sensitivity of STN entry was 60% and 90% for curve length and root mean square amplitude, respectively, when agreement within 0.5 mm of the neurophysiologist's prediction was used. Sensitivities were 70% and 100% for 1 mm accuracy. Exit point sensitivities were 80% and 90% for both features within 0.5 mm and 1.0 mm, respectively. Reproducible activity patterns in deep brain stimulation microelectrode recordings can allow accurate identification of STN boundaries. Quantitative analyses of this type may provide useful adjunctive information for electrode placement in deep brain stimulation surgery.

  20. Barriers to Retrieving Patient Information from Electronic Health Record Data: Failure Analysis from the TREC Medical Records Track

    PubMed Central

    Edinger, Tracy; Cohen, Aaron M.; Bedrick, Steven; Ambert, Kyle; Hersh, William

    2012-01-01

    Objective: Secondary use of electronic health record (EHR) data relies on the ability to retrieve accurate and complete information about desired patient populations. The Text Retrieval Conference (TREC) 2011 Medical Records Track was a challenge evaluation allowing comparison of systems and algorithms to retrieve patients eligible for clinical studies from a corpus of de-identified medical records, grouped by patient visit. Participants retrieved cohorts of patients relevant to 35 different clinical topics, and visits were judged for relevance to each topic. This study identified the most common barriers to identifying specific clinic populations in the test collection. Methods: Using the runs from track participants and judged visits, we analyzed the five non-relevant visits most often retrieved and the five relevant visits most often overlooked. Categories were developed iteratively to group the reasons for incorrect retrieval for each of the 35 topics. Results: Reasons fell into nine categories for non-relevant visits and five categories for relevant visits. Non-relevant visits were most often retrieved because they contained a non-relevant reference to the topic terms. Relevant visits were most often infrequently retrieved because they used a synonym for a topic term. Conclusions: This failure analysis provides insight into areas for future improvement in EHR-based retrieval with techniques such as more widespread and complete use of standardized terminology in retrieval and data entry systems. PMID:23304287

  1. Barriers to retrieving patient information from electronic health record data: failure analysis from the TREC Medical Records Track.

    PubMed

    Edinger, Tracy; Cohen, Aaron M; Bedrick, Steven; Ambert, Kyle; Hersh, William

    2012-01-01

    Secondary use of electronic health record (EHR) data relies on the ability to retrieve accurate and complete information about desired patient populations. The Text Retrieval Conference (TREC) 2011 Medical Records Track was a challenge evaluation allowing comparison of systems and algorithms to retrieve patients eligible for clinical studies from a corpus of de-identified medical records, grouped by patient visit. Participants retrieved cohorts of patients relevant to 35 different clinical topics, and visits were judged for relevance to each topic. This study identified the most common barriers to identifying specific clinic populations in the test collection. Using the runs from track participants and judged visits, we analyzed the five non-relevant visits most often retrieved and the five relevant visits most often overlooked. Categories were developed iteratively to group the reasons for incorrect retrieval for each of the 35 topics. Reasons fell into nine categories for non-relevant visits and five categories for relevant visits. Non-relevant visits were most often retrieved because they contained a non-relevant reference to the topic terms. Relevant visits were most often infrequently retrieved because they used a synonym for a topic term. This failure analysis provides insight into areas for future improvement in EHR-based retrieval with techniques such as more widespread and complete use of standardized terminology in retrieval and data entry systems.

  2. First seismic shear wave velocity profile of the lunar crust as extracted from the Apollo 17 active seismic data by wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-04-01

    We present a new seismic velocity model of the shallow lunar crust, including, for the first time, shear wave velocity information. So far, the shear wave velocity structure of the lunar near-surface was effectively unconstrained due to the complexity of lunar seismograms. Intense scattering and low attenuation in the lunar crust lead to characteristic long-duration reverberations on the seismograms. The reverberations obscure later arriving shear waves and mode conversions, rendering them impossible to identify and analyze. Additionally, only vertical component data were recorded during the Apollo active seismic experiments, which further compromises the identification of shear waves. We applied a novel processing and analysis technique to the data of the Apollo 17 lunar seismic profiling experiment (LSPE), which involved recording seismic energy generated by several explosive packages on a small areal array of four vertical component geophones. Our approach is based on the analysis of the spatial gradients of the seismic wavefield and yields key parameters such as apparent phase velocity and rotational ground motion as a function of time (depth), which cannot be obtained through conventional seismic data analysis. These new observables significantly enhance the data for interpretation of the recorded seismic wavefield and allow, for example, for the identification of S wave arrivals based on their lower apparent phase velocities and distinct higher amount of generated rotational motion relative to compressional (P-) waves. Using our methodology, we successfully identified pure-mode and mode-converted refracted shear wave arrivals in the complex LSPE data and derived a P- and S-wave velocity model of the shallow lunar crust at the Apollo 17 landing site. The extracted elastic-parameter model supports the current understanding of the lunar near-surface structure, suggesting a thin layer of low-velocity lunar regolith overlying a heavily fractured crust of basaltic material showing high (>0.4 down to 60 m) Poisson's ratios. Our new model can be used in future studies to better constrain the deep interior of the Moon. Given the rich information derived from the minimalistic recording configuration, our results demonstrate that wavefield gradient analysis should be critically considered for future space missions that aim to explore the interior structure of extraterrestrial objects by seismic methods. Additionally, we anticipate that the proposed shear wave identification methodology can also be applied to the routinely recorded vertical component data from land seismic exploration on Earth.

  3. MindEdit: A P300-based text editor for mobile devices.

    PubMed

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2017-01-01

    Practical application of Brain-Computer Interfaces (BCIs) requires that the whole BCI system be portable. The mobility of BCI systems involves two aspects: making the electroencephalography (EEG) recording devices portable, and developing software applications with low computational complexity to be able to run on low computational-power devices such as tablets and smartphones. This paper addresses the development of MindEdit; a P300-based text editor for Android-based devices. Given the limited resources of mobile devices and their limited computational power, a novel ensemble classifier is utilized that uses Principal Component Analysis (PCA) features to identify P300 evoked potentials from EEG recordings. PCA computations in the proposed method are channel-based as opposed to concatenating all channels as in traditional feature extraction methods; thus, this method has less computational complexity compared to traditional P300 detection methods. The performance of the method is demonstrated on data recorded from MindEdit on an Android tablet using the Emotiv wireless neuroheadset. Results demonstrate the capability of the introduced PCA ensemble classifier to classify P300 data with maximum average accuracy of 78.37±16.09% for cross-validation data and 77.5±19.69% for online test data using only 10 trials per symbol and a 33-character training dataset. Our analysis indicates that the introduced method outperforms traditional feature extraction methods. For a faster operation of MindEdit, a variable number of trials scheme is introduced that resulted in an online average accuracy of 64.17±19.6% and a maximum bitrate of 6.25bit/min. These results demonstrate the efficacy of using the developed BCI application with mobile devices. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Results from a survey of national immunization programmes on home-based vaccination record practices in 2013

    PubMed Central

    Young, Stacy L.; Gacic-Dobo, Marta; Brown, David W.

    2015-01-01

    Background Data on home-based records (HBRs) practices within national immunization programmes are non-existent, making it difficult to determine whether current efforts of immunization programmes related to basic recording of immunization services are appropriately focused. Methods During January 2014, WHO and the United Nations Children's Fund sent a one-page questionnaire to 195 countries to obtain information on HBRs including type of record used, number of records printed, whether records were provided free-of-charge or required by schools, whether there was a stock-out and the duration of any stock-outs that occurred, as well as the total expenditure for printing HBRs during 2013. Results A total of 140 countries returned a completed HBR questionnaire. Two countries were excluded from analysis because they did not use a HBR during 2013. HBR types varied across countries (vaccination only cards, 32/138 [23.1%]; vaccination plus growth monitoring records, 31/138 [22.4%]; child health books, 48/138 [34.7%]; combination of these, 27/138 [19.5%] countries). HBRs were provided free-of-charge in 124/138 (89.8%) respondent countries. HBRs were required for school entry in 62/138 (44.9%) countries. Nearly a quarter of countries reported HBR stock-outs during 2013. Computed printing cost per record was

  5. Results from a survey of national immunization programmes on home-based vaccination record practices in 2013.

    PubMed

    Young, Stacy L; Gacic-Dobo, Marta; Brown, David W

    2015-07-01

    Data on home-based records (HBRs) practices within national immunization programmes are non-existent, making it difficult to determine whether current efforts of immunization programmes related to basic recording of immunization services are appropriately focused. During January 2014, WHO and the United Nations Children's Fund sent a one-page questionnaire to 195 countries to obtain information on HBRs including type of record used, number of records printed, whether records were provided free-of-charge or required by schools, whether there was a stock-out and the duration of any stock-outs that occurred, as well as the total expenditure for printing HBRs during 2013. A total of 140 countries returned a completed HBR questionnaire. Two countries were excluded from analysis because they did not use a HBR during 2013. HBR types varied across countries (vaccination only cards, 32/138 [23.1%]; vaccination plus growth monitoring records, 31/138 [22.4%]; child health books, 48/138 [34.7%]; combination of these, 27/138 [19.5%] countries). HBRs were provided free-of-charge in 124/138 (89.8%) respondent countries. HBRs were required for school entry in 62/138 (44.9%) countries. Nearly a quarter of countries reported HBR stock-outs during 2013. Computed printing cost per record was

  6. Analysis and Visualization of 3D Motion Data for UPDRS Rating of Patients with Parkinson's Disease.

    PubMed

    Piro, Neltje E; Piro, Lennart K; Kassubek, Jan; Blechschmidt-Trapp, Ronald A

    2016-06-21

    Remote monitoring of Parkinson's Disease (PD) patients with inertia sensors is a relevant method for a better assessment of symptoms. We present a new approach for symptom quantification based on motion data: the automatic Unified Parkinson Disease Rating Scale (UPDRS) classification in combination with an animated 3D avatar giving the neurologist the impression of having the patient live in front of him. In this study we compared the UPDRS ratings of the pronation-supination task derived from: (a) an examination based on video recordings as a clinical reference; (b) an automatically classified UPDRS; and (c) a UPDRS rating from the assessment of the animated 3D avatar. Data were recorded using Magnetic, Angular Rate, Gravity (MARG) sensors with 15 subjects performing a pronation-supination movement of the hand. After preprocessing, the data were classified with a J48 classifier and animated as a 3D avatar. Video recording of the movements, as well as the 3D avatar, were examined by movement disorder specialists and rated by UPDRS. The mean agreement between the ratings based on video and (b) the automatically classified UPDRS is 0.48 and with (c) the 3D avatar it is 0.47. The 3D avatar is similarly suitable for assessing the UPDRS as video recordings for the examined task and will be further developed by the research team.

  7. Annual and average estimates of water-budget components based on hydrograph separation and PRISM precipitation for gaged basins in the Appalachian Plateaus Region, 1900-2011

    USGS Publications Warehouse

    Nelms, David L.; Messinger, Terence; McCoy, Kurt J.

    2015-07-14

    As part of the U.S. Geological Survey’s Groundwater Resources Program study of the Appalachian Plateaus aquifers, annual and average estimates of water-budget components based on hydrograph separation and precipitation data from parameter-elevation regressions on independent slopes model (PRISM) were determined at 849 continuous-record streamflow-gaging stations from Mississippi to New York and covered the period of 1900 to 2011. Only complete calendar years (January to December) of streamflow record at each gage were used to determine estimates of base flow, which is that part of streamflow attributed to groundwater discharge; such estimates can serve as a proxy for annual recharge. For each year, estimates of annual base flow, runoff, and base-flow index were determined using computer programs—PART, HYSEP, and BFI—that have automated the separation procedures. These streamflow-hydrograph analysis methods are provided with version 1.0 of the U.S. Geological Survey Groundwater Toolbox, which is a new program that provides graphing, mapping, and analysis capabilities in a Windows environment. Annual values of precipitation were estimated by calculating the average of cell values intercepted by basin boundaries where previously defined in the GAGES–II dataset. Estimates of annual evapotranspiration were then calculated from the difference between precipitation and streamflow.

  8. SU-E-T-11: A Cloud Based CT and LINAC QA Data Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiersma, R; Grelewicz, Z; Belcher, A

    Purpose: The current status quo of QA data management consists of a mixture of paper-based forms and spreadsheets for recording the results of daily, monthly, and yearly QA tests for both CT scanners and LINACs. Unfortunately, such systems suffer from a host of problems as, (1) records can be easily lost or destroyed, (2) data is difficult to access — one must physically hunt down records, (3) poor or no means of historical data analysis, and (4) no remote monitoring of machine performance off-site. To address these issues, a cloud based QA data management system was developed and implemented. Methods:more » A responsive tablet interface that optimizes clinic workflow with an easy-to-navigate interface accessible from any web browser was implemented in HTML/javascript/CSS to allow user mobility when entering QA data. Automated image QA was performed using a phantom QA kit developed in Python that is applicable to any phantom and is currently being used with the Gammex ACR, Las Vegas, Leeds, and Catphan phantoms for performing automated CT, MV, kV, and CBCT QAs, respectively. A Python based resource management system was used to distribute and manage intensive CPU tasks such as QA phantom image analysis or LaTeX-to-PDF QA report generation to independent process threads or different servers such that website performance is not affected. Results: To date the cloud QA system has performed approximately 185 QA procedures. Approximately 200 QA parameters are being actively tracked by the system on a monthly basis. Electronic access to historical QA parameter information was successful in proactively identifying a Linac CBCT scanner’s performance degradation. Conclusion: A fully comprehensive cloud based QA data management system was successfully implemented for the first time. Potential machine performance issues were proactively identified that would have been otherwise missed by a paper or spreadsheet based QA system.« less

  9. Auto correlation analysis of coda waves from local earthquakes for detecting temporal changes in shallow subsurface structures - The 2011 Tohoku-Oki, Japan, earthquake -

    NASA Astrophysics Data System (ADS)

    Nakahara, H.

    2013-12-01

    For monitoring temporal changes in subsurface structures, I propose to use auto correlation functions of coda waves from local earthquakes recorded at surface receivers, which probably contain more body waves than surface waves. Because the use of coda waves requires earthquakes, time resolution for monitoring decreases. But at regions with high seismicity, it may be possible to monitor subsurface structures in sufficient time resolutions. Studying the 2011 Tohoku-Oki (Mw 9.0), Japan, earthquake for which velocity changes have been already reported by previous studies, I try to validate the method. KiK-net stations in northern Honshu are used in the analysis. For each moderate earthquake, normalized auto correlation functions of surface records are stacked with respect to time windows in S-wave coda. Aligning the stacked normalized auto correlation functions with time, I search for changes in arrival times of phases. The phases at lag times of less than 1s are studied because changes at shallow depths are focused. Based on the stretching method, temporal variations in the arrival times are measured at the stations. Clear phase delays are found to be associated with the mainshock and to gradually recover with time. Amounts of the phase delays are in the order of 10% on average with the maximum of about 50% at some stations. For validation, the deconvolution analysis using surface and subsurface records at the same stations are conducted. The results show that the phase delays from the deconvolution analysis are slightly smaller than those from the auto correlation analysis, which implies that the phases on the auto correlations are caused by larger velocity changes at shallower depths. The auto correlation analysis seems to have an accuracy of about several percents, which is much larger than methods using earthquake doublets and borehole array data. So this analysis might be applicable to detect larger changes. In spite of these disadvantages, this analysis is still attractive because it can be applied to many records on the surface in regions where no boreholes are available. Acknowledgements: Seismograms recorded by KiK-net managed by National Research Institute for Earth Science and Disaster Prevention (NIED) were used in this study. This study was partially supported by JST J-RAPID program and JSPS KAKENHI Grant Numbers 24540449 and 23540449.

  10. A SWOT Analysis of the Various Backup Scenarios Used in Electronic Medical Record Systems.

    PubMed

    Seo, Hwa Jeong; Kim, Hye Hyeon; Kim, Ju Han

    2011-09-01

    Electronic medical records (EMRs) are increasingly being used by health care services. Currently, if an EMR shutdown occurs, even for a moment, patient safety and care can be seriously impacted. Our goal was to determine the methodology needed to develop an effective and reliable EMR backup system. Our "independent backup system by medical organizations" paradigm implies that individual medical organizations develop their own EMR backup systems within their organizations. A "personal independent backup system" is defined as an individual privately managing his/her own medical records, whereas in a "central backup system by the government" the government controls all the data. A "central backup system by private enterprises" implies that individual companies retain control over their own data. A "cooperative backup system among medical organizations" refers to a networked system established through mutual agreement. The "backup system based on mutual trust between an individual and an organization" means that the medical information backup system at the organizational level is established through mutual trust. Through the use of SWOT analysis it can be shown that cooperative backup among medical organizations is possible to be established through a network composed of various medical agencies and that it can be managed systematically. An owner of medical information only grants data access to the specific person who gave the authorization for backup based on the mutual trust between an individual and an organization. By employing SWOT analysis, we concluded that a linkage among medical organizations or between an individual and an organization can provide an efficient backup system.

  11. A SWOT Analysis of the Various Backup Scenarios Used in Electronic Medical Record Systems

    PubMed Central

    Seo, Hwa Jeong; Kim, Hye Hyeon

    2011-01-01

    Objectives Electronic medical records (EMRs) are increasingly being used by health care services. Currently, if an EMR shutdown occurs, even for a moment, patient safety and care can be seriously impacted. Our goal was to determine the methodology needed to develop an effective and reliable EMR backup system. Methods Our "independent backup system by medical organizations" paradigm implies that individual medical organizations develop their own EMR backup systems within their organizations. A "personal independent backup system" is defined as an individual privately managing his/her own medical records, whereas in a "central backup system by the government" the government controls all the data. A "central backup system by private enterprises" implies that individual companies retain control over their own data. A "cooperative backup system among medical organizations" refers to a networked system established through mutual agreement. The "backup system based on mutual trust between an individual and an organization" means that the medical information backup system at the organizational level is established through mutual trust. Results Through the use of SWOT analysis it can be shown that cooperative backup among medical organizations is possible to be established through a network composed of various medical agencies and that it can be managed systematically. An owner of medical information only grants data access to the specific person who gave the authorization for backup based on the mutual trust between an individual and an organization. Conclusions By employing SWOT analysis, we concluded that a linkage among medical organizations or between an individual and an organization can provide an efficient backup system. PMID:22084811

  12. A multimedia perioperative record keeper for clinical research.

    PubMed

    Perrino, A C; Luther, M A; Phillips, D B; Levin, F L

    1996-05-01

    To develop a multimedia perioperative recordkeeper that provides: 1. synchronous, real-time acquisition of multimedia data, 2. on-line access to the patient's chart data, and 3. advanced data analysis capabilities through integrated, multimedia database and analysis applications. To minimize cost and development time, the system design utilized industry standard hardware components and graphical. software development tools. The system was configured to use a Pentium PC complemented with a variety of hardware interfaces to external data sources. These sources included physiologic monitors with data in digital, analog, video, and audio as well as paper-based formats. The development process was guided by trials in over 80 clinical cases and by the critiques from numerous users. As a result of this process, a suite of custom software applications were created to meet the design goals. The Perioperative Data Acquisition application manages data collection from a variety of physiological monitors. The Charter application provides for rapid creation of an electronic medical record from the patient's paper-based chart and investigator's notes. The Multimedia Medical Database application provides a relational database for the organization and management of multimedia data. The Triscreen application provides an integrated data analysis environment with simultaneous, full-motion data display. With recent technological advances in PC power, data acquisition hardware, and software development tools, the clinical researcher now has the ability to collect and examine a more complete perioperative record. It is hoped that the description of the MPR and its development process will assist and encourage others to advance these tools for perioperative research.

  13. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  14. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  15. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  16. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  17. Wayside Bearing Fault Diagnosis Based on a Data-Driven Doppler Effect Eliminator and Transient Model Analysis

    PubMed Central

    Liu, Fang; Shen, Changqing; He, Qingbo; Zhang, Ao; Liu, Yongbin; Kong, Fanrang

    2014-01-01

    A fault diagnosis strategy based on the wayside acoustic monitoring technique is investigated for locomotive bearing fault diagnosis. Inspired by the transient modeling analysis method based on correlation filtering analysis, a so-called Parametric-Mother-Doppler-Wavelet (PMDW) is constructed with six parameters, including a center characteristic frequency and five kinematic model parameters. A Doppler effect eliminator containing a PMDW generator, a correlation filtering analysis module, and a signal resampler is invented to eliminate the Doppler effect embedded in the acoustic signal of the recorded bearing. Through the Doppler effect eliminator, the five kinematic model parameters can be identified based on the signal itself. Then, the signal resampler is applied to eliminate the Doppler effect using the identified parameters. With the ability to detect early bearing faults, the transient model analysis method is employed to detect localized bearing faults after the embedded Doppler effect is eliminated. The effectiveness of the proposed fault diagnosis strategy is verified via simulation studies and applications to diagnose locomotive roller bearing defects. PMID:24803197

  18. Building a Consistent Long-Term SSS Data Record from Multi-Satellite Measurements: A Case Study in the Eastern Tropical Pacific (SPURS-2)

    NASA Astrophysics Data System (ADS)

    Melnichenko, O.; Hacker, P. W.; Wentz, F. J.; Meissner, T.; Maximenko, N. A.; Potemra, J. T.

    2016-12-01

    To address the need for a consistent, continuous, long-term, high-resolution sea surface salinity (SSS) dataset for ocean research and applications, a trial SSS analysis is produced in the eastern tropical Pacific from multi-satellite observations. The new SSS data record is a synergy of data from two satellite missions. The beginning segment, covering the period from September 2011 to June 2015, utilizes Aquarius SSS data and is based on the optimum interpolation analysis developed at the University of Hawaii. The analysis is produced on a 0.25-degree grid and uses a dedicated bias-correction algorithm to correct the satellite retrievals for large-scale biases with respect to in-situ data. The time series is continued with the Soil Moisture Active Passive (SMAP) satellite-based SSS data provided by Remote Sensing Systems (RSS). To ensure consistency and continuity in the data record, SMAP SSS fields are adjusted using a set of optimally designed spatial filters and in-situ, primarily Argo, data to: (i) remove large-scale satellite biases, and (ii) reduce small-scale noise, while preserving the high spatial and temporal resolution of the data set. The consistency between the two sub-sets of the data record is evaluated during their overlapping period in April-June 2015. Verification studies show that SMAP SSS has a very good agreement with the Aquarius SSS, noting that SMAP SSS can provide better spatial resolution. The 5-yr long time series of SSS in the SPURS-2 domain (125oW, 10oN) shows fresher than normal SSS during the last year's El Nino event. The year-mean difference is about 0.5 psu. The annual cycle during the El Nino year also appears to be much weaker than in a normal year.

  19. Analysis of Medication Error Reports

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitney, Paul D.; Young, Jonathan; Santell, John

    In medicine, as in many areas of research, technological innovation and the shift from paper based information to electronic records has created a climate of ever increasing availability of raw data. There has been, however, a corresponding lag in our abilities to analyze this overwhelming mass of data, and classic forms of statistical analysis may not allow researchers to interact with data in the most productive way. This is true in the emerging area of patient safety improvement. Traditionally, a majority of the analysis of error and incident reports has been carried out based on an approach of data comparison,more » and starts with a specific question which needs to be answered. Newer data analysis tools have been developed which allow the researcher to not only ask specific questions but also to “mine” data: approach an area of interest without preconceived questions, and explore the information dynamically, allowing questions to be formulated based on patterns brought up by the data itself. Since 1991, United States Pharmacopeia (USP) has been collecting data on medication errors through voluntary reporting programs. USP’s MEDMARXsm reporting program is the largest national medication error database and currently contains well over 600,000 records. Traditionally, USP has conducted an annual quantitative analysis of data derived from “pick-lists” (i.e., items selected from a list of items) without an in-depth analysis of free-text fields. In this paper, the application of text analysis and data analysis tools used by Battelle to analyze the medication error reports already analyzed in the traditional way by USP is described. New insights and findings were revealed including the value of language normalization and the distribution of error incidents by day of the week. The motivation for this effort is to gain additional insight into the nature of medication errors to support improvements in medication safety.« less

  20. Facilitating the ethical use of health data for the benefit of society: electronic health records, consent and the duty of easy rescue

    PubMed Central

    Porsdam Mann, Sebastian; Sahakian, Barbara J.

    2016-01-01

    Advances in data science allow for sophisticated analysis of increasingly large datasets. In the medical context, large volumes of data collected for healthcare purposes are contained in electronic health records (EHRs). The real-life character and sheer amount of data contained in them make EHRs an attractive resource for public health and biomedical research. However, medical records contain sensitive information that could be misused by third parties. Medical confidentiality and respect for patients' privacy and autonomy protect patient data, barring access to health records unless consent is given by the data subject. This creates a situation in which much of the beneficial records-based research is prevented from being used or is seriously undermined, because the refusal of consent by some patients introduces a systematic deviation, known as selection bias, from a representative sample of the general population, thus distorting research findings. Although research exemptions for the requirement of informed consent exist, they are rarely used in practice due to concerns over liability and a general culture of caution. In this paper, we argue that the problem of research access to sensitive data can be understood as a tension between the medical duties of confidentiality and beneficence. We attempt to show that the requirement of informed consent is not appropriate for all kinds of records-based research by distinguishing studies involving minimal risk from those that feature moderate or greater risks. We argue that the duty of easy rescue—the principle that persons should benefit others when this can be done at no or minimal risk to themselves—grounds the removal of consent requirements for minimally risky records-based research. Drawing on this discussion, we propose a risk-adapted framework for the facilitation of ethical uses of health data for the benefit of society. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336803

  1. Facilitating the ethical use of health data for the benefit of society: electronic health records, consent and the duty of easy rescue.

    PubMed

    Porsdam Mann, Sebastian; Savulescu, Julian; Sahakian, Barbara J

    2016-12-28

    Advances in data science allow for sophisticated analysis of increasingly large datasets. In the medical context, large volumes of data collected for healthcare purposes are contained in electronic health records (EHRs). The real-life character and sheer amount of data contained in them make EHRs an attractive resource for public health and biomedical research. However, medical records contain sensitive information that could be misused by third parties. Medical confidentiality and respect for patients' privacy and autonomy protect patient data, barring access to health records unless consent is given by the data subject. This creates a situation in which much of the beneficial records-based research is prevented from being used or is seriously undermined, because the refusal of consent by some patients introduces a systematic deviation, known as selection bias, from a representative sample of the general population, thus distorting research findings. Although research exemptions for the requirement of informed consent exist, they are rarely used in practice due to concerns over liability and a general culture of caution. In this paper, we argue that the problem of research access to sensitive data can be understood as a tension between the medical duties of confidentiality and beneficence. We attempt to show that the requirement of informed consent is not appropriate for all kinds of records-based research by distinguishing studies involving minimal risk from those that feature moderate or greater risks. We argue that the duty of easy rescue-the principle that persons should benefit others when this can be done at no or minimal risk to themselves-grounds the removal of consent requirements for minimally risky records-based research. Drawing on this discussion, we propose a risk-adapted framework for the facilitation of ethical uses of health data for the benefit of society.This article is part of the themed issue 'The ethical impact of data science'. © 2015 The Authors.

  2. Modification of the Miyake-Apple technique for simultaneous anterior and posterior video imaging of wet laboratory-based corneal surgery.

    PubMed

    Tan, Johnson C H; Meadows, Howard; Gupta, Aanchal; Yeung, Sonia N; Moloney, Gregory

    2014-03-01

    The aim of this study was to describe a modification of the Miyake-Apple posterior video analysis for the simultaneous visualization of the anterior and posterior corneal surfaces during wet laboratory-based deep anterior lamellar keratoplasty (DALK). A human donor corneoscleral button was affixed to a microscope slide and placed onto a custom-made mounting box. A big bubble DALK was performed on the cornea in the wet laboratory. An 11-diopter intraocular lens was positioned over the aperture of the back camera of an iPhone. This served to video record the posterior view of the corneoscleral button during the big bubble formation. An overhead operating microscope with an attached video camcorder recorded the anterior view during the surgery. The anterior and posterior views of the wet laboratory-based DALK surgery were simultaneously captured and edited using video editing software. The formation of the big bubble can be studied. This video recording camera system has the potential to act as a valuable research and teaching tool in corneal lamellar surgery, especially in the behavior of the big bubble formation in DALK.

  3. Gait Recognition Using Wearable Motion Recording Sensors

    NASA Astrophysics Data System (ADS)

    Gafurov, Davrondzhon; Snekkenes, Einar

    2009-12-01

    This paper presents an alternative approach, where gait is collected by the sensors attached to the person's body. Such wearable sensors record motion (e.g. acceleration) of the body parts during walking. The recorded motion signals are then investigated for person recognition purposes. We analyzed acceleration signals from the foot, hip, pocket and arm. Applying various methods, the best EER obtained for foot-, pocket-, arm- and hip- based user authentication were 5%, 7%, 10% and 13%, respectively. Furthermore, we present the results of our analysis on security assessment of gait. Studying gait-based user authentication (in case of hip motion) under three attack scenarios, we revealed that a minimal effort mimicking does not help to improve the acceptance chances of impostors. However, impostors who know their closest person in the database or the genders of the users can be a threat to gait-based authentication. We also provide some new insights toward the uniqueness of gait in case of foot motion. In particular, we revealed the following: a sideway motion of the foot provides the most discrimination, compared to an up-down or forward-backward directions; and different segments of the gait cycle provide different level of discrimination.

  4. An information system for epidemiology based on a computer-based medical record.

    PubMed

    Verdier, C; Flory, A

    1994-12-01

    A new way is presented to build an information system addressed to problems in epidemiology. Based on our analysis of current and future requirements, a system is proposed which allows for collection, organization and distribution of data within a computer network. In this application, two broad communities of users-physicians and epidemiologists-can be identified, each with their own perspectives and goals. The different requirements of each community lead us to a client-service centered architecture which provides the functionality requirements of the two groups. The resulting physician workstation provides help for recording and querying medical information about patients and from a pharmacological database. All information is classified and coded in order to be retrieved for pharmaco-economic studies. The service center receives information from physician workstations and permits organizations that are in charge of statistical studies to work with "real" data recorded during patient encounters. This leads to a new approach in epidemiology. Studies can be carried out with a more efficient data acquisition. For modelling the information system, we use an object-oriented approach. We have observed that the object-oriented representation, particularly its concepts of generalization, aggregation and encapsulation, are very usable for our problem.

  5. SETTING UP FARM RECORDS TO PROVIDE FOR ANALYSIS.

    ERIC Educational Resources Information Center

    Illinois Univ., Urbana. Coll. of Agriculture.

    RESOURCE MATERIAL ON FARM RECORD ANALYSIS FOR USE IN HIGH SCHOOL VOCATIONAL AGRICULTURE AND ADULT FARMER CLASSES WAS DESIGNED BY SUBJECT MATTER SPECIALISTS, TEACHER EDUCATORS, SUPERVISORS, AND TEACHERS TO PROVIDE TEXTUAL MATERIAL FOR STUDENTS ON THE PURPOSES OF RECORDS, ANALYSIS MEASURES, INVENTORIES, DEPRECIATION SCHEDULES, FINANCIAL TRANSACTION…

  6. Modified polymethylmethacrylate as a base for thermostable optical recording media

    NASA Astrophysics Data System (ADS)

    Krul, L. P.; Matusevich, V.; Hoff, D.; Kowarschik, R.; Matusevich, Yu. I.; Butovskaya, G. V.; Murashko, E. A.

    2007-07-01

    A possibility to improve the thermal properties of holographic gratings in a photosensitive system based on polymethylmethacrylate (PMMA) and to enhance simultaneously the adhesion of the photopolymer to soda-lime glass is demonstrated. The modified PMMA was prepared by radical copolymerisation of methylmethacrylate (MMA) with acrylic acid (AA). Polymer films deposited from samples of the copolymer of MMA with AA containing 9,10-phenanthrenequinone additives were used as a photosensitive material for the recording of holographic gratings. It is possible to generate gratings that are thermally stable up to 200ºC using this modified PMMA. Dynamic thermogravimetry, differential thermal analysis and thermal mechanic analyses were used to determine the dependence of the thermal stability of the modified PMMA on the composition and the structure of its macromolecules.

  7. Wheezing recognition algorithm using recordings of respiratory sounds at the mouth in a pediatric population.

    PubMed

    Bokov, Plamen; Mahut, Bruno; Flaud, Patrice; Delclaux, Christophe

    2016-03-01

    Respiratory diseases in children are a common reason for physician visits. A diagnostic difficulty arises when parents hear wheezing that is no longer present during the medical consultation. Thus, an outpatient objective tool for recognition of wheezing is of clinical value. We developed a wheezing recognition algorithm from recorded respiratory sounds with a Smartphone placed near the mouth. A total of 186 recordings were obtained in a pediatric emergency department, mostly in toddlers (mean age 20 months). After exclusion of recordings with artefacts and those with a single clinical operator auscultation, 95 recordings with the agreement of two operators on auscultation diagnosis (27 with wheezing and 68 without) were subjected to a two phase algorithm (signal analysis and pattern classifier using machine learning algorithms) to classify records. The best performance (71.4% sensitivity and 88.9% specificity) was observed with a Support Vector Machine-based algorithm. We further tested the algorithm over a set of 39 recordings having a single operator and found a fair agreement (kappa=0.28, CI95% [0.12, 0.45]) between the algorithm and the operator. The main advantage of such an algorithm is its use in contact-free sound recording, thus valuable in the pediatric population. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. The use of missing birth record data as a marker for adverse reproductive outcomes: a geocoded analysis of birth record data.

    PubMed Central

    Headley, Adrienne J.; Fulcomer, Mark C.; Bastardi, Matthew M.; Im, Wansoo; Sass, Marcia M.; Chung, Katherine

    2006-01-01

    Adverse reproductive outcomes (AROs) disproportionately affect black American infants and significantly contribute to the U.S. infant mortality rate. Without accurate understanding of AROs, there remains little hope of ameliorating infant mortality rates or eliminating infant health disparities. However, despite the importance of monitoring infant mortality rates and health disparities, birth record data quality is not assured. Racial disparities in the reporting of birth record data have been documented, and missing birth record data for AROs appears to be disproportionate. Due to the extent of missing birth record data, innovative strategies have been developed to evaluate relationships between maternal socioeconomic status (SES) and community-based ARO rates. Because addresses convey aggregate information about income level, education and occupation, ZIP codes, census tracts and census block-groups have been applied to geocoding efforts. The goals of this study are to: 1) analyze the extent of missing birth record data for New Jersey areas with high rates of an ARO (preterm birth), 2) evaluate associations between the extent of missing birth record data and other AROs, and 3) consider how geocoding strategies could be applied to provide a basis for understanding maternal SES risk factors and ARO resource allocation for at-risk communities. PMID:16895276

  9. Comparison between uroflowmetry and sonouroflowmetry in recording of urinary flow in healthy men.

    PubMed

    Krhut, Jan; Gärtner, Marcel; Sýkora, Radek; Hurtík, Petr; Burda, Michal; Luňáček, Libor; Zvarová, Katarína; Zvara, Peter

    2015-08-01

    To evaluate the accuracy of sonouroflowmetry in recording urinary flow parameters and voided volume. A total of 25 healthy male volunteers (age 18-63 years) were included in the study. All participants were asked to carry out uroflowmetry synchronous with recording of the sound generated by the urine stream hitting the water level in the urine collection receptacle, using a dedicated cell phone. From 188 recordings, 34 were excluded, because of voided volume <150 mL or technical problems during recording. Sonouroflowmetry recording was visualized in a form of a trace, representing sound intensity over time. Subsequently, the matching datasets of uroflowmetry and sonouroflowmetry were compared with respect to flow time, voided volume, maximum flow rate and average flow rate. Pearson's correlation coefficient was used to compare parameters recorded by uroflowmetry with those calculated based on sonouroflowmetry recordings. The flow pattern recorded by sonouroflowmetry showed a good correlation with the uroflowmetry trace. A strong correlation (Pearson's correlation coefficient 0.87) was documented between uroflowmetry-recorded flow time and duration of the sound signal recorded with sonouroflowmetry. A moderate correlation was observed in voided volume (Pearson's correlation coefficient 0.68) and average flow rate (Pearson's correlation coefficient 0.57). A weak correlation (Pearson's correlation coefficient 0.38) between maximum flow rate recorded using uroflowmetry and sonouroflowmetry-recorded peak sound intensity was documented. The present study shows that the basic concept utilizing sound analysis for estimation of urinary flow parameters and voided volume is valid. However, further development of this technology and standardization of recording algorithm are required. © 2015 The Japanese Urological Association.

  10. Trends in basic mathematical competencies of beginning undergraduates in Ireland, 2003-2013

    NASA Astrophysics Data System (ADS)

    Treacy, Páraic; Faulkner, Fiona

    2015-11-01

    Deficiencies in beginning undergraduate students' basic mathematical skills has been an issue of concern in higher education, particularly in the past 15 years. This issue has been tracked and analysed in a number of universities in Ireland and internationally through student scores recorded in mathematics diagnostic tests. Students beginning their science-based and technology-based undergraduate courses in the University of Limerick have had their basic mathematics skills tested without any prior warning through a 40 question diagnostic test during their initial service mathematics lecture since 1998. Data gathered through this diagnostic test have been recorded in a database kept at the university and explored to track trends in mathematical competency of these beginning undergraduates. This paper details findings surrounding an analysis of the database between 2003 and 2013, outlining changes in mathematical competencies of these beginning undergraduates in an attempt to determine reasons for such changes. The analysis found that the proportion of students tested through this diagnostic test that are predicted to be at risk of failing their service mathematics end-of-semester examinations has increased significantly between 2003 and 2013. Furthermore, when students' performance in secondary level mathematics was controlled, it was determined that the performance of beginning undergraduates in 2013 was statistically significantly below that of the performance of the beginning undergraduates recorded 10 years previously.

  11. Medication-related cognitive artifacts used by older adults with heart failure

    PubMed Central

    Mickelson, Robin S.; Willis, Matt; Holden, Richard J.

    2015-01-01

    Objective To use a human factors perspective to examine how older adult patients with heart failure use cognitive artifacts for medication management. Methods We performed a secondary analysis of data collected from 30 patients and 14 informal caregivers enrolled in a larger study of heart failure self-care. Data included photographs, observation notes, interviews, video recordings, medical record data, and surveys. These data were analyzed using an iterative content analysis. Results Findings revealed that medication management was complex, inseparable from other patient activities, distributed across people, time, and place, and complicated by knowledge gaps. We identified fifteen types of cognitive artifacts including medical devices, pillboxes, medication lists, and electronic personal health records used for: 1) measurement/evaluation; 2) tracking/communication; 3) organization/administration; and 4) information/sensemaking. These artifacts were characterized by fit and misfit with the patient’s sociotechnical system and demonstrated both advantages and disadvantages. We found that patients often modified or “finished the design” of existing artifacts and relied on “assemblages” of artifacts, routines, and actors to accomplish their self-care goals. Conclusions Cognitive artifacts are useful but sometimes are poorly designed or are not used optimally. If appropriately designed for usability and acceptance, paper-based and computer-based information technologies can improve medication management for individuals living with chronic illness. These technologies can be designed for use by patients, caregivers, and clinicians; should support collaboration and communication between these individuals; can be coupled with home-based and wearable sensor technology; and must fit their users’ needs, limitations, abilities, tasks, routines, and contexts of use. PMID:26855882

  12. 77 FR 65913 - Privacy Act of 1974: Systems of Records.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-31

    ... performing clerical, stenographic, or data analysis functions, or by reproduction of records by electronic or... performing clerical, stenographic, or data analysis functions, or by reproduction of records by electronic or... Services (OGIS) National Archives and Records Administration, in connection with mediation of FOIA requests...

  13. Estimation of ground motion parameters

    USGS Publications Warehouse

    Boore, David M.; Joyner, W.B.; Oliver, A.A.; Page, R.A.

    1978-01-01

    Strong motion data from western North America for earthquakes of magnitude greater than 5 are examined to provide the basis for estimating peak acceleration, velocity, displacement, and duration as a function of distance for three magnitude classes. A subset of the data (from the San Fernando earthquake) is used to assess the effects of structural size and of geologic site conditions on peak motions recorded at the base of structures. Small but statistically significant differences are observed in peak values of horizontal acceleration, velocity and displacement recorded on soil at the base of small structures compared with values recorded at the base of large structures. The peak acceleration tends to b3e less and the peak velocity and displacement tend to be greater on the average at the base of large structures than at the base of small structures. In the distance range used in the regression analysis (15-100 km) the values of peak horizontal acceleration recorded at soil sites in the San Fernando earthquake are not significantly different from the values recorded at rock sites, but values of peak horizontal velocity and displacement are significantly greater at soil sites than at rock sites. Some consideration is given to the prediction of ground motions at close distances where there are insufficient recorded data points. As might be expected from the lack of data, published relations for predicting peak horizontal acceleration give widely divergent estimates at close distances (three well known relations predict accelerations between 0.33 g to slightly over 1 g at a distance of 5 km from a magnitude 6.5 earthquake). After considering the physics of the faulting process, the few available data close to faults, and the modifying effects of surface topography, at the present time it would be difficult to accept estimates less than about 0.8 g, 110 cm/s, and 40 cm, respectively, for the mean values of peak acceleration, velocity, and displacement at rock sites within 5 km of fault rupture in a magnitude 6.5 earthquake. These estimates can be expected to change as more data become available.

  14. Systematic Analysis of the Decision Rules of Traditional Chinese Medicine

    PubMed Central

    Bin-Rong, Ma; Xi-Yuan, Jiang; Su-Ming, Liso; Huai-ning, Zhu; Xiu-ru, Lin

    1981-01-01

    Chinese traditional medicine has evolved over many centuries, and has accumulated a body of observed relationships between symptoms, signs and prognoses, and the efficacy of alternative treatments and prescriptions. With the assistance of a computer-based clinical data base for recording the diagnostic and therapeutic practice of skilled practitioners of Chinese traditional medicine, a systematic program is being conducted to identify and define the clinical decision-making rules that underlie current practice.

  15. Time/Frequency Analysis of Terrestrial Impack Crater Records

    NASA Astrophysics Data System (ADS)

    Chang, Heon-Young

    2006-09-01

    The terrestrial impact cratering record recently has been examined in the time domain by Chang & Moon (2005). It was found that the ˜ 26 Myr periodicity in the impact cratering rate exists over the last ˜ 250 Myrs. Such a periodicity can be found regardless of the lower limit of the diameter up to D ˜ 35 km. It immediately called pros and cons. The aim of this paper is two-fold: (1) to test if reported periodicities can be obtained with an independent method, (2) to see, as attempted earlier, if the phase is modulated. To achieve these goals we employ the time/frequency analysis and for the first time apply this method to the terrestrial impact cratering records. We have confirmed that without exceptions noticeable peaks appear around ˜ 25 Myr, corresponding to a frequency of ˜ 0.04 (Myr)^{-1}. We also find periodicities in the data base including small impact craters, which are longer. Though the time/frequency analysis allows us to observe directly phase variations, we cannot find any indications of such changes. Instead, modes display slow variations of power in time. The time/frequency analysis shows a nonstationary behavior of the modes. The power can grow from just above the noise level and then decrease back to its initial level in a time of order of 10 Myrs.

  16. Toward an Improvement of the Analysis of Neural Coding.

    PubMed

    Alegre-Cortés, Javier; Soto-Sánchez, Cristina; Albarracín, Ana L; Farfán, Fernando D; Val-Calvo, Mikel; Ferrandez, José M; Fernandez, Eduardo

    2017-01-01

    Machine learning and artificial intelligence have strong roots on principles of neural computation. Some examples are the structure of the first perceptron, inspired in the retina, neuroprosthetics based on ganglion cell recordings or Hopfield networks. In addition, machine learning provides a powerful set of tools to analyze neural data, which has already proved its efficacy in so distant fields of research as speech recognition, behavioral states classification, or LFP recordings. However, despite the huge technological advances in neural data reduction of dimensionality, pattern selection, and clustering during the last years, there has not been a proportional development of the analytical tools used for Time-Frequency (T-F) analysis in neuroscience. Bearing this in mind, we introduce the convenience of using non-linear, non-stationary tools, EMD algorithms in particular, for the transformation of the oscillatory neural data (EEG, EMG, spike oscillations…) into the T-F domain prior to its analysis with machine learning tools. We support that to achieve meaningful conclusions, the transformed data we analyze has to be as faithful as possible to the original recording, so that the transformations forced into the data due to restrictions in the T-F computation are not extended to the results of the machine learning analysis. Moreover, bioinspired computation such as brain-machine interface may be enriched from a more precise definition of neuronal coding where non-linearities of the neuronal dynamics are considered.

  17. 36 CFR 1237.30 - How do agencies manage records on nitrocellulose-base and cellulose-acetate base film?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false How do agencies manage records on nitrocellulose-base and cellulose-acetate base film? 1237.30 Section 1237.30 Parks, Forests..., CARTOGRAPHIC, AND RELATED RECORDS MANAGEMENT § 1237.30 How do agencies manage records on nitrocellulose-base...

  18. 36 CFR 1237.30 - How do agencies manage records on nitrocellulose-base and cellulose-acetate base film?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false How do agencies manage records on nitrocellulose-base and cellulose-acetate base film? 1237.30 Section 1237.30 Parks, Forests..., CARTOGRAPHIC, AND RELATED RECORDS MANAGEMENT § 1237.30 How do agencies manage records on nitrocellulose-base...

  19. 36 CFR 1237.30 - How do agencies manage records on nitrocellulose-base and cellulose-acetate base film?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false How do agencies manage records on nitrocellulose-base and cellulose-acetate base film? 1237.30 Section 1237.30 Parks, Forests..., CARTOGRAPHIC, AND RELATED RECORDS MANAGEMENT § 1237.30 How do agencies manage records on nitrocellulose-base...

  20. Analysis of Earthquake Recordings Obtained from the Seafloor Earthquake Measurement System (SEMS) Instruments Deployed off the Coast of Southern California

    USGS Publications Warehouse

    Boore, D.M.; Smith, C.E.

    1999-01-01

    For more than 20 years, a program has been underway to obtain records of earthquake shaking on the seafloor at sites offshore of southern California, near oil platforms. The primary goal of the program is to obtain data that can help determine if ground motions at offshore sites are significantly different than those at onshore sites; if so, caution may be necessary in using onshore motions as the basis for the seismic design of oil platforms. We analyze data from eight earthquakes recorded at six offshore sites; these are the most important data recorded on these stations to date. Seven of the earthquakes were recorded at only one offshore station; the eighth event was recorded at two sites. The earthquakes range in magnitude from 4.7 to 6.1. Because of the scarcity of multiple recordings from any one event, most of the analysis is based on the ratio of spectra from vertical and horizontal components of motion. The results clearly show that the offshore motions have very low vertical motions compared to those from an average onshore site, particularly at short periods. Theoretical calculations find that the water layer has little effect on the horizontal components of motion but that it produces a strong spectral null on the vertical component at the resonant frequency of P waves in the water layer. The vertical-to-horizontal ratios for a few selected onshore sites underlain by relatively low shear-wave velocities are similar to the ratios from offshore sites for frequencies less than about one-half the water layer P-wave resonant frequency, suggesting that the shear-wave velocities beneath a site are more important than the water layer in determining the character of the ground motions at lower frequencies.

  1. Similarities and differences between on-scalp and conventional in-helmet magnetoencephalography recordings

    PubMed Central

    Pfeiffer, Christoph; Ruffieux, Silvia; Jousmäki, Veikko; Hämäläinen, Matti; Schneiderman, Justin F.; Lundqvist, Daniel

    2017-01-01

    The development of new magnetic sensor technologies that promise sensitivities approaching that of conventional MEG technology while operating at far lower operating temperatures has catalysed the growing field of on-scalp MEG. The feasibility of on-scalp MEG has been demonstrated via benchmarking of new sensor technologies performing neuromagnetic recordings in close proximity to the head surface against state-of-the-art in-helmet MEG sensor technology. However, earlier work has provided little information about how these two approaches compare, or about the reliability of observed differences. Herein, we present such a comparison, based on recordings of the N20m component of the somatosensory evoked field as elicited by electric median nerve stimulation. As expected from the proximity differences between the on-scalp and in-helmet sensors, the magnitude of the N20m activation as recorded with the on-scalp sensor was higher than that of the in-helmet sensors. The dipole pattern of the on-scalp recordings was also more spatially confined than that of the conventional recordings. Our results furthermore revealed unexpected temporal differences in the peak of the N20m component. An analysis protocol was therefore developed for assessing the reliability of this observed difference. We used this protocol to examine our findings in terms of differences in sensor sensitivity between the two types of MEG recordings. The measurements and subsequent analysis raised attention to the fact that great care has to be taken in measuring the field close to the zero-line crossing of the dipolar field, since it is heavily dependent on the orientation of sensors. Taken together, our findings provide reliable evidence that on-scalp and in-helmet sensors measure neural sources in mostly similar ways. PMID:28742118

  2. Climate variability and volcanic history of the Eastern Romanian Carpathians since early MIS 3 recorded in sediments from Mohoş crater

    NASA Astrophysics Data System (ADS)

    Bormann, M.; Veres, D.; Wulf, S.; Papadopoulou, M.; Panagiotopoulos, K.; Schaebitz, F.

    2015-12-01

    We present a 30m long sediment record covering the last ca. 50,000 years from the in-filled Mohoş crater (46°05' N; 25°55' E) located on Ciomadul volcano (Romania) that was retrieved in 2014. The record consists of bog and lacustrine sediments that are inter-bedded with tephra deposits. Ciomadul volcano, hosting the superimposed craters of Mohoş and Sf. Ana, is the youngest volcanic edifice in the Carpathian-Balkan region. Thus, tephra-analysis on the Mohoş sediments gives valuable insights into the volcanic history of that region, mainly arising from the younger crater of Sf Ana and several secondary domes. For investigations into the past climate history, the Mohoş sediment sequence has been analysed using a multi-proxy approach including geophysical, geochemical and sedimentological parameters. Multi-Sensor core logging and ITRAX X-ray fluorescence scanning have been performed at high-resolution, whereas grain size analysis, TOC and C/N ratios supplement the geophysical and geochemical data. Chronological control is based on radiocarbon and luminescence dating. We also present first results of the tephra-analysis on the Mohoş sediment record and their correlation to medium-distal pyroclastic deposits originating in this volcanic field. We further discuss responses of this mid-altitude site (1050 m a.s.l.) to past climate oscillations since early MIS 3. To date, the Mohoş core record provides the longest time series from the Carpathian region. This study is part of the Collaborative Research Centre 806 "Our Way To Europe; Culture-Environment Interaction and Human Mobility in the Late Quaternary" (www.sfb806.de); subproject B2.

  3. Extreme Flood Events Over the Past 300 Years Recorded in the Sediments of a Mountain Lake in the Altay Mountains, Northwestern China

    NASA Astrophysics Data System (ADS)

    Wu, J.; Zhou, J.; Shen, B.; Zeng, H.

    2017-12-01

    Global climate change has the potential to accelerate the hydrological cycle, which may further enhance the temporal frequency of regional extreme floods. Climatic models predict that intra-annual rainfall variability will intensify, which will shift current rainfall regimes towards more extreme systems with lower precipitation frequencies, longer dry periods, and larger individual precipitation events worldwide. Understanding the temporal variations of extreme floods that occur in response to climate change is essential to anticipate the trends in flood magnitude and frequency in the context of global warming. However, currently available instrumental data are not long enough for capturing the most extreme events, thus the acquisition of long duration datasets for historical floods that extend beyond available instrumental records is clearly an important step in discerning trends in flood frequency and magnitude with respect to climate change. In this study, a reconstruction of paleofloods over the past 300 years was conducted through an analysis of grain sizes from the sediments of Kanas Lake in the Altay Mountains of northwestern China. Grain parameters and frequency distributions both demonstrate that two abrupt environment changes exist within the lake sedimentary sequence. Based on canonical discriminant analysis (CDA) and C-M pattern analysis, two flood events corresponding to ca. 1760 AD and ca. 1890 AD were identified, both of which occurred during warmer and wetter climate conditions according to tree-ring records. These two flood events are also evidenced by lake sedimentary records in the Altay and Tianshan areas. Furthermore, through a comparison with other records, the flood event in ca. 1760 AD seems to have occurred in both the arid central Asia and the Alps in Europe, and thus may have been associated with changes in the North Atlantic Oscillation (NAO) index.

  4. Detection of the Sleep Stages Throughout Non-Obtrusive Measures of Inter-Beat Fluctuations and Motion: Night and Day Sleep of Female Shift Workers

    NASA Astrophysics Data System (ADS)

    Mendez, Martin O.; Palacios-Hernandez, Elvia R.; Alba, Alfonso; Kortelainen, Juha M.; Tenhunen, Mirja L.; Bianchi, Anna M.

    Automatic sleep staging based on inter-beat fluctuations and motion signals recorded through a pressure bed sensor during sleep is presented. The analysis of the sleep was based on the three major divisions of the sleep time: Wake, non-rapid eye movement (nREM) and rapid eye movement (REM) sleep stages. Twelve sleep recordings, from six females working alternate shift, with their respective annotations were used in the study. Six recordings were acquired during the night and six during the day after a night shift. A Time-Variant Autoregressive Model was used to extract features from inter-beat fluctuations which later were fed to a Support Vector Machine classifier. Accuracy, Kappa index, and percentage in wake, REM and nREM were used as performance measures. Comparison between the automatic sleep staging detection and the standard clinical annotations, shows mean values of 87% for accuracy 0.58 for kappa index, and mean errors of 5% for sleep stages. The performance measures were similar for night and day sleep recordings. In this sample of recordings, the results suggest that inter-beat fluctuations and motions acquired in non-obtrusive way carried valuable information related to the sleep macrostructure and could be used to support to the experts in extensive evaluation and monitoring of sleep.

  5. Requirements for the structured recording of surgical device data in the digital operating room.

    PubMed

    Rockstroh, Max; Franke, Stefan; Neumuth, Thomas

    2014-01-01

    Due to the increasing complexity of the surgical working environment, increasingly technical solutions must be found to help relieve the surgeon. This objective is supported by a structured storage concept for all relevant device data. In this work, we present a concept and prototype development of a storage system to address intraoperative medical data. The requirements of such a system are described, and solutions for data transfer, processing, and storage are presented. In a subsequent study, a prototype based on the presented concept is tested for correct and complete data transmission and storage and for the ability to record a complete neurosurgical intervention with low processing latencies. In the final section, several applications for the presented data recorder are shown. The developed system based on the presented concept is able to store the generated data correctly, completely, and quickly enough even if much more data than expected are sent during a surgical intervention. The Surgical Data Recorder supports automatic recognition of the interventional situation by providing a centralized data storage and access interface to the OR communication bus. In the future, further data acquisition technologies should be integrated. Therefore, additional interfaces must be developed. The data generated by these devices and technologies should also be stored in or referenced by the Surgical Data Recorder to support the analysis of the OR situation.

  6. Development of multiscale complexity and multifractality of fetal heart rate variability.

    PubMed

    Gierałtowski, Jan; Hoyer, Dirk; Tetschke, Florian; Nowack, Samuel; Schneider, Uwe; Zebrowski, Jan

    2013-11-01

    During fetal development a complex system grows and coordination over multiple time scales is formed towards an integrated behavior of the organism. Since essential cardiovascular and associated coordination is mediated by the autonomic nervous system (ANS) and the ANS activity is reflected in recordable heart rate patterns, multiscale heart rate analysis is a tool predestined for the diagnosis of prenatal maturation. The analyses over multiple time scales requires sufficiently long data sets while the recordings of fetal heart rate as well as the behavioral states studied are themselves short. Care must be taken that the analysis methods used are appropriate for short data lengths. We investigated multiscale entropy and multifractal scaling exponents from 30 minute recordings of 27 normal fetuses, aged between 23 and 38 weeks of gestational age (WGA) during the quiet state. In multiscale entropy, we found complexity lower than that of non-correlated white noise over all 20 coarse graining time scales investigated. Significant maturation age related complexity increase was strongest expressed at scale 2, both using sample entropy and generalized mutual information as complexity estimates. Multiscale multifractal analysis (MMA) in which the Hurst surface h(q,s) is calculated, where q is the multifractal parameter and s is the scale, was applied to the fetal heart rate data. MMA is a method derived from detrended fluctuation analysis (DFA). We modified the base algorithm of MMA to be applicable for short time series analysis using overlapping data windows and a reduction of the scale range. We looked for such q and s for which the Hurst exponent h(q,s) is most correlated with gestational age. We used this value of the Hurst exponent to predict the gestational age based only on fetal heart rate variability properties. Comparison with the true age of the fetus gave satisfying results (error 2.17±3.29 weeks; p<0.001; R(2)=0.52). In addition, we found that the normally used DFA scale range is non-optimal for fetal age evaluation. We conclude that 30 min recordings are appropriate and sufficient for assessing fetal age by multiscale entropy and multiscale multifractal analysis. The predominant prognostic role of scale 2 heart beats for MSE and scale 39 heart beats (at q=-0.7) for MMA cannot be explored neither by single scale complexity measures nor by standard detrended fluctuation analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Regional-specific Stochastic Simulation of Spatially-distributed Ground-motion Time Histories using Wavelet Packet Analysis

    NASA Astrophysics Data System (ADS)

    Huang, D.; Wang, G.

    2014-12-01

    Stochastic simulation of spatially distributed ground-motion time histories is important for performance-based earthquake design of geographically distributed systems. In this study, we develop a novel technique to stochastically simulate regionalized ground-motion time histories using wavelet packet analysis. First, a transient acceleration time history is characterized by wavelet-packet parameters proposed by Yamamoto and Baker (2013). The wavelet-packet parameters fully characterize ground-motion time histories in terms of energy content, time- frequency-domain characteristics and time-frequency nonstationarity. This study further investigates the spatial cross-correlations of wavelet-packet parameters based on geostatistical analysis of 1500 regionalized ground motion data from eight well-recorded earthquakes in California, Mexico, Japan and Taiwan. The linear model of coregionalization (LMC) is used to develop a permissible spatial cross-correlation model for each parameter group. The geostatistical analysis of ground-motion data from different regions reveals significant dependence of the LMC structure on regional site conditions, which can be characterized by the correlation range of Vs30 in each region. In general, the spatial correlation and cross-correlation of wavelet-packet parameters are stronger if the site condition is more homogeneous. Using the regional-specific spatial cross-correlation model and cokriging technique, wavelet packet parameters at unmeasured locations can be best estimated, and regionalized ground-motion time histories can be synthesized. Case studies and blind tests demonstrated that the simulated ground motions generally agree well with the actual recorded data, if the influence of regional-site conditions is considered. The developed method has great potential to be used in computational-based seismic analysis and loss estimation in a regional scale.

  8. High lateral resolution exploration using surface waves from noise records

    NASA Astrophysics Data System (ADS)

    Chávez-García, Francisco José Yokoi, Toshiaki

    2016-04-01

    Determination of the shear-wave velocity structure at shallow depths is a constant necessity in engineering or environmental projects. Given the sensitivity of Rayleigh waves to shear-wave velocity, subsoil structure exploration using surface waves is frequently used. Methods such as the spectral analysis of surface waves (SASW) or multi-channel analysis of surface waves (MASW) determine phase velocity dispersion from surface waves generated by an active source recorded on a line of geophones. Using MASW, it is important that the receiver array be as long as possible to increase the precision at low frequencies. However, this implies that possible lateral variations are discarded. Hayashi and Suzuki (2004) proposed a different way of stacking shot gathers to increase lateral resolution. They combined strategies used in MASW with the common mid-point (CMP) summation currently used in reflection seismology. In their common mid-point with cross-correlation method (CMPCC), they cross-correlate traces sharing CMP locations before determining phase velocity dispersion. Another recent approach to subsoil structure exploration is based on seismic interferometry. It has been shown that cross-correlation of a diffuse field, such as seismic noise, allows the estimation of the Green's Function between two receivers. Thus, a virtual-source seismic section may be constructed from the cross-correlation of seismic noise records obtained in a line of receivers. In this paper, we use the seismic interferometry method to process seismic noise records obtained in seismic refraction lines of 24 geophones, and analyse the results using CMPCC to increase the lateral resolution of the results. Cross-correlation of the noise records allows reconstructing seismic sections with virtual sources at each receiver location. The Rayleigh wave component of the Green's Functions is obtained with a high signal-to-noise ratio. Using CMPCC analysis of the virtual-source seismic lines, we are able to identify lateral variations of phase velocity inside the seismic line, and increase the lateral resolution compared with results of conventional analysis.

  9. Rhizosphere of Avicennia marina (Forsk.) Vierh. as a landmark for polythene degrading bacteria.

    PubMed

    Shahnawaz, Mohd; Sangale, Manisha K; Ade, Avinash B

    2016-07-01

    Due to high durability, cheap cost, and ease of manufacture, 311 million tons of plastic-based products are manufactured around the globe per annum. The slow/least rate of plastic degradation leads to generation of million tons of plastic waste per annum, which is of great environmental concern. Of the total plastic waste generated, polythene shared about 64 %. Various methods are available in the literature to tackle with the plastic waste, and biodegradation is considered as the most accepted, eco-friendly, and cost-effective method of polythene waste disposal. In the present study, an attempt has been made to isolate, screen, and characterize the most efficient polythene degrading bacteria by using rhizosphere soil of Avicennia marina as a landmark. From 12 localities along the west coast of India, a total of 123 bacterial isolates were recorded. Maximum percent weight loss (% WL; 21.87 ± 6.37 %) was recorded with VASB14 at pH 3.5 after 2 months of shaking at room temperature. Maximum percent weight gain (13.87 ± 3.6 %) was reported with MANGB5 at pH 7. Maximum percent loss in tensile strength (% loss in TS; 87.50 ± 4.8 %) was documented with VASB1 at pH 9.5. The results based on the % loss in TS were only reproducible. Further, the level of degradation was confirmed by scanning electron microscopic (SEM) and Fourier transform infrared spectroscopy (FTIR) analysis. In SEM analysis, scions/crakes were found on the surface of the degraded polythene, and mass of bacterial cell was also recorded on the weight-gained polythene strips. Maximum reduction in carbonyl index (4.14 %) was recorded in untreated polythene strip with Lysinibacillus fusiformis strain VASB14/WL. Based on 16S ribosomal RNA (rRNA) gene sequence homology, the most efficient polythene degrading bacteria were identified as L. fusiformis strainVASB14/WL and Bacillus cereus strain VASB1/TS.

  10. Seven New Recorded Species in Five Genera of the Strophariaceae in Korea.

    PubMed

    Cho, Hae Jin; Lee, Hyun; Park, Jae Young; Park, Myung Soo; Kim, Nam Kyu; Eimes, John A; Kim, Changmu; Han, Sang-Kuk; Lim, Young Woon

    2016-09-01

    Most known species in the Strophariaceae are decomposers and grow on various kind of organic matter. Approximately 18 genera and 1,316 species in the Strophariaceae have been reported worldwide. Through an ongoing survey of indigenous fungi in Korea, 29 specimens belonging to the Strophariaceae were collected from 2012 to 2016. These specimens were identified based on morphological characteristics and molecular analysis of internal transcribed spacer sequences. Fifteen taxa were confirmed, with eight species matching those previously recorded. Seven species in five genera were shown to be new records in Korea: Galerina marginata , Gymnopilus crociphyllus , Gymnopilus picreus , Hebeloma birrus , Hebeloma cavipes , Pholiota multicingulata , and Psilocybe thaizapoteca . In this study, we provide detailed morphological descriptions of these species and investigate their evolutionary relationships by constructing phylogenetic trees.

  11. Seven New Recorded Species in Five Genera of the Strophariaceae in Korea

    PubMed Central

    Cho, Hae Jin; Lee, Hyun; Park, Jae Young; Park, Myung Soo; Kim, Nam Kyu; Eimes, John A.; Kim, Changmu; Han, Sang-Kuk

    2016-01-01

    Most known species in the Strophariaceae are decomposers and grow on various kind of organic matter. Approximately 18 genera and 1,316 species in the Strophariaceae have been reported worldwide. Through an ongoing survey of indigenous fungi in Korea, 29 specimens belonging to the Strophariaceae were collected from 2012 to 2016. These specimens were identified based on morphological characteristics and molecular analysis of internal transcribed spacer sequences. Fifteen taxa were confirmed, with eight species matching those previously recorded. Seven species in five genera were shown to be new records in Korea: Galerina marginata, Gymnopilus crociphyllus, Gymnopilus picreus, Hebeloma birrus, Hebeloma cavipes, Pholiota multicingulata, and Psilocybe thaizapoteca. In this study, we provide detailed morphological descriptions of these species and investigate their evolutionary relationships by constructing phylogenetic trees. PMID:27790064

  12. Web-based biobank system infrastructure monitoring using Python, Perl, and PHP.

    PubMed

    Norling, Martin; Kihara, Absolomon; Kemp, Steve

    2013-12-01

    The establishment and maintenance of biobanks is only as worthwhile as the security and logging of the biobank contents. We have designed a monitoring system that continuously measures temperature and gas content, records the movement of samples in and out of the biobank, and also records the opening and closing of the freezers-storing the results and images in a database. We have also incorporated an early warning feature that sends out alerts, via SMS and email, to responsible persons if any measurement is recorded outside the acceptable limits, guaranteeing the integrity of biobanked samples, as well as reagents used in sample analysis. A surveillance system like this increases the value for any biobank as the initial investment is small and the value of having trustworthy samples for future research is high.

  13. Electronic health record analysis via deep poisson factor models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henao, Ricardo; Lu, James T.; Lucas, Joseph E.

    Electronic Health Record (EHR) phenotyping utilizes patient data captured through normal medical practice, to identify features that may represent computational medical phenotypes. These features may be used to identify at-risk patients and improve prediction of patient morbidity and mortality. We present a novel deep multi-modality architecture for EHR analysis (applicable to joint analysis of multiple forms of EHR data), based on Poisson Factor Analysis (PFA) modules. Each modality, composed of observed counts, is represented as a Poisson distribution, parameterized in terms of hidden binary units. In-formation from different modalities is shared via a deep hierarchy of common hidden units. Activationmore » of these binary units occurs with probability characterized as Bernoulli-Poisson link functions, instead of more traditional logistic link functions. In addition, we demon-strate that PFA modules can be adapted to discriminative modalities. To compute model parameters, we derive efficient Markov Chain Monte Carlo (MCMC) inference that scales efficiently, with significant computational gains when compared to related models based on logistic link functions. To explore the utility of these models, we apply them to a subset of patients from the Duke-Durham patient cohort. We identified a cohort of over 12,000 patients with Type 2 Diabetes Mellitus (T2DM) based on diagnosis codes and laboratory tests out of our patient population of over 240,000. Examining the common hidden units uniting the PFA modules, we identify patient features that represent medical concepts. Experiments indicate that our learned features are better able to predict mortality and morbidity than clinical features identified previously in a large-scale clinical trial.« less

  14. Electronic health record analysis via deep poisson factor models

    DOE PAGES

    Henao, Ricardo; Lu, James T.; Lucas, Joseph E.; ...

    2016-01-01

    Electronic Health Record (EHR) phenotyping utilizes patient data captured through normal medical practice, to identify features that may represent computational medical phenotypes. These features may be used to identify at-risk patients and improve prediction of patient morbidity and mortality. We present a novel deep multi-modality architecture for EHR analysis (applicable to joint analysis of multiple forms of EHR data), based on Poisson Factor Analysis (PFA) modules. Each modality, composed of observed counts, is represented as a Poisson distribution, parameterized in terms of hidden binary units. In-formation from different modalities is shared via a deep hierarchy of common hidden units. Activationmore » of these binary units occurs with probability characterized as Bernoulli-Poisson link functions, instead of more traditional logistic link functions. In addition, we demon-strate that PFA modules can be adapted to discriminative modalities. To compute model parameters, we derive efficient Markov Chain Monte Carlo (MCMC) inference that scales efficiently, with significant computational gains when compared to related models based on logistic link functions. To explore the utility of these models, we apply them to a subset of patients from the Duke-Durham patient cohort. We identified a cohort of over 12,000 patients with Type 2 Diabetes Mellitus (T2DM) based on diagnosis codes and laboratory tests out of our patient population of over 240,000. Examining the common hidden units uniting the PFA modules, we identify patient features that represent medical concepts. Experiments indicate that our learned features are better able to predict mortality and morbidity than clinical features identified previously in a large-scale clinical trial.« less

  15. Long Term Activity Analysis in Surveillance Video Archives

    ERIC Educational Resources Information Center

    Chen, Ming-yu

    2010-01-01

    Surveillance video recording is becoming ubiquitous in daily life for public areas such as supermarkets, banks, and airports. The rate at which surveillance video is being generated has accelerated demand for machine understanding to enable better content-based search capabilities. Analyzing human activity is one of the key tasks to understand and…

  16. Computer-Mediated Social Support for Physical Activity: A Content Analysis

    ERIC Educational Resources Information Center

    Stragier, Jeroen; Mechant, Peter; De Marez, Lieven; Cardon, Greet

    2018-01-01

    Purpose: Online fitness communities are a recent phenomenon experiencing growing user bases. They can be considered as online social networks in which recording, monitoring, and sharing of physical activity (PA) are the most prevalent practices. They have added a new dimension to the social experience of PA in which online peers function as…

  17. An Emic Lens into Online Learning Environments in PPL in Undergraduate Dentistry

    ERIC Educational Resources Information Center

    Bridges, Susan

    2015-01-01

    Whilst face-to-face tutorial group interaction has been the focus of quantitative and qualitative studies in problem-based learning (PBL), little work has explored the independent learning phase of the PBL cycle from an interactionist perspective. An interactional ethnographic logic of inquiry guided collection and analysis of video recordings and…

  18. The Mastersingers: Language and Practice in an Operatic Masterclass

    ERIC Educational Resources Information Center

    Atkinson, Paul

    2013-01-01

    The paper presents a microethnographic examination of an operatic masterclass, based on a transcribed video recording of just one such class. It is a companion piece to a more generalised ethnographic account of such masterclasses as pedagogic events. The detailed analysis demonstrates the close relationship between spoken and unspoken actions in…

  19. Bodily Experiences in Secondary School Biology

    ERIC Educational Resources Information Center

    Orlander, Auli Arvola; Wickman, Per-Olof

    2011-01-01

    This is a study of teaching about the human body. It is based on transcribed material from interviews with 15-year-old students and teachers about their experiences of sex education and from recordings of classroom interactions during a dissection. The analysis is focused on the relationship between what students are supposed to learn about the…

  20. Real-time Author Co-citation Mapping for Online Searching.

    ERIC Educational Resources Information Center

    Lin, Xia; White, Howard D.; Buzydlowski, Jan

    2003-01-01

    Describes the design and implementation of a prototype visualization system, AuthorLink, to enhance author searching. AuthorLink is based on author co-citation analysis and visualization mapping algorithms. AuthorLink produces interactive author maps in real time from a database of 1.26 million records supplied by the Institute for Scientific…

Top