Sample records for analysis time compared

  1. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  2. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  3. Defense Planning In A Time Of Conflict: A Comparative Analysis Of The 2001-2014 Quadrennial Defense Reviews, And Implications For The Army--Executive Summary

    DTIC Science & Technology

    2018-01-01

    Defense Planning in a Time of Conflict A Comparative Analysis of the 2001– 2014 Quadrennial Defense Reviews, and Implications for...The purpose of the project was to perform a comparative historical review of the four Quadrennial Defense Reviews (QDRs) conducted since the first...Planning in a Time of Conflict: A Comparative Analysis of the 2001–2014 Quadrennial Defense Reviews, and Implications for the Army—that documented

  4. A comparative analysis of spectral exponent estimation techniques for 1/fβ processes with applications to the analysis of stride interval time series

    PubMed Central

    Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin

    2013-01-01

    Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509

  5. A comparative analysis of spectral exponent estimation techniques for 1/f(β) processes with applications to the analysis of stride interval time series.

    PubMed

    Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin

    2014-01-30

    The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Comparative analysis of the modified enclosed energy metric for self-focusing holograms from digital lensless holographic microscopy.

    PubMed

    Trujillo, Carlos; Garcia-Sucerquia, Jorge

    2015-06-01

    A comparative analysis of the performance of the modified enclosed energy (MEE) method for self-focusing holograms recorded with digital lensless holographic microscopy is presented. Notwithstanding the MEE analysis previously published, no extended analysis of its performance has been reported. We have tested the MEE in terms of the minimum axial distance allowed between the set of reconstructed holograms to search for the focal plane and the elapsed time to obtain the focused image. These parameters have been compared with those for some of the already reported methods in the literature. The MEE achieves better results in terms of self-focusing quality but at a higher computational cost. Despite its longer processing time, the method remains within a time frame to be technologically attractive. Modeled and experimental holograms have been utilized in this work to perform the comparative study.

  7. Advance in ERG Analysis: From Peak Time and Amplitude to Frequency, Power, and Energy

    PubMed Central

    Lina, Jean-Marc; Lachapelle, Pierre

    2014-01-01

    Purpose. To compare time domain (TD: peak time and amplitude) analysis of the human photopic electroretinogram (ERG) with measures obtained in the frequency domain (Fourier analysis: FA) and in the time-frequency domain (continuous (CWT) and discrete (DWT) wavelet transforms). Methods. Normal ERGs (n = 40) were analyzed using traditional peak time and amplitude measurements of the a- and b-waves in the TD and descriptors extracted from FA, CWT, and DWT. Selected descriptors were also compared in their ability to monitor the long-term consequences of disease process. Results. Each method extracted relevant information but had distinct limitations (i.e., temporal and frequency resolutions). The DWT offered the best compromise by allowing us to extract more relevant descriptors of the ERG signal at the cost of lesser temporal and frequency resolutions. Follow-ups of disease progression were more prolonged with the DWT (max 29 years compared to 13 with TD). Conclusions. Standardized time domain analysis of retinal function should be complemented with advanced DWT descriptors of the ERG. This method should allow more sensitive/specific quantifications of ERG responses, facilitate follow-up of disease progression, and identify diagnostically significant changes of ERG waveforms that are not resolved when the analysis is only limited to time domain measurements. PMID:25061605

  8. Process fault detection and nonlinear time series analysis for anomaly detection in safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, T.L.; Mullen, M.F.; Wangen, L.E.

    In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect lossmore » of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.« less

  9. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  10. Real-Time Classification of Exercise Exertion Levels Using Discriminant Analysis of HRV Data.

    PubMed

    Jeong, In Cheol; Finkelstein, Joseph

    2015-01-01

    Heart rate variability (HRV) was shown to reflect activation of sympathetic nervous system however it is not clear which set of HRV parameters is optimal for real-time classification of exercise exertion levels. There is no studies that compared potential of two types of HRV parameters (time-domain and frequency-domain) in predicting exercise exertion level using discriminant analysis. The main goal of this study was to compare potential of HRV time-domain parameters versus HRV frequency-domain parameters in classifying exercise exertion level. Rest, exercise, and recovery categories were used in classification models. Overall 79.5% classification agreement by the time-domain parameters as compared to overall 52.8% classification agreement by frequency-domain parameters demonstrated that the time-domain parameters had higher potential in classifying exercise exertion levels.

  11. Detection of myocardial ischemia by automated, motion-corrected, color-encoded perfusion maps compared with visual analysis of adenosine stress cardiovascular magnetic resonance imaging at 3 T: a pilot study.

    PubMed

    Doesch, Christina; Papavassiliu, Theano; Michaely, Henrik J; Attenberger, Ulrike I; Glielmi, Christopher; Süselbeck, Tim; Fink, Christian; Borggrefe, Martin; Schoenberg, Stefan O

    2013-09-01

    The purpose of this study was to compare automated, motion-corrected, color-encoded (AMC) perfusion maps with qualitative visual analysis of adenosine stress cardiovascular magnetic resonance imaging for detection of flow-limiting stenoses. Myocardial perfusion measurements applying the standard adenosine stress imaging protocol and a saturation-recovery temporal generalized autocalibrating partially parallel acquisition (t-GRAPPA) turbo fast low angle shot (Turbo FLASH) magnetic resonance imaging sequence were performed in 25 patients using a 3.0-T MAGNETOM Skyra (Siemens Healthcare Sector, Erlangen, Germany). Perfusion studies were analyzed using AMC perfusion maps and qualitative visual analysis. Angiographically detected coronary artery (CA) stenoses greater than 75% or 50% or more with a myocardial perfusion reserve index less than 1.5 were considered as hemodynamically relevant. Diagnostic performance and time requirement for both methods were compared. Interobserver and intraobserver reliability were also assessed. A total of 29 CA stenoses were included in the analysis. Sensitivity, specificity, positive predictive value, negative predictive value, and accuracy for detection of ischemia on a per-patient basis were comparable using the AMC perfusion maps compared to visual analysis. On a per-CA territory basis, the attribution of an ischemia to the respective vessel was facilitated using the AMC perfusion maps. Interobserver and intraobserver reliability were better for the AMC perfusion maps (concordance correlation coefficient, 0.94 and 0.93, respectively) compared to visual analysis (concordance correlation coefficient, 0.73 and 0.79, respectively). In addition, in comparison to visual analysis, the AMC perfusion maps were able to significantly reduce analysis time from 7.7 (3.1) to 3.2 (1.9) minutes (P < 0.0001). The AMC perfusion maps yielded a diagnostic performance on a per-patient and on a per-CA territory basis comparable with the visual analysis. Furthermore, this approach demonstrated higher interobserver and intraobserver reliability as well as a better time efficiency when compared to visual analysis.

  12. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  13. Real-time fMRI processing with physiological noise correction - Comparison with off-line analysis.

    PubMed

    Misaki, Masaya; Barzigar, Nafise; Zotev, Vadim; Phillips, Raquel; Cheng, Samuel; Bodurka, Jerzy

    2015-12-30

    While applications of real-time functional magnetic resonance imaging (rtfMRI) are growing rapidly, there are still limitations in real-time data processing compared to off-line analysis. We developed a proof-of-concept real-time fMRI processing (rtfMRIp) system utilizing a personal computer (PC) with a dedicated graphic processing unit (GPU) to demonstrate that it is now possible to perform intensive whole-brain fMRI data processing in real-time. The rtfMRIp performs slice-timing correction, motion correction, spatial smoothing, signal scaling, and general linear model (GLM) analysis with multiple noise regressors including physiological noise modeled with cardiac (RETROICOR) and respiration volume per time (RVT). The whole-brain data analysis with more than 100,000voxels and more than 250volumes is completed in less than 300ms, much faster than the time required to acquire the fMRI volume. Real-time processing implementation cannot be identical to off-line analysis when time-course information is used, such as in slice-timing correction, signal scaling, and GLM. We verified that reduced slice-timing correction for real-time analysis had comparable output with off-line analysis. The real-time GLM analysis, however, showed over-fitting when the number of sampled volumes was small. Our system implemented real-time RETROICOR and RVT physiological noise corrections for the first time and it is capable of processing these steps on all available data at a given time, without need for recursive algorithms. Comprehensive data processing in rtfMRI is possible with a PC, while the number of samples should be considered in real-time GLM. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Target Identification Using Harmonic Wavelet Based ISAR Imaging

    NASA Astrophysics Data System (ADS)

    Shreyamsha Kumar, B. K.; Prabhakar, B.; Suryanarayana, K.; Thilagavathi, V.; Rajagopal, R.

    2006-12-01

    A new approach has been proposed to reduce the computations involved in the ISAR imaging, which uses harmonic wavelet-(HW) based time-frequency representation (TFR). Since the HW-based TFR falls into a category of nonparametric time-frequency (T-F) analysis tool, it is computationally efficient compared to parametric T-F analysis tools such as adaptive joint time-frequency transform (AJTFT), adaptive wavelet transform (AWT), and evolutionary AWT (EAWT). Further, the performance of the proposed method of ISAR imaging is compared with the ISAR imaging by other nonparametric T-F analysis tools such as short-time Fourier transform (STFT) and Choi-Williams distribution (CWD). In the ISAR imaging, the use of HW-based TFR provides similar/better results with significant (92%) computational advantage compared to that obtained by CWD. The ISAR images thus obtained are identified using a neural network-based classification scheme with feature set invariant to translation, rotation, and scaling.

  15. Selecting risk factors: a comparison of discriminant analysis, logistic regression and Cox's regression model using data from the Tromsø Heart Study.

    PubMed

    Brenn, T; Arnesen, E

    1985-01-01

    For comparative evaluation, discriminant analysis, logistic regression and Cox's model were used to select risk factors for total and coronary deaths among 6595 men aged 20-49 followed for 9 years. Groups with mortality between 5 and 93 per 1000 were considered. Discriminant analysis selected variable sets only marginally different from the logistic and Cox methods which always selected the same sets. A time-saving option, offered for both the logistic and Cox selection, showed no advantage compared with discriminant analysis. Analysing more than 3800 subjects, the logistic and Cox methods consumed, respectively, 80 and 10 times more computer time than discriminant analysis. When including the same set of variables in non-stepwise analyses, all methods estimated coefficients that in most cases were almost identical. In conclusion, discriminant analysis is advocated for preliminary or stepwise analysis, otherwise Cox's method should be used.

  16. Standardization of pitch-range settings in voice acoustic analysis.

    PubMed

    Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C

    2009-05-01

    Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.

  17. The error and bias of supplementing a short, arid climate, rainfall record with regional vs. global frequency analysis

    NASA Astrophysics Data System (ADS)

    Endreny, Theodore A.; Pashiardis, Stelios

    2007-02-01

    SummaryRobust and accurate estimates of rainfall frequencies are difficult to make with short, and arid-climate, rainfall records, however new regional and global methods were used to supplement such a constrained 15-34 yr record in Cyprus. The impact of supplementing rainfall frequency analysis with the regional and global approaches was measured with relative bias and root mean square error (RMSE) values. Analysis considered 42 stations with 8 time intervals (5-360 min) in four regions delineated by proximity to sea and elevation. Regional statistical algorithms found the sites passed discordancy tests of coefficient of variation, skewness and kurtosis, while heterogeneity tests revealed the regions were homogeneous to mildly heterogeneous. Rainfall depths were simulated in the regional analysis method 500 times, and then goodness of fit tests identified the best candidate distribution as the general extreme value (GEV) Type II. In the regional analysis, the method of L-moments was used to estimate location, shape, and scale parameters. In the global based analysis, the distribution was a priori prescribed as GEV Type II, a shape parameter was a priori set to 0.15, and a time interval term was constructed to use one set of parameters for all time intervals. Relative RMSE values were approximately equal at 10% for the regional and global method when regions were compared, but when time intervals were compared the global method RMSE had a parabolic-shaped time interval trend. Relative bias values were also approximately equal for both methods when regions were compared, but again a parabolic-shaped time interval trend was found for the global method. The global method relative RMSE and bias trended with time interval, which may be caused by fitting a single scale value for all time intervals.

  18. Automating data analysis for two-dimensional gas chromatography/time-of-flight mass spectrometry non-targeted analysis of comparative samples.

    PubMed

    Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey

    2018-03-16

    Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Alignment of high-throughput sequencing data inside in-memory databases.

    PubMed

    Firnkorn, Daniel; Knaup-Gregori, Petra; Lorenzo Bermejo, Justo; Ganzinger, Matthias

    2014-01-01

    In times of high-throughput DNA sequencing techniques, performance-capable analysis of DNA sequences is of high importance. Computer supported DNA analysis is still an intensive time-consuming task. In this paper we explore the potential of a new In-Memory database technology by using SAP's High Performance Analytic Appliance (HANA). We focus on read alignment as one of the first steps in DNA sequence analysis. In particular, we examined the widely used Burrows-Wheeler Aligner (BWA) and implemented stored procedures in both, HANA and the free database system MySQL, to compare execution time and memory management. To ensure that the results are comparable, MySQL has been running in memory as well, utilizing its integrated memory engine for database table creation. We implemented stored procedures, containing exact and inexact searching of DNA reads within the reference genome GRCh37. Due to technical restrictions in SAP HANA concerning recursion, the inexact matching problem could not be implemented on this platform. Hence, performance analysis between HANA and MySQL was made by comparing the execution time of the exact search procedures. Here, HANA was approximately 27 times faster than MySQL which means, that there is a high potential within the new In-Memory concepts, leading to further developments of DNA analysis procedures in the future.

  20. Bayesian analyses of time-interval data for environmental radiation monitoring.

    PubMed

    Luo, Peng; Sharp, Julia L; DeVol, Timothy A

    2013-01-01

    Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.

  1. Harmonic versus LigaSure hemostasis technique in thyroid surgery: A meta-analysis

    PubMed Central

    Upadhyaya, Arun; Hu, Tianpeng; Meng, Zhaowei; Li, Xue; He, Xianghui; Tian, Weijun; Jia, Qiang; Tan, Jian

    2016-01-01

    Harmonic scalpel and LigaSure vessel sealing systems have been suggested as options for saving surgical time and reducing postoperative complications. The aim of the present meta-analysis was to compare surgical time, postoperative complications and other parameters between them in for the open thyroidectomy procedure. Studies were retrieved from MEDLINE, Cochrane Library, EMBASE and ISI Web of Science until December 2015. All the randomized controlled trials (RCTs) comparing Harmonic scalpel and LigaSure during open thyroidectomy were selected. Following data extraction, statistical analyses were performed. Among the 24 studies that were evaluated for eligibility, 7 RCTs with 981 patients were included. The Harmonic scalpel significantly reduced surgical time compared with LigaSure techniques (8.79 min; 95% confidence interval, −15.91 to −1.67; P=0.02). However, no significant difference was observed for the intraoperative blood loss, postoperative blood loss, duration of hospital stay, thyroid weight and serum calcium level postoperatively in either group. The present meta-analysis indicated superiority of Harmonic Scalpel only in terms of surgical time compared with LigaSure hemostasis techniques in open thyroid surgery. PMID:27446546

  2. Estimating the effect of immortal-time bias in urological research: a case example of testosterone-replacement therapy.

    PubMed

    Wallis, Christopher J D; Saskin, Refik; Narod, Steven A; Law, Calvin; Kulkarni, Girish S; Seth, Arun; Nam, Robert K

    2017-10-01

    To quantify the effect of immortal-time bias in an observational study examining the effect of cumulative testosterone exposure on mortality. We used a population-based matched cohort study of men aged ≥66 years, newly treated with testosterone-replacement therapy (TRT), and matched-controls from 2007 to 2012 in Ontario, Canada to quantify the effects of immortal-time bias. We used generalised estimating equations to determine the association between cumulative TRT exposure and mortality. Results produced by models using time-fixed and time-varying exposures were compared. Further, we undertook a systematic review of PubMed to identify studies addressing immortal-time bias or time-varying exposures in the urological literature and qualitatively summated these. Among 10 311 TRT-exposed men and 28 029 controls, the use of a time-varying exposure resulted in the attenuation of treatment effects compared with an analysis that did not account for immortal-time bias. While both analyses showed a decreased risk of death for patients in the highest tertile of TRT exposure, the effect was overestimated when using a time-fixed analysis (adjusted hazard ratio [aHR] 0.56, 95% confidence interval [CI]: 0.52-0.61) when compared to a time-varying analysis (aHR 0.67, 95% CI: 0.62-0.73). Of the 1 241 studies employing survival analysis identified in the literature, nine manuscripts met criteria for inclusion. Of these, five used a time-varying analytical method. Each of these was a large, population-based retrospective cohort study assessing potential harms of pharmacological agents. Where exposures vary over time, a time-varying exposure is necessary to draw meaningful conclusions. Failure to use a time-varying analysis will result in overestimation of a beneficial effect. However, time-varying exposures are uncommonly utilised among manuscripts published in prominent urological journals. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.

  3. Comparative study of predicted and experimentally detected interplanetary shocks

    NASA Astrophysics Data System (ADS)

    Kartalev, M. D.; Grigorov, K. G.; Smith, Z.; Dryer, M.; Fry, C. D.; Sun, Wei; Deehr, C. S.

    2002-03-01

    We compare the real time space weather prediction shock arrival times at 1 AU made by the USAF/NOAA Shock Time of Arrival (STOA) and Interplanetary Shock Propagation Model (ISPM) models, and the Exploration Physics International/University of Alaska Hakamada-Akasofu-Fry Solar Wind Model (HAF-v2) to a real time analysis analysis of plasma and field ACE data. The comparison is made using an algorithm that was developed on the basis of wavelet data analysis and MHD identification procedure. The shock parameters are estimated for selected "candidate events". An appropriate automatically performing Web-based interface periodically utilizes solar wind observations made by the ACE at L1. Near real time results as well an archive of the registered interesting events are available on a specially developed web site. A number of events are considered. These studies are essential for the validation of real time space weather forecasts made from solar data.

  4. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    PubMed

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  5. A spectral power analysis of driving behavior changes during the transition from nondistraction to distraction.

    PubMed

    Wang, Yuan; Bao, Shan; Du, Wenjun; Ye, Zhirui; Sayer, James R

    2017-11-17

    This article investigated and compared frequency domain and time domain characteristics of drivers' behaviors before and after the start of distracted driving. Data from an existing naturalistic driving study were used. Fast Fourier transform (FFT) was applied for the frequency domain analysis to explore drivers' behavior pattern changes between nondistracted (prestarting of visual-manual task) and distracted (poststarting of visual-manual task) driving periods. Average relative spectral power in a low frequency range (0-0.5 Hz) and the standard deviation in a 10-s time window of vehicle control variables (i.e., lane offset, yaw rate, and acceleration) were calculated and further compared. Sensitivity analyses were also applied to examine the reliability of the time and frequency domain analyses. Results of the mixed model analyses from the time and frequency domain analyses all showed significant degradation in lateral control performance after engaging in visual-manual tasks while driving. Results of the sensitivity analyses suggested that the frequency domain analysis was less sensitive to the frequency bandwidth, whereas the time domain analysis was more sensitive to the time intervals selected for variation calculations. Different time interval selections can result in significantly different standard deviation values, whereas average spectral power analysis on yaw rate in both low and high frequency bandwidths showed consistent results, that higher variation values were observed during distracted driving when compared to nondistracted driving. This study suggests that driver state detection needs to consider the behavior changes during the prestarting periods, instead of only focusing on periods with physical presence of distraction, such as cell phone use. Lateral control measures can be a better indicator of distraction detection than longitudinal controls. In addition, frequency domain analyses proved to be a more robust and consistent method in assessing driving performance compared to time domain analyses.

  6. Maximum Entropy Method applied to Real-time Time-Dependent Density Functional Theory

    NASA Astrophysics Data System (ADS)

    Zempo, Yasunari; Toogoshi, Mitsuki; Kano, Satoru S.

    Maximum Entropy Method (MEM) is widely used for the analysis of a time-series data such as an earthquake, which has fairly long-periodicity but short observable data. We have examined MEM to apply to the optical analysis of the time-series data from the real-time TDDFT. In the analysis, usually Fourier Transform (FT) is used, and we have to pay our attention to the lower energy part such as the band gap, which requires the long time evolution. The computational cost naturally becomes quite expensive. Since MEM is based on the autocorrelation of the signal, in which the periodicity can be described as the difference of time-lags, its value in the lower energy naturally gets small compared to that in the higher energy. To improve the difficulty, our MEM has the two features: the raw data is repeated it many times and concatenated, which provides the lower energy resolution in high resolution; together with the repeated data, an appropriate phase for the target frequency is introduced to reduce the side effect of the artificial periodicity. We have compared our improved MEM and FT spectrum using small-to-medium size molecules. We can see the clear spectrum of MEM, compared to that of FT. Our new technique provides higher resolution in fewer steps, compared to that of FT. This work was partially supported by JSPS Grants-in-Aid for Scientific Research (C) Grant number 16K05047, Sumitomo Chemical, Co. Ltd., and Simulatio Corp.

  7. Spontaneous Swallowing Frequency [Has Potential to] Identify Dysphagia in Acute Stroke

    PubMed Central

    Carnaby, Giselle D; Sia, Isaac; Khanna, Anna; Waters, Michael

    2014-01-01

    Background and Purpose Spontaneous swallowing frequency has been described as an index of dysphagia in various health conditions. This study evaluated the potential of spontaneous swallow frequency analysis as a screening protocol for dysphagia in acute stroke. Methods In a cohort of 63 acute stroke cases swallow frequency rates (swallows per minute: SPM) were compared to stroke and swallow severity indices, age, time from stroke to assessment, and consciousness level. Mean differences in SPM were compared between patients with vs. without clinically significant dysphagia. ROC analysis was used to identify the optimal threshold in SPM which was compared to a validated clinical dysphagia examination for identification of dysphagia cases. Time series analysis was employed to identify the minimally adequate time period to complete spontaneous swallow frequency analysis. Results SPM correlated significantly with stroke and swallow severity indices but not with age, time from stroke onset, or consciousness level. Patients with dysphagia demonstrated significantly lower SPM rates. SPM differed by dysphagia severity. ROC analysis yielded a threshold of SPM ≤ 0.40 which identified dysphagia (per the criterion referent) with 0.96 sensitivity, 0.68 specificity, and 0.96 negative predictive value. Time series analysis indicated that a 5 to 10 minute sampling window was sufficient to calculate spontaneous swallow frequency to identify dysphagia cases in acute stroke. Conclusions Spontaneous swallowing frequency presents high potential to screen for dysphagia in acute stroke without the need for trained, available personnel. PMID:24149008

  8. Spontaneous swallowing frequency has potential to identify dysphagia in acute stroke.

    PubMed

    Crary, Michael A; Carnaby, Giselle D; Sia, Isaac; Khanna, Anna; Waters, Michael F

    2013-12-01

    Spontaneous swallowing frequency has been described as an index of dysphagia in various health conditions. This study evaluated the potential of spontaneous swallow frequency analysis as a screening protocol for dysphagia in acute stroke. In a cohort of 63 acute stroke cases, swallow frequency rates (swallows per minute [SPM]) were compared with stroke and swallow severity indices, age, time from stroke to assessment, and consciousness level. Mean differences in SPM were compared between patients with versus without clinically significant dysphagia. Receiver operating characteristic curve analysis was used to identify the optimal threshold in SPM, which was compared with a validated clinical dysphagia examination for identification of dysphagia cases. Time series analysis was used to identify the minimally adequate time period to complete spontaneous swallow frequency analysis. SPM correlated significantly with stroke and swallow severity indices but not with age, time from stroke onset, or consciousness level. Patients with dysphagia demonstrated significantly lower SPM rates. SPM differed by dysphagia severity. Receiver operating characteristic curve analysis yielded a threshold of SPM≤0.40 that identified dysphagia (per the criterion referent) with 0.96 sensitivity, 0.68 specificity, and 0.96 negative predictive value. Time series analysis indicated that a 5- to 10-minute sampling window was sufficient to calculate spontaneous swallow frequency to identify dysphagia cases in acute stroke. Spontaneous swallowing frequency presents high potential to screen for dysphagia in acute stroke without the need for trained, available personnel.

  9. Droplet Microarray Based on Superhydrophobic-Superhydrophilic Patterns for Single Cell Analysis.

    PubMed

    Jogia, Gabriella E; Tronser, Tina; Popova, Anna A; Levkin, Pavel A

    2016-12-09

    Single-cell analysis provides fundamental information on individual cell response to different environmental cues and is a growing interest in cancer and stem cell research. However, current existing methods are still facing challenges in performing such analysis in a high-throughput manner whilst being cost-effective. Here we established the Droplet Microarray (DMA) as a miniaturized screening platform for high-throughput single-cell analysis. Using the method of limited dilution and varying cell density and seeding time, we optimized the distribution of single cells on the DMA. We established culturing conditions for single cells in individual droplets on DMA obtaining the survival of nearly 100% of single cells and doubling time of single cells comparable with that of cells cultured in bulk cell population using conventional methods. Our results demonstrate that the DMA is a suitable platform for single-cell analysis, which carries a number of advantages compared with existing technologies allowing for treatment, staining and spot-to-spot analysis of single cells over time using conventional analysis methods such as microscopy.

  10. G14A-06- Analysis of the DORIS, GNSS, SLR, VLBI and Gravimetric Time Series at the GGOS Core Sites

    NASA Technical Reports Server (NTRS)

    Moreaux, G.; Lemoine, F.; Luceri, V.; Pavlis, E.; MacMillan, D.; Bonvalot, S.; Saunier, J.

    2017-01-01

    Analysis of the time series at the 3-4 multi-technique GGOS sites to analyze and compare the spectral content of the space geodetic and gravity time series. Evaluate the level of agreement between the space geodesy measurements and the physical tie vectors.

  11. Scalability of Comparative Analysis, Novel Algorithms and Tools (MICW - Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema

    Mavrommatis, Kostas

    2017-12-22

    DOE JGI's Kostas Mavrommatis, chair of the Scalability of Comparative Analysis, Novel Algorithms and Tools panel, at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  12. A new finite element formulation for computational fluid dynamics. IX - Fourier analysis of space-time Galerkin/least-squares algorithms

    NASA Technical Reports Server (NTRS)

    Shakib, Farzin; Hughes, Thomas J. R.

    1991-01-01

    A Fourier stability and accuracy analysis of the space-time Galerkin/least-squares method as applied to a time-dependent advective-diffusive model problem is presented. Two time discretizations are studied: a constant-in-time approximation and a linear-in-time approximation. Corresponding space-time predictor multi-corrector algorithms are also derived and studied. The behavior of the space-time algorithms is compared to algorithms based on semidiscrete formulations.

  13. Comparing Internet Probing Methodologies Through an Analysis of Large Dynamic Graphs

    DTIC Science & Technology

    2014-06-01

    comparable Internet topologies in less time. We compare these by modeling union of traceroute outputs as graphs, and using standard graph theoretical...topologies in less time. We compare these by modeling union of traceroute outputs as graphs, and using standard graph theoretical measurements as well...We compare these by modeling union of traceroute outputs as graphs, and study the graphs by using vertex and edge count, average vertex degree

  14. Optimisation and validation of a rapid and efficient microemulsion liquid chromatographic (MELC) method for the determination of paracetamol (acetaminophen) content in a suppository formulation.

    PubMed

    McEvoy, Eamon; Donegan, Sheila; Power, Joe; Altria, Kevin

    2007-05-09

    A rapid and efficient oil-in-water microemulsion liquid chromatographic method has been optimised and validated for the analysis of paracetamol in a suppository formulation. Excellent linearity, accuracy, precision and assay results were obtained. Lengthy sample pre-treatment/extraction procedures were eliminated due to the solubilising power of the microemulsion and rapid analysis times were achieved. The method was optimised to achieve rapid analysis time and relatively high peak efficiencies. A standard microemulsion composition of 33 g SDS, 66 g butan-1-ol, 8 g n-octane in 1l of 0.05% TFA modified with acetonitrile has been shown to be suitable for the rapid analysis of paracetamol in highly hydrophobic preparations under isocratic conditions. Validated assay results and overall analysis time of the optimised method was compared to British Pharmacopoeia reference methods. Sample preparation and analysis times for the MELC analysis of paracetamol in a suppository were extremely rapid compared to the reference method and similar assay results were achieved. A gradient MELC method using the same microemulsion has been optimised for the resolution of paracetamol and five of its related substances in approximately 7 min.

  15. Migration of Undergraduate First-Time Transfers: Snapshot Analysis 2006-2008

    ERIC Educational Resources Information Center

    South Carolina Commission on Higher Education, 2010

    2010-01-01

    The Commission on Higher Education had a student intern from USC-Columbia initiate an analysis of data on the migration of undergraduate first-time transfers to compare trends, growth, and proportions of transfers to and from various sectors and institution types over a three-year period, from 2006-2008. Staff have refined the analysis and…

  16. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study.

    PubMed

    van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-08-07

    Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.

  17. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study

    PubMed Central

    Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160

  18. Agreement between self-reported data on medicine use and prescription records vary according to method of analysis and therapeutic group.

    PubMed

    Nielsen, Merete Willemoes; Søndergaard, Birthe; Kjøller, Mette; Hansen, Ebba Holme

    2008-09-01

    This study compared national self-reported data on medicine use and national prescription records at the individual level. Data from the nationally representative Danish health survey conducted in 2000 (n=16,688) were linked at the individual level to national prescription records covering 1999-2000. Kappa statistics and 95% confidence intervals were calculated. Applying the legend time method to medicine groups used mainly on a chronic basis revealed good to very good agreement between the two data sources, whereas medicines used as needed showed fair to moderate agreement. When a fixed-time window was applied for analysis, agreement was unchanged for medicines used mainly on a chronic basis, whereas agreement increased somewhat compared to the legend time method when analyzing medicines used as needed. Agreement between national self-reported data and national prescription records differed according to method of analysis and therapeutic group. A fixed-time window is an appropriate method of analysis for most therapeutic groups.

  19. Laparoendoscopic single-site surgery varicocelectomy versus conventional laparoscopic varicocele ligation: A meta-analysis

    PubMed Central

    Li, Mingchao; Wang, Zhengyun

    2016-01-01

    Objective To perform a meta-analysis of data from available published studies comparing laparoendoscopic single-site surgery varicocelectomy (LESSV) with conventional transperitoneal laparoscopic varicocele ligation. Methods A comprehensive data search was performed in PubMed and Embase to identify randomized controlled trials and comparative studies that compared the two surgical approaches for the treatment of varicoceles. Results Six studies were included in the meta-analysis. LESSV required a significantly longer operative time than conventional laparoscopic varicocelectomy but was associated with significantly less postoperative pain at 6 h and 24 h, a shorter recovery time and greater patient satisfaction with the cosmetic outcome. There was no difference between the two surgical approaches in terms of postoperative semen quality or the incidence of complications. Conclusion These data suggest that LESSV offers a well tolerated and efficient alternative to conventional laparoscopic varicocelectomy, with less pain, a shorter recovery time and better cosmetic satisfaction. Further well-designed studies are required to confirm these findings and update the results of this meta-analysis. PMID:27688686

  20. Improved analysis of ground vibrations produced by man-made sources.

    PubMed

    Ainalis, Daniel; Ducarne, Loïc; Kaufmann, Olivier; Tshibangu, Jean-Pierre; Verlinden, Olivier; Kouroussis, Georges

    2018-03-01

    Man-made sources of ground vibration must be carefully monitored in urban areas in order to ensure that structural damage and discomfort to residents is prevented or minimised. The research presented in this paper provides a comparative evaluation of various methods used to analyse a series of tri-axial ground vibration measurements generated by rail, road, and explosive blasting. The first part of the study is focused on comparing various techniques to estimate the dominant frequency, including time-frequency analysis. The comparative evaluation of the various methods to estimate the dominant frequency revealed that, depending on the method used, there can be significant variation in the estimates obtained. A new and improved analysis approach using the continuous wavelet transform was also presented, using the time-frequency distribution to estimate the localised dominant frequency and peak particle velocity. The technique can be used to accurately identify the level and frequency content of a ground vibration signal as it varies with time, and identify the number of times the threshold limits of damage are exceeded. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Evaluation of the Circulatory Dynamics by using the Windkessel Model in Different Body Positions

    NASA Astrophysics Data System (ADS)

    Kotani, Kiyoshi; Iida, Fumiaki; Ogawa, Yutaro; Takamasu, Kiyoshi; Jimbo, Yasuhiko

    Autonomic nervous system is important in maintaining homeostasis by the opposing effects of sympathetic and parasympathetic nervous activity on organs. However, it is known that they are at times simultaneously increased or decreased in cases of strong fear or depression. Therefore, it is required to evaluate sympathetic and parasympathetic nervous activity independently. In this paper, we propose a method to evaluate sympathetic nervous activity by analyzing the decreases in blood pressure by utilizing the Windkessel model. Experiments are performed in sitting and standing positions for 380 s, respectively. First, we evaluate the effects of length for analysis on the Windkessel time constant. We shorten the length for analysis by multiplying constant coefficients (1.0, 0.9, and 0.8) to the length of blood pressure decrease and then cut-out the waveform for analysis. Then it is found that the Windkessel time constant is decreased as the length for analysis is shortened. This indicates that the length for analysis should be matched when the different experiments are compared. Second, we compare the Windkessel time constant of sitting to that of standing by matching their length for analysis. With statistically significant difference (P<0.05) the results indicate that the Windkessel time constant is larger in the sitting position. Through our observations this difference in the Windkessel time constant is caused by sympathetic nervous activity on vascular smooth muscle.

  2. Medical student professionalism narratives: a thematic analysis and interdisciplinary comparative investigation.

    PubMed

    Bernard, Aaron W; Malone, Matthew; Kman, Nicholas E; Caterino, Jeffrey M; Khandelwal, Sorabh

    2011-08-12

    Professionalism development is influenced by the informal and hidden curriculum. The primary objective of this study was to better understand this experiential learning in the setting of the Emergency Department (ED). Secondarily, the study aimed to explore differences in the informal curriculum between Emergency Medicine (EM) and Internal Medicine (IM) clerkships. A thematic analysis was conducted on 377 professionalism narratives from medical students completing a required EM clerkship from July 2008 through May 2010. The narratives were analyzed using established thematic categories from prior research as well as basic descriptive characteristics. Chi-square analysis was used to compare the frequency of thematic categories to prior research in IM. Finally, emerging themes not fully appreciated in the established thematic categories were created using grounded theory. Observations involving interactions between attending physician and patient were most abundant. The narratives were coded as positive 198 times, negative 128 times, and hybrid 37 times. The two most abundant narrative themes involved manifesting respect (36.9%) and spending time (23.7%). Both of these themes were statistically more likely to be noted by students on EM clerkships compared to IM clerkships. Finally, one new theme regarding cynicism emerged during analysis. This analysis describes an informal curriculum that is diverse in themes. Student narratives suggest their clinical experiences to be influential on professionalism development. Medical students focus on different aspects of professionalism depending on clerkship specialty.

  3. Medical Student Professionalism Narratives: A Thematic Analysis and Interdisciplinary Comparative Investigation

    PubMed Central

    2011-01-01

    Background Professionalism development is influenced by the informal and hidden curriculum. The primary objective of this study was to better understand this experiential learning in the setting of the Emergency Department (ED). Secondarily, the study aimed to explore differences in the informal curriculum between Emergency Medicine (EM) and Internal Medicine (IM) clerkships. Methods A thematic analysis was conducted on 377 professionalism narratives from medical students completing a required EM clerkship from July 2008 through May 2010. The narratives were analyzed using established thematic categories from prior research as well as basic descriptive characteristics. Chi-square analysis was used to compare the frequency of thematic categories to prior research in IM. Finally, emerging themes not fully appreciated in the established thematic categories were created using grounded theory. Results Observations involving interactions between attending physician and patient were most abundant. The narratives were coded as positive 198 times, negative 128 times, and hybrid 37 times. The two most abundant narrative themes involved manifesting respect (36.9%) and spending time (23.7%). Both of these themes were statistically more likely to be noted by students on EM clerkships compared to IM clerkships. Finally, one new theme regarding cynicism emerged during analysis. Conclusions This analysis describes an informal curriculum that is diverse in themes. Student narratives suggest their clinical experiences to be influential on professionalism development. Medical students focus on different aspects of professionalism depending on clerkship specialty. PMID:21838887

  4. Vibration Signature Analysis of a Faulted Gear Transmission System

    NASA Technical Reports Server (NTRS)

    Choy, F. K.; Huang, S.; Zakrajsek, J. J.; Handschuh, R. F.; Townsend, D. P.

    1994-01-01

    A comprehensive procedure in predicting faults in gear transmission systems under normal operating conditions is presented. Experimental data was obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. Time synchronous averaged vibration data was recorded throughout the test as the fault progressed from a small single pit to severe pitting over several teeth, and finally tooth fracture. A numerical procedure based on the Winger-Ville distribution was used to examine the time averaged vibration data. Results from the Wigner-Ville procedure are compared to results from a variety of signal analysis techniques which include time domain analysis methods and frequency analysis methods. Using photographs of the gear tooth at various stages of damage, the limitations and accuracy of the various techniques are compared and discussed. Conclusions are drawn from the comparison of the different approaches as well as the applicability of the Wigner-Ville method in predicting gear faults.

  5. A matching framework to improve causal inference in interrupted time-series analysis.

    PubMed

    Linden, Ariel

    2018-04-01

    Interrupted time-series analysis (ITSA) is a popular evaluation methodology in which a single treatment unit's outcome is studied over time and the intervention is expected to "interrupt" the level and/or trend of the outcome, subsequent to its introduction. When ITSA is implemented without a comparison group, the internal validity may be quite poor. Therefore, adding a comparable control group to serve as the counterfactual is always preferred. This paper introduces a novel matching framework, ITSAMATCH, to create a comparable control group by matching directly on covariates and then use these matches in the outcomes model. We evaluate the effect of California's Proposition 99 (passed in 1988) for reducing cigarette sales, by comparing California to other states not exposed to smoking reduction initiatives. We compare ITSAMATCH results to 2 commonly used matching approaches, synthetic controls (SYNTH), and regression adjustment; SYNTH reweights nontreated units to make them comparable to the treated unit, and regression adjusts covariates directly. Methods are compared by assessing covariate balance and treatment effects. Both ITSAMATCH and SYNTH achieved covariate balance and estimated similar treatment effects. The regression model found no treatment effect and produced inconsistent covariate adjustment. While the matching framework achieved results comparable to SYNTH, it has the advantage of being technically less complicated, while producing statistical estimates that are straightforward to interpret. Conversely, regression adjustment may "adjust away" a treatment effect. Given its advantages, ITSAMATCH should be considered as a primary approach for evaluating treatment effects in multiple-group time-series analysis. © 2017 John Wiley & Sons, Ltd.

  6. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  7. Linear and Nonlinear Analysis of Magnetic Bearing Bandwidth Due to Eddy Current Limitations

    NASA Technical Reports Server (NTRS)

    Kenny, Andrew; Palazzolo, Alan

    2000-01-01

    Finite element analysis was used to study the bandwidth of alloy hyperco50a and silicon iron laminated rotors and stators in magnetic bearings. A three dimensional model was made of a heteropolar bearing in which all the flux circulated in the plane of the rotor and stator laminate. A three dimensional model of a plate similar to the region of a pole near the gap was also studied with a very fine mesh. Nonlinear time transient solutions for the net flux carried by the plate were compared to steady state time harmonic solutions. Both linear and quasi-nonlinear steady state time harmonic solutions were calculated and compared. The finite element solutions for power loss and flux bandwidth were compared to those determined from classical analytical solutions to Maxwell's equations.

  8. "USA Today": Comparative Analysis with Two National and Two Los Angeles Daily Newspapers. Research Bulletin.

    ERIC Educational Resources Information Center

    Ames, Steve; And Others

    Sections of the newspaper "USA Today" were compared with corresponding sections of four major newspapers--the "New York Times," the "Wall Street Journal," the "Los Angeles Herald Examiner," and the "Los Angeles Times"--to determine what editorial components made "USA Today" different and…

  9. Evaluation of Two Commercial Systems for Automated Processing, Reading, and Interpretation of Lyme Borreliosis Western Blots▿

    PubMed Central

    Binnicker, M. J.; Jespersen, D. J.; Harring, J. A.; Rollins, L. O.; Bryant, S. C.; Beito, E. M.

    2008-01-01

    The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis. PMID:18463211

  10. Evaluation of two commercial systems for automated processing, reading, and interpretation of Lyme borreliosis Western blots.

    PubMed

    Binnicker, M J; Jespersen, D J; Harring, J A; Rollins, L O; Bryant, S C; Beito, E M

    2008-07-01

    The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis.

  11. Single-incision versus conventional three-port laparoscopic appendectomy: A meta-analysis of randomized controlled trials.

    PubMed

    Chen, Jiang-Ming; Geng, Wei; Xie, Sheng-Xue; Liu, Fu-Bao; Zhao, Yi-Jun; Yu, Li-Quan; Geng, Xiao-Ping

    2015-01-01

    The aim of this article was to compare the advantages and disadvantages of single-incision laparoscopic appendectomy (SILA) and conventional three-port laparoscopic appendectomy (CTLA). A meta-analysis was performed by analyzing all randomized controlled trials (RCTs) published in English that compared SILA and CTLA for appendicitis in adults and children. These studies compared these two methods from different angles including outcomes of interest, patient characteristics, operative time, pain visual analogue scales scores (VAS scores), length of hospital stay, time to return to full activity, resumption of diet, postoperative complications and cosmetic results The risk ratios (RR) and mean difference (MD) with 95% confidence intervals (CIs) were employed to assess the outcome. Seven recent RCTs encompassing 1170 patients (586 SILA and 584 CTLA cases) were included in this meta-analysis. The pooled results demonstrated that conversion rate, drain inserted, reoperation, length of hospital stay, resumption of normal diet and postoperative complications were statistically comparable between the two groups. The postoperative abdominal pain within 24 h was -0.57 in favor of the SILA technique (p = 0.05). Compared with CTLA, SILA showed a better cosmetic satisfaction score (SMD, 0.58; 95% CI, 0.32-0.83; p < 0.0001) and shorter time to recover normal activity (WMD, -0.69; 95% CI, -1.11-0.26; p = 0.001). However, SILA has a longer operative time (WMD, 5.38; 95% CI, 2.94-7.83; p < 0.0001). In selected patients, SILA was confirmed to be as safe and effective as CTLA. Despite the longer operative time, SILA has higher cosmetic satisfaction and shorter recovery time to normal activity. Due to the limitations of the available data, further research is needed.

  12. Analysis of SSEM Sensor Data Using BEAM

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Park, Han; James, Mark

    2004-01-01

    A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.

  13. Comparative Analysis of the Clinical Significance of Oscillatory Components in the Rhythmic Structure of Pulse Signal in the Diagnostics of Psychosomatic Disorders in School Age Children.

    PubMed

    Desova, A A; Dorofeyuk, A A; Anokhin, A M

    2017-01-01

    We performed a comparative analysis of the types of spectral density typical of various parameters of pulse signal. The experimental material was obtained during the examination of school age children with various psychosomatic disorders. We also performed a typological analysis of the spectral density functions corresponding to the time series of different parameters of a single oscillation of pulse signals; the results of their comparative analysis are presented. We determined the most significant spectral components for two disordersin children: arterial hypertension and mitral valve prolapse.

  14. Time dependent analysis of assay comparability: a novel approach to understand intra- and inter-site variability over time

    NASA Astrophysics Data System (ADS)

    Winiwarter, Susanne; Middleton, Brian; Jones, Barry; Courtney, Paul; Lindmark, Bo; Page, Ken M.; Clark, Alan; Landqvist, Claire

    2015-09-01

    We demonstrate here a novel use of statistical tools to study intra- and inter-site assay variability of five early drug metabolism and pharmacokinetics in vitro assays over time. Firstly, a tool for process control is presented. It shows the overall assay variability but allows also the following of changes due to assay adjustments and can additionally highlight other, potentially unexpected variations. Secondly, we define the minimum discriminatory difference/ratio to support projects to understand how experimental values measured at different sites at a given time can be compared. Such discriminatory values are calculated for 3 month periods and followed over time for each assay. Again assay modifications, especially assay harmonization efforts, can be noted. Both the process control tool and the variability estimates are based on the results of control compounds tested every time an assay is run. Variability estimates for a limited set of project compounds were computed as well and found to be comparable. This analysis reinforces the need to consider assay variability in decision making, compound ranking and in silico modeling.

  15. Sonographically guided intrasheath percutaneous release of the first annular pulley for trigger digits, part 2: randomized comparative study of the economic impact of 3 surgical models.

    PubMed

    Rojo-Manaute, Jose Manuel; Capa-Grasa, Alberto; Del Cerro-Gutiérrez, Miguel; Martínez, Manuel Villanueva; Chana-Rodríguez, Francisco; Martín, Javier Vaquero

    2012-03-01

    Trigger digit surgery can be performed by an open approach using classic open surgery, by a wide-awake approach, or by sonographically guided first annular pulley release in day surgery and office-based ambulatory settings. Our goal was to perform a turnover and economic analysis of 3 surgical models. Two studies were conducted. The first was a turnover analysis of 57 patients allocated 4:4:1 into the surgical models: sonographically guided-office-based, classic open-day surgery, and wide-awake-office-based. Regression analysis for the turnover time was monitored for assessing stability (R(2) < .26). Second, on the basis of turnover times and hospital tariff revenues, we calculated the total costs, income to cost ratio, opportunity cost, true cost, true net income (primary variable), break-even points for sonographically guided fixed costs, and 1-way analysis for identifying thresholds among alternatives. Thirteen sonographically guided-office-based patients were withdrawn because of a learning curve influence. The wide-awake (n = 6) and classic (n = 26) models were compared to the last 25% of the sonographically guided group (n = 12), which showed significantly less mean turnover times, income to cost ratios 2.52 and 10.9 times larger, and true costs 75.48 and 20.92 times lower, respectively. A true net income break-even point happened after 19.78 sonographically guided-office-based procedures. Sensitivity analysis showed a threshold between wide-awake and last 25% sonographically guided true costs if the last 25% sonographically guided turnover times reached 65.23 and 27.81 minutes, respectively. However, this trial was underpowered. This trial comparing surgical models was underpowered and is inconclusive on turnover times; however, the sonographically guided-office-based approach showed shorter turnover times and better economic results with a quick recoup of the costs of sonographically assisted surgery.

  16. Is Canadian Healthcare Affordable? A Comparative Analysis of the Canadian Healthcare System from 2004 to 2014.

    PubMed

    Soril, Lesley J J; Adams, Ted; Phipps-Taylor, Madeleine; Winblad, Ulrika; Clement, Fiona

    2017-08-01

    To compare cost-related non-adherence (CRNA), serious problems paying medical bills and average annual out-of-pocket cost over time in five countries. Repeated cross-sectional analysis of the Commonwealth Fund International Health Policy survey from 2004 to 2014. Responses were compared between Canada, the UK, Australia, New Zealand and the US. Compared to the UK, respondents in Canada, Australia and New Zealand were two to three times and respondents in the US were eight times more likely to experience CRNA; these odds remained stable over time. From 2004 to 2014, Canadian respondents paid US $852-1,767 out-of-pocket for care. The US reported the largest risks of serious problems paying for care (13-18.5%), highest out-of-pocket costs (US $2,060-3,319) and greatest rise in expenditures. Over the 10-year period, financial barriers to care were identified in Canada and internationally. Such persistent challenges are of great concern to countries striving for equitable access to healthcare. Copyright © 2017 Longwoods Publishing.

  17. Combining synthetic controls and interrupted time series analysis to improve causal inference in program evaluation.

    PubMed

    Linden, Ariel

    2018-04-01

    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied over time and the intervention is expected to "interrupt" the level and/or trend of the outcome. The internal validity is strengthened considerably when the treated unit is contrasted with a comparable control group. In this paper, we introduce a robust evaluation framework that combines the synthetic controls method (SYNTH) to generate a comparable control group and ITSA regression to assess covariate balance and estimate treatment effects. We evaluate the effect of California's Proposition 99 for reducing cigarette sales, by comparing California to other states not exposed to smoking reduction initiatives. SYNTH is used to reweight nontreated units to make them comparable to the treated unit. These weights are then used in ITSA regression models to assess covariate balance and estimate treatment effects. Covariate balance was achieved for all but one covariate. While California experienced a significant decrease in the annual trend of cigarette sales after Proposition 99, there was no statistically significant treatment effect when compared to synthetic controls. The advantage of using this framework over regression alone is that it ensures that a comparable control group is generated. Additionally, it offers a common set of statistical measures familiar to investigators, the capability for assessing covariate balance, and enhancement of the evaluation with a comprehensive set of postestimation measures. Therefore, this robust framework should be considered as a primary approach for evaluating treatment effects in multiple group time series analysis. © 2018 John Wiley & Sons, Ltd.

  18. Evaluation of the microsoft kinect skeletal versus depth data analysis for timed-up and go and figure of 8 walk tests.

    PubMed

    Hotrabhavananda, Benjamin; Mishra, Anup K; Skubic, Marjorie; Hotrabhavananda, Nijaporn; Abbott, Carmen

    2016-08-01

    We compared the performance of the Kinect skeletal data with the Kinect depth data in capturing different gait parameters during the Timed-up and Go Test (TUG) and Figure of 8 Walk Test (F8W). The gait parameters considered were stride length, stride time, and walking speed for the TUG, and number of steps and completion time for the F8W. A marker-based Vicon motion capture system was used for the ground-truth measurements. Five healthy participants were recruited for the experiment and were asked to perform three trials of each task. Results show that depth data analysis yields stride length and stride time measures with significantly low percentile errors as compared to the skeletal data analysis. However, the skeletal and depth data performed similar with less than 3% of absolute mean percentile error in determining the walking speed for the TUG and both parameters of F8W. The results show potential capabilities of Kinect depth data analysis in computing many gait parameters, whereas, the Kinect skeletal data can also be used for walking speed in TUG and F8W gait parameters.

  19. Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.

    PubMed

    Latha, Indu; Reichenbach, Stephen E; Tao, Qingping

    2011-09-23

    Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Comparative analysis of the numerical methods for estimation of predictability time of NEAs motion. (Russian Title: Сравнительный анализ численных методов оценивания времени предсказуемости движения АСЗ)

    NASA Astrophysics Data System (ADS)

    Bykova, L. E.; Galushina, T. Yu.; Razdymakhina, O. N.

    2011-07-01

    The paper presents the results of comparative analysis of different algorithms for determination of predictability time of Near-Earth asteroids (NEAs) motion. Three algorithms have been considered: shadow path method, variation method, MEGNO-analysis, where the characteristic of dynamic chaos is the time-weighted integral quantity of maximum characteristic Lyapunov exponent. The developed algorithms and software complex has been applied to identify the chaotic motion of some NEAs. It is shown that MEGNO-analysis allows enough accurately separate regular and chaotic motion of asteroids in a relatively short time intervals.

  1. The change of adjacent segment after cervical disc arthroplasty compared with anterior cervical discectomy and fusion: a meta-analysis of randomized controlled trials.

    PubMed

    Dong, Liang; Xu, Zhengwei; Chen, Xiujin; Wang, Dongqi; Li, Dichen; Liu, Tuanjing; Hao, Dingjun

    2017-10-01

    Many meta-analyses have been performed to study the efficacy of cervical disc arthroplasty (CDA) compared with anterior cervical discectomy and fusion (ACDF); however, there are few data referring to adjacent segment within these meta-analyses, or investigators are unable to arrive at the same conclusion in the few meta-analyses about adjacent segment. With the increased concerns surrounding adjacent segment degeneration (ASDeg) and adjacent segment disease (ASDis) after anterior cervical surgery, it is necessary to perform a comprehensive meta-analysis to analyze adjacent segment parameters. To perform a comprehensive meta-analysis to elaborate adjacent segment motion, degeneration, disease, and reoperation of CDA compared with ACDF. Meta-analysis of randomized controlled trials (RCTs). PubMed, Embase, and Cochrane Library were searched for RCTs comparing CDA and ACDF before May 2016. The analysis parameters included follow-up time, operative segments, adjacent segment motion, ASDeg, ASDis, and adjacent segment reoperation. The risk of bias scale was used to assess the papers. Subgroup analysis and sensitivity analysis were used to analyze the reason for high heterogeneity. Twenty-nine RCTs fulfilled the inclusion criteria. Compared with ACDF, the rate of adjacent segment reoperation in the CDA group was significantly lower (p<.01), and the advantage of that group in reducing adjacent segment reoperation increases with increasing follow-up time by subgroup analysis. There was no statistically significant difference in ASDeg between CDA and ACDF within the 24-month follow-up period; however, the rate of ASDeg in CDA was significantly lower than that of ACDF with the increase in follow-up time (p<.01). There was no statistically significant difference in ASDis between CDA and ACDF (p>.05). Cervical disc arthroplasty provided a lower adjacent segment range of motion (ROM) than did ACDF, but the difference was not statistically significant. Compared with ACDF, the advantages of CDA were lower ASDeg and adjacent segment reoperation. However, there was no statistically significant difference in ASDis and adjacent segment ROM. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Accuracy of the Garmin 920 XT HRM to perform HRV analysis.

    PubMed

    Cassirame, Johan; Vanhaesebrouck, Romain; Chevrolat, Simon; Mourot, Laurent

    2017-12-01

    Heart rate variability (HRV) analysis is widely used to investigate autonomous cardiac drive. This method requires periodogram measurement, which can be obtained by an electrocardiogram (ECG) or from a heart rate monitor (HRM), e.g. the Garmin 920 XT device. The purpose of this investigation was to assess the accuracy of RR time series measurements from a Garmin 920 XT HRM as compared to a standard ECG, and to verify whether the measurements thus obtained are suitable for HRV analysis. RR time series were collected simultaneously with an ECG (Powerlab system, AD Instruments, Castell Hill, Australia) and a Garmin XT 920 in 11 healthy subjects during three conditions, namely in the supine position, the standing position and during moderate exercise. In a first step, we compared RR time series obtained with both tools using the Bland and Altman method to obtain the limits of agreement in all three conditions. In a second step, we compared the results of HRV analysis between the ECG RR time series and Garmin 920 XT series. Results show that the accuracy of this system is in accordance with the literature in terms of the limits of agreement. In the supine position, bias was 0.01, - 2.24, + 2.26 ms; in the standing position, - 0.01, - 3.12, + 3.11 ms respectively, and during exercise, - 0.01, - 4.43 and + 4.40 ms. Regarding HRV analysis, we did not find any difference for HRV analysis in the supine position, but the standing and exercise conditions both showed small modifications.

  3. Purification of pharmaceutical preparations using thin-layer chromatography to obtain mass spectra with Direct Analysis in Real Time and accurate mass spectrometry.

    PubMed

    Wood, Jessica L; Steiner, Robert R

    2011-06-01

    Forensic analysis of pharmaceutical preparations requires a comparative analysis with a standard of the suspected drug in order to identify the active ingredient. Purchasing analytical standards can be expensive or unattainable from the drug manufacturers. Direct Analysis in Real Time (DART™) is a novel, ambient ionization technique, typically coupled with a JEOL AccuTOF™ (accurate mass) mass spectrometer. While a fast and easy technique to perform, a drawback of using DART™ is the lack of component separation of mixtures prior to ionization. Various in-house pharmaceutical preparations were purified using thin-layer chromatography (TLC) and mass spectra were subsequently obtained using the AccuTOF™- DART™ technique. Utilizing TLC prior to sample introduction provides a simple, low-cost solution to acquiring mass spectra of the purified preparation. Each spectrum was compared against an in-house molecular formula list to confirm the accurate mass elemental compositions. Spectra of purified ingredients of known pharmaceuticals were added to an in-house library for use as comparators for casework samples. Resolving isomers from one another can be accomplished using collision-induced dissociation after ionization. Challenges arose when the pharmaceutical preparation required an optimized TLC solvent to achieve proper separation and purity of the standard. Purified spectra were obtained for 91 preparations and included in an in-house drug standard library. Primary standards would only need to be purchased when pharmaceutical preparations not previously encountered are submitted for comparative analysis. TLC prior to DART™ analysis demonstrates a time efficient and cost saving technique for the forensic drug analysis community. Copyright © 2011 John Wiley & Sons, Ltd. Copyright © 2011 John Wiley & Sons, Ltd.

  4. Ultra wide-band localization and SLAM: a comparative study for mobile robot navigation.

    PubMed

    Segura, Marcelo J; Auat Cheein, Fernando A; Toibero, Juan M; Mut, Vicente; Carelli, Ricardo

    2011-01-01

    In this work, a comparative study between an Ultra Wide-Band (UWB) localization system and a Simultaneous Localization and Mapping (SLAM) algorithm is presented. Due to its high bandwidth and short pulses length, UWB potentially allows great accuracy in range measurements based on Time of Arrival (TOA) estimation. SLAM algorithms recursively estimates the map of an environment and the pose (position and orientation) of a mobile robot within that environment. The comparative study presented here involves the performance analysis of implementing in parallel an UWB localization based system and a SLAM algorithm on a mobile robot navigating within an environment. Real time results as well as error analysis are also shown in this work.

  5. Modified Right Heart Contrast Echocardiography Versus Traditional Method in Diagnosis of Right-to-Left Shunt: A Comparative Study.

    PubMed

    Wang, Yi; Zeng, Jie; Yin, Lixue; Zhang, Mei; Hou, Dailun

    2016-01-01

    The purpose of this study was to evaluate the reliability, effectiveness, and safety of modified right heart contrast transthoracic echocardiography (cTTE) in comparison with the traditional method. We performed a modified right heart cTTE using saline mixed with a small sample of patient's own blood. Samples were agitated with varying intensity. This study protocol involved microscopic analysis and patient evaluation. 1. Microscopic analysis: After two contrast samples had been agitated 10 or 20 times, they underwent a comparison of bubble size, bubble number, and red blood cell morphology. 2. Patient analysis: 40 patients with suspected RLS (right- to-left shunt) were enrolled. All patients underwent right heart contrast echocardiography. Oxygen saturation, transit time and duration, presence of RLS, change in indirect bilirubin and urobilinogen concentrations were compared afterward. Modified method generated more bubbles (P<0.05), but the differences in bubble size were not significant (P>0.05). Twenty-four patients were diagnosed with RLS (60%) using the modified method compared to 16 patients (40%) with the traditional method. The transit time of ASb20 group was the shortest (P<0.05). However, the duration time in this group was much longer (P<0.05). Also, in semi-quantitative analysis mean rank of RLS was higher after injecting the modified contrast agent agitated 20 times (P<0.05). Modified right heart contrast echocardiography is a reliable, effective and safe method of detecting cardiovascular RLS.

  6. Time-Efficiency Analysis Comparing Digital and Conventional Workflows for Implant Crowns: A Prospective Clinical Crossover Trial.

    PubMed

    Joda, Tim; Brägger, Urs

    2015-01-01

    To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.

  7. [Surgery for colorectal cancer since the introduction of the Netherlands national screening programmeInvestigations into changes in number of resections and waiting times for surgery].

    PubMed

    de Neree Tot Babberich, M P M; van der Willik, E M; van Groningen, J T; Ledeboer, M; Wiggers, T; Wouters, M W J M

    2017-01-01

    To investigate the impact of the Netherlands national colorectal cancer screening programme on the number of surgical resections for colorectal carcinoma and on waiting times for surgery. Descriptive study. Data were extracted from the Dutch Surgical Colorectal Audit. Patients with primary colorectal cancer surgery between 2011-2015 were included. The volume and median waiting times for the years 2011-2015 are described. Waiting times from first tumor positive biopsy until the operation (biopsy-operation) and first preoperative visit to the surgeon until the operation (visit-operation) are analyzed with a univariate and multivariate linear regression analysis. Separate analysis was done for visit-operation for academic and non-academic hospitals and for screening compared to non-screening patients. In 2014 there was an increase of 1469 (15%) patients compared to 2013. In 2015 this increase consisted of 1168 (11%) patients compared to 2014. In 2014 and 2015, 1359 (12%) and 3111 (26%) patients were referred to the surgeon through screening, respectively. The median waiting time of biopsy-operation significantly decreased (ß: 0.94, 95%BI) over the years 2014-2015 compared to 2011-2013. In non-academic hospitals, the waiting time visit-operation also decreased significantly (ß: 0.89, 95%BI 0.87-0.90) over the years 2014-2015 compared to 2011-2013. No difference was found in waiting times between patients referred to the surgeon through screening compared to non-screening. There is a clear increase in volume since the introduction of the colorectal cancer screening programme without an increase in waiting time until surgery.

  8. ClinicAl Evaluation of Dental Restorative Materials

    DTIC Science & Technology

    1989-01-01

    use of an Atuarial Life Table Survival Analysis procedure. The median survival time for anterior composites was 13.5 years, as compared to 12.1 years...dental materials. For the first time in clinical biomaterials research, we used a statistical approach of Survival Analysis which utilized the... analysis has been established to assure uniformity in usage. This scale is now in use by clinical investigators throughout the country. Its use at the

  9. Comparison among Reconstruction Algorithms for Quantitative Analysis of 11C-Acetate Cardiac PET Imaging.

    PubMed

    Shi, Ximin; Li, Nan; Ding, Haiyan; Dang, Yonghong; Hu, Guilan; Liu, Shuai; Cui, Jie; Zhang, Yue; Li, Fang; Zhang, Hui; Huo, Li

    2018-01-01

    Kinetic modeling of dynamic 11 C-acetate PET imaging provides quantitative information for myocardium assessment. The quality and quantitation of PET images are known to be dependent on PET reconstruction methods. This study aims to investigate the impacts of reconstruction algorithms on the quantitative analysis of dynamic 11 C-acetate cardiac PET imaging. Suspected alcoholic cardiomyopathy patients ( N = 24) underwent 11 C-acetate dynamic PET imaging after low dose CT scan. PET images were reconstructed using four algorithms: filtered backprojection (FBP), ordered subsets expectation maximization (OSEM), OSEM with time-of-flight (TOF), and OSEM with both time-of-flight and point-spread-function (TPSF). Standardized uptake values (SUVs) at different time points were compared among images reconstructed using the four algorithms. Time-activity curves (TACs) in myocardium and blood pools of ventricles were generated from the dynamic image series. Kinetic parameters K 1 and k 2 were derived using a 1-tissue-compartment model for kinetic modeling of cardiac flow from 11 C-acetate PET images. Significant image quality improvement was found in the images reconstructed using iterative OSEM-type algorithms (OSME, TOF, and TPSF) compared with FBP. However, no statistical differences in SUVs were observed among the four reconstruction methods at the selected time points. Kinetic parameters K 1 and k 2 also exhibited no statistical difference among the four reconstruction algorithms in terms of mean value and standard deviation. However, for the correlation analysis, OSEM reconstruction presented relatively higher residual in correlation with FBP reconstruction compared with TOF and TPSF reconstruction, and TOF and TPSF reconstruction were highly correlated with each other. All the tested reconstruction algorithms performed similarly for quantitative analysis of 11 C-acetate cardiac PET imaging. TOF and TPSF yielded highly consistent kinetic parameter results with superior image quality compared with FBP. OSEM was relatively less reliable. Both TOF and TPSF were recommended for cardiac 11 C-acetate kinetic analysis.

  10. Assessment of the usefulness of the standardized uptake values and the radioactivity levels for the preoperative diagnosis of thyroid cancer measured by using 18F-FDG PET/CT dual-time-point imaging

    NASA Astrophysics Data System (ADS)

    Lee, Hyeon-Guck; Hong, Seong-Jong; Cho, Jae-Hwan; Han, Man-Seok; Kim, Tae-Hyung; Lee, Ik-Han

    2013-02-01

    The purpose of this study was to assess and compare the changes in the SUV (standardized uptake value), the 18F-FDG (18F-fluorodeoxyglucose) uptake pattern, and the radioactivity level for the diagnosis of thyroid cancer via dual-time-point 18F-FDG PET/CT (positron emission tomographycomputed tomography) imaging. Moreover, the study aimed to verify the usefulness and significance of SUV values and radioactivity levels to discriminate tumor malignancy. A retrospective analysis was performed on 40 patients who received 18F-FDG PET/CT for thyroid cancer as a primary tumor. To set the background, we compared changes in values by calculating the dispersion of scattered rays in the neck area and the lung apex, and by comparing the mean and SD (standard deviation) values of the maxSUV and the radioactivity levels. According to the statistical analysis of the changes in 18F-FDG uptake for the diagnosis of thyroid cancer, a high similarity was observed with the coefficient of determination being R2 = 0.939, in the SUVs and the radioactivity levels. Moreover, similar results were observed in the assessment of tumor malignancy using dual-time-point. The quantitative analysis method for assessing tumor malignancy using radioactivity levels was neither specific nor discriminative compared to the semi-quantitative analysis method.

  11. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    NASA Technical Reports Server (NTRS)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  12. Three list scheduling temporal partitioning algorithm of time space characteristic analysis and compare for dynamic reconfigurable computing

    NASA Astrophysics Data System (ADS)

    Chen, Naijin

    2013-03-01

    Level Based Partitioning (LBP) algorithm, Cluster Based Partitioning (CBP) algorithm and Enhance Static List (ESL) temporal partitioning algorithm based on adjacent matrix and adjacent table are designed and implemented in this paper. Also partitioning time and memory occupation based on three algorithms are compared. Experiment results show LBP partitioning algorithm possesses the least partitioning time and better parallel character, as far as memory occupation and partitioning time are concerned, algorithms based on adjacent table have less partitioning time and less space memory occupation.

  13. Economics of adopting solar photovoltaic energy systems in irrigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matlin, R.W.; Katzman, M.T.

    An economic analysis concerning the adoption of solar photovoltaic energy systems in irrigation has been made compared to conventional fossil fuel energy sources. The basis for this analysis is presented along with a discussion as to the time of initial profitability, the time of optimal investment, the effects of the tax system, the cost per acre that would make irrigation unviable, and possible governmental incentives that would promote the deployment of photovoltaic irrigation systems between the time of initial profitability and the time of optimal investment.

  14. Comparing sequencing assays and human-machine analyses in actionable genomics for glioblastoma.

    PubMed

    Wrzeszczynski, Kazimierz O; Frank, Mayu O; Koyama, Takahiko; Rhrissorrakrai, Kahn; Robine, Nicolas; Utro, Filippo; Emde, Anne-Katrin; Chen, Bo-Juen; Arora, Kanika; Shah, Minita; Vacic, Vladimir; Norel, Raquel; Bilal, Erhan; Bergmann, Ewa A; Moore Vogel, Julia L; Bruce, Jeffrey N; Lassman, Andrew B; Canoll, Peter; Grommes, Christian; Harvey, Steve; Parida, Laxmi; Michelini, Vanessa V; Zody, Michael C; Jobanputra, Vaidehi; Royyuru, Ajay K; Darnell, Robert B

    2017-08-01

    To analyze a glioblastoma tumor specimen with 3 different platforms and compare potentially actionable calls from each. Tumor DNA was analyzed by a commercial targeted panel. In addition, tumor-normal DNA was analyzed by whole-genome sequencing (WGS) and tumor RNA was analyzed by RNA sequencing (RNA-seq). The WGS and RNA-seq data were analyzed by a team of bioinformaticians and cancer oncologists, and separately by IBM Watson Genomic Analytics (WGA), an automated system for prioritizing somatic variants and identifying drugs. More variants were identified by WGS/RNA analysis than by targeted panels. WGA completed a comparable analysis in a fraction of the time required by the human analysts. The development of an effective human-machine interface in the analysis of deep cancer genomic datasets may provide potentially clinically actionable calls for individual patients in a more timely and efficient manner than currently possible. NCT02725684.

  15. Le futur linguistique: temps lineaire ou temps ramifie (The Linguistic Future: Linear or Branching Time)?

    ERIC Educational Resources Information Center

    Martin, Robert

    1981-01-01

    Discusses the problems posed by a semantic analysis of the future tense in French, addressing particularly its double use as a tense and as a mood. The distinction between linear and branching time, or, certainty and possibility, central to this discussion, leads to a comparative analysis of future and conditional. (MES)

  16. A Comparison of Alternative Approaches to the Analysis of Interrupted Time-Series.

    ERIC Educational Resources Information Center

    Harrop, John W.; Velicer, Wayne F.

    1985-01-01

    Computer generated data representative of 16 Auto Regressive Integrated Moving Averages (ARIMA) models were used to compare the results of interrupted time-series analysis using: (1) the known model identification, (2) an assumed (l,0,0) model, and (3) an assumed (3,0,0) model as an approximation to the General Transformation approach. (Author/BW)

  17. Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks

    ERIC Educational Resources Information Center

    Park, Sanghoon

    2017-01-01

    This paper reports the findings of a comparative analysis of online learner behavioral interactions, time-on-task, attendance, and performance at different points throughout a semester (beginning, during, and end) based on two online courses: one course offering authentic discussion-based learning activities and the other course offering authentic…

  18. Comparing the performance of FA, DFA and DMA using different synthetic long-range correlated time series

    PubMed Central

    Shao, Ying-Hui; Gu, Gao-Feng; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Sornette, Didier

    2012-01-01

    Notwithstanding the significant efforts to develop estimators of long-range correlations (LRC) and to compare their performance, no clear consensus exists on what is the best method and under which conditions. In addition, synthetic tests suggest that the performance of LRC estimators varies when using different generators of LRC time series. Here, we compare the performances of four estimators [Fluctuation Analysis (FA), Detrended Fluctuation Analysis (DFA), Backward Detrending Moving Average (BDMA), and Centred Detrending Moving Average (CDMA)]. We use three different generators [Fractional Gaussian Noises, and two ways of generating Fractional Brownian Motions]. We find that CDMA has the best performance and DFA is only slightly worse in some situations, while FA performs the worst. In addition, CDMA and DFA are less sensitive to the scaling range than FA. Hence, CDMA and DFA remain “The Methods of Choice” in determining the Hurst index of time series. PMID:23150785

  19. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    PubMed

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  20. Racial Earnings Differentials and Performance Pay

    ERIC Educational Resources Information Center

    Heywood, John S.; O'Halloran, Patrick L.

    2005-01-01

    A comparative analysis between output-based payment and time rates payment is presented. It is observed that racial or gender earnings discrimination is more likely in time rates payment and supervisory evaluations.

  1. Improving Reports Turnaround Time: An Essential Healthcare Quality Dimension.

    PubMed

    Khan, Mustafa; Khalid, Parwaiz; Al-Said, Youssef; Cupler, Edward; Almorsy, Lamia; Khalifa, Mohamed

    2016-01-01

    Turnaround time is one of the most important healthcare performance indicators. King Faisal Specialist Hospital and Research Center in Jeddah, Saudi Arabia worked on reducing the reports turnaround time of the neurophysiology lab from more than two weeks to only five working days for 90% of cases. The main quality improvement methodology used was the FOCUS PDCA. Using root cause analysis, Pareto analysis and qualitative survey methods, the main factors contributing to the delay of turnaround time and the suggested improvement strategies were identified and implemented, through restructuring transcriptionists daily tasks, rescheduling physicians time and alerting for new reports, engaging consultants, consistent coordination and prioritizing critical reports. After implementation; 92% of reports are verified within 5 days compared to only 6% before implementation. 7% of reports were verified in 5 days to 2 weeks and only 1% of reports needed more than 2 weeks compared to 76% before implementation.

  2. Periods of High Intensity Solar Proton Flux

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Thomas M.; Adams, James H.; Dietrich, William F.

    2012-01-01

    Analysis is presented for times during a space mission that specified solar proton flux levels are exceeded. This includes both total time and continuous time periods during missions. Results for the solar maximum and solar minimum phases of the solar cycle are presented and compared for a broad range of proton energies and shielding levels. This type of approach is more amenable to reliability analysis for spacecraft systems and instrumentation than standard statistical models.

  3. The fluoroscopy time, door to balloon time, contrast volume use and prevalence of vascular access site failure with transradial versus transfemoral approach in ST segment elevation myocardial infarction: A systematic review & meta-analysis.

    PubMed

    Singh, Sukhchain; Singh, Mukesh; Grewal, Navsheen; Khosla, Sandeep

    2015-12-01

    The authors aimed to conduct first systematic review and meta-analysis in STEMI patients evaluating vascular access site failure rate, fluoroscopy time, door to balloon time and contrast volume used with transradial vs transfemoral approach (TRA vs TFA) for PCI. The PubMed, CINAHL, clinicaltrials.gov, Embase and CENTRAL databases were searched for randomized trials comparing TRA versus TFA. Random effect models were used to conduct this meta-analysis. Fourteen randomized trials comprising 3758 patients met inclusion criteria. The access site failure rate was significantly higher TRA compared to TFA (RR 3.30, CI 2.16-5.03; P=0.000). Random effect inverse variance weighted prevalence rate meta-analysis showed that access site failure rate was predicted to be 4% (95% CI 3.0-6.0%) with TRA versus 1% (95% CI 0.0-1.0 %) with TFA. Door to balloon time (Standardized mean difference [SMD] 0.30 min, 95% CI 0.23-0.37 min; P=0.000) and fluoroscopy time (Standardized mean difference 0.14 min, 95% CI 0.06-0.23 min; P=0.001) were also significantly higher in TRA. There was no difference in the amount of contrast volume used with TRA versus TFA (SMD -0.05 ml, 95% CI -0.14 to 0.04 ml; P=0.275). Statistical heterogeneity was low in cross-over rate and contrast volume use, moderate in fluoroscopy time but high in the door to balloon time comparison. Operators need to consider higher cross-over rate with TRA compared to TFA in STEMI patients while attempting PCI. Fluoroscopy and door to balloon times are negligibly higher with TRA but there is no difference in terms of contrast volume use. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Time-localized frequency analysis of ultrasonic guided waves for nondestructive testing

    NASA Astrophysics Data System (ADS)

    Shin, Hyeon Jae; Song, Sung-Jin

    2000-05-01

    A time-localized frequency (TLF) analysis is employed for the guided wave mode identification and improved guided wave applications. For the analysis of time-localized frequency contents of digitized ultrasonic signals, TLF analysis consists of splitting the time domain signal into overlapping segments, weighting each with the hanning window, and forming the columns of discrete Fourier transforms. The result is presented by a frequency versus time domain diagram showing frequency variation along the signal arrival time. For the demonstration of the utility of TLF analysis, an experimental group velocity dispersion pattern obtained by TLF analysis is compared with the dispersion diagram obtained by theory of elasticity. Sample piping is carbon steel piping that is used for the transportation of natural gas underground. Guided wave propagation characteristic on the piping is considered with TLF analysis and wave structure concepts. TLF analysis is used for the detection of simulated corrosion defects and the assessment of weld joint using ultrasonic guided waves. TLF analysis has revealed that the difficulty of mode identification in multi-mode propagation could be overcome. Group velocity dispersion pattern obtained by TLF analysis agrees well with theoretical results.

  5. [Dealing with Waiting Times in Health Systems - An International Comparative Overview].

    PubMed

    Finkenstädt, V

    2015-10-01

    Waiting times in the health system are a form of rationing that exists in many countries. Previous studies on this topic are mainly related to the problem of international comparability of waiting times or on the presentation of national strategies as to how they should be reduced. This review adds to this analysis and examines how the OECD countries deal with waiting times in the health-care system and investigates which information is published about waiting for what purpose. Furthermore, waiting times and the type of health system financing are compared. A systematic internet research on waiting times in the health-care system was conducted on the websites of the competent authorities (Ministry of Health or other authorities and institutions). The identified publications were then examined for the purpose of their deployment. Finally, the OECD Health Data were analysed to determine the relationship between tax and contribution financing of public health care expenditure. The primary form of financing was compared with the results of the waiting time analysis. 16 OECD countries are identified which officially collect and publish administrative data on waiting times on the Internet. The data are processed differently depending on the country. By providing this information, two main objectives are pursued: a public monitoring of waiting times in the health system (14 countries) and information for patients on waiting times (9 countries). Official statistics on waiting times exist mainly in countries with tax-financed health systems, whereas this is not the case in the majority of OECD countries with health systems that are funded through contributions. The publication of administrative waiting times data is primarily intended to inform the patient and as a performance indicator in terms of access to health care. Even if data on waiting times are published, the publication of indicators and the management of waiting lists alone will not solve the problem. Rather, the analysis shows that in tax-funded health systems access to medical care is frequently rationed and the demand side is often regulated by waiting lists. © Georg Thieme Verlag KG Stuttgart · New York.

  6. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  7. A cost analysis of introducing an infectious disease specialist-guided antimicrobial stewardship in an area with relatively low prevalence of antimicrobial resistance.

    PubMed

    Lanbeck, Peter; Ragnarson Tennvall, Gunnel; Resman, Fredrik

    2016-07-27

    Antimicrobial stewardship programs have been widely introduced in hospitals as a response to increasing antimicrobial resistance. Although such programs are commonly used, the long-term effects on antimicrobial resistance as well as societal economics are uncertain. We performed a cost analysis of an antimicrobial stewardship program introduced in Malmö, Sweden in 20 weeks 2013 compared with a corresponding control period in 2012. All direct costs and opportunity costs related to the stewardship intervention were calculated for both periods. Costs during the stewardship period were directly compared to costs in the control period and extrapolated to a yearly cost. Two main analyses were performed, one including only comparable direct costs (analysis one) and one including comparable direct and opportunity costs (analysis two). An extra analysis including all comparable direct costs including costs related to length of hospital stay (analysis three) was performed, but deemed as unrepresentative. According to analysis one, the cost per year was SEK 161 990 and in analysis two the cost per year was SEK 5 113. Since the two cohorts were skewed in terms of size and of infection severity as a consequence of the program, and since short-term patient outcomes have been demonstrated to be unchanged by the intervention, the costs pertaining to patient outcomes were not included in the analysis, and we suggest that analysis two provides the most correct cost calculation. In this analysis, the main cost drivers were the physician time and nursing time. A sensitivity analysis of analysis two suggested relatively modest variation under changing assumptions. The total yearly cost of introducing an infectious disease specialist-guided, audit-based antimicrobial stewardship in a department of internal medicine, including direct costs and opportunity costs, was calculated to be as low as SEK 5 113.

  8. ALASKAN RTMA GRAPHICS

    Science.gov Websites

    Alaskan RTMA Graphics This page displays Alaskan Real-Time Mesoscale Analyses and compares them to DISCLAIMER: The Alaskan Real-Time Mesoscale Analysis tool is in its developmental stage, and there is much

  9. Ultra Wide-Band Localization and SLAM: A Comparative Study for Mobile Robot Navigation

    PubMed Central

    Segura, Marcelo J.; Auat Cheein, Fernando A.; Toibero, Juan M.; Mut, Vicente; Carelli, Ricardo

    2011-01-01

    In this work, a comparative study between an Ultra Wide-Band (UWB) localization system and a Simultaneous Localization and Mapping (SLAM) algorithm is presented. Due to its high bandwidth and short pulses length, UWB potentially allows great accuracy in range measurements based on Time of Arrival (TOA) estimation. SLAM algorithms recursively estimates the map of an environment and the pose (position and orientation) of a mobile robot within that environment. The comparative study presented here involves the performance analysis of implementing in parallel an UWB localization based system and a SLAM algorithm on a mobile robot navigating within an environment. Real time results as well as error analysis are also shown in this work. PMID:22319397

  10. Induction of osteoporosis with its influence on osteoporotic determinants and their interrelationships in rats by DEXA.

    PubMed

    Heiss, Christian; Govindarajan, Parameswari; Schlewitz, Gudrun; Hemdan, Nasr Y A; Schliefke, Nathalie; Alt, Volker; Thormann, Ulrich; Lips, Katrin Susanne; Wenisch, Sabine; Langheinrich, Alexander C; Zahner, Daniel; Schnettler, Reinhard

    2012-06-01

    As women are the population most affected by multifactorial osteoporosis, research is focused on unraveling the underlying mechanism of osteoporosis induction in rats by combining ovariectomy (OVX) either with calcium, phosphorus, vitamin C and vitamin D2/D3 deficiency, or by administration of glucocorticoid (dexamethasone). Different skeletal sites of sham, OVX-Diet and OVX-Steroid rats were analyzed by Dual Energy X-ray Absorptiometry (DEXA) at varied time points of 0, 4 and 12 weeks to determine and compare the osteoporotic factors such as bone mineral density (BMD), bone mineral content (BMC), area, body weight and percent fat among different groups and time points. Comparative analysis and interrelationships among osteoporotic determinants by regression analysis were also determined. T scores were below-2.5 in OVX-Diet rats at 4 and 12 weeks post-OVX. OVX-diet rats revealed pronounced osteoporotic status with reduced BMD and BMC than the steroid counterparts, with the spine and pelvis as the most affected skeletal sites. Increase in percent fat was observed irrespective of the osteoporosis inducers applied. Comparative analysis and interrelationships between osteoporotic determinants that are rarely studied in animals indicate the necessity to analyze BMC and area along with BMD in obtaining meaningful information leading to proper prediction of probability of osteoporotic fractures. Enhanced osteoporotic effect observed in OVX-Diet rats indicates that estrogen dysregulation combined with diet treatment induces and enhances osteoporosis with time when compared to the steroid group. Comparative and regression analysis indicates the need to determine BMC along with BMD and area in osteoporotic determination.

  11. An Analysis on a Dynamic Amplifier and Calibration Methods for a Pseudo-Differential Dynamic Comparator

    NASA Astrophysics Data System (ADS)

    Paik, Daehwa; Miyahara, Masaya; Matsuzawa, Akira

    This paper analyzes a pseudo-differential dynamic comparator with a dynamic pre-amplifier. The transient gain of a dynamic pre-amplifier is derived and applied to equations of the thermal noise and the regeneration time of a comparator. This analysis enhances understanding of the roles of transistor's parameters in pre-amplifier's gain. Based on the calculated gain, two calibration methods are also analyzed. One is calibration of a load capacitance and the other is calibration of a bypass current. The analysis helps designers' estimation for the accuracy of calibration, dead-zone of a comparator with a calibration circuit, and the influence of PVT variation. The analyzed comparator uses 90-nm CMOS technology as an example and each estimation is compared with simulation results.

  12. Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations.

    PubMed

    Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin

    2016-01-01

    This paper introduces a new approach-the Principal Component Gradient Analysis (PCGA)-to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA.

  13. Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations

    PubMed Central

    Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin

    2016-01-01

    This paper introduces a new approach–the Principal Component Gradient Analysis (PCGA)–to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA. PMID:27467508

  14. A time-series method for automated measurement of changes in mitotic and interphase duration from time-lapse movies.

    PubMed

    Sigoillot, Frederic D; Huckins, Jeremy F; Li, Fuhai; Zhou, Xiaobo; Wong, Stephen T C; King, Randall W

    2011-01-01

    Automated time-lapse microscopy can visualize proliferation of large numbers of individual cells, enabling accurate measurement of the frequency of cell division and the duration of interphase and mitosis. However, extraction of quantitative information by manual inspection of time-lapse movies is too time-consuming to be useful for analysis of large experiments. Here we present an automated time-series approach that can measure changes in the duration of mitosis and interphase in individual cells expressing fluorescent histone 2B. The approach requires analysis of only 2 features, nuclear area and average intensity. Compared to supervised learning approaches, this method reduces processing time and does not require generation of training data sets. We demonstrate that this method is as sensitive as manual analysis in identifying small changes in interphase or mitotic duration induced by drug or siRNA treatment. This approach should facilitate automated analysis of high-throughput time-lapse data sets to identify small molecules or gene products that influence timing of cell division.

  15. Comparative study of stock trend prediction using time delay, recurrent and probabilistic neural networks.

    PubMed

    Saad, E W; Prokhorov, D V; Wunsch, D C

    1998-01-01

    Three networks are compared for low false alarm stock trend predictions. Short-term trends, particularly attractive for neural network analysis, can be used profitably in scenarios such as option trading, but only with significant risk. Therefore, we focus on limiting false alarms, which improves the risk/reward ratio by preventing losses. To predict stock trends, we exploit time delay, recurrent, and probabilistic neural networks (TDNN, RNN, and PNN, respectively), utilizing conjugate gradient and multistream extended Kalman filter training for TDNN and RNN. We also discuss different predictability analysis techniques and perform an analysis of predictability based on a history of daily closing price. Our results indicate that all the networks are feasible, the primary preference being one of convenience.

  16. Planning, Re-Bordering and Setting Times: A Comparative Analysis of European and Latin American "Education Spaces"

    ERIC Educational Resources Information Center

    Rambla, Xavier

    2013-01-01

    The article compares educational regionalisation in Europe and Latin America. This analysis unveils the influence of three social phenomena in the two case studies, namely power, fields of activity and knowledge. Mostly, it focuses on the initiatives led by the European Union and the Organisation of Ibero-American States in order to implement…

  17. Forming the Academic Profession in East Asia: A Comparative Analysis. East Asia: History, Politics, Sociology, Culture. A Routledge Series.

    ERIC Educational Resources Information Center

    Kim, Terri

    This book is a comparative examination of the formation of the academic profession in Korea and Malaya, and later, South Korea, Malaysia, and Singapore, from colonial times. The analysis takes into account the connections and disconnections between the colonial and postcolonial periods in shaping the academic profession. The chapters are: (1)…

  18. Social and Professional Status and Political Values in Russia, Germany, and the United States (A Comparative Analysis)

    ERIC Educational Resources Information Center

    Khavenson, T. E.; Migol', E. V.

    2012-01-01

    In this article, the authors propose that a comparative analysis of the political values of societies that have a developing democracy and societies that have existed for a considerable length of time under the conditions of a democratic regime will make it possible to discern differences in the correlation of political values that are…

  19. Approaching German Culture: A Tentative Analysis

    ERIC Educational Resources Information Center

    Tinsley, Royal; Woloshin, David

    1974-01-01

    A comparative analysis of the five universal problems of cultural orientation: 1) human nature, 2) social relations, 3) man and nature, 4) time, 5) space, as they are reflected in German and American culture. (PP)

  20. Bending and stretching finite element analysis of anisotropic viscoelastic composite plates

    NASA Technical Reports Server (NTRS)

    Hilton, Harry H.; Yi, Sung

    1990-01-01

    Finite element algorithms have been developed to analyze linear anisotropic viscoelastic plates, with or without holes, subjected to mechanical (bending, tension), temperature, and hygrothermal loadings. The analysis is based on Laplace transforms rather than direct time integrations in order to improve the accuracy of the results and save on extensive computational time and storage. The time dependent displacement fields in the transverse direction for the cross ply and angle ply laminates are calculated and the stacking sequence effects of the laminates are discussed in detail. Creep responses for the plates with or without a circular hole are also studied. The numerical results compare favorably with analytical solutions, i.e. within 1.8 percent for bending and 10(exp -3) 3 percent for tension. The tension results of the present method are compared with those using the direct time integration scheme.

  1. Economic Evaluation of Teledentistry in Cleft Lip and Palate Patients.

    PubMed

    Teoh, Jonathan; Hsueh, Arthur; Mariño, Rodrigo; Manton, David; Hallett, Kerrod

    2018-06-01

    To assess the use of Teledentistry (TD) in delivering specialist dental services at the Royal Children's Hospital (RCH) for rural and regional patients and to conduct an economic evaluation by building a decision model to estimate the costs and effectiveness of Teledental consultations compared with standard consultations at the RCH. A model-based analysis was conducted to determine the potential costs of implementing TD at the RCH. The outcome measure was timely consultations (whether the patient presented within an appropriate time according to the recommended schedule). Dental records at the RCH of those who presented for orthodontic or pediatric dental consultations were assessed. A cost-effectiveness analysis (CEA), comparing TD with the traditional method of consultation, was conducted. One-way sensitivity analysis was performed to test the robustness of the results. Results and Materials: A total of 367 TD appropriate consultations were identified, of which 241 were timely (65.7%). The mean cost of a RCH consultation was A$431.29, with the mean TD consult costing A$294.35. This represents a cost saving of A$136.95 per appointment. The CEA found TD to be a dominant option, with cost savings of A$3,160.81 for every additional timely consult. The model indicated that 36.7 days of clinic time may be freed up at the RCH to treat other patients and expand capacity. These results were robust when performing one-way sensitivity analysis. When taking a societal perspective, the implementation of TD is likely to be a cost-effective alternative compared with the standard practice of face-to-face consultation at the RCH.

  2. Creep analysis of silicone for podiatry applications.

    PubMed

    Janeiro-Arocas, Julia; Tarrío-Saavedra, Javier; López-Beceiro, Jorge; Naya, Salvador; López-Canosa, Adrián; Heredia-García, Nicolás; Artiaga, Ramón

    2016-10-01

    This work shows an effective methodology to characterize the creep-recovery behavior of silicones before their application in podiatry. The aim is to characterize, model and compare the creep-recovery properties of different types of silicone used in podiatry orthotics. Creep-recovery phenomena of silicones used in podiatry orthotics is characterized by dynamic mechanical analysis (DMA). Silicones provided by Herbitas are compared by observing their viscoelastic properties by Functional Data Analysis (FDA) and nonlinear regression. The relationship between strain and time is modeled by fixed and mixed effects nonlinear regression to compare easily and intuitively podiatry silicones. Functional ANOVA and Kohlrausch-Willians-Watts (KWW) model with fixed and mixed effects allows us to compare different silicones observing the values of fitting parameters and their physical meaning. The differences between silicones are related to the variations of breadth of creep-recovery time distribution and instantaneous deformation-permanent strain. Nevertheless, the mean creep-relaxation time is the same for all the studied silicones. Silicones used in palliative orthoses have higher instantaneous deformation-permanent strain and narrower creep-recovery distribution. The proposed methodology based on DMA, FDA and nonlinear regression is an useful tool to characterize and choose the proper silicone for each podiatry application according to their viscoelastic properties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Waste water processing technology for Space Station Freedom - Comparative test data analysis

    NASA Technical Reports Server (NTRS)

    Miernik, Janie H.; Shah, Burt H.; Mcgriff, Cindy F.

    1991-01-01

    Comparative tests were conducted to choose the optimum technology for waste water processing on SSF. A thermoelectric integrated membrane evaporation (TIMES) subsystem and a vapor compression distillation subsystem (VCD) were built and tested to compare urine processing capability. Water quality, performance, and specific energy were compared for conceptual designs intended to function as part of the water recovery and management system of SSF. The VCD is considered the most mature and efficient technology and was selected to replace the TIMES as the baseline urine processor for SSF.

  4. Comparison of suprapatellar and infrapatellar intramedullary nailing for tibial shaft fractures: a systematic review and meta-analysis.

    PubMed

    Yang, Liqing; Sun, Yuefeng; Li, Ge

    2018-06-14

    Optimal surgical approach for tibial shaft fractures remains controversial. We perform a meta-analysis from randomized controlled trials (RCTs) to compare the clinical efficacy and prognosis between infrapatellar and suprapatellar intramedullary nail in the treatment of tibial shaft fractures. PubMed, OVID, Embase, ScienceDirect, and Web of Science were searched up to December 2017 for comparative RCTs involving infrapatellar and suprapatellar intramedullary nail in the treatment of tibial shaft fractures. Primary outcomes were blood loss, visual analog scale (VAS) score, range of motion, Lysholm knee scores, and fluoroscopy times. Secondary outcomes were length of hospital stay and postoperative complications. We assessed statistical heterogeneity for each outcome with the use of a standard χ 2 test and the I 2 statistic. The meta-analysis was undertaken using Stata 14.0. Four RCTs involving 293 participants were included in our study. The present meta-analysis indicated that there were significant differences between infrapatellar and suprapatellar intramedullary nail regarding the total blood loss, VAS scores, Lysholm knee scores, and fluoroscopy times. Suprapatellar intramedullary nailing could significantly reduce total blood loss, postoperative knee pain, and fluoroscopy times compared to infrapatellar approach. Additionally, it was associated with an improved Lysholm knee scores. High-quality RCTs were still required for further investigation.

  5. Regenerating time series from ordinal networks.

    PubMed

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  6. Regenerating time series from ordinal networks

    NASA Astrophysics Data System (ADS)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  7. Efficacy comparison between cryoablation and radiofrequency ablation for patients with cavotricuspid valve isthmus dependent atrial flutter: a meta-analysis

    NASA Astrophysics Data System (ADS)

    Chen, Yi-He; Lin, Hui; Xie, Cheng-Long; Zhang, Xiao-Ting; Li, Yi-Gang

    2015-06-01

    We perform this meta-analysis to compare the efficacy and safety of cryoablation versus radiofrequency ablation for patients with cavotricuspid valve isthmus dependent atrial flutter. By searching EMBASE, MEDLINE, PubMed and Cochrane electronic databases from March 1986 to September 2014, 7 randomized clinical trials were included. Acute (risk ratio[RR]: 0.93; P = 0.14) and long-term (RR: 0.94; P = 0.08) success rate were slightly lower in cryoablation group than in radiofrequency ablation group, but the difference was not statistically significant. Additionally, the fluoroscopy time was nonsignificantly reduced (weighted mean difference[WMD]: -2.83 P = 0.29), whereas procedure time was significantly longer (WMD: 25.95; P = 0.01) in cryoablation group compared with radiofrequency ablation group. Furthermore, Pain perception during the catheter ablation was substantially less in cryoabaltion group than in radiofrequency ablation group (standardized mean difference[SMD]: -2.36 P < 0.00001). Thus, our meta-analysis demonstrated that cryoablation and radiofrequency ablation produce comparable acute and long-term success rate for patients with cavotricuspid valve isthmus dependent atrial flutter. Meanwhile, cryoablation ablation tends to reduce the fluoroscopy time and significantly reduce pain perception in cost of significantly prolonged procedure time.

  8. Efficacy comparison between cryoablation and radiofrequency ablation for patients with cavotricuspid valve isthmus dependent atrial flutter: a meta-analysis

    PubMed Central

    Chen, Yi-He; Lin, Hui; Xie, Cheng-Long; Zhang, Xiao-Ting; Li, Yi-Gang

    2015-01-01

    We perform this meta-analysis to compare the efficacy and safety of cryoablation versus radiofrequency ablation for patients with cavotricuspid valve isthmus dependent atrial flutter. By searching EMBASE, MEDLINE, PubMed and Cochrane electronic databases from March 1986 to September 2014, 7 randomized clinical trials were included. Acute (risk ratio[RR]: 0.93; P = 0.14) and long-term (RR: 0.94; P = 0.08) success rate were slightly lower in cryoablation group than in radiofrequency ablation group, but the difference was not statistically significant. Additionally, the fluoroscopy time was nonsignificantly reduced (weighted mean difference[WMD]: −2.83; P = 0.29), whereas procedure time was significantly longer (WMD: 25.95; P = 0.01) in cryoablation group compared with radiofrequency ablation group. Furthermore, Pain perception during the catheter ablation was substantially less in cryoabaltion group than in radiofrequency ablation group (standardized mean difference[SMD]: −2.36; P < 0.00001). Thus, our meta-analysis demonstrated that cryoablation and radiofrequency ablation produce comparable acute and long-term success rate for patients with cavotricuspid valve isthmus dependent atrial flutter. Meanwhile, cryoablation ablation tends to reduce the fluoroscopy time and significantly reduce pain perception in cost of significantly prolonged procedure time. PMID:26039980

  9. Time-Course Gene Set Analysis for Longitudinal Gene Expression Data

    PubMed Central

    Hejblum, Boris P.; Skinner, Jason; Thiébaut, Rodolphe

    2015-01-01

    Gene set analysis methods, which consider predefined groups of genes in the analysis of genomic data, have been successfully applied for analyzing gene expression data in cross-sectional studies. The time-course gene set analysis (TcGSA) introduced here is an extension of gene set analysis to longitudinal data. The proposed method relies on random effects modeling with maximum likelihood estimates. It allows to use all available repeated measurements while dealing with unbalanced data due to missing at random (MAR) measurements. TcGSA is a hypothesis driven method that identifies a priori defined gene sets with significant expression variations over time, taking into account the potential heterogeneity of expression within gene sets. When biological conditions are compared, the method indicates if the time patterns of gene sets significantly differ according to these conditions. The interest of the method is illustrated by its application to two real life datasets: an HIV therapeutic vaccine trial (DALIA-1 trial), and data from a recent study on influenza and pneumococcal vaccines. In the DALIA-1 trial TcGSA revealed a significant change in gene expression over time within 69 gene sets during vaccination, while a standard univariate individual gene analysis corrected for multiple testing as well as a standard a Gene Set Enrichment Analysis (GSEA) for time series both failed to detect any significant pattern change over time. When applied to the second illustrative data set, TcGSA allowed the identification of 4 gene sets finally found to be linked with the influenza vaccine too although they were found to be associated to the pneumococcal vaccine only in previous analyses. In our simulation study TcGSA exhibits good statistical properties, and an increased power compared to other approaches for analyzing time-course expression patterns of gene sets. The method is made available for the community through an R package. PMID:26111374

  10. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    NASA Astrophysics Data System (ADS)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  11. Principal components analysis of the photoresponse nonuniformity of a matrix detector.

    PubMed

    Ferrero, Alejandro; Alda, Javier; Campos, Joaquín; López-Alonso, Jose Manuel; Pons, Alicia

    2007-01-01

    The principal component analysis is used to identify and quantify spatial distributions of relative photoresponse as a function of the exposure time for a visible CCD array. The analysis shows a simple way to define an invariant photoresponse nonuniformity and compare it with the definition of this invariant pattern as the one obtained for long exposure times. Experimental data of radiant exposure from levels of irradiance obtained in a stable and well-controlled environment are used.

  12. A comparison of computer-assisted detection (CAD) programs for the identification of colorectal polyps: performance and sensitivity analysis, current limitations and practical tips for radiologists.

    PubMed

    Bell, L T O; Gandhi, S

    2018-06-01

    To directly compare the accuracy and speed of analysis of two commercially available computer-assisted detection (CAD) programs in detecting colorectal polyps. In this retrospective single-centre study, patients who had colorectal polyps identified on computed tomography colonography (CTC) and subsequent lower gastrointestinal endoscopy, were analysed using two commercially available CAD programs (CAD1 and CAD2). Results were compared against endoscopy to ascertain sensitivity and positive predictive value (PPV) for colorectal polyps. Time taken for CAD analysis was also calculated. CAD1 demonstrated a sensitivity of 89.8%, PPV of 17.6% and mean analysis time of 125.8 seconds. CAD2 demonstrated a sensitivity of 75.5%, PPV of 44.0% and mean analysis time of 84.6 seconds. The sensitivity and PPV for colorectal polyps and CAD analysis times can vary widely between current commercially available CAD programs. There is still room for improvement. Generally, there is a trade-off between sensitivity and PPV, and so further developments should aim to optimise both. Information on these factors should be made routinely available, so that an informed choice on their use can be made. This information could also potentially influence the radiologist's use of CAD results. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  13. Cost minimisation analysis: kilovoltage imaging with automated repositioning versus electronic portal imaging in image-guided radiotherapy for prostate cancer.

    PubMed

    Gill, S; Younie, S; Rolfo, A; Thomas, J; Siva, S; Fox, C; Kron, T; Phillips, D; Tai, K H; Foroudi, F

    2012-10-01

    To compare the treatment time and cost of prostate cancer fiducial marker image-guided radiotherapy (IGRT) using orthogonal kilovoltage imaging (KVI) and automated couch shifts and orthogonal electronic portal imaging (EPI) and manual couch shifts. IGRT treatment delivery times were recorded automatically on either unit. Costing was calculated from real costs derived from the implementation of a new radiotherapy centre. To derive cost per minute for EPI and KVI units the total annual setting up and running costs were divided by the total annual working time. The cost per IGRT fraction was calculated by multiplying the cost per minute by the duration of treatment. A sensitivity analysis was conducted to test the robustness of our analysis. Treatment times without couch shift were compared. Time data were analysed for 8648 fractions, 6057 from KVI treatment and 2591 from EPI treatment from a total of 294 patients. The median time for KVI treatment was 6.0 min (interquartile range 5.1-7.4 min) and for EPI treatment it was 10.0 min (interquartile range 8.3-11.8 min) (P value < 0.0001). The cost per fraction for KVI was A$258.79 and for EPI was A$345.50. The cost saving per fraction for KVI varied between A$66.09 and A$101.64 by sensitivity analysis. In patients where no couch shift was made, the median treatment delivery time for EPI was 8.8 min and for KVI was 5.1 min. Treatment time is less on KVI units compared with EPI units. This is probably due to automation of couch shift and faster evaluation of imaging on KVI units. Annual running costs greatly outweigh initial setting up costs and therefore the cost per fraction was less with KVI, despite higher initial costs. The selection of appropriate IGRT equipment can make IGRT practical within radiotherapy departments. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  14. Apollo 15 time and motion study

    NASA Technical Reports Server (NTRS)

    Kubis, J. F.; Elrod, J. T.; Rusnak, R.; Barnes, J. E.

    1972-01-01

    A time and motion study of Apollo 15 lunar surface activity led to examination of four distinct areas of crewmen activity. These areas are: an analysis of lunar mobility, a comparative analysis of tasks performed in 1-g training and lunar EVA, an analysis of the metabolic cost of two activities that are performed in several EVAs, and a fall/near-fall analysis. An analysis of mobility showed that the crewmen used three basic mobility patterns (modified walk, hop, side step) while on the lunar surface. These mobility patterns were utilized as adaptive modes to compensate for the uneven terrain and varied soil conditions that the crewmen encountered. A comparison of the time required to perform tasks at the final 1-g lunar EVA training sessions and the time required to perform the same task on the lunar surface indicates that, in almost all cases, it took significantly more time (on the order of 40%) to perform tasks on the moon. This increased time was observed even after extraneous factors (e.g., hardware difficulties) were factored out.

  15. Impact of Milrinone Administration in Adult Cardiac Surgery Patients: Updated Meta-Analysis.

    PubMed

    Ushio, Masahiro; Egi, Moritoki; Wakabayashi, Junji; Nishimura, Taichi; Miyatake, Yuji; Obata, Norihiko; Mizobuchi, Satoshi

    2016-12-01

    To determine the effects of milrinone on short-term mortality in cardiac surgery patients with focus on the presence or absence of heterogeneity of the effect. A systematic review and meta-analysis. Five hundred thirty-seven adult cardiac surgery patients from 12 RCTs. Milrinone administration. The authors conducted a systematic Medline and Pubmed search to assess the effect of milrinone on short-term mortality in adult cardiac surgery patients. Subanalysis was performed according to the timing for commencement of milrinone administration and the type of comparators. The primary outcome was any short-term mortality. Overall analysis showed no difference in mortality rates in patients who received milrinone and patients who received comparators (odds ratio = 1.25, 95% CI 0.45-3.51, p = 0.67). In subanalysis for the timing to commence milrinone administration and the type of comparators, odds ratio for mortality varied from 0.19 (placebo as control drug, start of administration after cardiopulmonary bypass) to 2.58 (levosimendan as control drug, start of administration after cardiopulmonary bypass). Among RCTs to assess the effect of milrinone administration in adult cardiac surgery patients, there are wide variations of the odds ratios of administration of milrinone for short-term mortality according to the comparators and the timing of administration. This fact may suggest that a simple pooling meta-analysis is not applicable for assessing the risk and benefit of milrinone administration in an adult cardiac surgery cohort. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Conventional and narrow bore short capillary columns with cyclodextrin derivatives as chiral selectors to speed-up enantioselective gas chromatography and enantioselective gas chromatography-mass spectrometry analyses.

    PubMed

    Bicchi, Carlo; Liberto, Erica; Cagliero, Cecilia; Cordero, Chiara; Sgorbini, Barbara; Rubiolo, Patrizia

    2008-11-28

    The analysis of complex real-world samples of vegetable origin requires rapid and accurate routine methods, enabling laboratories to increase sample throughput and productivity while reducing analysis costs. This study examines shortening enantioselective-GC (ES-GC) analysis time following the approaches used in fast GC. ES-GC separations are due to a weak enantiomer-CD host-guest interaction and the separation is thermodynamically driven and strongly influenced by temperature. As a consequence, fast temperature rates can interfere with enantiomeric discrimination; thus the use of short and/or narrow bore columns is a possible approach to speeding-up ES-GC analyses. The performance of ES-GC with a conventional inner diameter (I.D.) column (25 m length x 0.25 mm I.D., 0.15 microm and 0.25 microm d(f)) coated with 30% of 2,3-di-O-ethyl-6-O-tert-butyldimethylsilyl-beta-cyclodextrin in PS-086 is compared to those of conventional I.D. short column (5m length x 0.25 mm I.D., 0.15 microm d(f)) and of different length narrow bore columns (1, 2, 5 and 10 m long x 0.10 mm I.D., 0.10 microm d(f)) in analysing racemate standards of pesticides and in the flavour and fragrance field and real-world-samples. Short conventional I.D. columns gave shorter analysis time and comparable or lower resolutions with the racemate standards, depending mainly on analyte volatility. Narrow-bore columns were tested under different analysis conditions; they provided shorter analysis time and resolutions comparable to those of conventional I.D. ES columns. The narrow-bore columns offering the most effective compromise between separation efficiency and analysis time are the 5 and 2m columns; in combination with mass spectrometry as detector, applied to lavender and bergamot essential oil analyses, these reduced analysis time by a factor of at least three while separation of chiral markers remained unaltered.

  17. Using time-frequency analysis to determine time-resolved detonation velocity with microwave interferometry.

    PubMed

    Kittell, David E; Mares, Jesus O; Son, Steven F

    2015-04-01

    Two time-frequency analysis methods based on the short-time Fourier transform (STFT) and continuous wavelet transform (CWT) were used to determine time-resolved detonation velocities with microwave interferometry (MI). The results were directly compared to well-established analysis techniques consisting of a peak-picking routine as well as a phase unwrapping method (i.e., quadrature analysis). The comparison is conducted on experimental data consisting of transient detonation phenomena observed in triaminotrinitrobenzene and ammonium nitrate-urea explosives, representing high and low quality MI signals, respectively. Time-frequency analysis proved much more capable of extracting useful and highly resolved velocity information from low quality signals than the phase unwrapping and peak-picking methods. Additionally, control of the time-frequency methods is mainly constrained to a single parameter which allows for a highly unbiased analysis method to extract velocity information. In contrast, the phase unwrapping technique introduces user based variability while the peak-picking technique does not achieve a highly resolved velocity result. Both STFT and CWT methods are proposed as improved additions to the analysis methods applied to MI detonation experiments, and may be useful in similar applications.

  18. A comparative analysis of alternative approaches for quantifying nonlinear dynamics in cardiovascular system.

    PubMed

    Chen, Yun; Yang, Hui

    2013-01-01

    Heart rate variability (HRV) analysis has emerged as an important research topic to evaluate autonomic cardiac function. However, traditional time and frequency-domain analysis characterizes and quantify only linear and stationary phenomena. In the present investigation, we made a comparative analysis of three alternative approaches (i.e., wavelet multifractal analysis, Lyapunov exponents and multiscale entropy analysis) for quantifying nonlinear dynamics in heart rate time series. Note that these extracted nonlinear features provide information about nonlinear scaling behaviors and the complexity of cardiac systems. To evaluate the performance, we used 24-hour HRV recordings from 54 healthy subjects and 29 heart failure patients, available in PhysioNet. Three nonlinear methods are evaluated not only individually but also in combination using three classification algorithms, i.e., linear discriminate analysis, quadratic discriminate analysis and k-nearest neighbors. Experimental results show that three nonlinear methods capture nonlinear dynamics from different perspectives and the combined feature set achieves the best performance, i.e., sensitivity 97.7% and specificity 91.5%. Collectively, nonlinear HRV features are shown to have the promise to identify the disorders in autonomic cardiovascular function.

  19. Maternal Time Use and Nurturing: Analysis of the Association Between Breastfeeding Practice and Time Spent Interacting with Baby.

    PubMed

    Smith, Julie P; Forrester, Robert

    2017-06-01

    Breastfeeding supports child development through complex mechanisms that are not well understood. Numerous studies have compared how well breastfeeding and nonbreastfeeding mothers interact with their child, but few examine how much interaction occurs. Our study of weekly time use among 156 mothers of infants aged 3-9 months investigated whether lactating mothers spend more time providing emotional support or cognitive stimulation of their infants than nonbreastfeeding mothers, and whether the amount of such interactive time is associated with breastfeeding intensity. Mothers were recruited via mother's and baby groups, infant health clinics, and childcare services, and used an electronic device to record their 24-hour time use for 7 days. Sociodemographic and feeding status data were collected by questionnaire. Statistical analysis using linear mixed modeling and residual maximum likelihood analysis compared maternal time use for those giving "some breastfeeding" and those "not breastfeeding." Analysis was also conducted for more detailed feeding subgroups. Breastfeeding and nonbreastfeeding mothers had broadly similar socioeconomic and demographic characteristics. Breastfeeding was found to be associated with more mother-child interaction time, a difference only partially explained by weekly maternal employment hours or other interactive care activities such as play or reading. This study presents data suggesting that lactating mothers spent significantly more hours weekly on milk feeding and on carrying, holding, or soothing their infant than nonlactating mothers; and on providing childcare. Understanding the mechanisms by which child mental health and development benefits from breastfeeding may have important implications for policies and intervention strategies, and could be usefully informed by suitably designed time use studies.

  20. A comparison of the wavelet and short-time fourier transforms for Doppler spectral analysis.

    PubMed

    Zhang, Yufeng; Guo, Zhenyu; Wang, Weilian; He, Side; Lee, Ting; Loew, Murray

    2003-09-01

    Doppler spectrum analysis provides a non-invasive means to measure blood flow velocity and to diagnose arterial occlusive disease. The time-frequency representation of the Doppler blood flow signal is normally computed by using the short-time Fourier transform (STFT). This transform requires stationarity of the signal during a finite time interval, and thus imposes some constraints on the representation estimate. In addition, the STFT has a fixed time-frequency window, making it inaccurate to analyze signals having relatively wide bandwidths that change rapidly with time. In the present study, wavelet transform (WT), having a flexible time-frequency window, was used to investigate its advantages and limitations for the analysis of the Doppler blood flow signal. Representations computed using the WT with a modified Morlet wavelet were investigated and compared with the theoretical representation and those computed using the STFT with a Gaussian window. The time and frequency resolutions of these two approaches were compared. Three indices, the normalized root-mean-squared errors of the minimum, the maximum and the mean frequency waveforms, were used to evaluate the performance of the WT. Results showed that the WT can not only be used as an alternative signal processing tool to the STFT for Doppler blood flow signals, but can also generate a time-frequency representation with better resolution than the STFT. In addition, the WT method can provide both satisfactory mean frequencies and maximum frequencies. This technique is expected to be useful for the analysis of Doppler blood flow signals to quantify arterial stenoses.

  1. Note: Quasi-real-time analysis of dynamic near field scattering data using a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.

    2012-10-01

    We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.

  2. Multiscale multifractal time irreversibility analysis of stock markets

    NASA Astrophysics Data System (ADS)

    Jiang, Chenguang; Shang, Pengjian; Shi, Wenbin

    2016-11-01

    Time irreversibility is one of the most important properties of nonstationary time series. Complex time series often demonstrate even multiscale time irreversibility, such that not only the original but also coarse-grained time series are asymmetric over a wide range of scales. We study the multiscale time irreversibility of time series. In this paper, we develop a method called multiscale multifractal time irreversibility analysis (MMRA), which allows us to extend the description of time irreversibility to include the dependence on the segment size and statistical moments. We test the effectiveness of MMRA in detecting multifractality and time irreversibility of time series generated from delayed Henon map and binomial multifractal model. Then we employ our method to the time irreversibility analysis of stock markets in different regions. We find that the emerging market has higher multifractality degree and time irreversibility compared with developed markets. In this sense, the MMRA method may provide new angles in assessing the evolution stage of stock markets.

  3. Direct stenting versus balloon predilation: Jury is still out.

    PubMed

    Belardi, Jorge A; Albertal, Mariano

    2017-08-01

    Compared to balloon predilation, direct stenting (DS) shortens procedural time and reduces radiation and contrast exposure. A meta-analysis that included 7 studies comparing these 2 strategies revealed lower adverse event rate with DS. Studies included in the present meta-analysis were mostly observational and utilized first generation drug-eluting stent. Patient and lesion selection may explain these positive results. © 2017 Wiley Periodicals, Inc.

  4. 42 CFR 488.61 - Special procedures for approval and re-approval of organ transplant centers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... governing body on a sustainable basis, and has requested more time to design or implement additional... months of the Systems Improvement Agreement; (v) A comparative effectiveness analysis that compares...

  5. Comparative analysis of dynamic pricing strategies for managed lanes.

    DOT National Transportation Integrated Search

    2015-06-01

    The objective of this research is to investigate and compare the performances of different : dynamic pricing strategies for managed lanes facilities. These pricing strategies include real-time : traffic responsive methods, as well as refund options a...

  6. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE PAGES

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    2017-12-20

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  7. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  8. Boundary element analysis of corrosion problems for pumps and pipes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyasaka, M.; Amaya, K.; Kishimoto, K.

    1995-12-31

    Three-dimensional (3D) and axi-symmetric boundary element methods (BEM) were developed to quantitatively estimate cathodic protection and macro-cell corrosion. For 3D analysis, a multiple-region method (MRM) was developed in addition to a single-region method (SRM). The validity and usefulness of the BEMs were demonstrated by comparing numerical results with experimental data from galvanic corrosion systems of a cylindrical model and a seawater pipe, and from a cathodic protection system of an actual seawater pump. It was shown that a highly accurate analysis could be performed for fluid machines handling seawater with complex 3D fields (e.g. seawater pump) by taking account ofmore » flow rate and time dependencies of polarization curve. Compared to the 3D BEM, the axi-symmetric BEM permitted large reductions in numbers of elements and nodes, which greatly simplified analysis of axi-symmetric fields such as pipes. Computational accuracy and CPU time were compared between analyses using two approximation methods for polarization curves: a logarithmic-approximation method and a linear-approximation method.« less

  9. Evaluation of joint findings with gait analysis in children with hemophilia.

    PubMed

    Cayir, Atilla; Yavuzer, Gunes; Sayli, Revide Tülin; Gurcay, Eda; Culha, Vildan; Bozkurt, Murat

    2014-01-01

    Hemophilic arthropathy due to recurrent joint bleeding leads to physical, psychological and socioeconomic problems in children with hemophilia and reduces their quality of life. The purpose of this study was to evaluate joint damage through various parameters and to determine functional deterioration in the musculoskeletal system during walking using kinetic and kinematic gait analysis. Physical examination and kinetic and kinematic gait analysis findings of 19 hemophilic patients aged 7-20 years were compared with those of age, sex and leg length matched controls. Stride time was longer in the hemophilia group (p=0.001) compared to the age matched healthy control group, while hip, knee and ankle joint rotation angles were more limited (p=0.001, p=0.035 and p=0.001, respectively). In the hemophilia group, the extensor moment of the knee joint in the stance phase was less than that in the control group (p=0.001). Stride time was longer in the severe hemophilia group compared to the mild-moderate hemophilia and control groups (p=0.011 and p=0.001, respectively). Rotation angle of the ankle was wider in the control group compared to the other two groups (p=0.001 for both). Rotation angle of the ankle joint was narrower in the severe hemophilia group compared to the others (p=0.001 for each). Extensor moment of the knee joint was greater in the control group compared to the other two groups (p=0.003 and p=0.001, respectively). Walking velocity was higher in the control group compared to the severe hemophilia group. Kinetic and kinematic gait analysis has the sensitivity to detect minimal changes in biomechanical parameters. Gait analysis can be used as a reliable method to detect early joint damage.

  10. Introducing conjoint analysis method into delayed lotteries studies: its validity and time stability are higher than in adjusting.

    PubMed

    Białek, Michał; Markiewicz, Łukasz; Sawicki, Przemysław

    2015-01-01

    The delayed lotteries are much more common in everyday life than are pure lotteries. Usually, we need to wait to find out the outcome of the risky decision (e.g., investing in a stock market, engaging in a relationship). However, most research has studied the time discounting and probability discounting in isolation using the methodologies designed specifically to track changes in one parameter. Most commonly used method is adjusting, but its reported validity and time stability in research on discounting are suboptimal. The goal of this study was to introduce the novel method for analyzing delayed lotteries-conjoint analysis-which hypothetically is more suitable for analyzing individual preferences in this area. A set of two studies compared the conjoint analysis with adjusting. The results suggest that individual parameters of discounting strength estimated with conjoint have higher predictive value (Study 1 and 2), and they are more stable over time (Study 2) compared to adjusting. We discuss these findings, despite the exploratory character of reported studies, by suggesting that future research on delayed lotteries should be cross-validated using both methods.

  11. Comparison of causality analysis on simultaneously measured fMRI and NIRS signals during motor tasks.

    PubMed

    Anwar, Abdul Rauf; Muthalib, Makii; Perrey, Stephane; Galka, Andreas; Granert, Oliver; Wolff, Stephan; Deuschl, Guenther; Raethjen, Jan; Heute, Ulrich; Muthuraman, Muthuraman

    2013-01-01

    Brain activity can be measured using different modalities. Since most of the modalities tend to complement each other, it seems promising to measure them simultaneously. In to be presented research, the data recorded from Functional Magnetic Resonance Imaging (fMRI) and Near Infrared Spectroscopy (NIRS), simultaneously, are subjected to causality analysis using time-resolved partial directed coherence (tPDC). Time-resolved partial directed coherence uses the principle of state space modelling to estimate Multivariate Autoregressive (MVAR) coefficients. This method is useful to visualize both frequency and time dynamics of causality between the time series. Afterwards, causality results from different modalities are compared by estimating the Spearman correlation. In to be presented study, we used directionality vectors to analyze correlation, rather than actual signal vectors. Results show that causality analysis of the fMRI correlates more closely to causality results of oxy-NIRS as compared to deoxy-NIRS in case of a finger sequencing task. However, in case of simple finger tapping, no clear difference between oxy-fMRI and deoxy-fMRI correlation is identified.

  12. Functional magnetic resonance imaging activation detection: fuzzy cluster analysis in wavelet and multiwavelet domains.

    PubMed

    Jahanian, Hesamoddin; Soltanian-Zadeh, Hamid; Hossein-Zadeh, Gholam-Ali

    2005-09-01

    To present novel feature spaces, based on multiscale decompositions obtained by scalar wavelet and multiwavelet transforms, to remedy problems associated with high dimension of functional magnetic resonance imaging (fMRI) time series (when they are used directly in clustering algorithms) and their poor signal-to-noise ratio (SNR) that limits accurate classification of fMRI time series according to their activation contents. Using randomization, the proposed method finds wavelet/multiwavelet coefficients that represent the activation content of fMRI time series and combines them to define new feature spaces. Using simulated and experimental fMRI data sets, the proposed feature spaces are compared to the cross-correlation (CC) feature space and their performances are evaluated. In these studies, the false positive detection rate is controlled using randomization. To compare different methods, several points of the receiver operating characteristics (ROC) curves, using simulated data, are estimated and compared. The proposed features suppress the effects of confounding signals and improve activation detection sensitivity. Experimental results show improved sensitivity and robustness of the proposed method compared to the conventional CC analysis. More accurate and sensitive activation detection can be achieved using the proposed feature spaces compared to CC feature space. Multiwavelet features show superior detection sensitivity compared to the scalar wavelet features. (c) 2005 Wiley-Liss, Inc.

  13. Comprehensive Numerical Analysis of Finite Difference Time Domain Methods for Improving Optical Waveguide Sensor Accuracy

    PubMed Central

    Samak, M. Mosleh E. Abu; Bakar, A. Ashrif A.; Kashif, Muhammad; Zan, Mohd Saiful Dzulkifly

    2016-01-01

    This paper discusses numerical analysis methods for different geometrical features that have limited interval values for typically used sensor wavelengths. Compared with existing Finite Difference Time Domain (FDTD) methods, the alternating direction implicit (ADI)-FDTD method reduces the number of sub-steps by a factor of two to three, which represents a 33% time savings in each single run. The local one-dimensional (LOD)-FDTD method has similar numerical equation properties, which should be calculated as in the previous method. Generally, a small number of arithmetic processes, which result in a shorter simulation time, are desired. The alternating direction implicit technique can be considered a significant step forward for improving the efficiency of unconditionally stable FDTD schemes. This comparative study shows that the local one-dimensional method had minimum relative error ranges of less than 40% for analytical frequencies above 42.85 GHz, and the same accuracy was generated by both methods.

  14. Comparison of time-frequency distribution techniques for analysis of spinal somatosensory evoked potential.

    PubMed

    Hu, Y; Luk, K D; Lu, W W; Holmes, A; Leong, J C

    2001-05-01

    Spinal somatosensory evoked potential (SSEP) has been employed to monitor the integrity of the spinal cord during surgery. To detect both temporal and spectral changes in SSEP waveforms, an investigation of the application of time-frequency analysis (TFA) techniques was conducted. SSEP signals from 30 scoliosis patients were analysed using different techniques; short time Fourier transform (STFT), Wigner-Ville distribution (WVD), Choi-Williams distribution (CWD), cone-shaped distribution (CSD) and adaptive spectrogram (ADS). The time-frequency distributions (TFD) computed using these methods were assessed and compared with each other. WVD, ADS, CSD and CWD showed better resolution than STFT. Comparing normalised peak widths, CSD showed the sharpest peak width (0.13+/-0.1) in the frequency dimension, and a mean peak width of 0.70+/-0.12 in the time dimension. Both WVD and CWD produced cross-term interference, distorting the TFA distribution, but this was not seen with CSD and ADS. CSD appeared to give a lower mean peak power bias (10.3%+/-6.2%) than ADS (41.8%+/-19.6%). Application of the CSD algorithm showed both good resolution and accurate spectrograms, and is therefore recommended as the most appropriate TFA technique for the analysis of SSEP signals.

  15. The timing resolution of scintillation-detector systems: Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Choong, Woon-Seng

    2009-11-01

    Recent advancements in fast scintillating materials and fast photomultiplier tubes (PMTs) have stimulated renewed interest in time-of-flight (TOF) positron emission tomography (PET). It is well known that the improvement in the timing resolution in PET can significantly reduce the noise variance in the reconstructed image resulting in improved image quality. In order to evaluate the timing performance of scintillation detectors used in TOF PET, we use Monte Carlo analysis to model the physical processes (crystal geometry, crystal surface finish, scintillator rise time, scintillator decay time, photoelectron yield, PMT transit time spread, PMT single-electron response, amplifier response and time pick-off method) that can contribute to the timing resolution of scintillation-detector systems. In the Monte Carlo analysis, the photoelectron emissions are modeled by a rate function, which is used to generate the photoelectron time points. The rate function, which is simulated using Geant4, represents the combined intrinsic light emissions of the scintillator and the subsequent light transport through the crystal. The PMT output signal is determined by the superposition of the PMT single-electron response resulting from the photoelectron emissions. The transit time spread and the single-electron gain variation of the PMT are modeled in the analysis. Three practical time pick-off methods are considered in the analysis. Statistically, the best timing resolution is achieved with the first photoelectron timing. The calculated timing resolution suggests that a leading edge discriminator gives better timing performance than a constant fraction discriminator and produces comparable results when a two-threshold or three-threshold discriminator is used. For a typical PMT, the effect of detector noise on the timing resolution is negligible. The calculated timing resolution is found to improve with increasing mean photoelectron yield, decreasing scintillator decay time and decreasing transit time spread. However, only substantial improvement in the timing resolution is obtained with improved transit time spread if the first photoelectron timing is less than the transit time spread. While the calculated timing performance does not seem to be affected by the pixel size of the crystal, it improves for an etched crystal compared to a polished crystal. In addition, the calculated timing resolution degrades with increasing crystal length. These observations can be explained by studying the initial photoelectron rate. Experimental measurements provide reasonably good agreement with the calculated timing resolution. The Monte Carlo analysis developed in this work will allow us to optimize the scintillation detectors for timing and to understand the physical factors limiting their performance.

  16. The timing resolution of scintillation-detector systems: Monte Carlo analysis.

    PubMed

    Choong, Woon-Seng

    2009-11-07

    Recent advancements in fast scintillating materials and fast photomultiplier tubes (PMTs) have stimulated renewed interest in time-of-flight (TOF) positron emission tomography (PET). It is well known that the improvement in the timing resolution in PET can significantly reduce the noise variance in the reconstructed image resulting in improved image quality. In order to evaluate the timing performance of scintillation detectors used in TOF PET, we use Monte Carlo analysis to model the physical processes (crystal geometry, crystal surface finish, scintillator rise time, scintillator decay time, photoelectron yield, PMT transit time spread, PMT single-electron response, amplifier response and time pick-off method) that can contribute to the timing resolution of scintillation-detector systems. In the Monte Carlo analysis, the photoelectron emissions are modeled by a rate function, which is used to generate the photoelectron time points. The rate function, which is simulated using Geant4, represents the combined intrinsic light emissions of the scintillator and the subsequent light transport through the crystal. The PMT output signal is determined by the superposition of the PMT single-electron response resulting from the photoelectron emissions. The transit time spread and the single-electron gain variation of the PMT are modeled in the analysis. Three practical time pick-off methods are considered in the analysis. Statistically, the best timing resolution is achieved with the first photoelectron timing. The calculated timing resolution suggests that a leading edge discriminator gives better timing performance than a constant fraction discriminator and produces comparable results when a two-threshold or three-threshold discriminator is used. For a typical PMT, the effect of detector noise on the timing resolution is negligible. The calculated timing resolution is found to improve with increasing mean photoelectron yield, decreasing scintillator decay time and decreasing transit time spread. However, only substantial improvement in the timing resolution is obtained with improved transit time spread if the first photoelectron timing is less than the transit time spread. While the calculated timing performance does not seem to be affected by the pixel size of the crystal, it improves for an etched crystal compared to a polished crystal. In addition, the calculated timing resolution degrades with increasing crystal length. These observations can be explained by studying the initial photoelectron rate. Experimental measurements provide reasonably good agreement with the calculated timing resolution. The Monte Carlo analysis developed in this work will allow us to optimize the scintillation detectors for timing and to understand the physical factors limiting their performance.

  17. Two- and three-dimensional transvaginal ultrasound with power Doppler angiography and gel infusion sonography for diagnosis of endometrial malignancy.

    PubMed

    Dueholm, M; Christensen, J W; Rydbjerg, S; Hansen, E S; Ørtoft, G

    2015-06-01

    To evaluate the diagnostic efficiency of two-dimensional (2D) and three-dimensional (3D) transvaginal ultrasonography, power Doppler angiography (PDA) and gel infusion sonography (GIS) at offline analysis for recognition of malignant endometrium compared with real-time evaluation during scanning, and to determine optimal image parameters at 3D analysis. One hundred and sixty-nine consecutive women with postmenopausal bleeding and endometrial thickness ≥ 5 mm underwent systematic evaluation of endometrial pattern on 2D imaging, and 2D videoclips and 3D volumes were later analyzed offline. Histopathological findings at hysteroscopy or hysterectomy were used as the reference standard. The efficiency of the different techniques for diagnosis of malignancy was calculated and compared. 3D image parameters, endometrial volume and 3D vascular indices were assessed. Optimal 3D image parameters were transformed by logistic regression into a risk of endometrial cancer (REC) score, including scores for body mass index, endometrial thickness and endometrial morphology at gray-scale and PDA and GIS. Offline 2D and 3D analysis were equivalent, but had lower diagnostic performance compared with real-time evaluation during scanning. Their diagnostic performance was not markedly improved by the addition of PDA or GIS, but their efficiency was comparable with that of real-time 2D-GIS in offline examinations of good image quality. On logistic regression, the 3D parameters from the REC-score system had the highest diagnostic efficiency. The area under the curve of the REC-score system at 3D-GIS (0.89) was not improved by inclusion of vascular indices or endometrial volume calculations. Real-time evaluation during scanning is most efficient, but offline 2D and 3D analysis is useful for prediction of endometrial cancer when good image quality can be obtained. The diagnostic efficiency at 3D analysis may be improved by use of REC-scoring systems, without the need for calculation of vascular indices or endometrial volume. The optimal imaging modality appears to be real-time 2D-GIS. Copyright © 2014 ISUOG. Published by John Wiley & Sons Ltd.

  18. Comparative ergonomic workflow and user experience analysis of MRI versus fluoroscopy-guided vascular interventions: an iliac angioplasty exemplar case study.

    PubMed

    Fernández-Gutiérrez, Fabiola; Martínez, Santiago; Rube, Martin A; Cox, Benjamin F; Fatahi, Mahsa; Scott-Brown, Kenneth C; Houston, J Graeme; McLeod, Helen; White, Richard D; French, Karen; Gueorguieva, Mariana; Immel, Erwin; Melzer, Andreas

    2015-10-01

    A methodological framework is introduced to assess and compare a conventional fluoroscopy protocol for peripheral angioplasty with a new magnetic resonant imaging (MRI)-guided protocol. Different scenarios were considered during interventions on a perfused arterial phantom with regard to time-based and cognitive task analysis, user experience and ergonomics. Three clinicians with different expertise performed a total of 43 simulated common iliac angioplasties (9 fluoroscopic, 34 MRI-guided) in two blocks of sessions. Six different configurations for MRI guidance were tested in the first block. Four of them were evaluated in the second block and compared to the fluoroscopy protocol. Relevant stages' durations were collected, and interventions were audio-visually recorded from different perspectives. A cued retrospective protocol analysis (CRPA) was undertaken, including personal interviews. In addition, ergonomic constraints in the MRI suite were evaluated. Significant differences were found when comparing the performance between MRI configurations versus fluoroscopy. Two configurations [with times of 8.56 (0.64) and 9.48 (1.13) min] led to reduce procedure time for MRI guidance, comparable to fluoroscopy [8.49 (0.75) min]. The CRPA pointed out the main influential factors for clinical procedure performance. The ergonomic analysis quantified musculoskeletal risks for interventional radiologists when utilising MRI. Several alternatives were suggested to prevent potential low-back injuries. This work presents a step towards the implementation of efficient operational protocols for MRI-guided procedures based on an integral and multidisciplinary framework, applicable to the assessment of current vascular protocols. The use of first-user perspective raises the possibility of establishing new forms of clinical training and education.

  19. Induction of osteoporosis with its influence on osteoporotic determinants and their interrelationships in rats by DEXA

    PubMed Central

    Heiss, Christian; Govindarajan, Parameswari; Schlewitz, Gudrun; Hemdan, Nasr Y.A.; Schliefke, Nathalie; Alt, Volker; Thormann, Ulrich; Lips, Katrin Susanne; Wenisch, Sabine; Langheinrich, Alexander C.; Zahner, Daniel; Schnettler, Reinhard

    2012-01-01

    Summary Background As women are the population most affected by multifactorial osteoporosis, research is focused on unraveling the underlying mechanism of osteoporosis induction in rats by combining ovariectomy (OVX) either with calcium, phosphorus, vitamin C and vitamin D2/D3 deficiency, or by administration of glucocorticoid (dexamethasone). Material/Methods Different skeletal sites of sham, OVX-Diet and OVX-Steroid rats were analyzed by Dual Energy X-ray Absorptiometry (DEXA) at varied time points of 0, 4 and 12 weeks to determine and compare the osteoporotic factors such as bone mineral density (BMD), bone mineral content (BMC), area, body weight and percent fat among different groups and time points. Comparative analysis and interrelationships among osteoporotic determinants by regression analysis were also determined. Results T scores were below-2.5 in OVX-Diet rats at 4 and 12 weeks post-OVX. OVX-diet rats revealed pronounced osteoporotic status with reduced BMD and BMC than the steroid counterparts, with the spine and pelvis as the most affected skeletal sites. Increase in percent fat was observed irrespective of the osteoporosis inducers applied. Comparative analysis and interrelationships between osteoporotic determinants that are rarely studied in animals indicate the necessity to analyze BMC and area along with BMD in obtaining meaningful information leading to proper prediction of probability of osteoporotic fractures. Conclusions Enhanced osteoporotic effect observed in OVX-Diet rats indicates that estrogen dysregulation combined with diet treatment induces and enhances osteoporosis with time when compared to the steroid group. Comparative and regression analysis indicates the need to determine BMC along with BMD and area in osteoporotic determination. PMID:22648240

  20. Comparative Analysis of AhR-Mediated TCDD-Elicited Gene Expression in Human Liver Adult Stem Cells

    PubMed Central

    Kim, Suntae; Dere, Edward; Burgoon, Lyle D.; Chang, Chia-Cheng; Zacharewski, Timothy R.

    2009-01-01

    Time course and dose-response studies were conducted in HL1-1 cells, a human liver cell line with stem cell–like characteristics, to assess the differential gene expression elicited by 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) compared with other established models. Cells were treated with 0.001, 0.01, 0.1, 1, 10, or 100nM TCDD or dimethyl sulfoxide vehicle control for 12 h for the dose-response study, or with 10nM TCDD or vehicle for 1, 2, 4, 8, 12, 24, or 48 h for the time course study. Elicited changes were monitored using a human cDNA microarray with 6995 represented genes. Empirical Bayes analysis identified 144 genes differentially expressed at one or more time points following treatment. Most genes exhibited dose-dependent responses including CYP1A1, CYP1B1, ALDH1A3, and SLC7A5 genes. Comparative analysis of HL1-1 differential gene expression to human HepG2 data identified 74 genes with comparable temporal expression profiles including 12 putative primary responses. HL1-1–specific changes were related to lipid metabolism and immune responses, consistent with effects elicited in vivo. Furthermore, comparative analysis of HL1-1 cells with mouse Hepa1c1c7 hepatoma cell lines and C57BL/6 hepatic tissue identified 18 and 32 commonly regulated orthologous genes, respectively, with functions associated with signal transduction, transcriptional regulation, metabolism and transport. Although some common pathways are affected, the results suggest that TCDD elicits species- and model-specific gene expression profiles. PMID:19684285

  1. Survival time and effect of selected predictor variables on survival in owned pet cats seropositive for feline immunodeficiency and leukemia virus attending a referral clinic in northern Italy.

    PubMed

    Spada, Eva; Perego, Roberta; Sgamma, Elena Assunta; Proverbio, Daniela

    2018-02-01

    Feline immunodeficiency virus (FIV) and feline leukemia virus (FeLV) are among the most important feline infectious diseases worldwide. This retrospective study investigated survival times and effects of selected predictor factors on survival time in a population of owned pet cats in Northern Italy testing positive for the presence of FIV antibodies and FeLV antigen. One hundred and three retrovirus-seropositive cats, 53 FIV-seropositive cats, 40 FeLV-seropositive cats, and 10 FIV+FeLV-seropositive cats were included in the study. A population of 103 retrovirus-seronegative age and sex-matched cats was selected. Survival time was calculated and compared between retrovirus-seronegative, FIV, FeLV and FIV+FeLV-seropositive cats using Kaplan-Meier survival analysis. Cox proportional-hazards regression analysis was used to study the effect of selected predictor factors (male gender, peripheral blood cytopenia as reduced red blood cells - RBC- count, leukopenia, neutropenia and lymphopenia, hypercreatininemia and reduced albumin to globulin ratio) on survival time in retrovirus-seropositive populations. Median survival times for seronegative cats, FIV, FeLV and FIV+FeLV-seropositive cats were 3960, 2040, 714 and 77days, respectively. Compared to retrovirus-seronegative cats median survival time was significantly lower (P<0.000) in FeLV and FIV+FeLV-seropositive cats. Median survival time in FeLV and FIV+FeLV-seropositive cats was also significant lower (P<0.000) when compared to FIV-seropositive cats. Hazard ratio of death in FeLV and FIV+FeLV-seropositive cats being respectively 3.4 and 7.4 times higher, in comparison to seronegative cats and 2.3 and 4.8 times higher in FeLV and FIV+FeLV-seropositive cats as compared to FIV-seropositive cats. A Cox proportional-hazards regression analysis showed that FIV and FeLV-seropositive cats with reduced RBC counts at time of diagnosis of seropositivity had significantly shorter survival times when compared to FIV and FeLV-seropositive cats with normal RBC counts at diagnosis. In summary, FIV-seropositive status did not significantly affect longevity of cats in this study, unlike FeLV and FIV+FeLV-seropositivity. Reduced RBC counts at time of FIV and FeLV diagnosis could impact negatively on the longevity of seropositive cats and therefore blood counts should always be evaluated at diagnosis and follow-up of retrovirus-seropositive cats. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Compressive buckling analysis of hat-stiffened panel

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Jackson, Raymond H.

    1991-01-01

    Buckling analysis was performed on a hat-stiffened panel subjected to uniaxial compression. Both local buckling and global buckling were analyzed. It was found that the global buckling load was several times higher than the buckling load. The predicted local buckling loads compared favorably with both experimental data and finite-element analysis.

  3. Influence of Time-Series Normalization, Number of Nodes, Connectivity and Graph Measure Selection on Seizure-Onset Zone Localization from Intracranial EEG.

    PubMed

    van Mierlo, Pieter; Lie, Octavian; Staljanssens, Willeke; Coito, Ana; Vulliémoz, Serge

    2018-04-26

    We investigated the influence of processing steps in the estimation of multivariate directed functional connectivity during seizures recorded with intracranial EEG (iEEG) on seizure-onset zone (SOZ) localization. We studied the effect of (i) the number of nodes, (ii) time-series normalization, (iii) the choice of multivariate time-varying connectivity measure: Adaptive Directed Transfer Function (ADTF) or Adaptive Partial Directed Coherence (APDC) and (iv) graph theory measure: outdegree or shortest path length. First, simulations were performed to quantify the influence of the various processing steps on the accuracy to localize the SOZ. Afterwards, the SOZ was estimated from a 113-electrodes iEEG seizure recording and compared with the resection that rendered the patient seizure-free. The simulations revealed that ADTF is preferred over APDC to localize the SOZ from ictal iEEG recordings. Normalizing the time series before analysis resulted in an increase of 25-35% of correctly localized SOZ, while adding more nodes to the connectivity analysis led to a moderate decrease of 10%, when comparing 128 with 32 input nodes. The real-seizure connectivity estimates localized the SOZ inside the resection area using the ADTF coupled to outdegree or shortest path length. Our study showed that normalizing the time-series is an important pre-processing step, while adding nodes to the analysis did only marginally affect the SOZ localization. The study shows that directed multivariate Granger-based connectivity analysis is feasible with many input nodes (> 100) and that normalization of the time-series before connectivity analysis is preferred.

  4. Nonlinear Analysis of Motor Activity Shows Differences between Schizophrenia and Depression: A Study Using Fourier Analysis and Sample Entropy

    PubMed Central

    Hauge, Erik R.; Berle, Jan Øystein; Oedegaard, Ketil J.; Holsten, Fred; Fasmer, Ole Bernt

    2011-01-01

    The purpose of this study has been to describe motor activity data obtained by using wrist-worn actigraphs in patients with schizophrenia and major depression by the use of linear and non-linear methods of analysis. Different time frames were investigated, i.e., activity counts measured every minute for up to five hours and activity counts made hourly for up to two weeks. The results show that motor activity was lower in the schizophrenic patients and in patients with major depression, compared to controls. Using one minute intervals the depressed patients had a higher standard deviation (SD) compared to both the schizophrenic patients and the controls. The ratio between the root mean square successive differences (RMSSD) and SD was higher in the schizophrenic patients compared to controls. The Fourier analysis of the activity counts measured every minute showed that the relation between variance in the low and the high frequency range was lower in the schizophrenic patients compared to the controls. The sample entropy was higher in the schizophrenic patients compared to controls in the time series from the activity counts made every minute. The main conclusions of the study are that schizophrenic and depressive patients have distinctly different profiles of motor activity and that the results differ according to period length analysed. PMID:21297977

  5. High-Speed Real-Time Resting-State fMRI Using Multi-Slab Echo-Volumar Imaging

    PubMed Central

    Posse, Stefan; Ackley, Elena; Mutihac, Radu; Zhang, Tongsheng; Hummatov, Ruslan; Akhtari, Massoud; Chohan, Muhammad; Fisch, Bruce; Yonas, Howard

    2013-01-01

    We recently demonstrated that ultra-high-speed real-time fMRI using multi-slab echo-volumar imaging (MEVI) significantly increases sensitivity for mapping task-related activation and resting-state networks (RSNs) compared to echo-planar imaging (Posse et al., 2012). In the present study we characterize the sensitivity of MEVI for mapping RSN connectivity dynamics, comparing independent component analysis (ICA) and a novel seed-based connectivity analysis (SBCA) that combines sliding-window correlation analysis with meta-statistics. This SBCA approach is shown to minimize the effects of confounds, such as movement, and CSF and white matter signal changes, and enables real-time monitoring of RSN dynamics at time scales of tens of seconds. We demonstrate highly sensitive mapping of eloquent cortex in the vicinity of brain tumors and arterio-venous malformations, and detection of abnormal resting-state connectivity in epilepsy. In patients with motor impairment, resting-state fMRI provided focal localization of sensorimotor cortex compared with more diffuse activation in task-based fMRI. The fast acquisition speed of MEVI enabled segregation of cardiac-related signal pulsation using ICA, which revealed distinct regional differences in pulsation amplitude and waveform, elevated signal pulsation in patients with arterio-venous malformations and a trend toward reduced pulsatility in gray matter of patients compared with healthy controls. Mapping cardiac pulsation in cortical gray matter may carry important functional information that distinguishes healthy from diseased tissue vasculature. This novel fMRI methodology is particularly promising for mapping eloquent cortex in patients with neurological disease, having variable degree of cooperation in task-based fMRI. In conclusion, ultra-high-real-time speed fMRI enhances the sensitivity of mapping the dynamics of resting-state connectivity and cerebro-vascular pulsatility for clinical and neuroscience research applications. PMID:23986677

  6. Ultrasonic dissection versus conventional electrocautery during gastrectomy for gastric cancer: a meta-analysis of randomized controlled trials.

    PubMed

    Sun, Z C; Xu, W G; Xiao, X M; Yu, W H; Xu, D M; Xu, H M; Gao, H L; Wang, R X

    2015-04-01

    Use of ultrasonic surgical instrument is gaining popularity for dissection and coagulation in open surgery. However, there is still no consensus on the efficacy and safety of its use compared with conventional surgical technique in open gastrectomy for gastric cancer. The aim of this meta-analysis was to evaluate the role and surgical outcomes of ultrasonic dissection (UD) compared with conventional electrocautery (EC). A systematic literature search was performed to identify all studies comparing UD and EC in gastric cancer surgery. Intraoperative and postoperative outcomes were compared using weighted mean differences (WMDs) and odds ratios (ORs). Five studies were included in this meta-analysis, comprising 489 patients. Meta-analysis results showed that compared with EC, UD was associated with significantly shorter operation time (P = 0.03), less intraoperative blood loss (P = 0.002), lower morbidity (P = 0.02), and reduced postoperative hospital stay (P = 0.03). However, there was no significant difference between the two surgical techniques with regards to postoperative abdominal drainage (P = 0.17), and total cost in hospital (P = 0.59). Compared to EC, the use of UD during open gastrectomy can provide several improved outcomes for operation time, intraoperative blood loss, overall morbidity, and postoperative hospital stay. It appears that UD can be used instead of conventional EC in open gastric cancer surgery, although more larger trials with long follow-up should be performed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Impact of perioperative blood pressure variability on health resource utilization after cardiac surgery: an analysis of the ECLIPSE trials.

    PubMed

    Aronson, Solomon; Levy, Jerrold H; Lumb, Philip D; Fontes, Manuel; Wang, Yamei; Crothers, Tracy A; Sulham, Katherine A; Navetta, Marco S

    2014-06-01

    To examine the impact of blood pressure control on hospital health resource utilization using data from the ECLIPSE trials. Post-hoc analysis of data from 3 prospective, open-label, randomized clinical trials (ECLIPSE trials). Sixty-one medical centers in the United States. Patients 18 years or older undergoing cardiac surgery. Clevidipine was compared with nitroglycerin, sodium nitroprusside, and nicardipine. The ECLIPSE trials included 3 individual randomized open-label studies comparing clevidipine to nitroglycerin, sodium nitroprusside, and nicardipine. Blood pressure control was assessed as the integral of the cumulative area under the curve (AUC) outside specified systolic blood pressure ranges, such that lower AUC represents less variability. This analysis examined surgery duration, time to extubation, as well as intensive care unit (ICU) and hospital length of stay (LOS) in patients with AUC≤10 mmHg×min/h compared to patients with AUC>10 mmHg×min/h. One thousand four hundred ten patients were included for analysis; 736 patients (52%) had an AUC≤10 mmHg×min/h, and 674 (48%) had an AUC>10 mmHg×min/h. The duration of surgery and ICU LOS were similar between groups. Time to extubation and postoperative LOS were both significantly shorter (p = 0.05 and p<0.0001, respectively) in patients with AUC≤10. Multivariate analysis demonstrates AUC≤10 was significantly and independently associated with decreased time to extubation (hazard ratio 1.132, p = 0.0261) and postoperative LOS (hazard ratio 1.221, p = 0.0006). Based on data derived from the ECLIPSE studies, increased perioperative BP variability is associated with delayed time to extubation and increased postoperative LOS. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Comparing sequencing assays and human-machine analyses in actionable genomics for glioblastoma

    PubMed Central

    Wrzeszczynski, Kazimierz O.; Frank, Mayu O.; Koyama, Takahiko; Rhrissorrakrai, Kahn; Robine, Nicolas; Utro, Filippo; Emde, Anne-Katrin; Chen, Bo-Juen; Arora, Kanika; Shah, Minita; Vacic, Vladimir; Norel, Raquel; Bilal, Erhan; Bergmann, Ewa A.; Moore Vogel, Julia L.; Bruce, Jeffrey N.; Lassman, Andrew B.; Canoll, Peter; Grommes, Christian; Harvey, Steve; Parida, Laxmi; Michelini, Vanessa V.; Zody, Michael C.; Jobanputra, Vaidehi; Royyuru, Ajay K.

    2017-01-01

    Objective: To analyze a glioblastoma tumor specimen with 3 different platforms and compare potentially actionable calls from each. Methods: Tumor DNA was analyzed by a commercial targeted panel. In addition, tumor-normal DNA was analyzed by whole-genome sequencing (WGS) and tumor RNA was analyzed by RNA sequencing (RNA-seq). The WGS and RNA-seq data were analyzed by a team of bioinformaticians and cancer oncologists, and separately by IBM Watson Genomic Analytics (WGA), an automated system for prioritizing somatic variants and identifying drugs. Results: More variants were identified by WGS/RNA analysis than by targeted panels. WGA completed a comparable analysis in a fraction of the time required by the human analysts. Conclusions: The development of an effective human-machine interface in the analysis of deep cancer genomic datasets may provide potentially clinically actionable calls for individual patients in a more timely and efficient manner than currently possible. ClinicalTrials.gov identifier: NCT02725684. PMID:28740869

  9. Neural correlates of gait variability in people with multiple sclerosis with fall history.

    PubMed

    Kalron, Alon; Allali, Gilles; Achiron, Anat

    2018-05-28

    Investigate the association between step time variability and related brain structures in accordance with fall status in people with multiple sclerosis (PwMS). The study included 225 PwMS. A whole-brain MRI was performed by a high-resolution 3.0-Telsa MR scanner in addition to volumetric analysis based on 3D T1-weighted images using the FreeSurfer image analysis suite. Step time variability was measured by an electronic walkway. Participants were defined as "fallers" (at least two falls during the previous year) and "non-fallers". One hundred and five PwMS were defined as fallers and had a greater step time variability compared to non-fallers (5.6% (S.D.=3.4) vs. 3.4% (S.D.=1.5); p=0.001). MS fallers exhibited a reduced volume in the left caudate and both cerebellum hemispheres compared to non-fallers. By using a linear regression analysis no association was found between gait variability and related brain structures in the total cohort and non-fallers group. However, the analysis found an association between the left hippocampus and left putamen volumes with step time variability in the faller group; p=0.031, 0.048, respectively, controlling for total cranial volume, walking speed, disability, age and gender. Nevertheless, according to the hierarchical regression model, the contribution of these brain measures to predict gait variability was relatively small compared to walking speed. An association between low left hippocampal, putamen volumes and step time variability was found in PwMS with a history of falls, suggesting brain structural characteristics may be related to falls and increased gait variability in PwMS. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  10. Part-time versus full-time occlusion therapy for treatment of amblyopia: A meta-analysis.

    PubMed

    Yazdani, Negareh; Sadeghi, Ramin; Momeni-Moghaddam, Hamed; Zarifmahmoudi, Leili; Ehsaei, Asieh; Barrett, Brendan T

    2017-06-01

    To compare full-time occlusion (FTO) and part-time occlusion (PTO) therapy in the treatment of amblyopia, with the secondary aim of evaluating the minimum number of hours of part-time patching required for maximal effect from occlusion. A literature search was performed in PubMed, Scopus, Science Direct, Ovid, Web of Science and Cochrane library. Methodological quality of the literature was evaluated according to the Oxford Center for Evidence Based Medicine and modified Newcastle-Ottawa scale. Statistical analyses were performed using Comprehensive Meta-Analysis (version 2, Biostat Inc., USA). The present meta-analysis included six studies [three randomized controlled trials (RCTs) and three non-RCTs]. Pooled standardized difference in the mean changes in the visual acuity was 0.337 [lower and upper limits: -0.009, 0.683] higher in the FTO as compared to the PTO group; however, this difference was not statistically significant ( P  = 0.056, Cochrane Q value = 20.4 ( P  = 0.001), I 2  = 75.49%). Egger's regression intercept was 5.46 ( P  = 0.04). The pooled standardized difference in means of visual acuity changes was 1.097 [lower and upper limits: 0.68, 1.513] higher in the FTO arm ( P  < 0.001), and 0.7 [lower and upper limits: 0.315, 1.085] higher in the PTO arm ( P  < 0.001) compared to PTO less than two hours. This meta-analysis shows no statistically significant difference between PTO and FTO in treatment of amblyopia. However, our results suggest that the minimum effective PTO duration, to observe maximal improvement in visual acuity is six hours per day.

  11. Cross-recurrence quantification analysis of categorical and continuous time series: an R package

    PubMed Central

    Coco, Moreno I.; Dale, Rick

    2014-01-01

    This paper describes the R package crqa to perform cross-recurrence quantification analysis of two time series of either a categorical or continuous nature. Streams of behavioral information, from eye movements to linguistic elements, unfold over time. When two people interact, such as in conversation, they often adapt to each other, leading these behavioral levels to exhibit recurrent states. In dialog, for example, interlocutors adapt to each other by exchanging interactive cues: smiles, nods, gestures, choice of words, and so on. In order for us to capture closely the goings-on of dynamic interaction, and uncover the extent of coupling between two individuals, we need to quantify how much recurrence is taking place at these levels. Methods available in crqa would allow researchers in cognitive science to pose such questions as how much are two people recurrent at some level of analysis, what is the characteristic lag time for one person to maximally match another, or whether one person is leading another. First, we set the theoretical ground to understand the difference between “correlation” and “co-visitation” when comparing two time series, using an aggregative or cross-recurrence approach. Then, we describe more formally the principles of cross-recurrence, and show with the current package how to carry out analyses applying them. We end the paper by comparing computational efficiency, and results’ consistency, of crqa R package, with the benchmark MATLAB toolbox crptoolbox (Marwan, 2013). We show perfect comparability between the two libraries on both levels. PMID:25018736

  12. Detrended fluctuation analysis of non-stationary cardiac beat-to-beat interval of sick infants

    NASA Astrophysics Data System (ADS)

    Govindan, Rathinaswamy B.; Massaro, An N.; Al-Shargabi, Tareq; Niforatos Andescavage, Nickie; Chang, Taeun; Glass, Penny; du Plessis, Adre J.

    2014-11-01

    We performed detrended fluctuation analysis (DFA) of cardiac beat-to-beat intervals (RRis) collected from sick newborn infants over 1-4 day periods. We calculated four different metrics from the DFA fluctuation function: the DFA exponents αL (>40 beats up to one-fourth of the record length), αs (15-30 beats), root-mean-square (RMS) fluctuation on a short-time scale (20-50 beats), and RMS fluctuation on a long-time scale (110-150 beats). Except αL , all metrics clearly distinguished two groups of newborn infants (favourable vs. adverse) with well-characterized outcomes. However, the RMS fluctuations distinguished the two groups more consistently over time compared to αS . Furthermore, RMS distinguished the RRi of the two groups earlier compared to the DFA exponent. In all the three measures, the favourable outcome group displayed higher values, indicating a higher magnitude of (auto-)correlation and variability, thus normal physiology, compared to the adverse outcome group.

  13. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.

  14. Estimation and analysis of multifactor productivity in truck transportation : 1987 - 2003

    DOT National Transportation Integrated Search

    2009-02-01

    The analysis has three objectives: 1) to estimate multifactor : productivity (MFP) in truck transportation during : 1987-2003; 2) to examine changes in multifactor productivity : in U.S. truck transportation, over time, and : to compare these changes...

  15. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  16. Novel Analytic Methods Needed for Real-Time Continuous Core Body Temperature Data

    PubMed Central

    Hertzberg, Vicki; Mac, Valerie; Elon, Lisa; Mutic, Nathan; Mutic, Abby; Peterman, Katherine; Tovar-Aguilar, J. Antonio; Economos, Jeannie; Flocks, Joan; McCauley, Linda

    2017-01-01

    Affordable measurement of core body temperature, Tc, in a continuous, real-time fashion is now possible. With this advance comes a new data analysis paradigm for occupational epidemiology. We characterize issues arising after obtaining Tc data over 188 workdays for 83 participating farmworkers, a population vulnerable to effects of rising temperatures due to climate change. We describe a novel approach to these data using smoothing and functional data analysis. This approach highlights different data aspects compared to describing Tc at a single time point or summaries of the time course into an indicator function (e.g., did Tc ever exceed 38°C, the threshold limit value for occupational heat exposure). Participants working in ferneries had significantly higher Tc at some point during the workday compared to those working in nurseries, despite a shorter workday for fernery participants. Our results typify the challenges and opportunities in analyzing big data streams from real-time physiologic monitoring. PMID:27756853

  17. Novel Analytic Methods Needed for Real-Time Continuous Core Body Temperature Data.

    PubMed

    Hertzberg, Vicki; Mac, Valerie; Elon, Lisa; Mutic, Nathan; Mutic, Abby; Peterman, Katherine; Tovar-Aguilar, J Antonio; Economos, Eugenia; Flocks, Joan; McCauley, Linda

    2016-10-18

    Affordable measurement of core body temperature (T c ) in a continuous, real-time fashion is now possible. With this advance comes a new data analysis paradigm for occupational epidemiology. We characterize issues arising after obtaining T c data over 188 workdays for 83 participating farmworkers, a population vulnerable to effects of rising temperatures due to climate change. We describe a novel approach to these data using smoothing and functional data analysis. This approach highlights different data aspects compared with describing T c at a single time point or summaries of the time course into an indicator function (e.g., did T c ever exceed 38 °C, the threshold limit value for occupational heat exposure). Participants working in ferneries had significantly higher T c at some point during the workday compared with those working in nurseries, despite a shorter workday for fernery participants. Our results typify the challenges and opportunities in analyzing big data streams from real-time physiologic monitoring. © The Author(s) 2016.

  18. Choosing the Most Effective Pattern Classification Model under Learning-Time Constraint.

    PubMed

    Saito, Priscila T M; Nakamura, Rodrigo Y M; Amorim, Willian P; Papa, João P; de Rezende, Pedro J; Falcão, Alexandre X

    2015-01-01

    Nowadays, large datasets are common and demand faster and more effective pattern analysis techniques. However, methodologies to compare classifiers usually do not take into account the learning-time constraints required by applications. This work presents a methodology to compare classifiers with respect to their ability to learn from classification errors on a large learning set, within a given time limit. Faster techniques may acquire more training samples, but only when they are more effective will they achieve higher performance on unseen testing sets. We demonstrate this result using several techniques, multiple datasets, and typical learning-time limits required by applications.

  19. A real-world, multi-site, observational study of infusion time and treatment satisfaction with rheumatoid arthritis patients treated with intravenous golimumab or infliximab.

    PubMed

    Daniel, Shoshana R; McDermott, John D; Le, Cathy; Pierce, Christine A; Ziskind, Michael A; Ellis, Lorie A

    2018-05-25

    To assess real-world infusion times for golimumab (GLM-IV) and infliximab (IFX) for rheumatoid arthritis (RA) patients and factors associated with treatment satisfaction. An observational study assessed infusion time including: clinic visit duration, RA medication preparation and infusion time, and infusion process time. Satisfaction was assessed by a modified Treatment Satisfaction Questionnaire for Medication (patient) and study-specific questionnaires (patient and clinic personnel). Comparative statistical testing for patient data utilized analysis of variance for continuous measures, and Fisher's exact or Chi-square test for categorical measures. Multivariate analysis was performed for the primary time endpoints and patient satisfaction. One hundred and fifty patients were enrolled from six US sites (72 GLM-IV, 78 IFX). The majority of patients were female (80.0%) and Caucasian (88.7%). GLM-IV required fewer vials per infusion (3.7) compared to IFX (4.9; p = .0001). Clinic visit duration (minutes) was shorter for GLM-IV (65.1) compared to IFX (153.1; p < .0001), as was total infusion time for RA medication (32.8 GLM-IV, 119.5 IFX; p < .0001) and infusion process times (45.8 GLM-IV, 134.1 IFX; p < .0001). Patients treated with GLM-IV reported higher satisfaction ratings with infusion time (p < .0001) and total visit time (p = .0003). Clinic personnel reported higher satisfaction with GLM-IV than IFX specific to medication preparation time, ease of mixing RA medication, frequency of patients requiring pre-medication, and infusion time. Findings may not be representative of care delivery for all RA infusion practices or RA patients. Shorter overall clinic visit duration, infusion process, and RA medication infusion times were observed for GLM-IV compared to IFX. A shorter duration in infusion time was associated with higher patient and clinic personnel satisfaction ratings.

  20. Comparative antianaerobic activities of doripenem determined by MIC and time-kill analysis.

    PubMed

    Credito, Kim L; Ednie, Lois M; Appelbaum, Peter C

    2008-01-01

    Against 447 anaerobe strains, the investigational carbapenem doripenem had an MIC 50 of 0.125 microg/ml and an MIC 90 of 1 microg/ml. Results were similar to those for imipenem, meropenem, and ertapenem. Time-kill studies showed that doripenem had very good bactericidal activity compared to other carbapenems, with 99.9% killing of 11 strains at 2x MIC after 48 h.

  1. [Comparative clinical analysis of cesarean section technique by Misgav Ladach method and Pfennenstiel method].

    PubMed

    Popiela, A; Pańszczyk, M; Korzeniewski, J; Baranowski, W

    2000-04-01

    Clinical and biochemical parameters were analysed in 55 patients who underwent a caesarean section performed using Misgav Ladach method compared to reference group of 41 patients who underwent caesarean section using Pfannenstiel method. Shortened operation time, shortened hospitalisation time and less postoperative morbidity were observed in the Misgav Ladach group. This kind of method seems to have advantages in comparison to Pfannenstiel method.

  2. Laparoscopic anterior versus endoscopic posterior approach for adrenalectomy: a shift to a new golden standard?

    PubMed

    Vrielink, O M; Wevers, K P; Kist, J W; Borel Rinkes, I H M; Hemmer, P H J; Vriens, M R; de Vries, J; Kruijff, S

    2017-08-01

    There has been an increased utilization of the posterior retroperitoneal approach (PRA) for adrenalectomy alongside the "classic" laparoscopic transabdominal technique (LTA). The aim of this study was to compare both procedures based on outcome variables at various ranges of tumor size. A retrospective analysis was performed on 204 laparoscopic transabdominal (UMC Groningen) and 57 retroperitoneal (UMC Utrecht) adrenalectomies between 1998 and 2013. We applied a univariate and multivariate regression analysis. Mann-Whitney and chi-squared tests were used to compare outcome variables between both approaches. Both mean operation time and median blood loss were significantly lower in the PRA group with 102.1 (SD 33.5) vs. 173.3 (SD 59.1) minutes (p < 0.001) and 0 (0-200) vs. 50 (0-1000) milliliters (p < 0.001), respectively. The shorter operation time in PRA was independent of tumor size. Complication rates were higher in the LTA (19.1%) compared to PRA (8.8%). There was no significant difference in recovery time between both approaches. Application of the PRA decreases operation time, blood loss, and complication rates compared to LTA. This might encourage institutions that use the LTA to start using PRA in patients with adrenal tumors, independent of tumor size.

  3. A comparative study of visual reaction time in table tennis players and healthy controls.

    PubMed

    Bhabhor, Mahesh K; Vidja, Kalpesh; Bhanderi, Priti; Dodhia, Shital; Kathrotia, Rajesh; Joshi, Varsha

    2013-01-01

    Visual reaction time is time required to response to visual stimuli. The present study was conducted to measure visual reaction time in 209 subjects, 50 table tennis (TT) players and 159 healthy controls. The visual reaction time was measured by the direct RT computerized software in healthy controls and table tennis players. Simple visual reaction time was measured. During the reaction time testing, visual stimuli were given for eighteen times and average reaction time was taken as the final reaction time. The study shows that table tennis players had faster reaction time than healthy controls. On multivariate analysis, it was found that TT players had 74.121 sec (95% CI 98.8 and 49.4 sec) faster reaction time compared to non-TT players of same age and BMI. Also playing TT has a profound influence on visual reaction time than BMI. Our study concluded that persons involved in sports are having good reaction time as compared to controls. These results support the view that playing of table tennis is beneficial to eye-hand reaction time, improve the concentration and alertness.

  4. Pulse oximeter sensor application during neonatal resuscitation: a randomized controlled trial.

    PubMed

    Louis, Deepak; Sundaram, Venkataseshan; Kumar, Praveen

    2014-03-01

    This study was done to compare 2 techniques of pulse oximeter sensor application during neonatal resuscitation for faster signal detection. Sensor to infant first (STIF) and then to oximeter was compared with sensor to oximeter first (STOF) and then to infant in ≥28 weeks gestations. The primary outcome was time from completion of sensor application to reliable signal, defined as stable display of heart rate and saturation. Time from birth to sensor application, time taken for sensor application, time from birth to reliable signal, and need to reapply sensor were secondary outcomes. An intention-to-treat analysis was done, and subgroup analysis was done for gestation and need for resuscitation. One hundred fifty neonates were randomized with 75 to each technique. The median (IQR) time from sensor application to detection of reliable signal was longer in STIF group compared with STOF group (16 [15-17] vs. 10 [6-18] seconds; P <0.001). Time taken for application of sensor was longer with STIF technique than with STOF technique (12 [10-16] vs. 11 [9-15] seconds; P = 0.04). Time from birth to reliable signal did not differ between the 2 methods (STIF: 61 [52-76] seconds; STOF: 58 [47-73] seconds [P = .09]). Time taken for signal acquisition was longer with STIF than with STOF in both subgroups. In the delivery room setting, the STOF method recognized saturation and heart rate faster than the STIF method. The time from birth to reliable signal was similar with the 2 methods.

  5. Towards scar-free surgery: An analysis of the increasing complexity from laparoscopic surgery to NOTES

    PubMed Central

    Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.

    2014-01-01

    Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811

  6. Immortal time bias in observational studies of time-to-event outcomes.

    PubMed

    Jones, Mark; Fowler, Robert

    2016-12-01

    The purpose of the study is to show, through simulation and example, the magnitude and direction of immortal time bias when an inappropriate analysis is used. We compare 4 methods of analysis for observational studies of time-to-event outcomes: logistic regression, standard Cox model, landmark analysis, and time-dependent Cox model using an example data set of patients critically ill with influenza and a simulation study. For the example data set, logistic regression, standard Cox model, and landmark analysis all showed some evidence that treatment with oseltamivir provides protection from mortality in patients critically ill with influenza. However, when the time-dependent nature of treatment exposure is taken account of using a time-dependent Cox model, there is no longer evidence of a protective effect of treatment. The simulation study showed that, under various scenarios, the time-dependent Cox model consistently provides unbiased treatment effect estimates, whereas standard Cox model leads to bias in favor of treatment. Logistic regression and landmark analysis may also lead to bias. To minimize the risk of immortal time bias in observational studies of survival outcomes, we strongly suggest time-dependent exposures be included as time-dependent variables in hazard-based analyses. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Challenges to validity in single-group interrupted time series analysis.

    PubMed

    Linden, Ariel

    2017-04-01

    Single-group interrupted time series analysis (ITSA) is a popular evaluation methodology in which a single unit of observation is studied; the outcome variable is serially ordered as a time series, and the intervention is expected to "interrupt" the level and/or trend of the time series, subsequent to its introduction. The most common threat to validity is history-the possibility that some other event caused the observed effect in the time series. Although history limits the ability to draw causal inferences from single ITSA models, it can be controlled for by using a comparable control group to serve as the counterfactual. Time series data from 2 natural experiments (effect of Florida's 2000 repeal of its motorcycle helmet law on motorcycle fatalities and California's 1988 Proposition 99 to reduce cigarette sales) are used to illustrate how history biases results of single-group ITSA results-as opposed to when that group's results are contrasted to those of a comparable control group. In the first example, an external event occurring at the same time as the helmet repeal appeared to be the cause of a rise in motorcycle deaths, but was only revealed when Florida was contrasted with comparable control states. Conversely, in the second example, a decreasing trend in cigarette sales prior to the intervention raised question about a treatment effect attributed to Proposition 99, but was reinforced when California was contrasted with comparable control states. Results of single-group ITSA should be considered preliminary, and interpreted with caution, until a more robust study design can be implemented. © 2016 John Wiley & Sons, Ltd.

  8. Time-frequency analysis of phonocardiogram signals using wavelet transform: a comparative study.

    PubMed

    Ergen, Burhan; Tatar, Yetkin; Gulcur, Halil Ozcan

    2012-01-01

    Analysis of phonocardiogram (PCG) signals provides a non-invasive means to determine the abnormalities caused by cardiovascular system pathology. In general, time-frequency representation (TFR) methods are used to study the PCG signal because it is one of the non-stationary bio-signals. The continuous wavelet transform (CWT) is especially suitable for the analysis of non-stationary signals and to obtain the TFR, due to its high resolution, both in time and in frequency and has recently become a favourite tool. It decomposes a signal in terms of elementary contributions called wavelets, which are shifted and dilated copies of a fixed mother wavelet function, and yields a joint TFR. Although the basic characteristics of the wavelets are similar, each type of the wavelets produces a different TFR. In this study, eight real types of the most known wavelets are examined on typical PCG signals indicating heart abnormalities in order to determine the best wavelet to obtain a reliable TFR. For this purpose, the wavelet energy and frequency spectrum estimations based on the CWT and the spectra of the chosen wavelets were compared with the energy distribution and the autoregressive frequency spectra in order to determine the most suitable wavelet. The results show that Morlet wavelet is the most reliable wavelet for the time-frequency analysis of PCG signals.

  9. Hilbert-Huang Transform: A Spectral Analysis Tool Applied to Sunspot Number and Total Solar Irradiance Variations, as well as Near-Surface Atmospheric Variables

    NASA Astrophysics Data System (ADS)

    Barnhart, B. L.; Eichinger, W. E.; Prueger, J. H.

    2010-12-01

    Hilbert-Huang transform (HHT) is a relatively new data analysis tool which is used to analyze nonstationary and nonlinear time series data. It consists of an algorithm, called empirical mode decomposition (EMD), which extracts the cyclic components embedded within time series data, as well as Hilbert spectral analysis (HSA) which displays the time and frequency dependent energy contributions from each component in the form of a spectrogram. The method can be considered a generalized form of Fourier analysis which can describe the intrinsic cycles of data with basis functions whose amplitudes and phases may vary with time. The HHT will be introduced and compared to current spectral analysis tools such as Fourier analysis, short-time Fourier analysis, wavelet analysis and Wigner-Ville distributions. A number of applications are also presented which demonstrate the strengths and limitations of the tool, including analyzing sunspot number variability and total solar irradiance proxies as well as global averaged temperature and carbon dioxide concentration. Also, near-surface atmospheric quantities such as temperature and wind velocity are analyzed to demonstrate the nonstationarity of the atmosphere.

  10. Multiscale recurrence quantification analysis of order recurrence plots

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian; Lin, Aijing

    2017-03-01

    In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.

  11. Evaluation of automated sample preparation, retention time locked gas chromatography-mass spectrometry and data analysis methods for the metabolomic study of Arabidopsis species.

    PubMed

    Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Dugardeyn, Jasper; Van Der Straeten, Dominique; Xu, Guowang; Sandra, Pat

    2011-05-27

    In this paper, automated sample preparation, retention time locked gas chromatography-mass spectrometry (GC-MS) and data analysis methods for the metabolomics study were evaluated. A miniaturized and automated derivatisation method using sequential oximation and silylation was applied to a polar extract of 4 types (2 types×2 ages) of Arabidopsis thaliana, a popular model organism often used in plant sciences and genetics. Automation of the derivatisation process offers excellent repeatability, and the time between sample preparation and analysis was short and constant, reducing artifact formation. Retention time locked (RTL) gas chromatography-mass spectrometry was used, resulting in reproducible retention times and GC-MS profiles. Two approaches were used for data analysis. XCMS followed by principal component analysis (approach 1) and AMDIS deconvolution combined with a commercially available program (Mass Profiler Professional) followed by principal component analysis (approach 2) were compared. Several features that were up- or down-regulated in the different types were detected. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Harmonic Balance Computations of Fan Aeroelastic Stability

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.; Reddy, T. S. R.

    2010-01-01

    A harmonic balance (HB) aeroelastic analysis, which has been recently developed, was used to determine the aeroelastic stability (flutter) characteristics of an experimental fan. To assess the numerical accuracy of this HB aeroelastic analysis, a time-domain aeroelastic analysis was also used to determine the aeroelastic stability characteristics of the same fan. Both of these three-dimensional analysis codes model the unsteady flowfield due to blade vibrations using the Reynolds-averaged Navier-Stokes (RANS) equations. In the HB analysis, the unsteady flow equations are converted to a HB form and solved using a pseudo-time marching method. In the time-domain analysis, the unsteady flow equations are solved using an implicit time-marching approach. Steady and unsteady computations for two vibration modes were carried out at two rotational speeds: 100 percent (design) and 70 percent (part-speed). The steady and unsteady results obtained from the two analysis methods compare well, thus verifying the recently developed HB aeroelastic analysis. Based on the results, the experimental fan was found to have no aeroelastic instability (flutter) at the conditions examined in this study.

  13. Trends in Average Living Children at the Time of Terminal Contraception: A Time Series Analysis Over 27 Years Using ARIMA (p, d, q) Nonseasonal Model.

    PubMed

    Mumbare, Sachin S; Gosavi, Shriram; Almale, Balaji; Patil, Aruna; Dhakane, Supriya; Kadu, Aniruddha

    2014-10-01

    India's National Family Welfare Programme is dominated by sterilization, particularly tubectomy. Sterilization, being a terminal method of contraception, decides the final number of children for that couple. Many studies have shown the declining trend in the average number of living children at the time of sterilization over a short period of time. So this study was planned to do time series analysis of the average children at the time of terminal contraception, to do forecasting till 2020 for the same and to compare the rates of change in various subgroups of the population. Data was preprocessed in MS Access 2007 by creating and running SQL queries. After testing stationarity of every series with augmented Dickey-Fuller test, time series analysis and forecasting was done using best-fit Box-Jenkins ARIMA (p, d, q) nonseasonal model. To compare the rates of change of average children in various subgroups, at sterilization, analysis of covariance (ANCOVA) was applied. Forecasting showed that the replacement level of 2.1 total fertility rate (TFR) will be achieved in 2018 for couples opting for sterilization. The same will be achieved in 2020, 2016, 2018, and 2019 for rural area, urban area, Hindu couples, and Buddhist couples, respectively. It will not be achieved till 2020 in Muslim couples. Every stratum of population showed the declining trend. The decline for male children and in rural area was significantly faster than the decline for female children and in urban area, respectively. The decline was not significantly different in Hindu, Muslim, and Buddhist couples.

  14. The Synthesis of Silicon Carbide in Rhombohedral Form with Different Chemicals

    NASA Astrophysics Data System (ADS)

    KARİPER, İ. AFŞIN

    2017-06-01

    This study describes the attempt at producing silicon carbide using a simpler and less costly method. Within the study, XRD, EDX, and FTIR analyses were performed to determine the structural properties of the product, and SEM analyses were used to identify its surface properties. The characteristics such as porosity and surface area were determined through BET analysis. The starting reagents were compared with the product using FTIR analysis, whereas the product was compared with a sample of SiC procured from a supplier who manufactures high-purity products through BET analysis. In EDX analysis, approximately 72 pct Si and 28 pct C were identified. The vibrational peaks of the synthesized product (characteristics Si-C bonds) were observed at around 1076 cm-1 (FTIR analysis). At the same time, the outcomes were compared with major publications in the literature.

  15. News Sources on Rhodesia: A Comparative Analysis.

    ERIC Educational Resources Information Center

    McCoy, Jennifer; Cholawsky, Elizabeth

    1982-01-01

    Concludes that the "London Times" and the Foreign Broadcast Information Service of the United States government provide both comprehensive and unbiased coverage of events in Rhodesia, while the "New York Times" is less complete and the "Christian Science Monitor" is selective. (FL)

  16. 40 CFR 300.430 - Remedial investigation/feasibility study and selection of remedy.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., to provide additional data for the detailed analysis and to support engineering design of remedial... threat to human health or the environment or to support the analysis and design of potential response... timely manner and no later than the early stages of the comparative analysis. The lead and support...

  17. 40 CFR 300.430 - Remedial investigation/feasibility study and selection of remedy.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., to provide additional data for the detailed analysis and to support engineering design of remedial... threat to human health or the environment or to support the analysis and design of potential response... timely manner and no later than the early stages of the comparative analysis. The lead and support...

  18. 40 CFR 300.430 - Remedial investigation/feasibility study and selection of remedy.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., to provide additional data for the detailed analysis and to support engineering design of remedial... threat to human health or the environment or to support the analysis and design of potential response... timely manner and no later than the early stages of the comparative analysis. The lead and support...

  19. 40 CFR 300.430 - Remedial investigation/feasibility study and selection of remedy.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., to provide additional data for the detailed analysis and to support engineering design of remedial... threat to human health or the environment or to support the analysis and design of potential response... timely manner and no later than the early stages of the comparative analysis. The lead and support...

  20. 40 CFR 300.430 - Remedial investigation/feasibility study and selection of remedy.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., to provide additional data for the detailed analysis and to support engineering design of remedial... threat to human health or the environment or to support the analysis and design of potential response... timely manner and no later than the early stages of the comparative analysis. The lead and support...

  1. Comparative analysis between different font types and letter styles using a nonlinear invariant digital correlation

    NASA Astrophysics Data System (ADS)

    Coronel-Beltrán, Ángel; Álvarez-Borrego, Josué

    2010-01-01

    We present, in this paper, a comparative analysis of the letters in Times New Roman (TNR), Courier New (CN) and Arial (Ar) font types in plain and italic style and the effects of five foreground/background color combinations using an invariant digital correlation system with a nonlinear filter with k = 0.3. The evaluation of the output plane with this filter is given by the peak-to-correlation energy (PCE) metric. The results show that the letters in TNR font have a better mean PCE value when compared with the CN and Ar fonts. This result is in agreement with some studies on text legibility and for readability where the reaction time (RT) of some participant individuals reading a text is measured. We conclude that the PCE metric is proportional to 1/RT.

  2. A study on quantitative analysis of exposure dose caused by patient depending on time and distance in nuclear medicine examination

    NASA Astrophysics Data System (ADS)

    Kim, H. S.; Cho, J. H.; Shin, S. G.; Dong, K. R.; Chung, W. K.; Chung, J. E.

    2013-01-01

    This study evaluated possible actions that can help protect against and reduce radiation exposure by measuring the exposure dose for each type of isotope that is used frequently in nuclear medicine before performing numerical analysis of the effective half-life based on the measurement results. From July to August in 2010, the study targeted 10, 6 and 5 people who underwent an 18F-FDG (fludeoxyglucose) positron emission tomography (PET) scan, 99mTc-HDP bone scan, and 201Tl myocardial single-photon emission computed tomography (SPECT) scan, respectively, in the nuclear medicine department. After injecting the required medicine into the subjects, a survey meter was used to measure the dose depending on the distance from the heart and time elapsed. For the 18F-FDG PET scan, the dose decreased by approximately 66% at 90 min compared to that immediately after the injection and by 78% at a distance of 1 m compared to that at 0.3 m. In the 99mTc-HDP bone scan, the dose decreased by approximately 71% in 200 min compared to that immediately after the injection and by approximately 78% at a distance of 1 m compared to that at 0.3 m. In the 201Tl myocardial SPECT scan, the dose decreased by approximately 30% in 250 min compared to that immediately after the injection and by approximately 55% at a distance of 1 m compared to that at 0.3 m. In conclusion, the dose decreases by a large margin depending on the distance and time. In conclusion, this study measured the exposure doses by isotopes, distance from the heart and exposure time, and found that the doses were reduced significantly according the distance and the time.

  3. Information and Complexity Measures Applied to Observed and Simulated Soil Moisture Time Series

    USDA-ARS?s Scientific Manuscript database

    Time series of soil moisture-related parameters provides important insights in functioning of soil water systems. Analysis of patterns within these time series has been used in several studies. The objective of this work was to compare patterns in observed and simulated soil moisture contents to u...

  4. Time Delay Embedding Increases Estimation Precision of Models of Intraindividual Variability

    ERIC Educational Resources Information Center

    von Oertzen, Timo; Boker, Steven M.

    2010-01-01

    This paper investigates the precision of parameters estimated from local samples of time dependent functions. We find that "time delay embedding," i.e., structuring data prior to analysis by constructing a data matrix of overlapping samples, increases the precision of parameter estimates and in turn statistical power compared to standard…

  5. Describing-function analysis of a ripple regulator with slew-rate limits and time delays

    NASA Technical Reports Server (NTRS)

    Wester, Gene W.

    1990-01-01

    The effects of time delays and slew-rate limits on the steady-state operating points and performance of a free-running ripple regulator are evaluated using describing-function analysis. The describing function of an ideal comparator (no time delays or slew rate limits) has no phase shift and is independent of frequency. It is found that turn-on delay and turn-off delay have different effects on gain and phase and cannot be combined. Comparator hysteresis affects both gain and phase; likewise, time delays generally affect both gain and phase. It is found that the effective time delay around the feedback loop is one half the sum of turn-on and turn-off delays, regardless of whether the delays are caused by storage time or slew rate limits. Expressions are formulated for the switching frequency, switch duty ratio, dc output, and output ripple. For the case of no hysteresis, a simple, graphical solution for the switching frequency is possible, and the resulting switching frequency is independent of first-order variations of input or load.

  6. Laparoendoscopic single-site adrenalectomy versus conventional laparoscopic surgery: a systematic review and meta-analysis of observational studies.

    PubMed

    Wang, Linhui; Wu, Zhenjie; Li, Mingmin; Cai, Chen; Liu, Bing; Yang, Qing; Sun, Yinghao

    2013-06-01

    To assess the surgical efficacy and potential advantages of laparoendoscopic single-site adrenalectomy (LESS-AD) compared with conventional laparoscopic adrenalectomy (CL-AD) based on published literature. An online systematic search in electronic databasesM including Pubmed, Embase, and the Cochrane Library, as well as manual bibliography searches were performed. All studies that compared LESS-AD with CL-AD were included. The outcome measures were the patient demographics, tumor size, blood loss, operative time, time to resumption of oral intake, hospital stay, postoperative pain, cosmesis satisfaction score, rates of complication, conversion, and transfusion. A meta-analysis of the results was conducted. A total of 443 patients were included: 171 patients in the LESS-AD group and 272 patients in the CL-AD group (nine studies). There was no significant difference between the two groups in any of the demographic parameters expect for lesion size (age: P=0.24; sex: P=0.35; body mass index: P=0.79; laterality: P=0.76; size: P=0.002). There was no significant difference in estimated blood loss, time to oral intake resumption, and length of stay between the two groups. The LESS-AD patients had a significantly lower postoperative visual analog pain score compared with the CL-AD group, but a longer operative time was noted. Both groups had a comparable cosmetic satisfaction score. The two groups had a comparable rate of complication, conversion, and transfusion. In early experience, LESS-AD appears to be a safe and feasible alternative to its conventional laparoscopic counterpart with decreased postoperative pain noted, albeit with a longer operative time. As a promising and emerging minimally invasive technique, however, the current evidence has not verified other potential advantages (ie, cosmesis, recovery time, convalescence, port-related complications, etc.) of LESS-AD.

  7. A novel universal real-time PCR system using the attached universal duplex probes for quantitative analysis of nucleic acids.

    PubMed

    Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing

    2008-06-04

    Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.

  8. Efficient multitasking of Choleski matrix factorization on CRAY supercomputers

    NASA Technical Reports Server (NTRS)

    Overman, Andrea L.; Poole, Eugene L.

    1991-01-01

    A Choleski method is described and used to solve linear systems of equations that arise in large scale structural analysis. The method uses a novel variable-band storage scheme and is structured to exploit fast local memory caches while minimizing data access delays between main memory and vector registers. Several parallel implementations of this method are described for the CRAY-2 and CRAY Y-MP computers demonstrating the use of microtasking and autotasking directives. A portable parallel language, FORCE, is used for comparison with the microtasked and autotasked implementations. Results are presented comparing the matrix factorization times for three representative structural analysis problems from runs made in both dedicated and multi-user modes on both computers. CPU and wall clock timings are given for the parallel implementations and are compared to single processor timings of the same algorithm.

  9. Supportive Measures: An Analysis of the Trio Program--Student Support Services at East Tennessee State University from 2001-2004

    ERIC Educational Resources Information Center

    Strode, Christopher N.

    2013-01-01

    The purpose of this study was to examine the academic performance of the first-time, full-time, traditional-aged students in the Student Support Services program at East Tennessee State University. This was accomplished by comparing their academic performance with the academic performance of first-time, full-time, traditional-aged non-SSS…

  10. A New Time Measurement Method Using a High-End Global Navigation Satellite System to Analyze Alpine Skiing

    ERIC Educational Resources Information Center

    Supej, Matej; Holmberg, Hans-Christer

    2011-01-01

    Accurate time measurement is essential to temporal analysis in sport. This study aimed to (a) develop a new method for time computation from surveyed trajectories using a high-end global navigation satellite system (GNSS), (b) validate its precision by comparing GNSS with photocells, and (c) examine whether gate-to-gate times can provide more…

  11. Comparison of the intraarticular effectiveness of triamcinolone hexacetonide and triamcinolone acetonide in treatment of juvenile rheumatoid arthritis.

    PubMed

    Eberhard, Barbara A; Sison, M Cristina; Gottlieb, Beth S; Ilowite, Norman T

    2004-12-01

    To compare patients with juvenile rheumatoid arthritis (JRA) injected with triamcinolone hexacetonide (TH) or triamcinolone acetonide (TA) with respect to time to relapse. This was a retrospective chart review of 85 patients: 51 patients with JRA who had received a joint injection with TH during the period June 2000-April 2001 and 48 patients who had received a joint injection with TA during the period May 2001-March 2002 who were followed for a minimum of 15 months, after an intraarticular steroid injection. The primary endpoint variable for the study was the time to relapse of the arthritis in the affected joint following an intraarticular injection. A total of 227 joints were injected, 114 with TH and 113 with TA. In the TH group the mean time to relapse (+/- SE) was 10.14 +/- 0.49 months compared to the TA group at 7.75 +/- 0.49 months (p < 0.0001) using the log-rank test. A proportional hazards (Cox) regression analysis revealed no statistical association between sex, duration of illness, or type of arthritis and relapse time. An analysis was performed on the first intraarticular injection for each patient, with the average time to relapse for all joints injected of 10.36 +/- 0.72 months for TH compared to 8.45 +/- 0.78 months for TA (p < 0.02). A further analysis of the first knee injections showed a relapse time in the TH group of 11.11 +/- 0.81 months compared to 7.95 +/- 0.95 months for TA (p < 0.008). TH offers an advantage to TA, as there is a longer duration of action leading to an improved prolonged response rate in weight-bearing joints, particularly the knees. The results suggest that TH should be the intraarticular steroid of choice, particularly for the knee joint, in patients with JRA.

  12. Living network meta-analysis compared with pairwise meta-analysis in comparative effectiveness research: empirical study

    PubMed Central

    Nikolakopoulou, Adriani; Mavridis, Dimitris; Furukawa, Toshi A; Cipriani, Andrea; Tricco, Andrea C; Straus, Sharon E; Siontis, George C M; Egger, Matthias

    2018-01-01

    Abstract Objective To examine whether the continuous updating of networks of prospectively planned randomised controlled trials (RCTs) (“living” network meta-analysis) provides strong evidence against the null hypothesis in comparative effectiveness of medical interventions earlier than the updating of conventional, pairwise meta-analysis. Design Empirical study of the accumulating evidence about the comparative effectiveness of clinical interventions. Data sources Database of network meta-analyses of RCTs identified through searches of Medline, Embase, and the Cochrane Database of Systematic Reviews until 14 April 2015. Eligibility criteria for study selection Network meta-analyses published after January 2012 that compared at least five treatments and included at least 20 RCTs. Clinical experts were asked to identify in each network the treatment comparison of greatest clinical interest. Comparisons were excluded for which direct and indirect evidence disagreed, based on side, or node, splitting test (P<0.10). Outcomes and analysis Cumulative pairwise and network meta-analyses were performed for each selected comparison. Monitoring boundaries of statistical significance were constructed and the evidence against the null hypothesis was considered to be strong when the monitoring boundaries were crossed. A significance level was defined as α=5%, power of 90% (β=10%), and an anticipated treatment effect to detect equal to the final estimate from the network meta-analysis. The frequency and time to strong evidence was compared against the null hypothesis between pairwise and network meta-analyses. Results 49 comparisons of interest from 44 networks were included; most (n=39, 80%) were between active drugs, mainly from the specialties of cardiology, endocrinology, psychiatry, and rheumatology. 29 comparisons were informed by both direct and indirect evidence (59%), 13 by indirect evidence (27%), and 7 by direct evidence (14%). Both network and pairwise meta-analysis provided strong evidence against the null hypothesis for seven comparisons, but for an additional 10 comparisons only network meta-analysis provided strong evidence against the null hypothesis (P=0.002). The median time to strong evidence against the null hypothesis was 19 years with living network meta-analysis and 23 years with living pairwise meta-analysis (hazard ratio 2.78, 95% confidence interval 1.00 to 7.72, P=0.05). Studies directly comparing the treatments of interest continued to be published for eight comparisons after strong evidence had become evident in network meta-analysis. Conclusions In comparative effectiveness research, prospectively planned living network meta-analyses produced strong evidence against the null hypothesis more often and earlier than conventional, pairwise meta-analyses. PMID:29490922

  13. Automatic Real Time Ionogram Scaler with True Height Analysis - Artist

    DTIC Science & Technology

    1983-07-01

    scaled. The corresponding autoscaled values were compared with the manual scaled h’F, h’F2, fminF, foE, foEs, h’E and hlEs. The ARTIST program...I ... , ·~ J .,\\; j~~·n! I:\\’~ .. IC HT:/\\L rritw!E I ONOGI\\AM SCALER ’:!"[’!’if T:\\!_1!: H~:IGHT ANALYSIS - ARTIST P...S. TYPE OF REPORT & PERiCO COVERED Scientific Report No. 7 AUTOMATIC REAL TIME IONOGRAM SCALER WITH TRUE HEIGHT ANALYSIS - ARTIST 6. PERFORMING O𔃾G

  14. Monte Carlo analysis of a time-dependent neutron and secondary gamma-ray integral experiment on a thick concrete and steel shield

    NASA Astrophysics Data System (ADS)

    Cramer, S. N.; Roussin, R. W.

    1981-11-01

    A Monte Carlo analysis of a time-dependent neutron and secondary gamma-ray integral experiment on a thick concrete and steel shield is presented. The energy range covered in the analysis is 15-2 MeV for neutron source energies. The multigroup MORSE code was used with the VITAMIN C 171-36 neutron-gamma-ray cross-section data set. Both neutron and gamma-ray count rates and unfolded energy spectra are presented and compared, with good general agreement, with experimental results.

  15. Use of the landmark method to address immortal person-time bias in comparative effectiveness research: a simulation study.

    PubMed

    Mi, Xiaojuan; Hammill, Bradley G; Curtis, Lesley H; Lai, Edward Chia-Cheng; Setoguchi, Soko

    2016-11-20

    Observational comparative effectiveness and safety studies are often subject to immortal person-time, a period of follow-up during which outcomes cannot occur because of the treatment definition. Common approaches, like excluding immortal time from the analysis or naïvely including immortal time in the analysis, are known to result in biased estimates of treatment effect. Other approaches, such as the Mantel-Byar and landmark methods, have been proposed to handle immortal time. Little is known about the performance of the landmark method in different scenarios. We conducted extensive Monte Carlo simulations to assess the performance of the landmark method compared with other methods in settings that reflect realistic scenarios. We considered four landmark times for the landmark method. We found that the Mantel-Byar method provided unbiased estimates in all scenarios, whereas the exclusion and naïve methods resulted in substantial bias when the hazard of the event was constant or decreased over time. The landmark method performed well in correcting immortal person-time bias in all scenarios when the treatment effect was small, and provided unbiased estimates when there was no treatment effect. The bias associated with the landmark method tended to be small when the treatment rate was higher in the early follow-up period than it was later. These findings were confirmed in a case study of chronic obstructive pulmonary disease. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Color-coded perfusion analysis of CEUS for pre-interventional diagnosis of microvascularisation in cases of vascular malformations.

    PubMed

    Teusch, V I; Wohlgemuth, W A; Piehler, A P; Jung, E M

    2014-01-01

    Aim of our pilot study was the application of a contrast-enhanced color-coded ultrasound perfusion analysis in patients with vascular malformations to quantify microcirculatory alterations. 28 patients (16 female, 12 male, mean age 24.9 years) with high flow (n = 6) or slow-flow (n = 22) malformations were analyzed before intervention. An experienced examiner performed a color-coded Doppler sonography (CCDS) and a Power Doppler as well as a contrast-enhanced ultrasound after intravenous bolus injection of 1 - 2.4 ml of a second-generation ultrasound contrast medium (SonoVue®, Bracco, Milan). The contrast-enhanced examination was documented as a cine sequence over 60 s. The quantitative analysis based on color-coded contrast-enhanced ultrasound (CEUS) images included percentage peak enhancement (%peak), time to peak (TTP), area under the curve (AUC), and mean transit time (MTT). No side effects occurred after intravenous contrast injection. The mean %peak in arteriovenous malformations was almost twice as high as in slow-flow-malformations. The area under the curve was 4 times higher in arteriovenous malformations compared to the mean value of other malformations. The mean transit time was 1.4 times higher in high-flow-malformations compared to slow-flow-malformations. There was no difference regarding the time to peak between the different malformation types. The comparison between all vascular malformation and surrounding tissue showed statistically significant differences for all analyzed data (%peak, TTP, AUC, MTT; p < 0.01). High-flow and slow-flow vascular malformations had statistically significant differences in %peak (p < 0.01), AUC analysis (p < 0.01), and MTT (p < 0.05). Color-coded perfusion analysis of CEUS seems to be a promising technique for the dynamic assessment of microvasculature in vascular malformations.

  17. Comparative analysis of different weight matrices in subspace system identification for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Shokravi, H.; Bakhary, NH

    2017-11-01

    Subspace System Identification (SSI) is considered as one of the most reliable tools for identification of system parameters. Performance of a SSI scheme is considerably affected by the structure of the associated identification algorithm. Weight matrix is a variable in SSI that is used to reduce the dimensionality of the state-space equation. Generally one of the weight matrices of Principle Component (PC), Unweighted Principle Component (UPC) and Canonical Variate Analysis (CVA) are used in the structure of a SSI algorithm. An increasing number of studies in the field of structural health monitoring are using SSI for damage identification. However, studies that evaluate the performance of the weight matrices particularly in association with accuracy, noise resistance, and time complexity properties are very limited. In this study, the accuracy, noise-robustness, and time-efficiency of the weight matrices are compared using different qualitative and quantitative metrics. Three evaluation metrics of pole analysis, fit values and elapsed time are used in the assessment process. A numerical model of a mass-spring-dashpot and operational data is used in this research paper. It is observed that the principal components obtained using PC algorithms are more robust against noise uncertainty and give more stable results for the pole distribution. Furthermore, higher estimation accuracy is achieved using UPC algorithm. CVA had the worst performance for pole analysis and time efficiency analysis. The superior performance of the UPC algorithm in the elapsed time is attributed to using unit weight matrices. The obtained results demonstrated that the process of reducing dimensionality in CVA and PC has not enhanced the time efficiency but yield an improved modal identification in PC.

  18. A retrospective analysis of laparoscopic partial nephrectomy with segmental renal artery clamping and factors that predict postoperative renal function.

    PubMed

    Li, Pu; Qin, Chao; Cao, Qiang; Li, Jie; Lv, Qiang; Meng, Xiaoxin; Ju, Xiaobing; Tang, Lijun; Shao, Pengfei

    2016-10-01

    To evaluate the feasibility and efficiency of laparoscopic partial nephrectomy (LPN) with segmental renal artery clamping, and to analyse the factors affecting postoperative renal function. We conducted a retrospective analysis of 466 consecutive patients undergoing LPN using main renal artery clamping (group A, n = 152) or segmental artery clamping (group B, n = 314) between September 2007 and July 2015 in our department. Blood loss, operating time, warm ischaemia time (WIT) and renal function were compared between groups. Univariable and multivariable linear regression analyses were applied to assess the correlations of selected variables with postoperative glomerular filtration rate (GFR) reduction. Volumetric data and estimated GFR of a subset of 60 patients in group B were compared with GFR to evaluate the correlation between these functional variables and preserved renal function after LPN. The novel technique slightly increased operating time, WIT and intra-operative blood loss (P < 0.001), while it provided better postoperative renal function (P < 0.001) compared with the conventional technique. The blocking method and tumour characteristics were independent factors affecting GFR reduction, while WIT was not an independent factor. Correlation analysis showed that estimated GFR presented better correlation with GFR compared with kidney volume (R(2) = 0.794 cf. R(2) = 0.199) in predicting renal function after LPN. LPN with segmental artery clamping minimizes warm ischaemia injury and provides better early postoperative renal function compared with clamping the main renal artery. Kidney volume has a significantly inferior role compared with eGFR in predicting preserved renal function. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.

  19. Effectiveness of Quantitative Real Time PCR in Long-Term Follow-up of Chronic Myeloid Leukemia Patients.

    PubMed

    Savasoglu, Kaan; Payzin, Kadriye Bahriye; Ozdemirkiran, Fusun; Berber, Belgin

    2015-08-01

    To determine the use of the Quantitative Real Time PCR (RQ-PCR) assay follow-up with Chronic Myeloid Leukemia (CML) patients. Cross-sectional observational. Izmir Ataturk Education and Research Hospital, Izmir, Turkey, from 2009 to 2013. Cytogenetic, FISH, RQ-PCR test results from 177 CMLpatients' materials selected between 2009 - 2013 years was set up for comparison analysis. Statistical analysis was performed to compare between FISH, karyotype and RQ-PCR results of the patients. Karyotyping and FISH specificity and sensitivity rates determined by ROC analysis compared with RQ-PCR results. Chi-square test was used to compare test failure rates. Sensitivity and specificity values were determined for karyotyping 17.6 - 98% (p=0.118, p > 0.05) and for FISH 22.5 - 96% (p=0.064, p > 0.05) respectively. FISH sensitivity was slightly higher than karyotyping but there was calculated a strong correlation between them (p < 0.001). RQ-PCR test failure rate did not correlate with other two tests (p > 0.05); however, karyotyping and FISH test failure rate was statistically significant (p < 0.001). Besides, the situation needed for karyotype analysis, RQ-PCR assay can be used alone in the follow-up of CMLdisease.

  20. 'TIME': A Web Application for Obtaining Insights into Microbial Ecology Using Longitudinal Microbiome Data.

    PubMed

    Baksi, Krishanu D; Kuntal, Bhusan K; Mande, Sharmila S

    2018-01-01

    Realization of the importance of microbiome studies, coupled with the decreasing sequencing cost, has led to the exponential growth of microbiome data. A number of these microbiome studies have focused on understanding changes in the microbial community over time. Such longitudinal microbiome studies have the potential to offer unique insights pertaining to the microbial social networks as well as their responses to perturbations. In this communication, we introduce a web based framework called 'TIME' (Temporal Insights into Microbial Ecology'), developed specifically to obtain meaningful insights from microbiome time series data. The TIME web-server is designed to accept a wide range of popular formats as input with options to preprocess and filter the data. Multiple samples, defined by a series of longitudinal time points along with their metadata information, can be compared in order to interactively visualize the temporal variations. In addition to standard microbiome data analytics, the web server implements popular time series analysis methods like Dynamic time warping, Granger causality and Dickey Fuller test to generate interactive layouts for facilitating easy biological inferences. Apart from this, a new metric for comparing metagenomic time series data has been introduced to effectively visualize the similarities/differences in the trends of the resident microbial groups. Augmenting the visualizations with the stationarity information pertaining to the microbial groups is utilized to predict the microbial competition as well as community structure. Additionally, the 'causality graph analysis' module incorporated in TIME allows predicting taxa that might have a higher influence on community structure in different conditions. TIME also allows users to easily identify potential taxonomic markers from a longitudinal microbiome analysis. We illustrate the utility of the web-server features on a few published time series microbiome data and demonstrate the ease with which it can be used to perform complex analysis.

  1. A simplified implementation of edge detection in MATLAB is faster and more sensitive than fast fourier transform for actin fiber alignment quantification.

    PubMed

    Kemeny, Steven Frank; Clyne, Alisa Morss

    2011-04-01

    Fiber alignment plays a critical role in the structure and function of cells and tissues. While fiber alignment quantification is important to experimental analysis and several different methods for quantifying fiber alignment exist, many studies focus on qualitative rather than quantitative analysis perhaps due to the complexity of current fiber alignment methods. Speed and sensitivity were compared in edge detection and fast Fourier transform (FFT) for measuring actin fiber alignment in cells exposed to shear stress. While edge detection using matrix multiplication was consistently more sensitive than FFT, image processing time was significantly longer. However, when MATLAB functions were used to implement edge detection, MATLAB's efficient element-by-element calculations and fast filtering techniques reduced computation cost 100 times compared to the matrix multiplication edge detection method. The new computation time was comparable to the FFT method, and MATLAB edge detection produced well-distributed fiber angle distributions that statistically distinguished aligned and unaligned fibers in half as many sample images. When the FFT sensitivity was improved by dividing images into smaller subsections, processing time grew larger than the time required for MATLAB edge detection. Implementation of edge detection in MATLAB is simpler, faster, and more sensitive than FFT for fiber alignment quantification.

  2. Increasing the quality, comparability and accessibility of phytoplankton species composition time-series data

    NASA Astrophysics Data System (ADS)

    Zingone, Adriana; Harrison, Paul J.; Kraberg, Alexandra; Lehtinen, Sirpa; McQuatters-Gollop, Abigail; O'Brien, Todd; Sun, Jun; Jakobsen, Hans H.

    2015-09-01

    Phytoplankton diversity and its variation over an extended time scale can provide answers to a wide range of questions relevant to societal needs. These include human health, the safe and sustained use of marine resources and the ecological status of the marine environment, including long-term changes under the impact of multiple stressors. The analysis of phytoplankton data collected at the same place over time, as well as the comparison among different sampling sites, provide key information for assessing environmental change, and evaluating new actions that must be made to reduce human induced pressures on the environment. To achieve these aims, phytoplankton data may be used several decades later by users that have not participated in their production, including automatic data retrieval and analysis. The methods used in phytoplankton species analysis vary widely among research and monitoring groups, while quality control procedures have not been implemented in most cases. Here we highlight some of the main differences in the sampling and analytical procedures applied to phytoplankton analysis and identify critical steps that are required to improve the quality and inter-comparability of data obtained at different sites and/or times. Harmonization of methods may not be a realistic goal, considering the wide range of purposes of phytoplankton time-series data collection. However, we propose that more consistent and detailed metadata and complementary information be recorded and made available along with phytoplankton time-series datasets, including description of the procedures and elements allowing for a quality control of the data. To keep up with the progress in taxonomic research, there is a need for continued training of taxonomists, and for supporting and complementing existing web resources, in order to allow a constant upgrade of knowledge in phytoplankton classification and identification. Efforts towards the improvement of metadata recording, data annotation and quality control procedures will ensure the internal consistency of phytoplankton time series and facilitate their comparability and accessibility, thus strongly increasing the value of the precious information they provide. Ultimately, the sharing of quality controlled data will allow one to recoup the high cost of obtaining the data through the multiple use of the time-series data in various projects over many decades.

  3. Comparison of Delta-Shape Anastomosis and Extracorporeal Billroth I Anastomosis after Laparoscopic Distal Gastrectomy for Gastric Cancer: A Systematic Review with Meta-Analysis of Short-Term Outcomes.

    PubMed

    Hu, Geng-Yuan; Tao, Feng; Ji, Ke-Wei; Wang, Wei

    2016-01-01

    The aim of this systematic review and meta-analysis is to evaluate the safety and relative benefits of delta-shape anastomosis (DA) by comparing to conventional laparoscopy-assisted distal gastrectomy with Billroth I gastroduodenostomy (LADG BI). Studies and relevant literature regarding DA versus LADG BI were searched in the electronic databases. Operation time, postoperative complications, estimated blood loss, number of retrieved lymph nodes, time to first flatus, time to oral intake, length of postoperative hospitalization in DA and LADG BI were pooled and compared using meta-analysis. Weighted mean differences (WMDs) and odds ratios (ORs) were calculated with 95% confidence intervals (CIs) to evaluate the effect of DA. Eight studies of 1739 patients were included in the meta-analysis. Compared with LADG BI, DA had shorter postoperative hospitalization (WMD = -0.47, 95%CI: -0.69 to -0.25, P<0.01), less blood loss (WMD = - 25.90, 95%CI: -43.11 to -8.70, P<0.01), shorter time to oral intake (WMD = -0.25, 95%CI: -0.49 to -0.01, P = 0.04), and more retrieved lymph nodes (WMD = 1.36, 95%CI: 0.30 to 2.43, P = 0.01). Operation time (WMD = -0.07, 95%CI -15.58 to 15.43, P = 0.99), overall postoperative complication rate (OR = 1.05, 95%CI: 0.74 to 1.49, P = 0.63), surgical complication rate (OR = 1.02, 95%CI: 0.70 to 1.49, P = 0.90), nonsurgical complication rate (OR = 1.21, 95%CI: 0.54 to 2.72, P = 0.64), leakage rate (OR = 2.54, 95%CI: 0.92 to 7.01, P = 0.07), stricture rate (OR = 0.36, 95%CI: 0.09 to 1.44, P = 0.15), wound complication rate (OR = 0.71, 95%CI: 0.33 to 1.55, P = 0.39), time to first flatus (WMD = -0.10, 95%CI: -0.27 to 0.07, P = 0.26), and proximal surgical margin (WMD = -0.25, 95%CI: -1.14 to 0.65, P = 0.59) was not statistically different. Compared with LADG BI, DA is a safe and feasible procedure, with significantly reduced blood loss, time to oral intake, and postoperative hospitalization.

  4. Criteria for Comparing Domain Analysis Approaches Version 01.00.00

    DTIC Science & Technology

    1991-12-01

    Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management

  5. Highly sensitive index of sympathetic activity based on time-frequency spectral analysis of electrodermal activity.

    PubMed

    Posada-Quintero, Hugo F; Florian, John P; Orjuela-Cañón, Álvaro D; Chon, Ki H

    2016-09-01

    Time-domain indices of electrodermal activity (EDA) have been used as a marker of sympathetic tone. However, they often show high variation between subjects and low consistency, which has precluded their general use as a marker of sympathetic tone. To examine whether power spectral density analysis of EDA can provide more consistent results, we recently performed a variety of sympathetic tone-evoking experiments (43). We found significant increase in the spectral power in the frequency range of 0.045 to 0.25 Hz when sympathetic tone-evoking stimuli were induced. The sympathetic tone assessed by the power spectral density of EDA was found to have lower variation and more sensitivity for certain, but not all, stimuli compared with the time-domain analysis of EDA. We surmise that this lack of sensitivity in certain sympathetic tone-inducing conditions with time-invariant spectral analysis of EDA may lie in its inability to characterize time-varying dynamics of the sympathetic tone. To overcome the disadvantages of time-domain and time-invariant power spectral indices of EDA, we developed a highly sensitive index of sympathetic tone, based on time-frequency analysis of EDA signals. Its efficacy was tested using experiments designed to elicit sympathetic dynamics. Twelve subjects underwent four tests known to elicit sympathetic tone arousal: cold pressor, tilt table, stand test, and the Stroop task. We hypothesize that a more sensitive measure of sympathetic control can be developed using time-varying spectral analysis. Variable frequency complex demodulation, a recently developed technique for time-frequency analysis, was used to obtain spectral amplitudes associated with EDA. We found that the time-varying spectral frequency band 0.08-0.24 Hz was most responsive to stimulation. Spectral power for frequencies higher than 0.24 Hz were determined to be not related to the sympathetic dynamics because they comprised less than 5% of the total power. The mean value of time-varying spectral amplitudes in the frequency band 0.08-0.24 Hz were used as the index of sympathetic tone, termed TVSymp. TVSymp was found to be overall the most sensitive to the stimuli, as evidenced by a low coefficient of variation (0.54), and higher consistency (intra-class correlation, 0.96) and sensitivity (Youden's index > 0.75), area under the receiver operating characteristic (ROC) curve (>0.8, accuracy > 0.88) compared with time-domain and time-invariant spectral indices, including heart rate variability. Copyright © 2016 the American Physiological Society.

  6. Comparing Videotapes and Written Narrative Records of Second Grade Reading Classes: Selecting Methods for Particular Observational Goals.

    ERIC Educational Resources Information Center

    Gardner, C. H.; And Others

    The classroom behaviors recorded during three second grade reading lessons provide suitable evidence for comparing the relative merits of using narrative observations versus videotapes as data collection techniques. The comparative analysis illustrates the detail and precision of videotape. Primarily, videotape gives a true picture of linear time,…

  7. Transoral Incisionless Fundoplication (TIF 2.0): A Meta-Analysis of Three Randomized, Controlled Clinical Trials.

    PubMed

    Gerson, Lauren; Stouch, Bruce; Lobonţiu, Adrian

    2018-01-01

    The TIF procedure has emerged as an endoscopic treatment for patients with refractory gastro-esophageal reflux disease (GERD). Previous systematic reviews of the TIF procedure conflated findings from studies with modalities that do not reflect the current 2.0 procedure technique or refined data-backed patient selection criteria. A meta-analysis was conducted using data only from randomized studies that assessed the TIF 2.0 procedure compared to a control. The purpose of the meta-analysis was to determine the efficacy and long-term outcomes associated with performance of the TIF 2.0 procedure in patients with chronic long-term refractory GERD on optimized PPI therapy, including esophageal pH, PPI utilization and quality of life. Methods: Three prospective research questions were predicated on the outcomes of the TIF procedure compared to patients who received PPI therapy or sham, concomitant treatment for GERD, and the patient-reported quality of life. Event rates were calculated using the random effect model. Since the time of follow-up post-TIF procedure was variable, analysis was performed to incorporate the time of follow-up for each individual patient at the 3-year time point. Results: Results from this meta-analysis, including data from 233 patients, demonstrated that TIF subjects at 3 years had improved esophageal pH, a decrease in PPI utilization, and improved quality of life. Conclusions: In a meta-analysis of randomized, controlled trials (RCTs), the TIF procedure data for patients with GERD refractory to PPI's produces significant changes, compared with sham or PPI therapy, in esophageal pH, decreased PPI utilization, and improved quality of life. Celsius.

  8. Hybrid Wavelet De-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series

    NASA Astrophysics Data System (ADS)

    WANG, D.; Wang, Y.; Zeng, X.

    2017-12-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, Wavelet De-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series.

  9. Autoregressive modeling for the spectral analysis of oceanographic data

    NASA Technical Reports Server (NTRS)

    Gangopadhyay, Avijit; Cornillon, Peter; Jackson, Leland B.

    1989-01-01

    Over the last decade there has been a dramatic increase in the number and volume of data sets useful for oceanographic studies. Many of these data sets consist of long temporal or spatial series derived from satellites and large-scale oceanographic experiments. These data sets are, however, often 'gappy' in space, irregular in time, and always of finite length. The conventional Fourier transform (FT) approach to the spectral analysis is thus often inapplicable, or where applicable, it provides questionable results. Here, through comparative analysis with the FT for different oceanographic data sets, the possibilities offered by autoregressive (AR) modeling to perform spectral analysis of gappy, finite-length series, are discussed. The applications demonstrate that as the length of the time series becomes shorter, the resolving power of the AR approach as compared with that of the FT improves. For the longest data sets examined here, 98 points, the AR method performed only slightly better than the FT, but for the very short ones, 17 points, the AR method showed a dramatic improvement over the FT. The application of the AR method to a gappy time series, although a secondary concern of this manuscript, further underlines the value of this approach.

  10. Authentication of animal fats using direct analysis in real time (DART) ionization-mass spectrometry and chemometric tools.

    PubMed

    Vaclavik, Lukas; Hrbek, Vojtech; Cajka, Tomas; Rohlik, Bo-Anne; Pipek, Petr; Hajslova, Jana

    2011-06-08

    A combination of direct analysis in real time (DART) ionization coupled to time-of-flight mass spectrometry (TOFMS) and chemometrics was used for animal fat (lard and beef tallow) authentication. This novel instrumentation was employed for rapid profiling of triacylglycerols (TAGs) and polar compounds present in fat samples and their mixtures. Additionally, fat isolated from pork, beef, and pork/beef admixtures was analyzed. Mass spectral records were processed by principal component analysis (PCA) and stepwise linear discriminant analysis (LDA). DART-TOFMS profiles of TAGs were found to be more suitable for the purpose of discrimination among the examined fat types as compared to profiles of polar compounds. The LDA model developed using TAG data enabled not only reliable classification of samples representing neat fats but also detection of admixed lard and tallow at adulteration levels of 5 and 10% (w/w), respectively. The presented approach was also successfully applied to minced meat prepared from pork and beef with comparable fat content. Using the DART-TOFMS TAG profiles of fat isolated from meat mixtures, detection of 10% pork added to beef and vice versa was possible.

  11. The effects of the framing of time on delay discounting.

    PubMed

    DeHart, William Brady; Odum, Amy L

    2015-01-01

    We examined the effects of the framing of time on delay discounting. Delay discounting is the process by which delayed outcomes are devalued as a function of time. Time in a titrating delay discounting task is often framed in calendar units (e.g., as 1 week, 1 month, etc.). When time is framed as a specific date, delayed outcomes are discounted less compared to the calendar format. Other forms of framing time; however, have not been explored. All participants completed a titrating calendar unit delay-discounting task for money. Participants were also assigned to one of two delay discounting tasks: time as dates (e.g., June 1st, 2015) or time in units of days (e.g., 5000 days), using the same delay distribution as the calendar delay-discounting task. Time framed as dates resulted in less discounting compared to the calendar method, whereas time framed as days resulted in greater discounting compared to the calendar method. The hyperboloid model fit best compared to the hyperbola and exponential models. How time is framed may alter how participants attend to the delays as well as how the delayed outcome is valued. Altering how time is framed may serve to improve adherence to goals with delayed outcomes. © Society for the Experimental Analysis of Behavior.

  12. Gravity Tides Extracted from Relative Gravimeter Data by Combining Empirical Mode Decomposition and Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Hongjuan; Guo, Jinyun; Kong, Qiaoli; Chen, Xiaodong

    2018-04-01

    The static observation data from a relative gravimeter contain noise and signals such as gravity tides. This paper focuses on the extraction of the gravity tides from the static relative gravimeter data for the first time applying the combined method of empirical mode decomposition (EMD) and independent component analysis (ICA), called the EMD-ICA method. The experimental results from the CG-5 gravimeter (SCINTREX Limited Ontario Canada) data show that the gravity tides time series derived by EMD-ICA are consistent with the theoretical reference (Longman formula) and the RMS of their differences only reaches 4.4 μGal. The time series of the gravity tides derived by EMD-ICA have a strong correlation with the theoretical time series and the correlation coefficient is greater than 0.997. The accuracy of the gravity tides estimated by EMD-ICA is comparable to the theoretical model and is slightly higher than that of independent component analysis (ICA). EMD-ICA could overcome the limitation of ICA having to process multiple observations and slightly improve the extraction accuracy and reliability of gravity tides from relative gravimeter data compared to that estimated with ICA.

  13. Meta-analysis of Prolene Hernia System mesh versus Lichtenstein mesh in open inguinal hernia repair.

    PubMed

    Sanjay, Pandanaboyana; Watt, David G; Ogston, Simon A; Alijani, Afshin; Windsor, John A

    2012-10-01

    This study was designed to systematically analyse all published randomized clinical trials comparing the Prolene Hernia System (PHS) mesh and Lichtenstein mesh for open inguinal hernia repair. A literature search was performed using the Cochrane Colorectal Cancer Group Controlled Trials Register, the Cochrane Central Register of Controlled Trials in the Cochrane Library, MEDLINE, Embase and Science Citation Index Expanded. Randomized trials comparing the Lichtenstein Mesh repair (LMR) with the Prolene Hernia System were included. Statistical analysis was performed using Review Manager Version 5.1 software. The primary outcome measures were hernia recurrence and chronic pain after operation. Secondary outcome measures included surgical time, peri-operative complications, time to return to work, early and long-term postoperative complications. Six randomized clinical trials were identified as suitable, containing 1313 patients. There was no statistical difference between the two types of repair in operation time, time to return to work, incidence of chronic groin pain, hernia recurrence or long-term complications. The PHS group had a higher rate of peri-operative complications, compared to Lichtenstein mesh repair (risk ratio (RR) 0.71, 95% confidence interval 0.55-0.93, P=0.01). The use of PHS mesh was associated with an increased risk of peri-operative complications compared to LMR. Both mesh repair techniques have comparable short- and long-term outcomes. Copyright © 2012 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  14. Digital microfluidic platform for multiplexing enzyme assays: implications for lysosomal storage disease screening in newborns.

    PubMed

    Sista, Ramakrishna S; Eckhardt, Allen E; Wang, Tong; Graham, Carrie; Rouse, Jeremy L; Norton, Scott M; Srinivasan, Vijay; Pollack, Michael G; Tolun, Adviye A; Bali, Deeksha; Millington, David S; Pamula, Vamsee K

    2011-10-01

    Newborn screening for lysosomal storage diseases (LSDs) has been gaining considerable interest owing to the availability of enzyme replacement therapies. We present a digital microfluidic platform to perform rapid, multiplexed enzymatic analysis of acid α-glucosidase (GAA) and acid α-galactosidase to screen for Pompe and Fabry disorders. The results were compared with those obtained using standard fluorometric methods. We performed bench-based, fluorometric enzymatic analysis on 60 deidentified newborn dried blood spots (DBSs), plus 10 Pompe-affected and 11 Fabry-affected samples, at Duke Biochemical Genetics Laboratory using a 3-mm punch for each assay and an incubation time of 20 h. We used a digital microfluidic platform to automate fluorometric enzymatic assays at Advanced Liquid Logic Inc. using extract from a single punch for both assays, with an incubation time of 6 h. Assays were also performed with an incubation time of 1 h. Assay results were generally comparable, although mean enzymatic activity for GAA using microfluidics was approximately 3 times higher than that obtained using bench-based methods, which could be attributed to higher substrate concentration. Clear separation was observed between the normal and affected samples at both 6- and 1-h incubation times using digital microfluidics. A digital microfluidic platform compared favorably with a clinical reference laboratory to perform enzymatic analysis in DBSs for Pompe and Fabry disorders. This platform presents a new technology for a newborn screening laboratory to screen LSDs by fully automating all the liquid-handling operations in an inexpensive system, providing rapid results.

  15. A hybrid wavelet de-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series.

    PubMed

    Wang, Dong; Borthwick, Alistair G; He, Handan; Wang, Yuankun; Zhu, Jieyu; Lu, Yuan; Xu, Pengcheng; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-01-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, wavelet de-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. Compared to three other generic methods, the results generated by WD-REPA model presented invariably smaller error measures which means the forecasting capability of the WD-REPA model is better than other models. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Time series analysis of InSAR data: Methods and trends

    NASA Astrophysics Data System (ADS)

    Osmanoğlu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cabral-Cano, Enrique

    2016-05-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ;unwrapping; of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  17. Time Series Analysis of Insar Data: Methods and Trends

    NASA Technical Reports Server (NTRS)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  18. Modeling BAS Dysregulation in Bipolar Disorder.

    PubMed

    Hamaker, Ellen L; Grasman, Raoul P P P; Kamphuis, Jan Henk

    2016-08-01

    Time series analysis is a technique that can be used to analyze the data from a single subject and has great potential to investigate clinically relevant processes like affect regulation. This article uses time series models to investigate the assumed dysregulation of affect that is associated with bipolar disorder. By formulating a number of alternative models that capture different kinds of theoretically predicted dysregulation, and by comparing these in both bipolar patients and controls, we aim to illustrate the heuristic potential this method of analysis has for clinical psychology. We argue that, not only can time series analysis elucidate specific maladaptive dynamics associated with psychopathology, it may also be clinically applied in symptom monitoring and the evaluation of therapeutic interventions.

  19. Comparative analysis of detection methods for congenital cytomegalovirus infection in a Guinea pig model.

    PubMed

    Park, Albert H; Mann, David; Error, Marc E; Miller, Matthew; Firpo, Matthew A; Wang, Yong; Alder, Stephen C; Schleiss, Mark R

    2013-01-01

    To assess the validity of the guinea pig as a model for congenital cytomegalovirus (CMV) infection by comparing the effectiveness of detecting the virus by real-time polymerase chain reaction (PCR) in blood, urine, and saliva. Case-control study. Academic research. Eleven pregnant Hartley guinea pigs. Blood, urine, and saliva samples were collected from guinea pig pups delivered from pregnant dams inoculated with guinea pig CMV. These samples were then evaluated for the presence of guinea pig CMV by real-time PCR assuming 100% transmission. Thirty-one pups delivered from 9 inoculated pregnant dams and 8 uninfected control pups underwent testing for guinea pig CMV and for auditory brainstem response hearing loss. Repeated-measures analysis of variance demonstrated no statistically significantly lower weight for the infected pups compared with the noninfected control pups. Six infected pups demonstrated auditory brainstem response hearing loss. The sensitivity and specificity of the real-time PCR assay on saliva samples were 74.2% and 100.0%, respectively. The sensitivity of the real-time PCR on blood and urine samples was significantly lower than that on saliva samples. Real-time PCR assays of blood, urine, and saliva revealed that saliva samples show high sensitivity and specificity for detecting congenital CMV infection in guinea pigs. This finding is consistent with recent screening studies in human newborns. The guinea pig may be a good animal model in which to compare different diagnostic assays for congenital CMV infection.

  20. Living network meta-analysis compared with pairwise meta-analysis in comparative effectiveness research: empirical study.

    PubMed

    Nikolakopoulou, Adriani; Mavridis, Dimitris; Furukawa, Toshi A; Cipriani, Andrea; Tricco, Andrea C; Straus, Sharon E; Siontis, George C M; Egger, Matthias; Salanti, Georgia

    2018-02-28

    To examine whether the continuous updating of networks of prospectively planned randomised controlled trials (RCTs) ("living" network meta-analysis) provides strong evidence against the null hypothesis in comparative effectiveness of medical interventions earlier than the updating of conventional, pairwise meta-analysis. Empirical study of the accumulating evidence about the comparative effectiveness of clinical interventions. Database of network meta-analyses of RCTs identified through searches of Medline, Embase, and the Cochrane Database of Systematic Reviews until 14 April 2015. Network meta-analyses published after January 2012 that compared at least five treatments and included at least 20 RCTs. Clinical experts were asked to identify in each network the treatment comparison of greatest clinical interest. Comparisons were excluded for which direct and indirect evidence disagreed, based on side, or node, splitting test (P<0.10). Cumulative pairwise and network meta-analyses were performed for each selected comparison. Monitoring boundaries of statistical significance were constructed and the evidence against the null hypothesis was considered to be strong when the monitoring boundaries were crossed. A significance level was defined as α=5%, power of 90% (β=10%), and an anticipated treatment effect to detect equal to the final estimate from the network meta-analysis. The frequency and time to strong evidence was compared against the null hypothesis between pairwise and network meta-analyses. 49 comparisons of interest from 44 networks were included; most (n=39, 80%) were between active drugs, mainly from the specialties of cardiology, endocrinology, psychiatry, and rheumatology. 29 comparisons were informed by both direct and indirect evidence (59%), 13 by indirect evidence (27%), and 7 by direct evidence (14%). Both network and pairwise meta-analysis provided strong evidence against the null hypothesis for seven comparisons, but for an additional 10 comparisons only network meta-analysis provided strong evidence against the null hypothesis (P=0.002). The median time to strong evidence against the null hypothesis was 19 years with living network meta-analysis and 23 years with living pairwise meta-analysis (hazard ratio 2.78, 95% confidence interval 1.00 to 7.72, P=0.05). Studies directly comparing the treatments of interest continued to be published for eight comparisons after strong evidence had become evident in network meta-analysis. In comparative effectiveness research, prospectively planned living network meta-analyses produced strong evidence against the null hypothesis more often and earlier than conventional, pairwise meta-analyses. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  1. Scaling Analysis of Alloy Solidification and Fluid Flow in a Rectangular Cavity

    NASA Astrophysics Data System (ADS)

    Plotkowski, A.; Fezi, K.; Krane, M. J. M.

    A scaling analysis was performed to predict trends in alloy solidification in a side-cooled rectangular cavity. The governing equations for energy and momentum were scaled in order to determine the dependence of various aspects of solidification on the process parameters for a uniform initial temperature and an isothermal boundary condition. This work improved on previous analyses by adding considerations for the cooling bulk fluid flow. The analysis predicted the time required to extinguish the superheat, the maximum local solidification time, and the total solidification time. The results were compared to a numerical simulation for a Al-4.5 wt.% Cu alloy with various initial and boundary conditions. Good agreement was found between the simulation results and the trends predicted by the scaling analysis.

  2. A time-frequency approach for the analysis of normal and arrhythmia cardiac signals.

    PubMed

    Mahmoud, Seedahmed S; Fang, Qiang; Davidović, Dragomir M; Cosic, Irena

    2006-01-01

    Previously, electrocardiogram (ECG) signals have been analyzed in either a time-indexed or spectral form. The reality, is that the ECG and all other biological signals belong to the family of multicomponent nonstationary signals. Due to this reason, the use of time-frequency analysis can be unavoidable for these signals. The Husimi and Wigner distributions are normally used in quantum mechanics for phase space representations of the wavefunction. In this paper, we introduce the Husimi distribution (HD) to analyze the normal and abnormal ECG signals in time-frequency domain. The abnormal cardiac signal was taken from a patient with supraventricular arrhythmia. Simulation results show that the HD has a good performance in the analysis of the ECG signals comparing with the Wigner-Ville distribution (WVD).

  3. Treating Depression during Pregnancy and the Postpartum: A Preliminary Meta-Analysis

    ERIC Educational Resources Information Center

    Bledsoe, Sarah E.; Grote, Nancy K.

    2006-01-01

    Objectives: This meta-analysis evaluates treatment effects for nonpsychotic major depression during pregnancy and postpartum comparing interventions by type and timing. Methods: Studies for decreasing depressive severity during pregnancy and postpartum applying treatment trials and standardized measures were included. Standardized mean differences…

  4. NASA trend analysis procedures

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  5. American Time-Styles: A Finite-Mixture Allocation Model for Time-Use Analysis

    ERIC Educational Resources Information Center

    Kamakura, Wagner A.

    2009-01-01

    Time-use has already been the subject of numerous studies across multiple disciplines such as economics, marketing, sociology, transportation and urban planning. However, most of this research has focused on comparing demographic groups on a few broadly defined activities (e.g., work for pay, leisure, housework, etc.). In this study we take a…

  6. The comparative analysis of the current-meter method and the pressure-time method used for discharge measurements in the Kaplan turbine penstocks

    NASA Astrophysics Data System (ADS)

    Adamkowski, A.; Krzemianowski, Z.

    2012-11-01

    The paper presents experiences gathered during many years of utilizing the current-meter and pressure-time methods for flow rate measurements in many hydropower plants. The integration techniques used in these both methods are different from the recommendations contained in the relevant international standards, mainly from the graphical and arithmetical ones. The results of the comparative analysis of both methods applied at the same time during the hydraulic performance tests of two Kaplan turbines in one of the Polish hydropower plant are presented in the final part of the paper. In the case of the pressure-time method application, the concrete penstocks of the tested turbines required installing a special measuring instrumentation inside the penstock. The comparison has shown a satisfactory agreement between the results of discharge measurements executed using the both considered methods. Maximum differences between the discharge values have not exceeded 1.0 % and the average differences have not been greater than 0.5 %.

  7. A content analysis of food advertising on Turkish television.

    PubMed

    Akçil Ok, Mehtap; Ercan, Aydan; Kaya, Fatih Suleyman

    2016-12-01

    The aim of this study was to conduct a comprehensive content analysis of Television (TV) food advertising and compare various food advertisements on free-to-air Turkish national TV channels by broadcast time (duration) and frequency over the period of a week (19-25 April 2012). TV food advertisements were the unit of content analysis in this study. Each advertisement identified as promoting a food product was analysed for content; non-food advertisements were not analysed, although they were counted as a proportion of the advertisements aired. We recorded all programmes for 4 h each per day (7 p.m.-11 p.m.), totalling 84 h. Five types of food-related advertisements were identified (basic foods, junk foods, meat products, beverages and fast food), and six types of non-food advertisements. The Student t-test and ANOVA were used to compare the mean broadcast time of all prime time advertising for the two groups. The mean broadcast times for prime time, non-food advertisements showed a statistically significant difference (p < 0.05). This difference is related to the prime time period 7 p.m.-8 p.m. being considered dinner time for most Turkish families. Additionally, the number and broadcast times of beverage advertisements increased during this time period, while the broadcast time per beverage advertisement decreased (ratio = 20.8 s per ads). As a result, TV food advertising increased not only during dinner time but also in overall broadcast time (per advertisement). These findings may be useful for explaining how advertising can negatively influence food choices, thereby increasing public awareness of the need for health messages targeting obesity. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Uncertainty analysis in seismic tomography

    NASA Astrophysics Data System (ADS)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  9. Learning curves for single incision and conventional laparoscopic right hemicolectomy: a multidimensional analysis.

    PubMed

    Park, Yoonah; Yong, Yuen Geng; Yun, Seong Hyeon; Jung, Kyung Uk; Huh, Jung Wook; Cho, Yong Beom; Kim, Hee Cheol; Lee, Woo Yong; Chun, Ho-Kyung

    2015-05-01

    This study aimed to compare the learning curves and early postoperative outcomes for conventional laparoscopic (CL) and single incision laparoscopic (SIL) right hemicolectomy (RHC). This retrospective study included the initial 35 cases in each group. Learning curves were evaluated by the moving average of operative time, mean operative time of every five consecutive cases, and cumulative sum (CUSUM) analysis. The learning phase was considered overcome when the moving average of operative times reached a plateau, and when the mean operative time of every five consecutive cases reached a low point and subsequently did not vary by more than 30 minutes. Six patients with missing data in the CL RHC group were excluded from the analyses. According to the mean operative time of every five consecutive cases, learning phase of SIL and CL RHC was completed between 26 and 30 cases, and 16 and 20 cases, respectively. Moving average analysis revealed that approximately 31 (SIL) and 25 (CL) cases were needed to complete the learning phase, respectively. CUSUM analysis demonstrated that 10 (SIL) and two (CL) cases were required to reach a steady state of complication-free performance, respectively. Postoperative complications rate was higher in SIL than in CL group, but the difference was not statistically significant (17.1% vs. 3.4%). The learning phase of SIL RHC is longer than that of CL RHC. Early oncological outcomes of both techniques were comparable. However, SIL RHC had a statistically insignificant higher complication rate than CL RHC during the learning phase.

  10. Spatio-temporal hierarchy in the dynamics of a minimalist protein model

    NASA Astrophysics Data System (ADS)

    Matsunaga, Yasuhiro; Baba, Akinori; Li, Chun-Biu; Straub, John E.; Toda, Mikito; Komatsuzaki, Tamiki; Berry, R. Stephen

    2013-12-01

    A method for time series analysis of molecular dynamics simulation of a protein is presented. In this approach, wavelet analysis and principal component analysis are combined to decompose the spatio-temporal protein dynamics into contributions from a hierarchy of different time and space scales. Unlike the conventional Fourier-based approaches, the time-localized wavelet basis captures the vibrational energy transfers among the collective motions of proteins. As an illustrative vehicle, we have applied our method to a coarse-grained minimalist protein model. During the folding and unfolding transitions of the protein, vibrational energy transfers between the fast and slow time scales were observed among the large-amplitude collective coordinates while the other small-amplitude motions are regarded as thermal noise. Analysis employing a Gaussian-based measure revealed that the time scales of the energy redistribution in the subspace spanned by such large-amplitude collective coordinates are slow compared to the other small-amplitude coordinates. Future prospects of the method are discussed in detail.

  11. A comparative study on visual choice reaction time for different colors in females.

    PubMed

    Balakrishnan, Grrishma; Uppinakudru, Gurunandan; Girwar Singh, Gaur; Bangera, Shobith; Dutt Raghavendra, Aswini; Thangavel, Dinesh

    2014-01-01

    Reaction time is one of the important methods to study a person's central information processing speed and coordinated peripheral movement response. Visual choice reaction time is a type of reaction time and is very important for drivers, pilots, security guards, and so forth. Previous studies were mainly on simple reaction time and there are very few studies on visual choice reaction time. The aim of our study was to compare the visual choice reaction time for red, green, and yellow colors of 60 healthy undergraduate female volunteers. After giving adequate practice, visual choice reaction time was recorded for red, green, and yellow colors using reaction time machine (RTM 608, Medicaid, Chandigarh). Repeated measures of ANOVA and Bonferroni multiple comparison were used for analysis and P < 0.05 was considered statistically significant. The results showed that both red and green had significantly less choice visual choice reaction (P values <0.0001 and 0.0002) when compared with yellow. This could be because individual color mental processing time for yellow color is more than red and green.

  12. Estimating soil hydraulic parameters from transient flow experiments in a centrifuge using parameter optimization technique

    USGS Publications Warehouse

    Šimůnek, Jirka; Nimmo, John R.

    2005-01-01

    A modified version of the Hydrus software package that can directly or inversely simulate water flow in a transient centrifugal field is presented. The inverse solver for parameter estimation of the soil hydraulic parameters is then applied to multirotation transient flow experiments in a centrifuge. Using time‐variable water contents measured at a sequence of several rotation speeds, soil hydraulic properties were successfully estimated by numerical inversion of transient experiments. The inverse method was then evaluated by comparing estimated soil hydraulic properties with those determined independently using an equilibrium analysis. The optimized soil hydraulic properties compared well with those determined using equilibrium analysis and steady state experiment. Multirotation experiments in a centrifuge not only offer significant time savings by accelerating time but also provide significantly more information for the parameter estimation procedure compared to multistep outflow experiments in a gravitational field.

  13. Time and Frequency-Domain Cross-Verification of SLS 6DOF Trajectory Simulations

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew; McCullough, John

    2017-01-01

    The Space Launch System (SLS) Guidance, Navigation, and Control (GNC) team and its partners have developed several time- and frequency-based simulations for development and analysis of the proposed SLS launch vehicle. The simulations differ in fidelity and some have unique functionality that allows them to perform specific analyses. Some examples of the purposes of the various models are: trajectory simulation, multi-body separation, Monte Carlo, hardware in the loop, loads, and frequency domain stability analyses. While no two simulations are identical, many of the models are essentially six degree-of-freedom (6DOF) representations of the SLS plant dynamics, hardware implementation, and flight software. Thus at a high level all of those models should be in agreement. Comparison of outputs from several SLS trajectory and stability analysis tools are ongoing as part of the program's current verification effort. The purpose of these comparisons is to highlight modeling and analysis differences, verify simulation data sources, identify inconsistencies and minor errors, and ultimately to verify output data as being a good representation of the vehicle and subsystem dynamics. This paper will show selected verification work in both the time and frequency domain from the current design analysis cycle of the SLS for several of the design and analysis simulations. In the time domain, the tools that will be compared are MAVERIC, CLVTOPS, SAVANT, STARS, ARTEMIS, and POST 2. For the frequency domain analysis, the tools to be compared are FRACTAL, SAVANT, and STARS. The paper will include discussion of these tools including their capabilities, configurations, and the uses to which they are put in the SLS program. Determination of the criteria by which the simulations are compared (matching criteria) requires thoughtful consideration, and there are several pitfalls that may occur that can severely punish a simulation if not considered carefully. The paper will discuss these considerations and will present a framework for responding to these issues when they arise. For example, small event timing differences can lead to large differences in mass properties if the criteria are to measure those properties at the same time, or large differences in altitude if the criteria are to measure those properties when the simulation experiences a staging event. Similarly, a tiny difference in phase can lead to large gain margin differences for frequency-domain comparisons of gain margins.

  14. Comparison between three option, four option and five option multiple choice question tests for quality parameters: A randomized study.

    PubMed

    Vegada, Bhavisha; Shukla, Apexa; Khilnani, Ajeetkumar; Charan, Jaykaran; Desai, Chetna

    2016-01-01

    Most of the academic teachers use four or five options per item of multiple choice question (MCQ) test as formative and summative assessment. Optimal number of options in MCQ item is a matter of considerable debate among academic teachers of various educational fields. There is a scarcity of the published literature regarding the optimum number of option in each item of MCQ in the field of medical education. To compare three options, four options, and five options MCQs test for the quality parameters - reliability, validity, item analysis, distracter analysis, and time analysis. Participants were 3 rd semester M.B.B.S. students. Students were divided randomly into three groups. Each group was given one set of MCQ test out of three options, four options, and five option randomly. Following the marking of the multiple choice tests, the participants' option selections were analyzed and comparisons were conducted of the mean marks, mean time, validity, reliability and facility value, discrimination index, point biserial value, distracter analysis of three different option formats. Students score more ( P = 0.000) and took less time ( P = 0.009) for the completion of three options as compared to four options and five options groups. Facility value was more ( P = 0.004) in three options group as compared to four and five options groups. There was no significant difference between three groups for the validity, reliability, and item discrimination. Nonfunctioning distracters were more in the four and five options group as compared to three option group. Assessment based on three option MCQs is can be preferred over four option and five option MCQs.

  15. Real-Time Support on IEEE 802.11 Wireless Ad-Hoc Networks: Reality vs. Theory

    NASA Astrophysics Data System (ADS)

    Kang, Mikyung; Kang, Dong-In; Suh, Jinwoo

    The usable throughput of an IEEE 802.11 system for an application is much less than the raw bandwidth. Although 802.11b has a theoretical maximum of 11Mbps, more than half of the bandwidth is consumed by overhead leaving at most 5Mbps of usable bandwidth. Considering this characteristic, this paper proposes and analyzes a real-time distributed scheduling scheme based on the existing IEEE 802.11 wireless ad-hoc networks, using USC/ISI's Power Aware Sensing Tracking and Analysis (PASTA) hardware platform. We compared the distributed real-time scheduling scheme with the real-time polling scheme to meet deadline, and compared a measured real bandwidth with a theoretical result. The theoretical and experimental results show that the distributed scheduling scheme can guarantee real-time traffic and enhances the performance up to 74% compared with polling scheme.

  16. [Study of the reliability in one dimensional size measurement with digital slit lamp microscope].

    PubMed

    Wang, Tao; Qi, Chaoxiu; Li, Qigen; Dong, Lijie; Yang, Jiezheng

    2010-11-01

    To study the reliability of digital slit lamp microscope as a tool for quantitative analysis in one dimensional size measurement. Three single-blinded observers acquired and repeatedly measured the images with a size of 4.00 mm and 10.00 mm on the vernier caliper, which simulatated the human eye pupil and cornea diameter under China-made digital slit lamp microscope in the objective magnification of 4 times, 10 times, 16 times, 25 times, 40 times and 4 times, 10 times, 16 times, respectively. The correctness and precision of measurement were compared. The images with 4 mm size were measured by three investigators and the average values were located between 3.98 to 4.06. For the images with 10.00 mm size, the average values fell within 10.00 ~ 10.04. Measurement results of 4.00 mm images showed, except A4, B25, C16 and C25, significant difference was noted between the measured value and the true value. Regarding measurement results of 10.00 mm iamges indicated, except A10, statistical significance was found between the measured value and the true value. In terms of comparing the results of the same size measured at different magnifications by the same investigator, except for investigators A's measurements of 10.00 mm dimension, the measurement results by all the remaining investigators presented statistical significance at different magnifications. Compared measurements of the same size with different magnifications, measurements of 4.00 mm in 4-fold magnification had no significant difference among the investigators', the remaining results were statistically significant. The coefficient of variation of all measurement results were less than 5%; as magnification increased, the coefficient of variation decreased. The measurement of digital slit lamp microscope in one-dimensional size has good reliability,and should be performed for reliability analysis before used for quantitative analysis to reduce systematic errors.

  17. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-file Systems.

    PubMed

    Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Dixit, Kratika; Naik, Saraswathi V

    2016-01-01

    Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. This is an experimental, in vitro study comparing the two groups. A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49.

  18. An analysis of international health care logistics: the benefits and implications of implementing just-in-time systems in the health care industry.

    PubMed

    Jarrett, P Gary

    2006-01-01

    The primary purpose of this study is to undertake a diagnostic investigation of the international health care logistical environment and determine whether regulatory policies or industry procedures have hindered the implementation of just-in-time (JIT) systems and then to recommend operational improvements to be achieved by implementing JIT Systems. The analysis was conducted in a systematic manner and compared the anticipated benefits with benefits validated in other industries from the implementation of JIT. An extensive literature review was conducted. In this particular study the cost and benefit outcomes achieved from a health care JIT implementation were compared with those achieved by the manufacturing, service, and retail industries. Chiefly, it was found that the health service market must be restructured to encourage greater price competition among priorities. A new standardization process should eliminate duplication of products and realize substantial savings. The analysis was conducted in a systematic manner and compared the anticipated benefits with benefits validated in other industries from the implementation of JIT.

  19. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    PubMed

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  20. Generalized sample entropy analysis for traffic signals based on similarity measure

    NASA Astrophysics Data System (ADS)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  1. Robotic partial nephrectomy shortens warm ischemia time, reducing suturing time kinetics even for an experienced laparoscopic surgeon: a comparative analysis.

    PubMed

    Faria, Eliney F; Caputo, Peter A; Wood, Christopher G; Karam, Jose A; Nogueras-González, Graciela M; Matin, Surena F

    2014-02-01

    Laparoscopic and robotic partial nephrectomy (LPN and RPN) are strongly related to influence of tumor complexity and learning curve. We analyzed a consecutive experience between RPN and LPN to discern if warm ischemia time (WIT) is in fact improved while accounting for these two confounding variables and if so by which particular aspect of WIT. This is a retrospective analysis of consecutive procedures performed by a single surgeon between 2002-2008 (LPN) and 2008-2012 (RPN). Specifically, individual steps, including tumor excision, suturing of intrarenal defect, and parenchyma, were recorded at the time of surgery. Multivariate and univariate analyzes were used to evaluate influence of learning curve, tumor complexity, and time kinetics of individual steps during WIT, to determine their influence in WIT. Additionally, we considered the effect of RPN on the learning curve. A total of 146 LPNs and 137 RPNs were included. Considering renal function, WIT, suturing time, renorrhaphy time were found statistically significant differences in favor of RPN (p < 0.05). In the univariate analysis, surgical procedure, learning curve, clinical tumor size, and RENAL nephrometry score were statistically significant predictors for WIT (p < 0.05). RPN decreased the WIT on average by approximately 7 min compared to LPN even when adjusting for learning curve, tumor complexity, and both together (p < 0.001). We found RPN was associated with a shorter WIT when controlling for influence of the learning curve and tumor complexity. The time required for tumor excision was not shortened but the time required for suturing steps was significantly shortened.

  2. Desorption atmospheric pressure photoionization and direct analysis in real time coupled with travelling wave ion mobility mass spectrometry.

    PubMed

    Räsänen, Riikka-Marjaana; Dwivedi, Prabha; Fernández, Facundo M; Kauppila, Tiina J

    2014-11-15

    Ambient mass spectrometry (MS) is a tool for screening analytes directly from sample surfaces. However, background impurities may complicate the spectra and therefore fast separation techniques are needed. Here, we demonstrate the use of travelling wave ion mobility spectrometry in a comparative study of two ambient MS techniques. Desorption atmospheric pressure photoionization (DAPPI) and direct analysis in real time (DART) were coupled with travelling wave ion mobility mass spectrometry (TWIM-MS) for highly selective surface analysis. The ionization efficiencies of DAPPI and DART were compared. Test compounds were: bisphenol A, benzo[a]pyrene, ranitidine, cortisol and α-tocopherol. DAPPI-MS and DART-TWIM-MS were also applied to the analysis of chloroquine from dried blood spots, and α-tocopherol from almond surface, and DAPPI-TWIM-MS was applied to analysis of pharmaceuticals and multivitamin tablets. DAPPI was approximately 100 times more sensitive than DART for bisphenol A and 10-20 times more sensitive for the other compounds. The limits of detection were between 30-290 and 330-8200 fmol for DAPPI and DART, respectively. Also, from the authentic samples, DAPPI ionized chloroquine and α-tocopherol more efficiently than DART. The mobility separation enabled the detection of species with low signal intensities, e.g. thiamine and cholecalciferol, in the DAPPI-TWIM-MS analysis of multivitamin tablets. DAPPI ionized the studied compounds of interest more efficiently than DART. For both DAPPI and DART, the mobility separation prior to MS analysis reduced the amount of chemical noise in the mass spectrum and significantly increased the signal-to-noise ratio for the analytes. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Stability analysis for a delay differential equations model of a hydraulic turbine speed governor

    NASA Astrophysics Data System (ADS)

    Halanay, Andrei; Safta, Carmen A.; Dragoi, Constantin; Piraianu, Vlad F.

    2017-01-01

    The paper aims to study the dynamic behavior of a speed governor for a hydraulic turbine using a mathematical model. The nonlinear mathematical model proposed consists in a system of delay differential equations (DDE) to be compared with already established mathematical models of ordinary differential equations (ODE). A new kind of nonlinearity is introduced as a time delay. The delays can characterize different running conditions of the speed governor. For example, it is considered that spool displacement of hydraulic amplifier might be blocked due to oil impurities in the oil supply system and so the hydraulic amplifier has a time delay in comparison to the time control. Numerical simulations are presented in a comparative manner. A stability analysis of the hydraulic control system is performed, too. Conclusions of the dynamic behavior using the DDE model of a hydraulic turbine speed governor are useful in modeling and controlling hydropower plants.

  4. Comprehensive transcriptome analysis and flavonoid profiling of Ginkgo leaves reveals flavonoid content alterations in day-night cycles.

    PubMed

    Ni, Jun; Dong, Lixiang; Jiang, Zhifang; Yang, Xiuli; Chen, Ziying; Wu, Yuhuan; Xu, Maojun

    2018-01-01

    Ginkgo leaves are raw materials for flavonoid extraction. Thus, the timing of their harvest is important to optimize the extraction efficiency, which benefits the pharmaceutical industry. In this research, we compared the transcriptomes of Ginkgo leaves harvested at midday and midnight. The differentially expressed genes with the highest probabilities in each step of flavonoid biosynthesis were down-regulated at midnight. Furthermore, real-time PCR corroborated the transcriptome results, indicating the decrease in flavonoid biosynthesis at midnight. The flavonoid profiles of Ginkgo leaves harvested at midday and midnight were compared, and the total flavonoid content decreased at midnight. A detailed analysis of individual flavonoids showed that most of their contents were decreased by various degrees. Our results indicated that circadian rhythms affected the flavonoid contents in Ginkgo leaves, which provides valuable information for optimizing their harvesting times to benefit the pharmaceutical industry.

  5. ICL-based TDLAS sensor for real-time breath gas analysis of carbon monoxide isotopes.

    PubMed

    Ghorbani, Ramin; Schmidt, Florian M

    2017-05-29

    We present a compact sensor for carbon monoxide (CO) in air and exhaled breath based on a room temperature interband cascade laser (ICL) operating at 4.69 µm, a low-volume circular multipass cell and wavelength modulation absorption spectroscopy. A fringe-limited (1σ) sensitivity of 6.5 × 10 -8 cm -1 Hz -1/2 and a detection limit of 9 ± 5 ppbv at 0.07 s acquisition time are achieved, which constitutes a 25-fold improvement compared to direct absorption spectroscopy. Integration over 10 s increases the precision to 0.6 ppbv. The setup also allows measuring the stable isotope 13 CO in breath. We demonstrate quantification of indoor air CO and real-time detection of CO expirograms from healthy non-smokers and a healthy smoker before and after smoking. Isotope ratio analysis indicates depletion of 13 CO in breath compared to natural abundance.

  6. Repressing the effects of variable speed harmonic orders in operational modal analysis

    NASA Astrophysics Data System (ADS)

    Randall, R. B.; Coats, M. D.; Smith, W. A.

    2016-10-01

    Discrete frequency components such as machine shaft orders can disrupt the operation of normal Operational Modal Analysis (OMA) algorithms. With constant speed machines, they have been removed using time synchronous averaging (TSA). This paper compares two approaches for varying speed machines. In one method, signals are transformed into the order domain, and after the removal of shaft speed related components by a cepstral notching method, are transformed back to the time domain to allow normal OMA. In the other simpler approach an exponential shortpass lifter is applied directly in the time domain cepstrum to enhance the modal information at the expense of other disturbances. For simulated gear signals with speed variations of both ±5% and ±15%, the simpler approach was found to give better results The TSA method is shown not to work in either case. The paper compares the results with those obtained using a stationary random excitation.

  7. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 1

    NASA Technical Reports Server (NTRS)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 1 of the four major tasks included in the study. Task 1 compares flight plans based on forecasts with plans based on the verifying analysis from 33 days during the summer and fall of 1979. The comparisons show that: (1) potential fuel savings conservatively estimated to be between 1.2 and 2.5 percent could result from using more timely and accurate weather data in flight planning and route selection; (2) the Suitland forecast generally underestimates wind speeds; and (3) the track selection methodology of many airlines operating on the North Atlantic may not be optimum resulting in their selecting other than the optimum North Atlantic Organized Track about 50 percent of the time.

  8. Long Term Precipitation Pattern Identification and Derivation of Non Linear Precipitation Trend in a Catchment using Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Unnikrishnan, Poornima; Jothiprakash, Vinayakam

    2017-04-01

    Precipitation is the major component in the hydrologic cycle. Awareness of not only the total amount of rainfall pertaining to a catchment, but also the pattern of its spatial and temporal distribution are equally important in the management of water resources systems in an efficient way. Trend is the long term direction of a time series; it determines the overall pattern of a time series. Singular Spectrum Analysis (SSA) is a time series analysis technique that decomposes the time series into small components (eigen triples). This property of the method of SSA has been utilized to extract the trend component of the rainfall time series. In order to derive trend from the rainfall time series, we need to select components corresponding to trend from the eigen triples. For this purpose, periodogram analysis of the eigen triples have been proposed to be coupled with SSA, in the present study. In the study, seasonal data of England and Wales Precipitation (EWP) for a time period of 1766-2013 have been analyzed and non linear trend have been derived out of the precipitation data. In order to compare the performance of SSA in deriving trend component, Mann Kendall (MK) test is also used to detect trends in EWP seasonal series and the results have been compared. The result showed that the MK test could detect the presence of positive or negative trend for a significance level, whereas the proposed methodology of SSA could extract the non-linear trend present in the rainfall series along with its shape. We will discuss further the comparison of both the methodologies along with the results in the presentation.

  9. Real-time open-loop frequency response analysis of flight test data

    NASA Technical Reports Server (NTRS)

    Bosworth, J. T.; West, J. C.

    1986-01-01

    A technique has been developed to compare the open-loop frequency response of a flight test aircraft real time with linear analysis predictions. The result is direct feedback to the flight control systems engineer on the validity of predictions and adds confidence for proceeding with envelope expansion. Further, gain and phase margins can be tracked for trends in a manner similar to the techniques used by structural dynamics engineers in tracking structural modal damping.

  10. MS Data Miner: a web-based software tool to analyze, compare, and share mass spectrometry protein identifications.

    PubMed

    Dyrlund, Thomas F; Poulsen, Ebbe T; Scavenius, Carsten; Sanggaard, Kristian W; Enghild, Jan J

    2012-09-01

    Data processing and analysis of proteomics data are challenging and time consuming. In this paper, we present MS Data Miner (MDM) (http://sourceforge.net/p/msdataminer), a freely available web-based software solution aimed at minimizing the time required for the analysis, validation, data comparison, and presentation of data files generated in MS software, including Mascot (Matrix Science), Mascot Distiller (Matrix Science), and ProteinPilot (AB Sciex). The program was developed to significantly decrease the time required to process large proteomic data sets for publication. This open sourced system includes a spectra validation system and an automatic screenshot generation tool for Mascot-assigned spectra. In addition, a Gene Ontology term analysis function and a tool for generating comparative Excel data reports are included. We illustrate the benefits of MDM during a proteomics study comprised of more than 200 LC-MS/MS analyses recorded on an AB Sciex TripleTOF 5600, identifying more than 3000 unique proteins and 3.5 million peptides. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A novel universal real-time PCR system using the attached universal duplex probes for quantitative analysis of nucleic acids

    PubMed Central

    Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing

    2008-01-01

    Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost. PMID:18522756

  12. Challenges of Guarantee-Time Bias

    PubMed Central

    Giobbie-Hurder, Anita; Gelber, Richard D.; Regan, Meredith M.

    2013-01-01

    The potential for guarantee-time bias (GTB), also known as immortal time bias, exists whenever an analysis that is timed from enrollment or random assignment, such as disease-free or overall survival, is compared across groups defined by a classifying event occurring sometime during follow-up. The types of events associated with GTB are varied and may include the occurrence of objective disease response, onset of toxicity, or seroconversion. However, comparative analyses using these types of events as predictors are different from analyses using baseline characteristics that are specified completely before the occurrence of any outcome event. Recognizing the potential for GTB is not always straightforward, and it can be challenging to know when GTB is influencing the results of an analysis. This article defines GTB, provides examples of GTB from several published articles, and discusses three analytic techniques that can be used to remove the bias: conditional landmark analysis, extended Cox model, and inverse probability weighting. The strengths and limitations of each technique are presented. As an example, we explore the effect of bisphosphonate use on disease-free survival (DFS) using data from the BIG (Breast International Group) 1-98 randomized clinical trial. An analysis using a naive approach showed substantial benefit for patients who received bisphosphonate therapy. In contrast, analyses using the three methods known to remove GTB showed no statistical evidence of a reduction in risk of a DFS event with bisphosphonate therapy. PMID:23835712

  13. Visualisation and quantitative analysis of the rodent malaria liver stage by real time imaging.

    PubMed

    Ploemen, Ivo H J; Prudêncio, Miguel; Douradinha, Bruno G; Ramesar, Jai; Fonager, Jannik; van Gemert, Geert-Jan; Luty, Adrian J F; Hermsen, Cornelus C; Sauerwein, Robert W; Baptista, Fernanda G; Mota, Maria M; Waters, Andrew P; Que, Ivo; Lowik, Clemens W G M; Khan, Shahid M; Janse, Chris J; Franke-Fayard, Blandine M D

    2009-11-18

    The quantitative analysis of Plasmodium development in the liver in laboratory animals in cultured cells is hampered by low parasite infection rates and the complicated methods required to monitor intracellular development. As a consequence, this important phase of the parasite's life cycle has been poorly studied compared to blood stages, for example in screening anti-malarial drugs. Here we report the use of a transgenic P. berghei parasite, PbGFP-Luc(con), expressing the bioluminescent reporter protein luciferase to visualize and quantify parasite development in liver cells both in culture and in live mice using real-time luminescence imaging. The reporter-parasite based quantification in cultured hepatocytes by real-time imaging or using a microplate reader correlates very well with established quantitative RT-PCR methods. For the first time the liver stage of Plasmodium is visualized in whole bodies of live mice and we were able to discriminate as few as 1-5 infected hepatocytes per liver in mice using 2D-imaging and to identify individual infected hepatocytes by 3D-imaging. The analysis of liver infections by whole body imaging shows a good correlation with quantitative RT-PCR analysis of extracted livers. The luminescence-based analysis of the effects of various drugs on in vitro hepatocyte infection shows that this method can effectively be used for in vitro screening of compounds targeting Plasmodium liver stages. Furthermore, by analysing the effect of primaquine and tafenoquine in vivo we demonstrate the applicability of real time imaging to assess parasite drug sensitivity in the liver. The simplicity and speed of quantitative analysis of liver-stage development by real-time imaging compared to the PCR methodologies, as well as the possibility to analyse liver development in live mice without surgery, opens up new possibilities for research on Plasmodium liver infections and for validating the effect of drugs and vaccines on the liver stage of Plasmodium.

  14. Visualisation and Quantitative Analysis of the Rodent Malaria Liver Stage by Real Time Imaging

    PubMed Central

    Douradinha, Bruno G.; Ramesar, Jai; Fonager, Jannik; van Gemert, Geert-Jan; Luty, Adrian J. F.; Hermsen, Cornelus C.; Sauerwein, Robert W.; Baptista, Fernanda G.; Mota, Maria M.; Waters, Andrew P.; Que, Ivo; Lowik, Clemens W. G. M.; Khan, Shahid M.; Janse, Chris J.; Franke-Fayard, Blandine M. D.

    2009-01-01

    The quantitative analysis of Plasmodium development in the liver in laboratory animals in cultured cells is hampered by low parasite infection rates and the complicated methods required to monitor intracellular development. As a consequence, this important phase of the parasite's life cycle has been poorly studied compared to blood stages, for example in screening anti-malarial drugs. Here we report the use of a transgenic P. berghei parasite, PbGFP-Luccon, expressing the bioluminescent reporter protein luciferase to visualize and quantify parasite development in liver cells both in culture and in live mice using real-time luminescence imaging. The reporter-parasite based quantification in cultured hepatocytes by real-time imaging or using a microplate reader correlates very well with established quantitative RT-PCR methods. For the first time the liver stage of Plasmodium is visualized in whole bodies of live mice and we were able to discriminate as few as 1–5 infected hepatocytes per liver in mice using 2D-imaging and to identify individual infected hepatocytes by 3D-imaging. The analysis of liver infections by whole body imaging shows a good correlation with quantitative RT-PCR analysis of extracted livers. The luminescence-based analysis of the effects of various drugs on in vitro hepatocyte infection shows that this method can effectively be used for in vitro screening of compounds targeting Plasmodium liver stages. Furthermore, by analysing the effect of primaquine and tafenoquine in vivo we demonstrate the applicability of real time imaging to assess parasite drug sensitivity in the liver. The simplicity and speed of quantitative analysis of liver-stage development by real-time imaging compared to the PCR methodologies, as well as the possibility to analyse liver development in live mice without surgery, opens up new possibilities for research on Plasmodium liver infections and for validating the effect of drugs and vaccines on the liver stage of Plasmodium. PMID:19924309

  15. Effect of Concomitant Medications on the Safety and Efficacy of Extended-Release Carbidopa-Levodopa (IPX066) in Patients With Advanced Parkinson Disease: A Post Hoc Analysis.

    PubMed

    LeWitt, Peter A; Verhagen Metman, Leo; Rubens, Robert; Khanna, Sarita; Kell, Sherron; Gupta, Suneel

    Extended-release (ER) carbidopa-levodopa (CD-LD) (IPX066/RYTARY/NUMIENT) produces improvements in "off" time, "on" time without troublesome dyskinesia, and Unified Parkinson Disease Rating Scale scores compared with immediate-release (IR) CD-LD or IR CD-LD plus entacapone (CLE). Post hoc analyses of 2 ER CD-LD phase 3 trials evaluated whether the efficacy and safety of ER CD-LD relative to the respective active comparators were altered by concomitant medications (dopaminergic agonists, monoamine oxidase B [MAO-B] inhibitors, or amantadine). ADVANCE-PD (n = 393) assessed safety and efficacy of ER CD-LD versus IR CD-LD. ASCEND-PD (n = 91) evaluated ER CD-LD versus CLE. In both studies, IR- and CLE-experienced patients underwent a 6-week, open-label dose-conversion period to ER CD-LD prior to randomization. For analysis, the randomized population was divided into 3 subgroups: dopaminergic agonists, rasagiline or selegiline, and amantadine. For each subgroup, changes from baseline in PD diary measures ("off" time and "on" time with and without troublesome dyskinesia), Unified Parkinson Disease Rating Scale Parts II + III scores, and adverse events were analyzed, comparing ER CD-LD with the active comparator. Concomitant dopaminergic agonist or MAO-B inhibitor use did not diminish the efficacy (improvement in "off" time and "on" time without troublesome dyskinesia) of ER CD-LD compared with IR CD-LD or CLE, whereas the improvement with concomitant amantadine failed to reach significance. Safety and tolerability were similar among the subgroups, and ER CD-LD did not increase troublesome dyskinesia. For patients on oral LD regimens and taking a dopaminergic agonist, and/or a MAO-B inhibitor, changing from an IR to an ER CD-LD formulation provides approximately an additional hour of "good" on time.

  16. Effect of Concomitant Medications on the Safety and Efficacy of Extended-Release Carbidopa-Levodopa (IPX066) in Patients With Advanced Parkinson Disease: A Post Hoc Analysis

    PubMed Central

    LeWitt, Peter A.; Verhagen Metman, Leo; Rubens, Robert; Khanna, Sarita; Kell, Sherron; Gupta, Suneel

    2018-01-01

    Objectives Extended-release (ER) carbidopa-levodopa (CD-LD) (IPX066/RYTARY/NUMIENT) produces improvements in “off” time, “on” time without troublesome dyskinesia, and Unified Parkinson Disease Rating Scale scores compared with immediate-release (IR) CD-LD or IR CD-LD plus entacapone (CLE). Post hoc analyses of 2 ER CD-LD phase 3 trials evaluated whether the efficacy and safety of ER CD-LD relative to the respective active comparators were altered by concomitant medications (dopaminergic agonists, monoamine oxidase B [MAO-B] inhibitors, or amantadine). Methods ADVANCE-PD (n = 393) assessed safety and efficacy of ER CD-LD versus IR CD-LD. ASCEND-PD (n = 91) evaluated ER CD-LD versus CLE. In both studies, IR- and CLE-experienced patients underwent a 6-week, open-label dose-conversion period to ER CD-LD prior to randomization. For analysis, the randomized population was divided into 3 subgroups: dopaminergic agonists, rasagiline or selegiline, and amantadine. For each subgroup, changes from baseline in PD diary measures (“off” time and “on” time with and without troublesome dyskinesia), Unified Parkinson Disease Rating Scale Parts II + III scores, and adverse events were analyzed, comparing ER CD-LD with the active comparator. Results and Conclusions Concomitant dopaminergic agonist or MAO-B inhibitor use did not diminish the efficacy (improvement in “off” time and “on” time without troublesome dyskinesia) of ER CD-LD compared with IR CD-LD or CLE, whereas the improvement with concomitant amantadine failed to reach significance. Safety and tolerability were similar among the subgroups, and ER CD-LD did not increase troublesome dyskinesia. For patients on oral LD regimens and taking a dopaminergic agonist, and/or a MAO-B inhibitor, changing from an IR to an ER CD-LD formulation provides approximately an additional hour of “good” on time. PMID:29432286

  17. Modelling short time series in metabolomics: a functional data analysis approach.

    PubMed

    Montana, Giovanni; Berk, Maurice; Ebbels, Tim

    2011-01-01

    Metabolomics is the study of the complement of small molecule metabolites in cells, biofluids and tissues. Many metabolomic experiments are designed to compare changes observed over time under two or more experimental conditions (e.g. a control and drug-treated group), thus producing time course data. Models from traditional time series analysis are often unsuitable because, by design, only very few time points are available and there are a high number of missing values. We propose a functional data analysis approach for modelling short time series arising in metabolomic studies which overcomes these obstacles. Our model assumes that each observed time series is a smooth random curve, and we propose a statistical approach for inferring this curve from repeated measurements taken on the experimental units. A test statistic for detecting differences between temporal profiles associated with two experimental conditions is then presented. The methodology has been applied to NMR spectroscopy data collected in a pre-clinical toxicology study.

  18. Comparative Analysis of Haar and Daubechies Wavelet for Hyper Spectral Image Classification

    NASA Astrophysics Data System (ADS)

    Sharif, I.; Khare, S.

    2014-11-01

    With the number of channels in the hundreds instead of in the tens Hyper spectral imagery possesses much richer spectral information than multispectral imagery. The increased dimensionality of such Hyper spectral data provides a challenge to the current technique for analyzing data. Conventional classification methods may not be useful without dimension reduction pre-processing. So dimension reduction has become a significant part of Hyper spectral image processing. This paper presents a comparative analysis of the efficacy of Haar and Daubechies wavelets for dimensionality reduction in achieving image classification. Spectral data reduction using Wavelet Decomposition could be useful because it preserves the distinction among spectral signatures. Daubechies wavelets optimally capture the polynomial trends while Haar wavelet is discontinuous and resembles a step function. The performance of these wavelets are compared in terms of classification accuracy and time complexity. This paper shows that wavelet reduction has more separate classes and yields better or comparable classification accuracy. In the context of the dimensionality reduction algorithm, it is found that the performance of classification of Daubechies wavelets is better as compared to Haar wavelet while Daubechies takes more time compare to Haar wavelet. The experimental results demonstrate the classification system consistently provides over 84% classification accuracy.

  19. Comparative investigation of body composition in male dogs using CT and body fat analysis software.

    PubMed

    Kobayashi, Toyokazu; Koie, Hiroshi; Kusumi, Akiko; Kitagawa, Masato; Kanayama, Kiichi; Otsuji, Kazuya

    2014-03-01

    In small animal veterinary practices, body condition score (BCS) is generally used to diagnose obesity. However, BCS does not constitute objective data. In this study, we investigated the value of using human body fat analysis software for male dogs. We also compared changes in body fat after neutering. Changes in body fat at the time of neutering (age 1 year) and 1 year later were compared by performing CT scanning and using human body fat analysis software. We found that body fat increased in all the individuals tested. In terms of the site of fat accumulation, subcutaneous fat was more pronounced than visceral fat with a marked increase on the dorsal side of the abdomen rather than the thorax.

  20. Real-time measurement system for the evaluation of the intima media thickness with a new edge detector.

    PubMed

    Faita, Francesco; Gemignani, Vincenzo; Bianchini, Elisabetta; Giannarelli, Chiara; Demi, Marcello

    2006-01-01

    The evaluation of the intima media thickness (IMT) of the common carotid artery (CCA) with B-mode ultrasonography represents an important index of cardiovascular risk. The IMT is defined as the distance between the leading edge of the lumen-intima interface and the leading edge of the media-adventitia interface. In order to evaluate the IMT, it is necessary to locate such edges. In this paper we developed an automatic real-time system to evaluate the IMT based on the first order absolute moment (FOAM), which is used as an edge detector, and on a pattern recognition approach. The IMT measurements were compared with manual measurements. We used regression analysis and Bland-Altman analysis to compare the results.

  1. Grayscale image segmentation for real-time traffic sign recognition: the hardware point of view

    NASA Astrophysics Data System (ADS)

    Cao, Tam P.; Deng, Guang; Elton, Darrell

    2009-02-01

    In this paper, we study several grayscale-based image segmentation methods for real-time road sign recognition applications on an FPGA hardware platform. The performance of different image segmentation algorithms in different lighting conditions are initially compared using PC simulation. Based on these results and analysis, suitable algorithms are implemented and tested on a real-time FPGA speed sign detection system. Experimental results show that the system using segmented images uses significantly less hardware resources on an FPGA while maintaining comparable system's performance. The system is capable of processing 60 live video frames per second.

  2. Evaluation of a new imaging tool for use with major trauma cases in the emergency department.

    PubMed

    Crönlein, Moritz; Holzapfel, Konstantin; Beirer, Marc; Postl, Lukas; Kanz, Karl-Georg; Pförringer, Dominik; Huber-Wagner, Stefan; Biberthaler, Peter; Kirchhoff, Chlodwig

    2016-11-17

    The aim of this study was to evaluate potential benefits of a new diagnostic software prototype (Trauma Viewer, TV) automatically reformatting computed tomography (CT) data on diagnostic speed and quality, compared to CT-image data evaluation using a conventional CT console. Multiple trauma CT data sets were analysed by one expert radiology and one expert traumatology fellow independently twice, once using the TV and once using the secondary conventional CT console placed in the CT control room. Actual analysis time and precision of diagnoses assessment were evaluated. The TV and CT-console results were compared respectively, but also a comparison to the initial multiple trauma CT reports assessed by emergency radiology fellows considered as the gold standard was performed. Finally, design and function of the Trauma Viewer were evaluated in a descriptive manner. CT data sets of 30 multiple trauma patients were enrolled. Mean time needed for analysis of one CT dataset was 2.43 min using the CT console and 3.58 min using the TV respectively. Thus, secondary conventional CT console analysis was on average 1.15 min shorter compared to the TV analysis. Both readers missed a total of 11 diagnoses using the secondary conventional CT console compared to 12 missed diagnoses using the TV. However, none of these overlooked diagnoses resulted in an Abbreviated Injury Scale (AIS) > 2 corresponding to life threatening injuries. Even though it took the two expert fellows a little longer to analyse the CT scans on the prototype TV compared to the CT console, which can be explained by the new user interface of the TV, our preliminary results demonstrate that, after further development, the TV might serve as a new diagnostic feature in the trauma room management. Its high potential to improve time and quality of CT-based diagnoses might help in fast decision making regarding treatment of severely injured patients.

  3. The Shock and Vibration Digest. Volume 14, Number 12

    DTIC Science & Technology

    1982-12-01

    to evaluate the uses of statistical energy analysis for determining sound transmission performance. Coupling loss factors were mea- sured and compared...measurements for the artificial (Also see No. 2623) cracks in mild-steel test pieces. 82-2676 Ihprovement of the Method of Statistical Energy Analysis for...eters, using a large number of free-response time histories In the application of the statistical energy analysis theory simultaneously in one analysis

  4. TimesVector: a vectorized clustering approach to the analysis of time series transcriptome data from multiple phenotypes.

    PubMed

    Jung, Inuk; Jo, Kyuri; Kang, Hyejin; Ahn, Hongryul; Yu, Youngjae; Kim, Sun

    2017-12-01

    Identifying biologically meaningful gene expression patterns from time series gene expression data is important to understand the underlying biological mechanisms. To identify significantly perturbed gene sets between different phenotypes, analysis of time series transcriptome data requires consideration of time and sample dimensions. Thus, the analysis of such time series data seeks to search gene sets that exhibit similar or different expression patterns between two or more sample conditions, constituting the three-dimensional data, i.e. gene-time-condition. Computational complexity for analyzing such data is very high, compared to the already difficult NP-hard two dimensional biclustering algorithms. Because of this challenge, traditional time series clustering algorithms are designed to capture co-expressed genes with similar expression pattern in two sample conditions. We present a triclustering algorithm, TimesVector, specifically designed for clustering three-dimensional time series data to capture distinctively similar or different gene expression patterns between two or more sample conditions. TimesVector identifies clusters with distinctive expression patterns in three steps: (i) dimension reduction and clustering of time-condition concatenated vectors, (ii) post-processing clusters for detecting similar and distinct expression patterns and (iii) rescuing genes from unclassified clusters. Using four sets of time series gene expression data, generated by both microarray and high throughput sequencing platforms, we demonstrated that TimesVector successfully detected biologically meaningful clusters of high quality. TimesVector improved the clustering quality compared to existing triclustering tools and only TimesVector detected clusters with differential expression patterns across conditions successfully. The TimesVector software is available at http://biohealth.snu.ac.kr/software/TimesVector/. sunkim.bioinfo@snu.ac.kr. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  5. Headed toward Equality? Housework Change in Comparative Perspective

    ERIC Educational Resources Information Center

    Geist, Claudia; Cohen, Philip N.

    2011-01-01

    This paper examines gendered housework in the larger context of comparative social change, asking specifically whether cross-national differences in domestic labor patterns converge over time. Our analysis of data from 13 countries (N = 11,065) from the 1994 and 2002 International Social Survey Program (ISSP) confirmed that social context matters…

  6. Comparing drinking water treatment costs to source water protection costs using time series analysis.

    EPA Science Inventory

    We present a framework to compare water treatment costs to source water protection costs, an important knowledge gap for drinking water treatment plants (DWTPs). This trade-off helps to determine what incentives a DWTP has to invest in natural infrastructure or pollution reductio...

  7. Pilot-Reported Beta-Blockers Identified by Forensic Toxicology Analysis of Postmortem Specimens.

    PubMed

    Canfield, Dennis V; Dubowski, Kurt M; Whinnery, James M; Forster, Estrella M

    2018-01-01

    This study compared beta-blockers reported by pilots with the medications found by postmortem toxicology analysis of specimens received from fatal aviation accidents between 1999 and 2015. Several studies have compared drugs using the standard approach: Compare the drug found by toxicology analysis with the drug reported by the pilot. This study uniquely examined first the pilot-reported medication and then compared it to that detected by toxicology analysis. This study will serve two purposes: (i) to determine the capability of a toxicology laboratory to detect reported medications, and (ii) to identify pilots with medications below detectable limits. All information required for this study was extracted from the Toxicology Data Base system and was searched using ToxFlo or SQL Server Management Studio. The following information was collected and analyzed: pilot-reported trade and/or generic drug, date specimens received, time of accident, type of aviation operations (CFR), state, pilot level, age, class of medical, specimen type, specimen concentration, dose reported, frequency reported associated with the accident, quantity reported, National Transportation Safety Board (NTSB) accident event number, and all NTSB reports. There were 319 pilots that either reported taking a beta-blocker or were found to be taking a beta-blocker by postmortem toxicology analysis. Time of death, therapeutic concentration and specimen type were found to be factors in the ability of the laboratory to detect beta-blockers. Beta-blockers taken by pilots will, in most cases, be found by a competent postmortem forensic toxicology laboratory at therapeutic concentrations. The dose taken by the pilot was not found to be a factor in the ability of the laboratory to identify beta-blockers. Time of dose, route of administration, specimen tested and therapeutic concentration of the drug were found to be factors in the ability of the laboratory to identify beta-blockers in postmortem specimens. Published by Oxford University Press 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  8. A constitutive material model for nonlinear finite element structural analysis using an iterative matrix approach

    NASA Technical Reports Server (NTRS)

    Koenig, Herbert A.; Chan, Kwai S.; Cassenti, Brice N.; Weber, Richard

    1988-01-01

    A unified numerical method for the integration of stiff time dependent constitutive equations is presented. The solution process is directly applied to a constitutive model proposed by Bodner. The theory confronts time dependent inelastic behavior coupled with both isotropic hardening and directional hardening behaviors. Predicted stress-strain responses from this model are compared to experimental data from cyclic tests on uniaxial specimens. An algorithm is developed for the efficient integration of the Bodner flow equation. A comparison is made with the Euler integration method. An analysis of computational time is presented for the three algorithms.

  9. Lipidomic analysis of biological samples: Comparison of liquid chromatography, supercritical fluid chromatography and direct infusion mass spectrometry methods.

    PubMed

    Lísa, Miroslav; Cífková, Eva; Khalikova, Maria; Ovčačíková, Magdaléna; Holčapek, Michal

    2017-11-24

    Lipidomic analysis of biological samples in a clinical research represents challenging task for analytical methods given by the large number of samples and their extreme complexity. In this work, we compare direct infusion (DI) and chromatography - mass spectrometry (MS) lipidomic approaches represented by three analytical methods in terms of comprehensiveness, sample throughput, and validation results for the lipidomic analysis of biological samples represented by tumor tissue, surrounding normal tissue, plasma, and erythrocytes of kidney cancer patients. Methods are compared in one laboratory using the identical analytical protocol to ensure comparable conditions. Ultrahigh-performance liquid chromatography/MS (UHPLC/MS) method in hydrophilic interaction liquid chromatography mode and DI-MS method are used for this comparison as the most widely used methods for the lipidomic analysis together with ultrahigh-performance supercritical fluid chromatography/MS (UHPSFC/MS) method showing promising results in metabolomics analyses. The nontargeted analysis of pooled samples is performed using all tested methods and 610 lipid species within 23 lipid classes are identified. DI method provides the most comprehensive results due to identification of some polar lipid classes, which are not identified by UHPLC and UHPSFC methods. On the other hand, UHPSFC method provides an excellent sensitivity for less polar lipid classes and the highest sample throughput within 10min method time. The sample consumption of DI method is 125 times higher than for other methods, while only 40μL of organic solvent is used for one sample analysis compared to 3.5mL and 4.9mL in case of UHPLC and UHPSFC methods, respectively. Methods are validated for the quantitative lipidomic analysis of plasma samples with one internal standard for each lipid class. Results show applicability of all tested methods for the lipidomic analysis of biological samples depending on the analysis requirements. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis.

    PubMed

    Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul

    2012-01-01

    Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.

  11. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037

  12. C(α) torsion angles as a flexible criterion to extract secrets from a molecular dynamics simulation.

    PubMed

    Victor Paul Raj, Fredrick Robin Devadoss; Exner, Thomas E

    2014-04-01

    Given the increasing complexity of simulated molecular systems, and the fact that simulation times have now reached milliseconds to seconds, immense amounts of data (in the gigabyte to terabyte range) are produced in current molecular dynamics simulations. Manual analysis of these data is a very time-consuming task, and important events that lead from one intermediate structure to another can become occluded in the noise resulting from random thermal fluctuations. To overcome these problems and facilitate a semi-automated data analysis, we introduce in this work a measure based on C(α) torsion angles: torsion angles formed by four consecutive C(α) atoms. This measure describes changes in the backbones of large systems on a residual length scale (i.e., a small number of residues at a time). Cluster analysis of individual C(α) torsion angles and its fuzzification led to continuous time patches representing (meta)stable conformations and to the identification of events acting as transitions between these conformations. The importance of a change in torsion angle to structural integrity is assessed by comparing this change to the average fluctuations in the same torsion angle over the complete simulation. Using this novel measure in combination with other measures such as the root mean square deviation (RMSD) and time series of distance measures, we performed an in-depth analysis of a simulation of the open form of DNA polymerase I. The times at which major conformational changes occur and the most important parts of the molecule and their interrelations were pinpointed in this analysis. The simultaneous determination of the time points and localizations of major events is a significant advantage of the new bottom-up approach presented here, as compared to many other (top-down) approaches in which only the similarity of the complete structure is analyzed.

  13. A Fiscal Analysis of Fixed-Amount Federal Grants-in-Aid: The Case of Vocational Education.

    ERIC Educational Resources Information Center

    Patterson, Philip D., Jr.

    A fiscal analysis of fixed-amount Federal grant programs using the criteria of effectiveness, efficiency, and equity is essential to an evaluation of the Federal grant structure. Measures of program need should be current, comparable over time and among states, and subjected to sensitivity analysis so that future grants can be estimated. Income…

  14. APPLICATION OF TRAVEL TIME RELIABILITY FOR PERFORMANCE ORIENTED OPERATIONAL PLANNING OF EXPRESSWAYS

    NASA Astrophysics Data System (ADS)

    Mehran, Babak; Nakamura, Hideki

    Evaluation of impacts of congestion improvement scheme s on travel time reliability is very significant for road authorities since travel time reliability repr esents operational performance of expressway segments. In this paper, a methodology is presented to estimate travel tim e reliability prior to implementation of congestion relief schemes based on travel time variation modeling as a function of demand, capacity, weather conditions and road accident s. For subject expressway segmen ts, traffic conditions are modeled over a whole year considering demand and capacity as random variables. Patterns of demand and capacity are generated for each five minute interval by appl ying Monte-Carlo simulation technique, and accidents are randomly generated based on a model that links acci dent rate to traffic conditions. A whole year analysis is performed by comparing de mand and available capacity for each scenario and queue length is estimated through shockwave analysis for each time in terval. Travel times are estimated from refined speed-flow relationships developed for intercity expressways and buffer time index is estimated consequently as a measure of travel time reliability. For validation, estimated reliability indices are compared with measured values from empirical data, and it is shown that the proposed method is suitable for operational evaluation and planning purposes.

  15. Short-time fractional Fourier methods for the time-frequency representation of chirp signals.

    PubMed

    Capus, Chris; Brown, Keith

    2003-06-01

    The fractional Fourier transform (FrFT) provides a valuable tool for the analysis of linear chirp signals. This paper develops two short-time FrFT variants which are suited to the analysis of multicomponent and nonlinear chirp signals. Outputs have similar properties to the short-time Fourier transform (STFT) but show improved time-frequency resolution. The FrFT is a parameterized transform with parameter, a, related to chirp rate. The two short-time implementations differ in how the value of a is chosen. In the first, a global optimization procedure selects one value of a with reference to the entire signal. In the second, a values are selected independently for each windowed section. Comparative variance measures based on the Gaussian function are given and are shown to be consistent with the uncertainty principle in fractional domains. For appropriately chosen FrFT orders, the derived fractional domain uncertainty relationship is minimized for Gaussian windowed linear chirp signals. The two short-time FrFT algorithms have complementary strengths demonstrated by time-frequency representations for a multicomponent bat chirp, a highly nonlinear quadratic chirp, and an output pulse from a finite-difference sonar model with dispersive change. These representations illustrate the improvements obtained in using FrFT based algorithms compared to the STFT.

  16. Time and learning efficiency in Internet-based learning: a systematic review and meta-analysis.

    PubMed

    Cook, David A; Levinson, Anthony J; Garside, Sarah

    2010-12-01

    Authors have claimed that Internet-based instruction promotes greater learning efficiency than non-computer methods. determine, through a systematic synthesis of evidence in health professions education, how Internet-based instruction compares with non-computer instruction in time spent learning, and what features of Internet-based instruction are associated with improved learning efficiency. we searched databases including MEDLINE, CINAHL, EMBASE, and ERIC from 1990 through November 2008. STUDY SELECTION AND DATA ABSTRACTION we included all studies quantifying learning time for Internet-based instruction for health professionals, compared with other instruction. Reviewers worked independently, in duplicate, to abstract information on interventions, outcomes, and study design. we identified 20 eligible studies. Random effects meta-analysis of 8 studies comparing Internet-based with non-Internet instruction (positive numbers indicating Internet longer) revealed pooled effect size (ES) for time -0.10 (p = 0.63). Among comparisons of two Internet-based interventions, providing feedback adds time (ES 0.67, p =0.003, two studies), and greater interactivity generally takes longer (ES 0.25, p = 0.089, five studies). One study demonstrated that adapting to learner prior knowledge saves time without significantly affecting knowledge scores. Other studies revealed that audio narration, video clips, interactive models, and animations increase learning time but also facilitate higher knowledge and/or satisfaction. Across all studies, time correlated positively with knowledge outcomes (r = 0.53, p = 0.021). on average, Internet-based instruction and non-computer instruction require similar time. Instructional strategies to enhance feedback and interactivity typically prolong learning time, but in many cases also enhance learning outcomes. Isolated examples suggest potential for improving efficiency in Internet-based instruction.

  17. Mechanical analysis of statolith action in roots and rhizoids

    NASA Astrophysics Data System (ADS)

    Todd, Paul

    1994-08-01

    Published observations on the response times following gravistimulation (horizontal positioning) of Chara rhizoids and developing roots of vascular plants with normal and ``starchless'' amyloplasts were reviewed and compared. Statolith motion was found to be consistent with gravitational sedimentation opposed by elastic deformation of an intracellular material. The time required for a statolith to sediment to equilibrium was calculated on the basis of its buoyant density and compared with observed sedimentation times. In the examples chosen, the response time following gravistimulation (from horizontal positioning to the return of downward growth) could be related to the statolith sedimentation time. Such a relationship implies that the transduction step is rapid in comparison with the perception steo following gravistimulation of rhizoids and developing roots.

  18. Mechanical analysis of statolith action in roots and rhizoids.

    PubMed

    Todd, P

    1994-01-01

    Published observations on the response times following gravistimulation (horizontal positioning) of Chara rhizoids and developing roots of vascular plants with normal and "starchless" amyloplasts were reviewed and compared. Statolith motion was found to be consistent with gravitational sedimentation opposed by elastic deformation of an intracellular material. The time required for a statolith to sediment to equilibrium was calculated on the basis of its buoyant density and compared with observed sedimentation times. In the examples chosen, the response time following gravistimulation (from horizontal positioning to the return of downward growth) could be related to the statolith sedimentation time. Such a relationship implies that the transduction step is rapid in comparison with the perception step following gravistimulation of rhizoids and developing roots.

  19. Determination of red blood cell fatty acid profiles: Rapid and high-confident analysis by chemical ionization-gas chromatography-tandem mass spectrometry.

    PubMed

    Schober, Yvonne; Wahl, Hans Günther; Renz, Harald; Nockher, Wolfgang Andreas

    2017-01-01

    Cellular fatty acid (FA) profiles have been acknowledged as biomarkers in various human diseases. Nevertheless, common FA analysis by gas chromatography mass spectrometry (GC-MS) requires long analysis time. Hence, there is a need for feasible methods for high throughput analysis in clinical studies. FA was extracted from red blood cells (RBC) and derivatized to fatty acid methyl esters (FAME). A method using gas chromatography tandem mass spectrometry (GC-MS/MS) with ammonia-induced chemical ionization (CI) was developed for the analysis of FA profiles in human RBC. We compared this method with classical single GC-MS using electron impact ionization (EI). The FA profiles of 703 RBC samples were determined by GC-MS/MS. In contrast to EI ammonia-induced CI resulted in adequate amounts of molecular ions for further fragmentation of FAME. Specific fragments for confident quantification and fragmentation were determined for 45 FA. The GC-MS/MS method has a total run time of 9min compared to typical analysis times of up to 60min in conventional GC-MS. Intra and inter assay variations were <10% for all FA analyzed. Analysis of RBC FA composition revealed an age-dependent increase of the omega-3 eicosapentaenoic and docosahexaenoic acid, and a decline of the omega-6 linoleic acid with a corresponding rise of the omega-3 index. The combination of ammonia-induced CI and tandem mass spectrometry after GC separation allows for high-throughput, robust and confident analysis of FA profiles in the clinical laboratory. Copyright © 2016. Published by Elsevier B.V.

  20. Applications of satellite image processing to the analysis of Amazonian cultural ecology

    NASA Technical Reports Server (NTRS)

    Behrens, Clifford A.

    1991-01-01

    This paper examines the application of satellite image processing towards identifying and comparing resource exploitation among indigenous Amazonian peoples. The use of statistical and heuristic procedures for developing land cover/land use classifications from Thematic Mapper satellite imagery will be discussed along with actual results from studies of relatively small (100 - 200 people) settlements. Preliminary research indicates that analysis of satellite imagery holds great potential for measuring agricultural intensification, comparing rates of tropical deforestation, and detecting changes in resource utilization patterns over time.

  1. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  2. A prospective observational study comparing a physiological scoring system with time-based discharge criteria in pediatric ambulatory surgical patients.

    PubMed

    Armstrong, James; Forrest, Helen; Crawford, Mark W

    2015-10-01

    Discharge criteria based on physiological scoring systems can be used in the postanesthesia care unit (PACU) to fast-track patients after ambulatory surgery; however, studies comparing physiological scoring systems with traditional time-based discharge criteria are lacking. The purpose of this study was to compare PACU discharge readiness times using physiological vs time-based discharge criteria in pediatric ambulatory surgical patients. We recorded physiological observations from consecutive American Society of Anesthesiologists physical status I-III patients aged 1-18 yr who were admitted to the PACU after undergoing ambulatory surgery in a tertiary academic pediatric hospital. The physiological score was a combination of the Aldrete and Chung systems. Scores were recorded every 15 min starting upon arrival in the PACU. Patients were considered fit for discharge once they attained a score ≥12 (maximum score, 14), provided no score was zero, with the time to achieve a score ≥12 defining the criteria-based discharge (CBD) time. Patients were discharged from the PACU when both the CBD and the existing time-based discharge (TBD) criteria were met. The CBD and TBD data were compared using Kaplan-Meier and log-rank analysis. Observations from 506 children are presented. Median (interquartile range [IQR]) age was 5.5 [2.8-9.9] yr. Median [IQR] CBD and TBD PACU discharge readiness times were 30 [15-45] min and 60 [45-60] min, respectively. Analysis of Kaplan-Meier curves indicated a significant difference in discharge times using the different criteria (hazard ratio, 5.43; 95% confidence interval, 4.51 to 6.53; P < 0.001). All patients were discharged home without incident. This prospective study suggests that discharge decisions based on physiological criteria have the potential for significantly speeding the transit of children through the PACU, thereby enhancing PACU efficiency and resource utilization.

  3. Pre-hospital electrocardiogram triage with telemedicine near halves time to treatment in STEMI: A meta-analysis and meta-regression analysis of non-randomized studies.

    PubMed

    Brunetti, Natale Daniele; De Gennaro, Luisa; Correale, Michele; Santoro, Francesco; Caldarola, Pasquale; Gaglione, Antonio; Di Biase, Matteo

    2017-04-01

    A shorter time to treatment has been shown to be associated with lower mortality rates in acute myocardial infarction (AMI). Several strategies have been adopted with the aim to reduce any delay in diagnosis of AMI: pre-hospital triage with telemedicine is one of such strategies. We therefore aimed to measure the real effect of pre-hospital triage with telemedicine in case of AMI in a meta-analysis study. We performed a meta-analysis of non-randomized studies with the aim to quantify the exact reduction of time to treatment achieved by pre-hospital triage with telemedicine. Data were pooled and compared by relative time reduction and 95% C.I.s. A meta-regression analysis was performed in order to find possible predictors of shorter time to treatment. Eleven studies were selected and finally evaluated in the study. The overall relative reduction of time to treatment with pre-hospital triage and telemedicine was -38/-40% (p<0.001). Absolute time reduction was significantly correlated to time to treatment in the control groups (p<0.001), while relative time reduction was independent. A non-significant trend toward shorter relative time reductions was observed over years. Pre-hospital triage with telemedicine is associated with a near halved time to treatment in AMI. The benefit is larger in terms of absolute time to treatment reduction in populations with larger delays to treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Particle simulation of Coulomb collisions: Comparing the methods of Takizuka and Abe and Nanbu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Chiaming; Lin, Tungyou; Caflisch, Russel

    2008-04-20

    The interactions of charged particles in a plasma are governed by long-range Coulomb collision. We compare two widely used Monte Carlo models for Coulomb collisions. One was developed by Takizuka and Abe in 1977, the other was developed by Nanbu in 1997. We perform deterministic and statistical error analysis with respect to particle number and time step. The two models produce similar stochastic errors, but Nanbu's model gives smaller time step errors. Error comparisons between these two methods are presented.

  5. Comparing the Efficiencies of Third Molar Surgeries With and Without a Dentist Anesthesiologist.

    PubMed

    Reebye, Uday; Young, S; Boukas, E; Davidian, E; Carnahan, J

    2017-01-01

    Two different anesthesia models were compared in terms of surgical duration, safer outcomes, and economic implications. Third molar surgeries performed with and without a separate dentist anesthesiologist were evaluated by a retrospective data analysis of the surgical operative times. For more difficult surgeries, substantially shorter operative times were observed with the dentist anesthesiologist model, leading to a more favorable surgical outcome. An example calculation is presented to demonstrate economic advantages of scheduling the participation of a dentist anesthesiologist for more difficult surgeries.

  6. A Meta-Analysis of Individualized Instruction in Dental Education.

    ERIC Educational Resources Information Center

    Dacanay, Lakshmi S; Cohen, Peter A.

    1992-01-01

    Meta-analysis of 34 comparative studies on conventional vs. individualized instruction (II) found most favored the latter but with small-moderate overall effect. Pacing had significant effect, with teacher-pacing more effective than student-paced learning. On average, II required less time than conventional teaching. Additional research on this…

  7. COMPARISON OF TWO DIFFERENT SOLID PHASE EXTRACTION/LARGE VOLUME INJECTION PROCEDURES FOR METHOD 8270

    EPA Science Inventory

    Two solid phase (SPE) and one traditional continuous liquid-liquid extraction method are compared for analysis of Method 8270 SVOCs. Productivity parameters include data quality, sample volume, analysis time and solvent waste.

    One SPE system, unique in the U.S., uses aut...

  8. Comparative transcriptome analysis during early fruit development between three seedy citrus genotypes and their seedless mutants

    USDA-ARS?s Scientific Manuscript database

    Identification of genes with differential transcript abundance (GDTA) in seedless mutants may enhance understanding of seedless citrus development. Transcriptome analysis was conducted at three time points during early fruit development (Phase 1) of three seedy citrus genotypes: Fallglo [Bower citru...

  9. A STRINGENT COMPARISON OF SAMPLING AND ANALYSIS METHODS FOR VOCS IN AMBIENT AIR

    EPA Science Inventory

    A carefully designed study was conducted during the summer of 1998 to simultaneously collect samples of ambient air by canisters and compare the analysis results to direct sorbent preconcentration results taken at the time of sample collection. A total of 32 1-h sample sets we...

  10. Clustering Financial Time Series by Network Community Analysis

    NASA Astrophysics Data System (ADS)

    Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio

    In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.

  11. Data Intensive Analysis of Biomolecular Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straatsma, TP; Soares, Thereza A.

    2007-12-01

    The advances in biomolecular modeling and simulation made possible by the availability of increasingly powerful high performance computing resources is extending molecular simulations to biological more relevant system size and time scales. At the same time, advances in simulation methodologies are allowing more complex processes to be described more accurately. These developments make a systems approach to computational structural biology feasible, but this will require a focused emphasis on the comparative analysis of the increasing number of molecular simulations that are being carried out for biomolecular systems with more realistic models, multi-component environments, and for longer simulation times. Just asmore » in the case of the analysis of the large data sources created by the new high-throughput experimental technologies, biomolecular computer simulations contribute to the progress in biology through comparative analysis. The continuing increase in available protein structures allows the comparative analysis of the role of structure and conformational flexibility in protein function, and is the foundation of the discipline of structural bioinformatics. This creates the opportunity to derive general findings from the comparative analysis of molecular dynamics simulations of a wide range of proteins, protein-protein complexes and other complex biological systems. Because of the importance of protein conformational dynamics for protein function, it is essential that the analysis of molecular trajectories is carried out using a novel, more integrative and systematic approach. We are developing a much needed rigorous computer science based framework for the efficient analysis of the increasingly large data sets resulting from molecular simulations. Such a suite of capabilities will also provide the required tools for access and analysis of a distributed library of generated trajectories. Our research is focusing on the following areas: (1) the development of an efficient analysis framework for very large scale trajectories on massively parallel architectures, (2) the development of novel methodologies that allow automated detection of events in these very large data sets, and (3) the efficient comparative analysis of multiple trajectories. The goal of the presented work is the development of new algorithms that will allow biomolecular simulation studies to become an integral tool to address the challenges of post-genomic biological research. The strategy to deliver the required data intensive computing applications that can effectively deal with the volume of simulation data that will become available is based on taking advantage of the capabilities offered by the use of large globally addressable memory architectures. The first requirement is the design of a flexible underlying data structure for single large trajectories that will form an adaptable framework for a wide range of analysis capabilities. The typical approach to trajectory analysis is to sequentially process trajectories time frame by time frame. This is the implementation found in molecular simulation codes such as NWChem, and has been designed in this way to be able to run on workstation computers and other architectures with an aggregate amount of memory that would not allow entire trajectories to be held in core. The consequence of this approach is an I/O dominated solution that scales very poorly on parallel machines. We are currently using an approach of developing tools specifically intended for use on large scale machines with sufficient main memory that entire trajectories can be held in core. This greatly reduces the cost of I/O as trajectories are read only once during the analysis. In our current Data Intensive Analysis (DIANA) implementation, each processor determines and skips to the entry within the trajectory that typically will be available in multiple files and independently from all other processors read the appropriate frames.« less

  12. Data Intensive Analysis of Biomolecular Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straatsma, TP

    2008-03-01

    The advances in biomolecular modeling and simulation made possible by the availability of increasingly powerful high performance computing resources is extending molecular simulations to biological more relevant system size and time scales. At the same time, advances in simulation methodologies are allowing more complex processes to be described more accurately. These developments make a systems approach to computational structural biology feasible, but this will require a focused emphasis on the comparative analysis of the increasing number of molecular simulations that are being carried out for biomolecular systems with more realistic models, multi-component environments, and for longer simulation times. Just asmore » in the case of the analysis of the large data sources created by the new high-throughput experimental technologies, biomolecular computer simulations contribute to the progress in biology through comparative analysis. The continuing increase in available protein structures allows the comparative analysis of the role of structure and conformational flexibility in protein function, and is the foundation of the discipline of structural bioinformatics. This creates the opportunity to derive general findings from the comparative analysis of molecular dynamics simulations of a wide range of proteins, protein-protein complexes and other complex biological systems. Because of the importance of protein conformational dynamics for protein function, it is essential that the analysis of molecular trajectories is carried out using a novel, more integrative and systematic approach. We are developing a much needed rigorous computer science based framework for the efficient analysis of the increasingly large data sets resulting from molecular simulations. Such a suite of capabilities will also provide the required tools for access and analysis of a distributed library of generated trajectories. Our research is focusing on the following areas: (1) the development of an efficient analysis framework for very large scale trajectories on massively parallel architectures, (2) the development of novel methodologies that allow automated detection of events in these very large data sets, and (3) the efficient comparative analysis of multiple trajectories. The goal of the presented work is the development of new algorithms that will allow biomolecular simulation studies to become an integral tool to address the challenges of post-genomic biological research. The strategy to deliver the required data intensive computing applications that can effectively deal with the volume of simulation data that will become available is based on taking advantage of the capabilities offered by the use of large globally addressable memory architectures. The first requirement is the design of a flexible underlying data structure for single large trajectories that will form an adaptable framework for a wide range of analysis capabilities. The typical approach to trajectory analysis is to sequentially process trajectories time frame by time frame. This is the implementation found in molecular simulation codes such as NWChem, and has been designed in this way to be able to run on workstation computers and other architectures with an aggregate amount of memory that would not allow entire trajectories to be held in core. The consequence of this approach is an I/O dominated solution that scales very poorly on parallel machines. We are currently using an approach of developing tools specifically intended for use on large scale machines with sufficient main memory that entire trajectories can be held in core. This greatly reduces the cost of I/O as trajectories are read only once during the analysis. In our current Data Intensive Analysis (DIANA) implementation, each processor determines and skips to the entry within the trajectory that typically will be available in multiple files and independently from all other processors read the appropriate frames.« less

  13. Identification of feline polycystic kidney disease mutation using fret probes and melting curve analysis.

    PubMed

    Criado-Fornelio, A; Buling, A; Barba-Carretero, J C

    2009-02-01

    We developed and validated a real-time polymerase chain reaction (PCR) assay using fluorescent hybridization probes and melting curve analysis to identify the PKD1 exon 29 (C-->A) mutation, which is implicated in polycystic kidney disease of cats. DNA was isolated from peripheral blood of 20 Persian cats. The employ of the new real-time PCR and melting curve analysis in these samples indicated that 13 cats (65%) were wild type homozygotes and seven cats (35%) were heterozygotes. Both PCR-RFLP and sequencing procedures were in full agreement with real-time PCR test results. Sequence analysis showed that the mutant gene had the expected base change compared to the wild type gene. The new procedure is not only very reliable but also faster than the techniques currently applied for diagnosis of the mutation.

  14. Robustness properties of discrete time regulators, LOG regulators and hybrid systems

    NASA Technical Reports Server (NTRS)

    Stein, G.; Athans, M.

    1979-01-01

    Robustness properites of sample-data LQ regulators are derived which show that these regulators have fundamentally inferior uncertainty tolerances when compared to their continuous-time counterparts. Results are also presented in stability theory, multivariable frequency domain analysis, LQG robustness, and mathematical representations of hybrid systems.

  15. Real-time obstructive sleep apnea detection from frequency analysis of EDR and HRV using Lomb Periodogram.

    PubMed

    Fan, Shu-Han; Chou, Chia-Ching; Chen, Wei-Chen; Fang, Wai-Chi

    2015-01-01

    In this study, an effective real-time obstructive sleep apnea (OSA) detection method from frequency analysis of ECG-derived respiratory (EDR) and heart rate variability (HRV) is proposed. Compared to traditional Polysomnography (PSG) which needs several physiological signals measured from patients, the proposed OSA detection method just only use ECG signals to determine the time interval of OSA. In order to be feasible to be implemented in hardware to achieve the real-time detection and portable application, the simplified Lomb Periodogram is utilized to perform the frequency analysis of EDR and HRV in this study. The experimental results of this work indicate that the overall accuracy can be effectively increased with values of Specificity (Sp) of 91%, Sensitivity (Se) of 95.7%, and Accuracy of 93.2% by integrating the EDR and HRV indexes.

  16. A parallelization scheme of the periodic signals tracking algorithm for isochronous mass spectrometry on GPUs

    NASA Astrophysics Data System (ADS)

    Chen, R. J.; Wang, M.; Yan, X. L.; Yang, Q.; Lam, Y. H.; Yang, L.; Zhang, Y. H.

    2017-12-01

    The periodic signals tracking algorithm has been used to determine the revolution times of ions stored in storage rings in isochronous mass spectrometry (IMS) experiments. It has been a challenge to perform real-time data analysis by using the periodic signals tracking algorithm in the IMS experiments. In this paper, a parallelization scheme of the periodic signals tracking algorithm is introduced and a new program is developed. The computing time of data analysis can be reduced by a factor of ∼71 and of ∼346 by using our new program on Tesla C1060 GPU and Tesla K20c GPU, compared to using old program on Xeon E5540 CPU. We succeed in performing real-time data analysis for the IMS experiments by using the new program on Tesla K20c GPU.

  17. Cerebral autoregulation in the preterm newborn using near-infrared spectroscopy: a comparison of time-domain and frequency-domain analyses

    NASA Astrophysics Data System (ADS)

    Eriksen, Vibeke R.; Hahn, Gitte H.; Greisen, Gorm

    2015-03-01

    The aim was to compare two conventional methods used to describe cerebral autoregulation (CA): frequency-domain analysis and time-domain analysis. We measured cerebral oxygenation (as a surrogate for cerebral blood flow) and mean arterial blood pressure (MAP) in 60 preterm infants. In the frequency domain, outcome variables were coherence and gain, whereas the cerebral oximetry index (COx) and the regression coefficient were the outcome variables in the time domain. Correlation between coherence and COx was poor. The disagreement between the two methods was due to the MAP and cerebral oxygenation signals being in counterphase in three cases. High gain and high coherence may arise spuriously when cerebral oxygenation decreases as MAP increases; hence, time-domain analysis appears to be a more robust-and simpler-method to describe CA.

  18. Comparison of three different C18 HPLC columns with different particle sizes for the optimization of aflatoxins analysis.

    PubMed

    Medina, A; Magan, N

    2012-03-15

    In this work we compared the performance of chromatography columns with particles of 5 and 3 μm with the new 2.7 μm solid core particles for the analysis of aflatoxins B1, G1, B2, and G2 using trifluoroacetic acid pre-column derivatization. Three different columns have been used and chromatographic parameters as retention time, resolution, limit of detection (LOD), limit of quantification (LOQ) were obtained from all of them and compared. The results show that comparing with the traditional columns, shorter columns (100 mm × 4.6 mm) with the new solid core particles are suitable for the analysis of these mycotoxins and allowed the reduction of the analysis time by 45.5% and 33.3% with respect to columns with particle size 5 μm (150 mm × 4.6 mm) and 3 μm (150 mm × 4.6 mm) respectively, without any detrimental effect on performance. This leads to the reduction of the analysis costs by saving on organic solvents and increasing the total number of analyses per day. The capability of these columns for analyzing samples, in different culture media, was assessed by analyzing different samples from: yeasts extract sucrose medium, corn meal agar medium and fresh hazelnut media. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. A time-frequency analysis method to obtain stable estimates of magnetotelluric response function based on Hilbert-Huang transform

    NASA Astrophysics Data System (ADS)

    Cai, Jianhua

    2017-05-01

    The time-frequency analysis method represents signal as a function of time and frequency, and it is considered a powerful tool for handling arbitrary non-stationary time series by using instantaneous frequency and instantaneous amplitude. It also provides a possible alternative to the analysis of the non-stationary magnetotelluric (MT) signal. Based on the Hilbert-Huang transform (HHT), a time-frequency analysis method is proposed to obtain stable estimates of the magnetotelluric response function. In contrast to conventional methods, the response function estimation is performed in the time-frequency domain using instantaneous spectra rather than in the frequency domain, which allows for imaging the response parameter content as a function of time and frequency. The theory of the method is presented and the mathematical model and calculation procedure, which are used to estimate response function based on HHT time-frequency spectrum, are discussed. To evaluate the results, response function estimates are compared with estimates from a standard MT data processing method based on the Fourier transform. All results show that apparent resistivities and phases, which are calculated from the HHT time-frequency method, are generally more stable and reliable than those determined from the simple Fourier analysis. The proposed method overcomes the drawbacks of the traditional Fourier methods, and the resulting parameter minimises the estimation bias caused by the non-stationary characteristics of the MT data.

  20. Analysis of epidemiological situation of iodine deficiency in Tomsk region from 1998 to 2014

    NASA Astrophysics Data System (ADS)

    Samoilova, Y. G.; Oleynik, O. A.; Yurchenko, E. V.; Zinchuk, S. F.; Sivolobova, T. V.; Rotkank, M. A.; Mazhitova, D. S.

    2017-08-01

    The purpose of the present research is the comparative analysis of the epidemiological situation of iodine deficiency in the Tomsk region from 1998 to 2014. There were examined 9901 and 15174 children of the school age including within the medical examination of the children’s population in 1998 and 2014 in Tomsk. At all school pupils there were analyzed anamnestic data and carried out anthropometrical measurements, ultrasonography of the thyroid gland was carried out using the portable scanner "Aloka SSD 500" with the linear sensor of 7,5 MHz frequency in the real time, the thyroid volume was evaluated according to Delange (1997). The excretion of inorganic iodine was determined at 264 in 1998 and at 120 children in 2014, respectively, in a one-time portion of urine by the cerium-arsenic method. There was additionally carried out the analysis of results of determination of TSH at 10717 in 1998, and at 15091 in 2014 in a spot of the whole blood at newborn children on the 4-5 day after birth at full-term and on the 7-14 day at prematurely born children. Neonatal TSH in the dried-up samples of capillary blood was determined by the method of the fluorometric immune-ferment analysis with the use of sets of TSH-Neonatal: Delfia, Finland. Statistical processing of the data obtained was carried out with the use of the applied software package pspp. The descriptive analysis included calculation of the median for the quantitative and the calculation of frequencies for qualitative data. The comparative analysis included calculation of distinctions reliability by the Mann-Whitney criterion for independent and to Wilcoxon’s criterion for dependent data. The comparative analysis of epidemiological situation of iodine deficiency in Tomsk and the Tomsk region in 1998 and 2014 specifies decreasing the iodine deficiency diseases in the Tomsk region, increasing the iodine provision of the population within 15 years by 27%, decreasing neonatal hyperthyroidism by 1.5 times.

  1. Validation of methods to control for immortal time bias in a pharmacoepidemiologic analysis of renin-angiotensin system inhibitors in type 2 diabetes.

    PubMed

    Yang, Xilin; Kong, Alice Ps; Luk, Andrea Oy; Ozaki, Risa; Ko, Gary Tc; Ma, Ronald Cw; Chan, Juliana Cn; So, Wing Yee

    2014-01-01

    Pharmacoepidemiologic analysis can confirm whether drug efficacy in a randomized controlled trial (RCT) translates to effectiveness in real settings. We examined methods used to control for immortal time bias in an analysis of renin-angiotensin system (RAS) inhibitors as the reference cardioprotective drug. We analyzed data from 3928 patients with type 2 diabetes who were recruited into the Hong Kong Diabetes Registry between 1996 and 2005 and followed up to July 30, 2005. Different Cox models were used to obtain hazard ratios (HRs) for cardiovascular disease (CVD) associated with RAS inhibitors. These HRs were then compared to the HR of 0.92 reported in a recent meta-analysis of RCTs. During a median follow-up period of 5.45 years, 7.23% (n = 284) patients developed CVD and 38.7% (n = 1519) were started on RAS inhibitors, with 39.1% of immortal time among the users. In multivariable analysis, time-dependent drug-exposure Cox models and Cox models that moved immortal time from users to nonusers both severely inflated the HR, and time-fixed models that included immortal time deflated the HR. Use of time-fixed Cox models that excluded immortal time resulted in a HR of only 0.89 (95% CI, 0.68-1.17) for CVD associated with RAS inhibitors, which is closer to the values reported in RCTs. In pharmacoepidemiologic analysis, time-dependent drug exposure models and models that move immortal time from users to nonusers may introduce substantial bias in investigations of the effects of RAS inhibitors on CVD in type 2 diabetes.

  2. Work flow analysis of around-the-clock processing of blood culture samples and integrated MALDI-TOF mass spectrometry analysis for the diagnosis of bloodstream infections.

    PubMed

    Schneiderhan, Wilhelm; Grundt, Alexander; Wörner, Stefan; Findeisen, Peter; Neumaier, Michael

    2013-11-01

    Because sepsis has a high mortality rate, rapid microbiological diagnosis is required to enable efficient therapy. The effectiveness of MALDI-TOF mass spectrometry (MALDI-TOF MS) analysis in reducing turnaround times (TATs) for blood culture (BC) pathogen identification when available in a 24-h hospital setting has not been determined. On the basis of data from a total number of 912 positive BCs collected within 140 consecutive days and work flow analyses of laboratory diagnostics, we evaluated different models to assess the TATs for batch-wise and for immediate response (real-time) MALDI-TOF MS pathogen identification of positive BC results during the night shifts. The results were compared to TATs from routine BC processing and biochemical identification performed during regular working hours. Continuous BC incubation together with batch-wise MALDI-TOF MS analysis enabled significant reductions of up to 58.7 h in the mean TATs for the reporting of the bacterial species. The TAT of batch-wise MALDI-TOF MS analysis was inferior by a mean of 4.9 h when compared to the model of the immediate work flow under ideal conditions with no constraints in staff availability. Together with continuous cultivation of BC, the 24-h availability of MALDI-TOF MS can reduce the TAT for microbial pathogen identification within a routine clinical laboratory setting. Batch-wise testing of positive BC loses a few hours compared to real-time identification but is still far superior to classical BC processing. Larger prospective studies are required to evaluate the contribution of rapid around-the-clock pathogen identification to medical decision-making for septicemic patients.

  3. Statistical Analysis of Zebrafish Locomotor Response.

    PubMed

    Liu, Yiwen; Carmer, Robert; Zhang, Gaonan; Venkatraman, Prahatha; Brown, Skye Ashton; Pang, Chi-Pui; Zhang, Mingzhi; Ma, Ping; Leung, Yuk Fai

    2015-01-01

    Zebrafish larvae display rich locomotor behaviour upon external stimulation. The movement can be simultaneously tracked from many larvae arranged in multi-well plates. The resulting time-series locomotor data have been used to reveal new insights into neurobiology and pharmacology. However, the data are of large scale, and the corresponding locomotor behavior is affected by multiple factors. These issues pose a statistical challenge for comparing larval activities. To address this gap, this study has analyzed a visually-driven locomotor behaviour named the visual motor response (VMR) by the Hotelling's T-squared test. This test is congruent with comparing locomotor profiles from a time period. Different wild-type (WT) strains were compared using the test, which shows that they responded differently to light change at different developmental stages. The performance of this test was evaluated by a power analysis, which shows that the test was sensitive for detecting differences between experimental groups with sample numbers that were commonly used in various studies. In addition, this study investigated the effects of various factors that might affect the VMR by multivariate analysis of variance (MANOVA). The results indicate that the larval activity was generally affected by stage, light stimulus, their interaction, and location in the plate. Nonetheless, different factors affected larval activity differently over time, as indicated by a dynamical analysis of the activity at each second. Intriguingly, this analysis also shows that biological and technical repeats had negligible effect on larval activity. This finding is consistent with that from the Hotelling's T-squared test, and suggests that experimental repeats can be combined to enhance statistical power. Together, these investigations have established a statistical framework for analyzing VMR data, a framework that should be generally applicable to other locomotor data with similar structure.

  4. Statistical Analysis of Zebrafish Locomotor Response

    PubMed Central

    Zhang, Gaonan; Venkatraman, Prahatha; Brown, Skye Ashton; Pang, Chi-Pui; Zhang, Mingzhi; Ma, Ping; Leung, Yuk Fai

    2015-01-01

    Zebrafish larvae display rich locomotor behaviour upon external stimulation. The movement can be simultaneously tracked from many larvae arranged in multi-well plates. The resulting time-series locomotor data have been used to reveal new insights into neurobiology and pharmacology. However, the data are of large scale, and the corresponding locomotor behavior is affected by multiple factors. These issues pose a statistical challenge for comparing larval activities. To address this gap, this study has analyzed a visually-driven locomotor behaviour named the visual motor response (VMR) by the Hotelling’s T-squared test. This test is congruent with comparing locomotor profiles from a time period. Different wild-type (WT) strains were compared using the test, which shows that they responded differently to light change at different developmental stages. The performance of this test was evaluated by a power analysis, which shows that the test was sensitive for detecting differences between experimental groups with sample numbers that were commonly used in various studies. In addition, this study investigated the effects of various factors that might affect the VMR by multivariate analysis of variance (MANOVA). The results indicate that the larval activity was generally affected by stage, light stimulus, their interaction, and location in the plate. Nonetheless, different factors affected larval activity differently over time, as indicated by a dynamical analysis of the activity at each second. Intriguingly, this analysis also shows that biological and technical repeats had negligible effect on larval activity. This finding is consistent with that from the Hotelling’s T-squared test, and suggests that experimental repeats can be combined to enhance statistical power. Together, these investigations have established a statistical framework for analyzing VMR data, a framework that should be generally applicable to other locomotor data with similar structure. PMID:26437184

  5. Cost-utility analysis of memantine extended release added to cholinesterase inhibitors compared to cholinesterase inhibitor monotherapy for the treatment of moderate-to-severe dementia of the Alzheimer's type in the US.

    PubMed

    Saint-Laurent Thibault, Catherine; Özer Stillman, Ipek; Chen, Stephanie; Getsios, Denis; Proskorovsky, Irina; Hernandez, Luis; Dixit, Shailja

    2015-01-01

    This study evaluates the cost-effectiveness of memantine extended release (ER) as an add-on therapy to acetylcholinesterase inhibitor (AChEI) [combination therapy] for treatment of patients with moderate-to-severe Alzheimer's disease (AD) from both a healthcare payer and a societal perspective over 3 years when compared to AChEI monotherapy in the US. A phase III trial evaluated the efficacy and safety of memantine ER for treatment of AD patients taking an AChEI. The analysis assessed the long-term costs and health outcomes using an individual patient simulation in which AD progression is modeled in terms of cognition, behavior, and functioning changes. Input parameters are based on patient-level trial data, published literature, and publicly available data sources. Changes in anti-psychotic medication use are incorporated based on a published retrospective cohort study. Costs include drug acquisition and monitoring, total AD-related medical care, and informal care associated with caregiver time. Incremental cost-utility ratio (ICUR), life years, care time for caregiver, time in community and institution, time on anti-psychotics, time by disease severity, and time without severe symptoms are reported. Costs and health outcomes are discounted at 3% per annum. Considering a societal perspective over 3 years, this analysis shows that memantine ER combined with an AChEI provides better clinical outcomes and lower costs than AChEI monotherapy. Discounted average savings were estimated at $18,355 and $20,947 per patient and quality-adjusted life-years (QALYs) increased by an average of 0.12 and 0.13 from a societal and healthcare payer perspective, respectively. Patients on combination therapy spent an average of 4 months longer living at home and spend less time in moderate-severe and severe stages of the disease. Combination therapy for patients with moderate-to-severe AD is a cost-effective treatment compared to AChEI monotherapy in the US.

  6. Quantitative EEG analysis using error reduction ratio-causality test; validation on simulated and real EEG data.

    PubMed

    Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios

    2014-01-01

    To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  7. A comparison of the burden and resultant risk associated with occupational falls from a height and on the same level in Australia.

    PubMed

    Mangharam, Jean; Moorin, Rachael; Straker, Leon

    2016-12-01

    Occupational falls are one of the leading causes of occupational injury and death internationally. This study described the nature of occupational falls following an analysis of workers compensation data in Western Australia. Frequencies, proportions and incidence rates were calculated following mechanism, gender, age and industry stratification. The natures of injury and bodily locations affected were compared between mechanisms of fall. Industry incidence rates were ranked and their corresponding proportions reported. Cost and lost time were described and risk scores for each burden type (incapacity, cost and lost time) were calculated and compared between fall mechanisms. Of all occupational falls, the proportion, incidence rates and risk scores of falls on same level were consistently greater compared to falls from a height. Gender, age and industry groups that appear to be at highest risk vary with the measure used and mechanism of incident. This study translates epidemiological information into a risk score that can aid in prioritisation. Practitioner Summary: This paper presents an in-depth analysis of Worker's Compensation claims for falls in Western Australia. Calculated proportion, incidence rates and formulated risk scores for falls on the level were consistently greater compared to falls from a height. Limitations associated with the analysis of large-scale data-sets are described.

  8. Can a clinical placement influence stigma? An analysis of measures of social distance.

    PubMed

    Moxham, Lorna; Taylor, Ellie; Patterson, Christopher; Perlman, Dana; Brighton, Renee; Sumskis, Susan; Keough, Emily; Heffernan, Tim

    2016-09-01

    The way people who experience mental illness are perceived by health care professionals, which often includes stigmatising attitudes, can have a significant impact on treatment outcomes and on their quality of life. To determine whether stigma towards people with mental illness varied for undergraduate nursing students who attended a non-traditional clinical placement called Recovery Camp compared to students who attended a 'typical' mental health clinical placement. Quasi-experimental. Seventy-nine third-year nursing students were surveyed; n=40 attended Recovery Camp (intervention), n=39 (comparison group) attended a 'typical' mental health clinical placement. All students completed the Social Distance Scale (SDS) pre- and post-placement and at three-month follow-up. Data analysis consisted of a one-way repeated measures analysis of variance (ANOVA) exploring parameter estimates between group scores across three time points. Two secondary repeated measures ANOVAs were performed to demonstrate the differences in SDS scores for each group across time. Pairwise comparisons demonstrated the differences between time intervals. A statistically significant difference in ratings of stigma between the intervention group and the comparison group existed. Parameter estimates revealed that stigma ratings for the intervention group were significantly reduced post-placement and remained consistently low at three-month follow-up. There was no significant difference in ratings of stigma for the comparison group over time. Students who attended Recovery Camp reported significant decreases in stigma towards people with a mental illness over time, compared to the typical placement group. Findings suggest that a therapeutic recreation based clinical placement was more successful in reducing stigma regarding mental illness in undergraduate nursing students compared to those who attended typical mental health clinical placements. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis

    PubMed Central

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731

  10. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis.

    PubMed

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.

  11. Effect of heating strategies on whey protein denaturation--Revisited by liquid chromatography quadrupole time-of-flight mass spectrometry.

    PubMed

    Akkerman, M; Rauh, V M; Christensen, M; Johansen, L B; Hammershøj, M; Larsen, L B

    2016-01-01

    Previous standards in the area of effect of heat treatment processes on milk protein denaturation were based primarily on laboratory-scale analysis and determination of denaturation degrees by, for example, electrophoresis. In this study, whey protein denaturation was revisited by pilot-scale heating strategies and liquid chromatography quadrupole time-of-flight mass spectrometer (LC/MC Q-TOF) analysis. Skim milk was heat treated by the use of 3 heating strategies, namely plate heat exchanger (PHE), tubular heat exchanger (THE), and direct steam injection (DSI), under various heating temperatures (T) and holding times. The effect of heating strategy on the degree of denaturation of β-lactoglobulin and α-lactalbumin was determined using LC/MC Q-TOF of pH 4.5-soluble whey proteins. Furthermore, effect of heating strategy on the rennet-induced coagulation properties was studied by oscillatory rheometry. In addition, rennet-induced coagulation of heat-treated micellar casein concentrate subjected to PHE was studied. For skim milk, the whey protein denaturation increased significantly as T and holding time increased, regardless of heating method. High denaturation degrees were obtained for T >100°C using PHE and THE, whereas DSI resulted in significantly lower denaturation degrees, compared with PHE and THE. Rennet coagulation properties were impaired by increased T and holding time regardless of heating method, although DSI resulted in less impairment compared with PHE and THE. No significant difference was found between THE and PHE for effect on rennet coagulation time, whereas the curd firming rate was significantly larger for THE compared with PHE. Micellar casein concentrate possessed improved rennet coagulation properties compared with skim milk receiving equal heat treatment. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. Enhanced round robin CPU scheduling with burst time based time quantum

    NASA Astrophysics Data System (ADS)

    Indusree, J. R.; Prabadevi, B.

    2017-11-01

    Process scheduling is a very important functionality of Operating system. The main-known process-scheduling algorithms are First Come First Serve (FCFS) algorithm, Round Robin (RR) algorithm, Priority scheduling algorithm and Shortest Job First (SJF) algorithm. Compared to its peers, Round Robin (RR) algorithm has the advantage that it gives fair share of CPU to the processes which are already in the ready-queue. The effectiveness of the RR algorithm greatly depends on chosen time quantum value. Through this research paper, we are proposing an enhanced algorithm called Enhanced Round Robin with Burst-time based Time Quantum (ERRBTQ) process scheduling algorithm which calculates time quantum as per the burst-time of processes already in ready queue. The experimental results and analysis of ERRBTQ algorithm clearly indicates the improved performance when compared with conventional RR and its variants.

  13. Most suitable mother wavelet for the analysis of fractal properties of stride interval time series via the average wavelet coefficient

    PubMed Central

    Zhang, Zhenwei; VanSwearingen, Jessie; Brach, Jennifer S.; Perera, Subashan

    2016-01-01

    Human gait is a complex interaction of many nonlinear systems and stride intervals exhibit self-similarity over long time scales that can be modeled as a fractal process. The scaling exponent represents the fractal degree and can be interpreted as a biomarker of relative diseases. The previous study showed that the average wavelet method provides the most accurate results to estimate this scaling exponent when applied to stride interval time series. The purpose of this paper is to determine the most suitable mother wavelet for the average wavelet method. This paper presents a comparative numerical analysis of sixteen mother wavelets using simulated and real fractal signals. Simulated fractal signals were generated under varying signal lengths and scaling exponents that indicate a range of physiologically conceivable fractal signals. The five candidates were chosen due to their good performance on the mean square error test for both short and long signals. Next, we comparatively analyzed these five mother wavelets for physiologically relevant stride time series lengths. Our analysis showed that the symlet 2 mother wavelet provides a low mean square error and low variance for long time intervals and relatively low errors for short signal lengths. It can be considered as the most suitable mother function without the burden of considering the signal length. PMID:27960102

  14. TSCAN: Pseudo-time reconstruction and evaluation in single-cell RNA-seq analysis

    PubMed Central

    Ji, Zhicheng; Ji, Hongkai

    2016-01-01

    When analyzing single-cell RNA-seq data, constructing a pseudo-temporal path to order cells based on the gradual transition of their transcriptomes is a useful way to study gene expression dynamics in a heterogeneous cell population. Currently, a limited number of computational tools are available for this task, and quantitative methods for comparing different tools are lacking. Tools for Single Cell Analysis (TSCAN) is a software tool developed to better support in silico pseudo-Time reconstruction in Single-Cell RNA-seq ANalysis. TSCAN uses a cluster-based minimum spanning tree (MST) approach to order cells. Cells are first grouped into clusters and an MST is then constructed to connect cluster centers. Pseudo-time is obtained by projecting each cell onto the tree, and the ordered sequence of cells can be used to study dynamic changes of gene expression along the pseudo-time. Clustering cells before MST construction reduces the complexity of the tree space. This often leads to improved cell ordering. It also allows users to conveniently adjust the ordering based on prior knowledge. TSCAN has a graphical user interface (GUI) to support data visualization and user interaction. Furthermore, quantitative measures are developed to objectively evaluate and compare different pseudo-time reconstruction methods. TSCAN is available at https://github.com/zji90/TSCAN and as a Bioconductor package. PMID:27179027

  15. TSCAN: Pseudo-time reconstruction and evaluation in single-cell RNA-seq analysis.

    PubMed

    Ji, Zhicheng; Ji, Hongkai

    2016-07-27

    When analyzing single-cell RNA-seq data, constructing a pseudo-temporal path to order cells based on the gradual transition of their transcriptomes is a useful way to study gene expression dynamics in a heterogeneous cell population. Currently, a limited number of computational tools are available for this task, and quantitative methods for comparing different tools are lacking. Tools for Single Cell Analysis (TSCAN) is a software tool developed to better support in silico pseudo-Time reconstruction in Single-Cell RNA-seq ANalysis. TSCAN uses a cluster-based minimum spanning tree (MST) approach to order cells. Cells are first grouped into clusters and an MST is then constructed to connect cluster centers. Pseudo-time is obtained by projecting each cell onto the tree, and the ordered sequence of cells can be used to study dynamic changes of gene expression along the pseudo-time. Clustering cells before MST construction reduces the complexity of the tree space. This often leads to improved cell ordering. It also allows users to conveniently adjust the ordering based on prior knowledge. TSCAN has a graphical user interface (GUI) to support data visualization and user interaction. Furthermore, quantitative measures are developed to objectively evaluate and compare different pseudo-time reconstruction methods. TSCAN is available at https://github.com/zji90/TSCAN and as a Bioconductor package. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    NASA Astrophysics Data System (ADS)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content of such reservoirs; both in onshore regions as well as in offshore regions. Drilling a well is always guided by technical, economic and security constraints to prevent crew, equipment and environment from injury, damage and pollution. Although risk assessment and local practice provides a high degree of security, uncertainty is given by the behaviour of the formation which may cause crucial situations at the rig. To overcome such uncertainties real-time sensor measurements form a base to predict and thus prevent such crises, the proposed method supports the identification of the data necessary for that.

  17. Nanoparticle separation with a miniaturized asymmetrical flow field-flow fractionation cartridge

    PubMed Central

    Müller, David; Cattaneo, Stefano; Meier, Florian; Welz, Roland; de Mello, Andrew J.

    2015-01-01

    Asymmetrical Flow Field-Flow Fractionation (AF4) is a separation technique applicable to particles over a wide size range. Despite the many advantages of AF4, its adoption in routine particle analysis is somewhat limited by the large footprint of currently available separation cartridges, extended analysis times and significant solvent consumption. To address these issues, we describe the fabrication and characterization of miniaturized AF4 cartridges. Key features of the down-scaled platform include simplified cartridge and reagent handling, reduced analysis costs and higher throughput capacities. The separation performance of the miniaturized cartridge is assessed using certified gold and silver nanoparticle standards. Analysis of gold nanoparticle populations indicates shorter analysis times and increased sensitivity compared to conventional AF4 separation schemes. Moreover, nanoparticulate titanium dioxide populations exhibiting broad size distributions are analyzed in a rapid and efficient manner. Finally, the repeatability and reproducibility of the miniaturized platform are investigated with respect to analysis time and separation efficiency. PMID:26258119

  18. Nanoparticle separation with a miniaturized asymmetrical flow field-flow fractionation cartridge

    NASA Astrophysics Data System (ADS)

    Müller, David; Cattaneo, Stefano; Meier, Florian; Welz, Roland; deMello, Andrew

    2015-07-01

    Asymmetrical Flow Field-Flow Fractionation (AF4) is a separation technique applicable to particles over a wide size range. Despite the many advantages of AF4, its adoption in routine particle analysis is somewhat limited by the large footprint of currently available separation cartridges, extended analysis times and significant solvent consumption. To address these issues, we describe the fabrication and characterization of miniaturized AF4 cartridges. Key features of the scale-down platform include simplified cartridge and reagent handling, reduced analysis costs and higher throughput capacities. The separation performance of the miniaturized cartridge is assessed using certified gold and silver nanoparticle standards. Analysis of gold nanoparticle populations indicates shorter analysis times and increased sensitivity compared to conventional AF4 separation schemes. Moreover, nanoparticulate titanium dioxide populations exhibiting broad size distributions are analyzed in a rapid and efficient manner. Finally, the repeatability and reproducibility of the miniaturized platform are investigated with respect to analysis time and separation efficiency.

  19. Parasitic modulation of electromagnetic signals caused by time-varying plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Min, E-mail: merovingia1911@126.com; Li, Xiaoping; Xie, Kai

    2015-02-15

    An experiment on the propagation of electromagnetic (EM) signals in continuous time-varying plasma is described. The time-varying characteristics of plasma are considered to cause a parasitic modulation in both amplitude and phase, and the strength of this modulation, which carries the information of the electron density profile, is closely related to the plasma frequency and the incident wave frequency. Through theoretical analysis, we give an explanation and mechanism of the interaction between the continuous time-varying plasma and EM waves, which is verified by a comparative analysis with experiments performed under the same conditions. The effects of this modulation on themore » EM signals in the plasma sheath cannot be ignored.« less

  20. Mendelian randomization analysis of a time-varying exposure for binary disease outcomes using functional data analysis methods.

    PubMed

    Cao, Ying; Rajan, Suja S; Wei, Peng

    2016-12-01

    A Mendelian randomization (MR) analysis is performed to analyze the causal effect of an exposure variable on a disease outcome in observational studies, by using genetic variants that affect the disease outcome only through the exposure variable. This method has recently gained popularity among epidemiologists given the success of genetic association studies. Many exposure variables of interest in epidemiological studies are time varying, for example, body mass index (BMI). Although longitudinal data have been collected in many cohort studies, current MR studies only use one measurement of a time-varying exposure variable, which cannot adequately capture the long-term time-varying information. We propose using the functional principal component analysis method to recover the underlying individual trajectory of the time-varying exposure from the sparsely and irregularly observed longitudinal data, and then conduct MR analysis using the recovered curves. We further propose two MR analysis methods. The first assumes a cumulative effect of the time-varying exposure variable on the disease risk, while the second assumes a time-varying genetic effect and employs functional regression models. We focus on statistical testing for a causal effect. Our simulation studies mimicking the real data show that the proposed functional data analysis based methods incorporating longitudinal data have substantial power gains compared to standard MR analysis using only one measurement. We used the Framingham Heart Study data to demonstrate the promising performance of the new methods as well as inconsistent results produced by the standard MR analysis that relies on a single measurement of the exposure at some arbitrary time point. © 2016 WILEY PERIODICALS, INC.

  1. Boar seminal plasma exosomes maintain sperm function by infiltrating into the sperm membrane.

    PubMed

    Du, Jian; Shen, Jian; Wang, Yuanxian; Pan, Chuanying; Pang, Weijun; Diao, Hua; Dong, Wuzi

    2016-09-13

    Seminal plasma ingredients are important for maintenance of sperm viability. This study focuses on the effect of boar seminal plasma exosomes on sperm function during long-term liquid storage. Boar seminal plasma exosomes had typical nano-structure morphology as measured by scanning electron microscopy (SEM) and molecular markers such as AWN, CD9 and CD63 by western blot analysis. The effect on sperm parameters of adding different ratio of boar seminal plasma exosomes to boar sperm preparations was analyzed. Compared to the diluent without exosomes, the diluent with four times or sixteen times exosomes compared to original semen had higher sperm motility, prolonged effective survival time, improved sperm plasma membrane integrity (p < 0.05), increased total antioxidant capacity (T-AOC) activity and decreased malondialdehyde (MDA) content. The diluent containing four times concentration of exosomes compared to original semen was determined to inhibit premature capacitation, but not to influence capacitation induced in vitro. Inhibition of premature capacitation is likely related to the concentration of exosomes which had been demonstrated to transfer proteins including AWN and PSP-1 into sperm. In addition, using fluorescence microscopy and scanning electron microscopy analysis, it was demonstrated that exosomes in diluent were directly binding to the membrane of sperm head which could improve sperm plasma membrane integrity.

  2. Boar seminal plasma exosomes maintain sperm function by infiltrating into the sperm membrane

    PubMed Central

    Du, Jian; Shen, Jian; Wang, Yuanxian; Pan, Chuanying; Pang, Weijun; Diao, Hua; Dong, Wuzi

    2016-01-01

    Seminal plasma ingredients are important for maintenance of sperm viability. This study focuses on the effect of boar seminal plasma exosomes on sperm function during long-term liquid storage. Boar seminal plasma exosomes had typical nano-structure morphology as measured by scanning electron microscopy (SEM) and molecular markers such as AWN, CD9 and CD63 by western blot analysis. The effect on sperm parameters of adding different ratio of boar seminal plasma exosomes to boar sperm preparations was analyzed. Compared to the diluent without exosomes, the diluent with four times or sixteen times exosomes compared to original semen had higher sperm motility, prolonged effective survival time, improved sperm plasma membrane integrity (p < 0.05), increased total antioxidant capacity (T-AOC) activity and decreased malondialdehyde (MDA) content. The diluent containing four times concentration of exosomes compared to original semen was determined to inhibit premature capacitation, but not to influence capacitation induced in vitro. Inhibition of premature capacitation is likely related to the concentration of exosomes which had been demonstrated to transfer proteins including AWN and PSP-1 into sperm. In addition, using fluorescence microscopy and scanning electron microscopy analysis, it was demonstrated that exosomes in diluent were directly binding to the membrane of sperm head which could improve sperm plasma membrane integrity. PMID:27542209

  3. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 2: Unsteady ducted propfan analysis computer program users manual

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Delaney, Robert A.; Bettner, James L.

    1991-01-01

    The primary objective of this study was the development of a time-dependent three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict unsteady compressible transonic flows about ducted and unducted propfan propulsion systems at angle of attack. The computer codes resulting from this study are referred to as Advanced Ducted Propfan Analysis Codes (ADPAC). This report is intended to serve as a computer program user's manual for the ADPAC developed under Task 2 of NASA Contract NAS3-25270, Unsteady Ducted Propfan Analysis. Aerodynamic calculations were based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. A time-accurate implicit residual smoothing operator was utilized for unsteady flow predictions. For unducted propfans, a single H-type grid was used to discretize each blade passage of the complete propeller. For ducted propfans, a coupled system of five grid blocks utilizing an embedded C-grid about the cowl leading edge was used to discretize each blade passage. Grid systems were generated by a combined algebraic/elliptic algorithm developed specifically for ducted propfans. Numerical calculations were compared with experimental data for both ducted and unducted propfan flows. The solution scheme demonstrated efficiency and accuracy comparable with other schemes of this class.

  4. Modeling spatiotemporal covariance for magnetoencephalography or electroencephalography source analysis.

    PubMed

    Plis, Sergey M; George, J S; Jun, S C; Paré-Blagoev, J; Ranken, D M; Wood, C C; Schmidt, D M

    2007-01-01

    We propose a new model to approximate spatiotemporal noise covariance for use in neural electromagnetic source analysis, which better captures temporal variability in background activity. As with other existing formalisms, our model employs a Kronecker product of matrices representing temporal and spatial covariance. In our model, spatial components are allowed to have differing temporal covariances. Variability is represented as a series of Kronecker products of spatial component covariances and corresponding temporal covariances. Unlike previous attempts to model covariance through a sum of Kronecker products, our model is designed to have a computationally manageable inverse. Despite increased descriptive power, inversion of the model is fast, making it useful in source analysis. We have explored two versions of the model. One is estimated based on the assumption that spatial components of background noise have uncorrelated time courses. Another version, which gives closer approximation, is based on the assumption that time courses are statistically independent. The accuracy of the structural approximation is compared to an existing model, based on a single Kronecker product, using both Frobenius norm of the difference between spatiotemporal sample covariance and a model, and scatter plots. Performance of ours and previous models is compared in source analysis of a large number of single dipole problems with simulated time courses and with background from authentic magnetoencephalography data.

  5. [The trial of business data analysis at the Department of Radiology by constructing the auto-regressive integrated moving-average (ARIMA) model].

    PubMed

    Tani, Yuji; Ogasawara, Katsuhiko

    2012-01-01

    This study aimed to contribute to the management of a healthcare organization by providing management information using time-series analysis of business data accumulated in the hospital information system, which has not been utilized thus far. In this study, we examined the performance of the prediction method using the auto-regressive integrated moving-average (ARIMA) model, using the business data obtained at the Radiology Department. We made the model using the data used for analysis, which was the number of radiological examinations in the past 9 years, and we predicted the number of radiological examinations in the last 1 year. Then, we compared the actual value with the forecast value. We were able to establish that the performance prediction method was simple and cost-effective by using free software. In addition, we were able to build the simple model by pre-processing the removal of trend components using the data. The difference between predicted values and actual values was 10%; however, it was more important to understand the chronological change rather than the individual time-series values. Furthermore, our method was highly versatile and adaptable compared to the general time-series data. Therefore, different healthcare organizations can use our method for the analysis and forecasting of their business data.

  6. xCELLigence system for real-time label-free monitoring of growth and viability of cell lines from hematological malignancies.

    PubMed

    Martinez-Serra, Jordi; Gutierrez, Antonio; Muñoz-Capó, Saúl; Navarro-Palou, María; Ros, Teresa; Amat, Juan Carlos; Lopez, Bernardo; Marcus, Toni F; Fueyo, Laura; Suquia, Angela G; Gines, Jordi; Rubio, Francisco; Ramos, Rafael; Besalduch, Joan

    2014-01-01

    The xCELLigence system is a new technological approach that allows the real-time cell analysis of adherent tumor cells. To date, xCELLigence has not been able to monitor the growth or cytotoxicity of nonadherent cells derived from hematological malignancies. The basis of its technology relies on the use of culture plates with gold microelectrodes located in their base. We have adapted the methodology described by others to xCELLigence, based on the pre-coating of the cell culture surface with specific substrates, some of which are known to facilitate cell adhesion in the extracellular matrix. Pre-coating of the culture plates with fibronectin, compared to laminin, collagen, or gelatin, significantly induced the adhesion of most of the leukemia/lymphoma cells assayed (Jurkat, L1236, KMH2, and K562). With a fibronectin substrate, nonadherent cells deposited in a monolayer configuration, and consequently, the cell growth and viability were robustly monitored. We further demonstrate the feasibility of xCELLigence for the real-time monitoring of the cytotoxic properties of several antineoplastic agents. In order to validate this technology, the data obtained through real-time cell analysis was compared with that obtained from using the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide method. This provides an excellent label-free tool for the screening of drug efficacy in nonadherent cells and discriminates optimal time points for further molecular analysis of cellular events associated with treatments, reducing both time and costs.

  7. Longitudinal study of fingerprint recognition.

    PubMed

    Yoon, Soweon; Jain, Anil K

    2015-07-14

    Human identification by fingerprints is based on the fundamental premise that ridge patterns from distinct fingers are different (uniqueness) and a fingerprint pattern does not change over time (persistence). Although the uniqueness of fingerprints has been investigated by developing statistical models to estimate the probability of error in comparing two random samples of fingerprints, the persistence of fingerprints has remained a general belief based on only a few case studies. In this study, fingerprint match (similarity) scores are analyzed by multilevel statistical models with covariates such as time interval between two fingerprints in comparison, subject's age, and fingerprint image quality. Longitudinal fingerprint records of 15,597 subjects are sampled from an operational fingerprint database such that each individual has at least five 10-print records over a minimum time span of 5 y. In regard to the persistence of fingerprints, the longitudinal analysis on a single (right index) finger demonstrates that (i) genuine match scores tend to significantly decrease when time interval between two fingerprints in comparison increases, whereas the change in impostor match scores is negligible; and (ii) fingerprint recognition accuracy at operational settings, nevertheless, tends to be stable as the time interval increases up to 12 y, the maximum time span in the dataset. However, the uncertainty of temporal stability of fingerprint recognition accuracy becomes substantially large if either of the two fingerprints being compared is of poor quality. The conclusions drawn from 10-finger fusion analysis coincide with the conclusions from single-finger analysis.

  8. Longitudinal study of fingerprint recognition

    PubMed Central

    Yoon, Soweon; Jain, Anil K.

    2015-01-01

    Human identification by fingerprints is based on the fundamental premise that ridge patterns from distinct fingers are different (uniqueness) and a fingerprint pattern does not change over time (persistence). Although the uniqueness of fingerprints has been investigated by developing statistical models to estimate the probability of error in comparing two random samples of fingerprints, the persistence of fingerprints has remained a general belief based on only a few case studies. In this study, fingerprint match (similarity) scores are analyzed by multilevel statistical models with covariates such as time interval between two fingerprints in comparison, subject’s age, and fingerprint image quality. Longitudinal fingerprint records of 15,597 subjects are sampled from an operational fingerprint database such that each individual has at least five 10-print records over a minimum time span of 5 y. In regard to the persistence of fingerprints, the longitudinal analysis on a single (right index) finger demonstrates that (i) genuine match scores tend to significantly decrease when time interval between two fingerprints in comparison increases, whereas the change in impostor match scores is negligible; and (ii) fingerprint recognition accuracy at operational settings, nevertheless, tends to be stable as the time interval increases up to 12 y, the maximum time span in the dataset. However, the uncertainty of temporal stability of fingerprint recognition accuracy becomes substantially large if either of the two fingerprints being compared is of poor quality. The conclusions drawn from 10-finger fusion analysis coincide with the conclusions from single-finger analysis. PMID:26124106

  9. Computational simulation and aerodynamic sensitivity analysis of film-cooled turbines

    NASA Astrophysics Data System (ADS)

    Massa, Luca

    A computational tool is developed for the time accurate sensitivity analysis of the stage performance of hot gas, unsteady turbine components. An existing turbomachinery internal flow solver is adapted to the high temperature environment typical of the hot section of jet engines. A real gas model and film cooling capabilities are successfully incorporated in the software. The modifications to the existing algorithm are described; both the theoretical model and the numerical implementation are validated. The accuracy of the code in evaluating turbine stage performance is tested using a turbine geometry typical of the last stage of aeronautical jet engines. The results of the performance analysis show that the predictions differ from the experimental data by less than 3%. A reliable grid generator, applicable to the domain discretization of the internal flow field of axial flow turbine is developed. A sensitivity analysis capability is added to the flow solver, by rendering it able to accurately evaluate the derivatives of the time varying output functions. The complex Taylor's series expansion (CTSE) technique is reviewed. Two of them are used to demonstrate the accuracy and time dependency of the differentiation process. The results are compared with finite differences (FD) approximations. The CTSE is more accurate than the FD, but less efficient. A "black box" differentiation of the source code, resulting from the automated application of the CTSE, generates high fidelity sensitivity algorithms, but with low computational efficiency and high memory requirements. New formulations of the CTSE are proposed and applied. Selective differentiation of the method for solving the non-linear implicit residual equation leads to sensitivity algorithms with the same accuracy but improved run time. The time dependent sensitivity derivatives are computed in run times comparable to the ones required by the FD approach.

  10. Assessing child time-activity patterns in relation to indoor cooking fires in developing countries: a methodological comparison.

    PubMed

    Barnes, Brendon; Mathee, Angela; Moiloa, Kebitsamang

    2005-01-01

    Indoor air pollution, caused by the indoor burning of biomass fuels, has been associated with an increased risk of child acute respiratory infections in developing countries. The amount of time that children spend in proximity to fires is a crucial determinant of the health impact of indoor air pollution. Researchers are reliant on social scientific methods to assess exposure based on child location patterns in relation to indoor fires. The inappropriate use of methods could lead to misclassification of exposure. The aim of this paper is to compare two methods (observations and questionnaire interview) with video analysis (which is thought to offer a more accurate assessment of exposure) in rural South African villages. Compared to video analysis, results show that observations may underestimate the amount of time that children spend very close (within 1.5 m) to fires. This is possibly due to reactivity caused by the presence of an observer. The questionnaire interview offers a more accurate assessment of the amounts of time that children spend within 1.5 m of fires at the expense of a detailed behavioural analysis. By drawing on the strengths and weaknesses of each, this paper discusses the appropriateness of methods to different research contexts.

  11. Meta-analysis of the alpha/beta ratio for prostate cancer in the presence of an overall time factor: bad news, good news, or no news?

    PubMed

    Vogelius, Ivan R; Bentzen, Søren M

    2013-01-01

    To present a novel method for meta-analysis of the fractionation sensitivity of tumors as applied to prostate cancer in the presence of an overall time factor. A systematic search for radiation dose-fractionation trials in prostate cancer was performed using PubMed and by manual search. Published trials comparing standard fractionated external beam radiation therapy with alternative fractionation were eligible. For each trial the α/β ratio and its 95% confidence interval (CI) were extracted, and the data were synthesized with each study weighted by the inverse variance. An overall time factor was included in the analysis, and its influence on α/β was investigated. Five studies involving 1965 patients were included in the meta-analysis of α/β. The synthesized α/β assuming no effect of overall treatment time was -0.07 Gy (95% CI -0.73-0.59), which was increased to 0.47 Gy (95% CI -0.55-1.50) if a single highly weighted study was excluded. In a separate analysis, 2 studies based on 10,808 patients in total allowed extraction of a synthesized estimate of a time factor of 0.31 Gy/d (95% CI 0.20-0.42). The time factor increased the α/β estimate to 0.58 Gy (95% CI -0.53-1.69)/1.93 Gy (95% CI -0.27-4.14) with/without the heavily weighted study. An analysis of the uncertainty of the α/β estimate showed a loss of information when the hypofractionated arm was underdosed compared with the normo-fractionated arm. The current external beam fractionation studies are consistent with a very low α/β ratio for prostate cancer, although the CIs include α/β ratios up to 4.14 Gy in the presence of a time factor. Details of the dose fractionation in the 2 trial arms have critical influence on the information that can be extracted from a study. Studies with unfortunate designs will supply little or no information about α/β regardless of the number of subjects enrolled. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Learning curves for single incision and conventional laparoscopic right hemicolectomy: a multidimensional analysis

    PubMed Central

    Park, Yoonah; Yong, Yuen Geng; Jung, Kyung Uk; Huh, Jung Wook; Cho, Yong Beom; Kim, Hee Cheol; Lee, Woo Yong; Chun, Ho-Kyung

    2015-01-01

    Purpose This study aimed to compare the learning curves and early postoperative outcomes for conventional laparoscopic (CL) and single incision laparoscopic (SIL) right hemicolectomy (RHC). Methods This retrospective study included the initial 35 cases in each group. Learning curves were evaluated by the moving average of operative time, mean operative time of every five consecutive cases, and cumulative sum (CUSUM) analysis. The learning phase was considered overcome when the moving average of operative times reached a plateau, and when the mean operative time of every five consecutive cases reached a low point and subsequently did not vary by more than 30 minutes. Results Six patients with missing data in the CL RHC group were excluded from the analyses. According to the mean operative time of every five consecutive cases, learning phase of SIL and CL RHC was completed between 26 and 30 cases, and 16 and 20 cases, respectively. Moving average analysis revealed that approximately 31 (SIL) and 25 (CL) cases were needed to complete the learning phase, respectively. CUSUM analysis demonstrated that 10 (SIL) and two (CL) cases were required to reach a steady state of complication-free performance, respectively. Postoperative complications rate was higher in SIL than in CL group, but the difference was not statistically significant (17.1% vs. 3.4%). Conclusion The learning phase of SIL RHC is longer than that of CL RHC. Early oncological outcomes of both techniques were comparable. However, SIL RHC had a statistically insignificant higher complication rate than CL RHC during the learning phase. PMID:25960990

  13. Variable stiffness colonoscope versus regular adult colonoscope: meta-analysis of randomized controlled trials.

    PubMed

    Othman, M O; Bradley, A G; Choudhary, A; Hoffman, R M; Roy, P K

    2009-01-01

    The variable stiffness colonoscope (VSC) may have theoretical advantages over standard adult colonoscopes (SACs), though data are conflicting. We conducted a meta-analysis to compare the efficacies of the VSC and SAC. We searched Medline (1966 - 2008) and abstracts of gastroenterology scientific meetings in the 5 years to February 2008, only for randomized clinical trials (RCTs) of adult patients. Trial quality was assessed using the Delphi list. In a meta-analysis with a fixed effects model, cecal intubation rates, cecal intubation times, abdominal pain scores, sedation used, and use of ancillary maneuvers, were compared in separate analyses, using weighted mean differences (WMDs), standardized mean differences (SMDs), or odds ratios (ORs). Seven RCTs satisfied the inclusion criteria (1923 patients), four comparing VSC with SAC procedures in adults, and three evaluating the pediatric VSC. There was no significant heterogeneity among the studies. The overall trial quality was adequate. Cecal intubation rate was higher with the use of VSC (OR = 2.08, 95 % confidence interval [CI] 1.29 to 3.36). The VSC was associated with lower abdominal pain scores and a decreased need for sedation during colonoscopy. Cecal intubation time was similar for the two colonscope types (WMD = - 0.21 minutes, 95 % CI - 0.85 to 0.43). Because of the nature of the intervention no studies were blinded. There was no universal method for using the VSC. Compared with the SAC, VSC use was associated with a higher cecal intubation rate, less abdominal pain, and decreased need for sedation. However, cecal intubation times were similar for the two colonoscope types.

  14. Systematic review and meta-analysis of randomized controlled trials of Simethicone for gastrointestinal endoscopic visibility.

    PubMed

    Wu, Liucheng; Cao, Yunfei; Liao, Cun; Huang, Jiahao; Gao, Feng

    2011-02-01

    The value of supplemental use of Simethicone in endoscopy including capsule endoscopy (CE), colonoscopy and esophagogastroduodenoscopy is not addressed and is controversial. A systematic review and meta-analysis of randomized controlled studies on the use of Simethicone for endoscopy were carried out. The effects of this preparation on the following endpoints were examined: small bowel visualization quality (SBVQ), completion rate, gastric transit time, small bowel transit time, diagnostic yield, efficacy of bowel preparation, degree of air bubbles and duration time. A total of 13 studies were eligible in this meta-analysis; 4 studies comparing purgative or fasting plus Simethicone with purgative or fasting alone for capsule endoscopy were identified. For patients who had supplemental Simethicone before CE, the SBVQ was significantly better ([odds ratio] OR = 2.84, 95% CI: 1.74-4.65, p = 0.00), and the completion rate was comparable (OR = 0.80, 95% CI: 0.44-1.44, p = 0.454). Also, 7 studies comparing purgative plus Simethicone with purgative alone for colonoscopy were identified. For patients who had supplemental Simethicone before colonoscopy, the efficacy of colon preparation was comparable (OR = 2.06, 95% CI: 0.56-7.53, p = 0.27), but the air bubbles were significantly decreased (OR = 39.32, 95% CI: 11.38-135.86, p = 0.00). Supplemental use of Simethicone before endoscopy improves the SBVQ, especially for patients who received no purgative, but does not affect the CE completion rate. It decreases air bubbles in the colonic lumen, but does not improve bowel preparation. And its effect on diagnostic yield remains controversial.

  15. Augmenting Qualitative Text Analysis with Natural Language Processing: Methodological Study.

    PubMed

    Guetterman, Timothy C; Chang, Tammy; DeJonckheere, Melissa; Basu, Tanmay; Scruggs, Elizabeth; Vydiswaran, V G Vinod

    2018-06-29

    Qualitative research methods are increasingly being used across disciplines because of their ability to help investigators understand the perspectives of participants in their own words. However, qualitative analysis is a laborious and resource-intensive process. To achieve depth, researchers are limited to smaller sample sizes when analyzing text data. One potential method to address this concern is natural language processing (NLP). Qualitative text analysis involves researchers reading data, assigning code labels, and iteratively developing findings; NLP has the potential to automate part of this process. Unfortunately, little methodological research has been done to compare automatic coding using NLP techniques and qualitative coding, which is critical to establish the viability of NLP as a useful, rigorous analysis procedure. The purpose of this study was to compare the utility of a traditional qualitative text analysis, an NLP analysis, and an augmented approach that combines qualitative and NLP methods. We conducted a 2-arm cross-over experiment to compare qualitative and NLP approaches to analyze data generated through 2 text (short message service) message survey questions, one about prescription drugs and the other about police interactions, sent to youth aged 14-24 years. We randomly assigned a question to each of the 2 experienced qualitative analysis teams for independent coding and analysis before receiving NLP results. A third team separately conducted NLP analysis of the same 2 questions. We examined the results of our analyses to compare (1) the similarity of findings derived, (2) the quality of inferences generated, and (3) the time spent in analysis. The qualitative-only analysis for the drug question (n=58) yielded 4 major findings, whereas the NLP analysis yielded 3 findings that missed contextual elements. The qualitative and NLP-augmented analysis was the most comprehensive. For the police question (n=68), the qualitative-only analysis yielded 4 primary findings and the NLP-only analysis yielded 4 slightly different findings. Again, the augmented qualitative and NLP analysis was the most comprehensive and produced the highest quality inferences, increasing our depth of understanding (ie, details and frequencies). In terms of time, the NLP-only approach was quicker than the qualitative-only approach for the drug (120 vs 270 minutes) and police (40 vs 270 minutes) questions. An approach beginning with qualitative analysis followed by qualitative- or NLP-augmented analysis took longer time than that beginning with NLP for both drug (450 vs 240 minutes) and police (390 vs 220 minutes) questions. NLP provides both a foundation to code qualitatively more quickly and a method to validate qualitative findings. NLP methods were able to identify major themes found with traditional qualitative analysis but were not useful in identifying nuances. Traditional qualitative text analysis added important details and context. ©Timothy C Guetterman, Tammy Chang, Melissa DeJonckheere, Tanmay Basu, Elizabeth Scruggs, VG Vinod Vydiswaran. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.06.2018.

  16. Identification of varying time scales in sediment transport using the Hilbert-Huang Transform method

    NASA Astrophysics Data System (ADS)

    Kuai, Ken Z.; Tsai, Christina W.

    2012-02-01

    SummarySediment transport processes vary at a variety of time scales - from seconds, hours, days to months and years. Multiple time scales exist in the system of flow, sediment transport and bed elevation change processes. As such, identification and selection of appropriate time scales for flow and sediment processes can assist in formulating a system of flow and sediment governing equations representative of the dynamic interaction of flow and particles at the desired details. Recognizing the importance of different varying time scales in the fluvial processes of sediment transport, we introduce the Hilbert-Huang Transform method (HHT) to the field of sediment transport for the time scale analysis. The HHT uses the Empirical Mode Decomposition (EMD) method to decompose a time series into a collection of the Intrinsic Mode Functions (IMFs), and uses the Hilbert Spectral Analysis (HSA) to obtain instantaneous frequency data. The EMD extracts the variability of data with different time scales, and improves the analysis of data series. The HSA can display the succession of time varying time scales, which cannot be captured by the often-used Fast Fourier Transform (FFT) method. This study is one of the earlier attempts to introduce the state-of-the-art technique for the multiple time sales analysis of sediment transport processes. Three practical applications of the HHT method for data analysis of both suspended sediment and bedload transport time series are presented. The analysis results show the strong impact of flood waves on the variations of flow and sediment time scales at a large sampling time scale, as well as the impact of flow turbulence on those time scales at a smaller sampling time scale. Our analysis reveals that the existence of multiple time scales in sediment transport processes may be attributed to the fractal nature in sediment transport. It can be demonstrated by the HHT analysis that the bedload motion time scale is better represented by the ratio of the water depth to the settling velocity, h/ w. In the final part, HHT results are compared with an available time scale formula in literature.

  17. An Energy-Based Similarity Measure for Time Series

    NASA Astrophysics Data System (ADS)

    Boudraa, Abdel-Ouahab; Cexus, Jean-Christophe; Groussat, Mathieu; Brunagel, Pierre

    2007-12-01

    A new similarity measure, called SimilB, for time series analysis, based on the cross-[InlineEquation not available: see fulltext.]-energy operator (2004), is introduced. [InlineEquation not available: see fulltext.] is a nonlinear measure which quantifies the interaction between two time series. Compared to Euclidean distance (ED) or the Pearson correlation coefficient (CC), SimilB includes the temporal information and relative changes of the time series using the first and second derivatives of the time series. SimilB is well suited for both nonstationary and stationary time series and particularly those presenting discontinuities. Some new properties of [InlineEquation not available: see fulltext.] are presented. Particularly, we show that [InlineEquation not available: see fulltext.] as similarity measure is robust to both scale and time shift. SimilB is illustrated with synthetic time series and an artificial dataset and compared to the CC and the ED measures.

  18. Timing characterization and analysis of the Linux-based, closed loop control computer for the Subaru Telescope laser guide star adaptive optics system

    NASA Astrophysics Data System (ADS)

    Dinkins, Matthew; Colley, Stephen

    2008-07-01

    Hardware and software specialized for real time control reduce the timing jitter of executables when compared to off-the-shelf hardware and software. However, these specialized environments are costly in both money and development time. While conventional systems have a cost advantage, the jitter in these systems is much larger and potentially problematic. This study analyzes the timing characterstics of a standard Dell server running a fully featured Linux operating system to determine if such a system would be capable of meeting the timing requirements for closed loop operations. Investigations are preformed on the effectiveness of tools designed to make off-the-shelf system performance closer to specialized real time systems. The Gnu Compiler Collection (gcc) is compared to the Intel C Compiler (icc), compiler optimizations are investigated, and real-time extensions to Linux are evaluated.

  19. Differences in botulinum toxin dosing between patients with adductor spasmodic dysphonia and essential voice tremor.

    PubMed

    Orbelo, Diana M; Duffy, Joseph R; Hughes Borst, Becky J; Ekbom, Dale; Maragos, Nicolas E

    2014-01-01

    To explore possible dose differences in average botulinum toxin (BTX) given to patients with adductor spasmodic dysphonia (ADSD) compared with patients with essential voice tremor (EVT). A retrospective study compared the average BTX dose injected in equal doses to the thyroarytenoid (TA) muscles of 51 patients with ADSD with 52 patients with EVT. Those with ADSD received significantly higher total doses (6.80 ± 2.79 units) compared with those with EVT (5.02 ± 1.65 units). Dose at time of first injection, age at time of first injection, gender, year of first injection, and average time between injections were included in multivariate analysis but did not interact with total average dose findings. Patients with ADSD may need relatively higher doses of BTX injections to bilateral TA muscles compared with patients with EVT. Copyright © 2014 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  20. Time Exceedances for High Intensity Solar Proton Fluxes

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Thomas M.; Adam, James H., Jr.; Dietrich, William F.

    2011-01-01

    A model is presented for times during a space mission that specified solar proton flux levels are exceeded. This includes both total time and continuous time periods during missions. Results for the solar maximum and solar minimum phases of the solar cycle are presented and compared for a broad range of proton energies and shielding levels. This type of approach is more amenable to reliability analysis for spacecraft systems and instrumentation than standard statistical models.

  1. Strategic Mobility 21: Rail Network Capacity Analysis

    DTIC Science & Technology

    2006-09-30

    of commercial freight and passenger train movements per peak day on the rail main lines in the study area for Year 2000 (actual...levels in Year 2010 and beyond with transit times comparable to Year 2000, at least four main tracks on the south slope of Cajon Pass are required... study . Rail Network Capacity Analysis 1 1.0 INTRODUCTION This document contains the analysis of rail capacity

  2. Comparative Investigation of Body Composition in Male Dogs Using CT and Body Fat Analysis Software

    PubMed Central

    KOBAYASHI, Toyokazu; KOIE, Hiroshi; KUSUMI, Akiko; KITAGAWA, Masato; KANAYAMA, Kiichi; OTSUJI, Kazuya

    2013-01-01

    ABSTRACT In small animal veterinary practices, body condition score (BCS) is generally used to diagnose obesity. However, BCS does not constitute objective data. In this study, we investigated the value of using human body fat analysis software for male dogs. We also compared changes in body fat after neutering. Changes in body fat at the time of neutering (age 1 year) and 1 year later were compared by performing CT scanning and using human body fat analysis software. We found that body fat increased in all the individuals tested. In terms of the site of fat accumulation, subcutaneous fat was more pronounced than visceral fat with a marked increase on the dorsal side of the abdomen rather than the thorax. PMID:24212506

  3. Who pays for the health benefits of exclusive breastfeeding? An analysis of maternal time costs.

    PubMed

    Smith, J P; Forrester, Robert

    2013-11-01

    The benefits of exclusive breastfeeding, including public health cost savings, are widely recognized, but breastfeeding requires maternal time investments. This study investigates the time taken to exclusively breastfeed at 6 months compared with not exclusively breastfeeding. Time use data were examined from an Australian survey of new mothers conducted during 2005-2006. Data from 139 mothers with infants age 6 months were analyzed using chi-square tests of independence to examine socioeconomic and demographic characteristics and 2-sided t tests to compare average weekly hours spent on milk feeding, feeding solids, preparing feeds, and the total of these. The comparison was of exclusively breastfeeding mothers with other mothers. We also compared exclusively breastfeeding with partially breastfeeding and formula feeding mothers using a 1-way between-groups analysis of variance (ANOVA). The exclusively breastfeeding (vs other) mothers spent 7 hours extra weekly on milk feeding their infants but 2 hours less feeding solids. These differences were statistically significant. ANOVA revealed significant differences between exclusively breastfeeding mothers, breastfeeding mothers who had introduced solids, and mothers who fed any formula, in time spent feeding milk, and solids, and preparing feeds. Exclusive breastfeeding is time intensive, which is economically costly to women. This may contribute to premature weaning for women who are time-stressed, lack household help from family, or cannot afford paid help. Gaining public health benefits of exclusive breastfeeding requires strategies to share maternal lactation costs more widely, such as additional help with housework or caring for children, enhanced leave, and workplace lactation breaks and suitable child care.

  4. Implications of the pion-decay gamma emission and neutron observations with CORONAS-F/SONG

    NASA Astrophysics Data System (ADS)

    Kurt, V.; Yushkov, B.; Kudela, K.

    2013-05-01

    We analyzed the high-energy gamma and neutron emissions observed by the SONG instrument onboard the CORONAS-F satellite during August 25, 2001, October 28, 2003, November 4, 2003, and January 20, 2005 solar flares. These flares produced neutrons and/or protons recorded near Earth. The SONG response was consistent with detection of the pion-decay gamma emission and neutrons in these events. We compared time profiles of various electromagnetic emissions and showed that the maximum of the pion-decay-emission coincided in time best of all with the soft X-ray derivative, dISXR/dt, maximum. We evaluated the energy of accelerated ions and compared it with the energy deposited by accelerated electrons. The ion energy becomes comparable or even higher than the electron energy from a certain step of flare development. So the time profile of dISXR/dt is a superposition of energy deposited by both fractions of accelerated particles. This result allowed us to use a time profile of dISXR/dt as a real proxy of time behavior of the energy release at least during major flare analysis. In particular the time interval when the dISXR/dt value exceeds 0.9 of its maximum can be used as a unified reference point for the calculations of time delay between the high-energy proton acceleration and GLE onset. Analysis of the total set of pion-decay emission observations shows that such temporal closeness of pion-decay emission maximum and the soft X-ray derivative maximum is typical but not obligatory.

  5. Classification of breast cancer in ultrasound imaging using a generic deep learning analysis software: a pilot study.

    PubMed

    Becker, Anton S; Mueller, Michael; Stoffel, Elina; Marcon, Magda; Ghafoor, Soleen; Boss, Andreas

    2018-02-01

    To train a generic deep learning software (DLS) to classify breast cancer on ultrasound images and to compare its performance to human readers with variable breast imaging experience. In this retrospective study, all breast ultrasound examinations from January 1, 2014 to December 31, 2014 at our institution were reviewed. Patients with post-surgical scars, initially indeterminate, or malignant lesions with histological diagnoses or 2-year follow-up were included. The DLS was trained with 70% of the images, and the remaining 30% were used to validate the performance. Three readers with variable expertise also evaluated the validation set (radiologist, resident, medical student). Diagnostic accuracy was assessed with a receiver operating characteristic analysis. 82 patients with malignant and 550 with benign lesions were included. Time needed for training was 7 min (DLS). Evaluation time for the test data set were 3.7 s (DLS) and 28, 22 and 25 min for human readers (decreasing experience). Receiver operating characteristic analysis revealed non-significant differences (p-values 0.45-0.47) in the area under the curve of 0.84 (DLS), 0.88 (experienced and intermediate readers) and 0.79 (inexperienced reader). DLS may aid diagnosing cancer on breast ultrasound images with an accuracy comparable to radiologists, and learns better and faster than a human reader with no prior experience. Further clinical trials with dedicated algorithms are warranted. Advances in knowledge: DLS can be trained classify cancer on breast ultrasound images high accuracy even with comparably few training cases. The fast evaluation speed makes real-time image analysis feasible.

  6. Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements

    NASA Astrophysics Data System (ADS)

    Papa, A. R.; Akel, A. F.

    2009-05-01

    Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.

  7. Phase walk analysis of leptokurtic time series.

    PubMed

    Schreiber, Korbinian; Modest, Heike I; Räth, Christoph

    2018-06-01

    The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.

  8. Comparative study of 2D ultrasound imaging methods in the f-k domain and evaluation of their performances in a realistic NDT configuration

    NASA Astrophysics Data System (ADS)

    Merabet, Lucas; Robert, Sébastien; Prada, Claire

    2018-04-01

    In this paper, we present two frequency-domain algorithms for 2D imaging with plane wave emissions, namely Stolt's migration and Lu's method. The theoretical background is first presented, followed by an analysis of the algorithm complexities. The frequency-domain methods are then compared to the time-domain plane wave imaging in a realistic inspection configuration where the array elements are not in contact with the specimen. Imaging defects located far away from the array aperture is assessed and computation times for the three methods are presented as a function of the number of pixels of the reconstructed image. We show that Lu's method provides a time gain of up to 33 compared to the time-domain algorithm, and demonstrate the limitations of Stolt's migration for defects far away from the aperture.

  9. Modular socket system versus traditionally laminated socket: a cost analysis.

    PubMed

    Normann, Elna; Olsson, Anna; Brodtkorb, Thor-Henrik

    2011-03-01

    Using the new modular socket system (MSS) to produce a prosthetic socket directly on the patient has the potential of being easier and quicker to manufacture but also incurring higher costs. The purpose of the study was to compare the costs of manufacturing a transtibial prosthetic socket using either a MSS or a standard laminated socket (PC). Concurrent controlled trial. A total of 20 patients at two orthopaedic facilities were followed with regards to the cost of manufacturing a prosthetic socket using either MSS or PC. Time aspects and material costs were considered in the cost analysis. Other factors studied include delivery time and number of visits. For the cost analysis, only direct costs pertaining to the prosthetic socket were considered. The total cost of MSS was found to be significantly higher (p < 0.01) compared to PC. However, the production and time cost was significantly lower. Delivery time to the patient was 1 day for MSS compared to 17 days for PC. Our study shows that the direct prosthetic cost of treating a patient using MSS is significantly higher than treating a patient using PC. However, the MSS prosthesis can be delivered significantly faster and with fewer visits. Further studies taking the full societal costs of MSS into account should therefore be performed. This study shows that the direct prosthetic cost of treating a patient with Modular Socket System is significantly higher than treating a patient with plastercasting with standard laminated socket. However, the Modular Socket System prosthesis can be delivered significantly faster and with fewer visits.

  10. A comparative analysis of localized and propagating surface plasmon resonance sensors: the binding of concanavalin a to a monosaccharide functionalized self-assembled monolayer.

    PubMed

    Yonzon, Chanda Ranjit; Jeoung, Eunhee; Zou, Shengli; Schatz, George C; Mrksich, Milan; Van Duyne, Richard P

    2004-10-06

    A comparative analysis of the properties of two optical biosensor platforms: (1) the propagating surface plasmon resonance (SPR) sensor based on a planar, thin film gold surface and (2) the localized surface plasmon resonance (LSPR) sensor based on surface confined Ag nanoparticles fabricated by nanosphere lithography (NSL) are presented. The binding of Concanavalin A (ConA) to mannose-functionalized self-assembled monolayers (SAMs) was chosen to highlight the similarities and differences between the responses of the real-time angle shift SPR and wavelength shift LSPR biosensors. During the association phase in the real-time binding studies, both SPR and LSPR sensors exhibited qualitatively similar signal vs time curves. However, in the dissociation phase, the SPR sensor showed an approximately 5 times greater loss of signal than the LSPR sensor. A comprehensive set of nonspecific binding studies demonstrated that this signal difference was not the consequence of greater nonspecific binding to the LSPR sensor but rather a systematic function of the Ag nanoparticle's nanoscale structure. Ag nanoparticles with larger aspect ratios showed larger dissociation phase responses than those with smaller aspect ratios. A theoretical analysis based on finite element electrodynamics demonstrates that this results from the characteristic decay length of the electromagnetic fields surrounding Ag nanoparticles being of comparable dimensions to the ConA molecules. Finally, an elementary (2 x 1) multiplexed version of an LSPR carbohydrate sensing chip to probe the simultaneous binding of ConA to mannose and galactose-functionalized SAMs has been demonstrated.

  11. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-file Systems

    PubMed Central

    Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Naik, Saraswathi V

    2016-01-01

    ABSTRACT Background: Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. Study design: This is an experimental, in vitro study comparing the two groups. Materials and methods: A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. Results: A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Conclusion: Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49. PMID:27274155

  12. Evaluation of a New Digital Automated Glycemic Pattern Detection Tool.

    PubMed

    Comellas, María José; Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg

    2017-11-01

    Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study.

  13. Comparison of helicopter and ground emergency medical service: a retrospective analysis of a German rescue helicopter base.

    PubMed

    Mommsen, Philipp; Bradt, Nikolas; Zeckey, Christian; Andruszkow, Hagen; Petri, Max; Frink, Michael; Hildebrand, Frank; Krettek, Christian; Probst, Christian

    2012-01-01

    In consideration of rising cost pressure in the German health care system, the usefulness of helicopter emergency medical service (HEMS) in terms of time- and cost-effectiveness is controversially discussed. The aim of the present study was to investigate whether HEMS is associated with significantly decreased arrival and transportation times compared to ground EMS. In a retrospective study, we evaluated 1,548 primary emergency missions for time sensitive diagnoses (multiple trauma, traumatic brain and burn injury, heart-attack, stroke, and pediatric emergency) performed by a German HEMS using the medical database, NADIN, of the German Air Rescue Service. Arrival and transportation times were compared to calculated ground EMS times. HEMS showed significantly reduced arrival times at the scene in case of heart-attack, stroke and pediatric emergencies. In contrast, HEMS and ground EMS showed comparable arrival times in patients with multiple trauma, traumatic brain and burn injury due to an increased flight distance. HEMS showed a significantly decreased transportation time to the closest centre capable of specialist care in all diagnosis groups (p<0.001). The results of the present study indicate the time-effectiveness of German air ambulance services with significantly decreased transportation times.

  14. Elementary Students' Retention of Environmental Science Knowledge: Connected Science Instruction versus Direct Instruction

    ERIC Educational Resources Information Center

    Upadhyay, Bhaskar; DeFranco, Cristina

    2008-01-01

    This study compares 3rd-grade elementary students' gain and retention of science vocabulary over time in two different classes--"connected science instruction" versus "direct instruction." Data analysis yielded that students who received connected science instruction showed less gain in science knowledge in the short term compared to students who…

  15. Dependability and performability analysis

    NASA Technical Reports Server (NTRS)

    Trivedi, Kishor S.; Ciardo, Gianfranco; Malhotra, Manish; Sahner, Robin A.

    1993-01-01

    Several practical issues regarding specifications and solution of dependability and performability models are discussed. Model types with and without rewards are compared. Continuous-time Markov chains (CTMC's) are compared with (continuous-time) Markov reward models (MRM's) and generalized stochastic Petri nets (GSPN's) are compared with stochastic reward nets (SRN's). It is shown that reward-based models could lead to more concise model specifications and solution of a variety of new measures. With respect to the solution of dependability and performability models, three practical issues were identified: largeness, stiffness, and non-exponentiality, and a variety of approaches are discussed to deal with them, including some of the latest research efforts.

  16. Maternal Voice and Short-Term Outcomes in Preterm Infants

    PubMed Central

    Krueger, Charlene; Parker, Leslie; Chiu, Sheau-Huey; Theriaque, Douglas

    2013-01-01

    This study explored effects of exposure to maternal voice on short-term outcomes in very low birth weight preterm infants cared for within an neonatal intensive care unit (NICU) without an ongoing program of developmental care. Using a comparative design, 53 infants born during their 27th to 28th postmenstrual week were sampled by convenience. Experimental groups were exposed to maternal voice during two developmental time periods. Group 1 listened to a recording of their mothers reciting a rhyme from 28 to 34 postmenstrual weeks. Group 2 waited 4 weeks and heard the recording from 32 to 34 weeks. The control group received routine care. The primary analysis of combined experimental groups compared to the control group revealed that the experimental infants experienced significantly fewer episodes of feeding intolerance and achieved full enteral feeds quicker compared to the control group. Further, in an analysis evaluating all three groups separately, it was noted that Group 1 experienced significantly fewer episodes of feeding intolerance compared to the control group. Study findings warrant further investigation of exposure to maternal voice and the developmental timing at which exposure is begun. PMID:20112262

  17. Studies of mobile dust in scrape-off layer plasmas using silica aerogel collectors

    NASA Astrophysics Data System (ADS)

    Bergsåker, H.; Ratynskaia, S.; Litnovsky, A.; Ogata, D.; Sahle, W.

    2011-08-01

    Dust capture with ultralow density silica aerogel collectors is a new method, which allows time resolved in situ capture of dust particles in the scrape-off layers of fusion devices, without substantially damaging the particles. Particle composition and morphology, particle flux densities and particle velocity distributions can be determined through appropriate analysis of the aerogel surfaces after exposure. The method has been applied in comparative studies of intrinsic dust in the TEXTOR tokamak and in the Extrap T2R reversed field pinch. The analysis methods have been mainly optical microscopy and SEM. The method is shown to be applicable in both devices and the results are tentatively compared between the two plasma devices, which are very different in terms of edge plasma conditions, time scale, geometry and wall materials.

  18. Evaluation of methodology for the analysis of 'time-to-event' data in pharmacogenomic genome-wide association studies.

    PubMed

    Syed, Hamzah; Jorgensen, Andrea L; Morris, Andrew P

    2016-06-01

    To evaluate the power to detect associations between SNPs and time-to-event outcomes across a range of pharmacogenomic study designs while comparing alternative regression approaches. Simulations were conducted to compare Cox proportional hazards modeling accounting for censoring and logistic regression modeling of a dichotomized outcome at the end of the study. The Cox proportional hazards model was demonstrated to be more powerful than the logistic regression analysis. The difference in power between the approaches was highly dependent on the rate of censoring. Initial evaluation of single-nucleotide polymorphism association signals using computationally efficient software with dichotomized outcomes provides an effective screening tool for some design scenarios, and thus has important implications for the development of analytical protocols in pharmacogenomic studies.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shoaf, S.; APS Engineering Support Division

    A real-time image analysis system was developed for beam imaging diagnostics. An Apple Power Mac G5 with an Active Silicon LFG frame grabber was used to capture video images that were processed and analyzed. Software routines were created to utilize vector-processing hardware to reduce the time to process images as compared to conventional methods. These improvements allow for more advanced image processing diagnostics to be performed in real time.

  20. SeaQuaKE: Sea-optimized Quantum Key Exchange

    DTIC Science & Technology

    2015-01-01

    of photon pairs in both polarization [3] and time-bin [4] degrees of freedom simultaneously. Entanglement analysis components in both the...greater throughput per entangled photon pair compared to alternative sources that encode in only a Photon -pair source Time-bin entanglement ...Polarization Entanglement & Pair Generation Hyperentangled Photon Pair Source •Wavelength availability • Power • Pulse rate Time-bin Mux • Waveguide vs

  1. Root cause analysis of laboratory turnaround times for patients in the emergency department.

    PubMed

    Fernandes, Christopher M B; Worster, Andrew; Hill, Stephen; McCallum, Catherine; Eva, Kevin

    2004-03-01

    Laboratory investigations are essential to patient care and are conducted routinely in emergency departments (EDs). This study reports the turnaround times at an academic, tertiary care ED, using root cause analysis to identify potential areas of improvement. Our objectives were to compare the laboratory turnaround times with established benchmarks and identify root causes for delays. Turnaround and process event times for a consecutive sample of hemoglobin and potassium measurements were recorded during an 8-day study period using synchronized time stamps. A log transformation (ln [minutes + 1]) was performed to normalize the time data, which were then compared with established benchmarks using one-sample t tests. The turnaround time for hemoglobin was significantly less than the established benchmark (n = 140, t = -5.69, p < 0.001) and that of potassium was significantly greater (n = 121, t = 12.65, p < 0.001). The hemolysis rate was 5.8%, with 0.017% of samples needing recollection. Causes of delays included order-processing time, a high proportion (43%) of tests performed on patients who had been admitted but were still in the ED waiting for a bed, and excessive laboratory process times for potassium. The turnaround time for hemoglobin (18 min) met the established benchmark, but that for potassium (49 min) did not. Root causes for delay were order-processing time, excessive queue and instrument times for potassium and volume of tests for admitted patients. Further study of these identified causes of delays is required to see whether laboratory TATs can be reduced.

  2. Evaluation of Hydrologic and Meteorological Impacts on Dengue Fever Incidences in Southern Taiwan using Time- Frequency Method

    NASA Astrophysics Data System (ADS)

    Tsai, Christina; Yeh, Ting-Gu

    2017-04-01

    Extreme weather events are occurring more frequently as a result of climate change. Recently dengue fever has become a serious issue in southern Taiwan. It may have characteristic temporal scales that can be identified. Some researchers have hypothesized that dengue fever incidences are related to climate change. This study applies time-frequency analysis to time series data concerning dengue fever and hydrologic and meteorological variables. Results of three time-frequency analytical methods - the Hilbert Huang transform (HHT), the Wavelet Transform (WT) and the Short Time Fourier Transform (STFT) are compared and discussed. A more effective time-frequency analysis method will be identified to analyze relevant time series data. The most influential time scales of hydrologic and meteorological variables that are associated with dengue fever are determined. Finally, the linkage between hydrologic/meteorological factors and dengue fever incidences can be established.

  3. Managing Complexity in Evidence Analysis: A Worked Example in Pediatric Weight Management.

    PubMed

    Parrott, James Scott; Henry, Beverly; Thompson, Kyle L; Ziegler, Jane; Handu, Deepa

    2018-05-02

    Nutrition interventions are often complex and multicomponent. Typical approaches to meta-analyses that focus on individual causal relationships to provide guideline recommendations are not sufficient to capture this complexity. The objective of this study is to describe the method of meta-analysis used for the Pediatric Weight Management (PWM) Guidelines update and provide a worked example that can be applied in other areas of dietetics practice. The effects of PWM interventions were examined for body mass index (BMI), body mass index z-score (BMIZ), and waist circumference at four different time periods. For intervention-level effects, intervention types were identified empirically using multiple correspondence analysis paired with cluster analysis. Pooled effects of identified types were examined using random effects meta-analysis models. Differences in effects among types were examined using meta-regression. Context-level effects are examined using qualitative comparative analysis. Three distinct types (or families) of PWM interventions were identified: medical nutrition, behavioral, and missing components. Medical nutrition and behavioral types showed statistically significant improvements in BMIZ across all time points. Results were less consistent for BMI and waist circumference, although four distinct patterns of weight status change were identified. These varied by intervention type as well as outcome measure. Meta-regression indicated statistically significant differences between the medical nutrition and behavioral types vs the missing component type for both BMIZ and BMI, although the pattern varied by time period and intervention type. Qualitative comparative analysis identified distinct configurations of context characteristics at each time point that were consistent with positive outcomes among the intervention types. Although analysis of individual causal relationships is invaluable, this approach is inadequate to capture the complexity of dietetics practice. An alternative approach that integrates intervention-level with context-level meta-analyses may provide deeper understanding in the development of practice guidelines. Copyright © 2018 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  4. System identification of an unmanned quadcopter system using MRAN neural

    NASA Astrophysics Data System (ADS)

    Pairan, M. F.; Shamsudin, S. S.

    2017-12-01

    This project presents the performance analysis of the radial basis function neural network (RBF) trained with Minimal Resource Allocating Network (MRAN) algorithm for real-time identification of quadcopter. MRAN’s performance is compared with the RBF with Constant Trace algorithm for 2500 input-output pair data sampling. MRAN utilizes adding and pruning hidden neuron strategy to obtain optimum RBF structure, increase prediction accuracy and reduce training time. The results indicate that MRAN algorithm produces fast training time and more accurate prediction compared with standard RBF. The model proposed in this paper is capable of identifying and modelling a nonlinear representation of the quadcopter flight dynamics.

  5. Cost-Effectiveness Analysis of the Automation of a Circulation System.

    ERIC Educational Resources Information Center

    Mosley, Isobel

    A general methodology for cost effectiveness analysis was developed and applied to the Colorado State University library loan desk. The cost effectiveness of the existing semi-automated circulation system was compared with that of a fully manual one, based on the existing manual subsystem. Faculty users' time and computer operating costs were…

  6. The Economics of [Not] Closing Small Rural Schools.

    ERIC Educational Resources Information Center

    Witham, Mark

    This paper presents a preliminary analysis of the comparative costs and benefits of closing small rural schools in South Australia. The cost analysis includes accounting for the use of staff, goods, and services; distance education support; land and buildings; and the opportunity cost of children's bus travel time. The assumption that children's…

  7. The Arabs: Perception/Misperception. A Comparative View, Experimental Version.

    ERIC Educational Resources Information Center

    Otero, George G.

    In this unit, high-school students identify and evaluate their own images of the Arabs and begin to develop more accurate perceptions of the Arabs through data analysis. Activities emphasize social studies skills, such as mapmaking and reading, use of time lines and the concept of chronology, and data collection and analysis. Students compare…

  8. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup usingmore » H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.« less

  9. Radiofrequency Ablation versus Cryoablation in the Treatment of Paroxysmal Atrial Fibrillation: A Meta-Analysis

    PubMed Central

    Hachem, Ali H.; Marine, Joseph E.; Tahboub, Housam A.; Kamdar, Sana; Kanjwal, Shaffi; Soni, Ronak

    2018-01-01

    Background Pulmonary vein isolation is commonly performed using radiofrequency energy with cryoablation gaining acceptance. We performed a meta-analysis of randomized controlled trials which compared radiofrequency versus cryoablation for patients with atrial fibrillation. Methods A systematic search strategy identified both published and unpublished articles from inception to November 10, 2016, in multiple databases. The primary outcomes for this meta-analysis were long-term freedom from atrial fibrillation at 12-month follow-up and overall postoperative complication rates. For all included studies, the methodological quality was assessed through the Cochrane Collaboration's tool for risk of bias. Results A total of 247 articles were identified with eight being included in this review as they satisfied the prespecified inclusion criteria. Overall, there was no significant difference in freedom from atrial fibrillation at ≥12-month follow-up between those receiving cryoballoon and radiofrequency ablation, respectively (OR = 0.98, CI = 0.67–1.43, I2 = 56%, p=0.90). Additionally, the secondary outcomes of duration of ablation, fluoroscopy time, and ablation time failed to reach significance. Cryoballoon ablation had significantly greater odds of postoperative phrenic nerve injury at 12-month follow-up. Conclusions Our meta-analysis suggests that cryoballoon ablation provides comparable benefits with regard to freedom from atrial fibrillation at medium-term follow-up, fluoroscopy time, ablation time, operative duration, and overall complication rate in comparison to radiofrequency ablation. PMID:29805800

  10. Time required for motor activity in lucid dreams.

    PubMed

    Erlacher, Daniel; Schredl, Michael

    2004-12-01

    The present study investigated the relationship between the time required for specific tasks (counting and performing squats) in lucid dreams and in the waking state. Five proficient lucid dreamers (26-34 yr. old, M=29.8, SD=3.0; one woman and four men) participated. Analysis showed that the time needed for counting in a lucid dream is comparable to the time needed for counting in wakefulness, but motor activities required more time in lucid dreams than in the waking state.

  11. Children's Well-Being during Parents' Marital Disruption Process: A Pooled Time-Series Analysis.

    ERIC Educational Resources Information Center

    Sun, Yongmin; Li, Yuanzhang

    2002-01-01

    Examines the extent to which parents' marital disruption process affects children's academic performance and well-being both before and after parental divorce. Compared with peers in intact families, children of divorce faired less well. Discusses how family resources mediate detrimental effects over time. Similar results are noted for girls and…

  12. Full-time Faculty and Civil Service Salaries at Illinois Colleges and Universities.

    ERIC Educational Resources Information Center

    Illinois State Board of Higher Education, Springfield.

    This report presents an analysis of weighted average salaries for full-time faculty and civil service employees at Illinois public and independent colleges and universities, and the Illinois Mathematics and Science Academy. The report includes average salaries for fiscal years 1985, 1990, and 1996-98 and compares salaries with select economic…

  13. Time and Learning Efficiency in Internet-Based Learning: A Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Cook, David A.; Levinson, Anthony J.; Garside, Sarah

    2010-01-01

    Authors have claimed that Internet-based instruction promotes greater learning efficiency than non-computer methods. Objectives Determine, through a systematic synthesis of evidence in health professions education, how Internet-based instruction compares with non-computer instruction in time spent learning, and what features of Internet-based…

  14. Hierarchical Spatio-temporal Visual Analysis of Cluster Evolution in Electrocorticography Data

    DOE PAGES

    Murugesan, Sugeerth; Bouchard, Kristofer; Chang, Edward; ...

    2016-10-02

    Here, we present ECoG ClusterFlow, a novel interactive visual analysis tool for the exploration of high-resolution Electrocorticography (ECoG) data. Our system detects and visualizes dynamic high-level structures, such as communities, using the time-varying spatial connectivity network derived from the high-resolution ECoG data. ECoG ClusterFlow provides a multi-scale visualization of the spatio-temporal patterns underlying the time-varying communities using two views: 1) an overview summarizing the evolution of clusters over time and 2) a hierarchical glyph-based technique that uses data aggregation and small multiples techniques to visualize the propagation of clusters in their spatial domain. ECoG ClusterFlow makes it possible 1) tomore » compare the spatio-temporal evolution patterns across various time intervals, 2) to compare the temporal information at varying levels of granularity, and 3) to investigate the evolution of spatial patterns without occluding the spatial context information. Lastly, we present case studies done in collaboration with neuroscientists on our team for both simulated and real epileptic seizure data aimed at evaluating the effectiveness of our approach.« less

  15. Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model.

    PubMed

    Musekiwa, Alfred; Manda, Samuel O M; Mwambi, Henry G; Chen, Ding-Geng

    2016-01-01

    Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results.

  16. Multi-spacecraft solar energetic particle analysis of FERMI gamma-ray flare events within the HESPERIA H2020 project

    NASA Astrophysics Data System (ADS)

    Tziotziou, Kostas; Malandraki, Olga; Valtonen, Eino; Heber, Bernd; Zucca, Pietro; Klein, Karl-Ludwig; Vainio, Rami; Tsiropoula, Georgia; Share, Gerald

    2017-04-01

    Multi-spacecraft observations of solar energetic particle (SEP) events are important for understanding the acceleration processes and the interplanetary propagation of particles released during eruptive events. In this work, we have carefully studied 25 gamma-ray flare events observed by FERMI and investigated possible associations with SEP-related events observed with STEREO and L1 spacecraft in the heliosphere. A data-driven velocity dispersion analysis (VDA) and Time-Shifting Analysis (TSA) are used for deriving the release times of protons and electrons at the Sun and for comparing them with the respective times stemming from the gamma-ray event analysis and their X-ray signatures, in an attempt to interconnect the SEPs and Fermi events and better understand the physics involved. Acknowledgements: This project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.

  17. Direct Analysis in Real Time-Mass Spectrometry for the Rapid Detection of Metabolites of Aconite Alkaloids in Intestinal Bacteria

    NASA Astrophysics Data System (ADS)

    Li, Xue; Hou, Guangyue; Xing, Junpeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2014-12-01

    In the present work, direct analysis of real time ionization combined with multi-stage tandem mass spectrometry (DART-MSn) was used to investigate the metabolic profile of aconite alkaloids in rat intestinal bacteria. A total of 36 metabolites from three aconite alkaloids were identified by using DART-MSn, and the feasibility of quantitative analysis of these analytes was examined. Key parameters of the DART ion source, such as helium gas temperature and pressure, the source-to-MS distance, and the speed of the autosampler, were optimized to achieve high sensitivity, enhance reproducibility, and reduce the occurrence of fragmentation. The instrument analysis time for one sample can be less than 10 s for this method. Compared with ESI-MS and UPLC-MS, the DART-MS is more efficient for directly detecting metabolic samples, and has the advantage of being a simple, high-speed, high-throughput method.

  18. Direct analysis in real time-mass spectrometry for the rapid detection of metabolites of aconite alkaloids in intestinal bacteria.

    PubMed

    Li, Xue; Hou, Guangyue; Xing, Junpeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2014-12-01

    In the present work, direct analysis of real time ionization combined with multi-stage tandem mass spectrometry (DART-MS(n)) was used to investigate the metabolic profile of aconite alkaloids in rat intestinal bacteria. A total of 36 metabolites from three aconite alkaloids were identified by using DART-MS(n), and the feasibility of quantitative analysis of these analytes was examined. Key parameters of the DART ion source, such as helium gas temperature and pressure, the source-to-MS distance, and the speed of the autosampler, were optimized to achieve high sensitivity, enhance reproducibility, and reduce the occurrence of fragmentation. The instrument analysis time for one sample can be less than 10 s for this method. Compared with ESI-MS and UPLC-MS, the DART-MS is more efficient for directly detecting metabolic samples, and has the advantage of being a simple, high-speed, high-throughput method.

  19. Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals.

    PubMed

    Hedayatifar, L; Vahabi, M; Jafari, G R

    2011-08-01

    When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.

  20. Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals

    NASA Astrophysics Data System (ADS)

    Hedayatifar, L.; Vahabi, M.; Jafari, G. R.

    2011-08-01

    When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.

  1. Detecting a periodic signal in the terrestrial cratering record

    NASA Technical Reports Server (NTRS)

    Grieve, Richard A. F.; Rupert, James D.; Goodacre, Alan K.; Sharpton, Virgil L.

    1988-01-01

    A time-series analysis of model periodic data, where the period and phase are known, has been performed in order to investigate whether a significant period can be detected consistently from a mix of random and periodic impacts. Special attention is given to the effect of age uncertainties and random ages in the detection of a periodic signal. An equivalent analysis is performed with observed data on crater ages and compared with the model data, and the effects of the temporal distribution of crater ages on the results from the time-series analysis are studied. Evidence for a consistent 30-m.y. period is found to be weak.

  2. Estimating short-period dynamics using an extended Kalman filter

    NASA Technical Reports Server (NTRS)

    Bauer, Jeffrey E.; Andrisani, Dominick

    1990-01-01

    An extended Kalman filter (EKF) is used to estimate the parameters of a low-order model from aircraft transient response data. The low-order model is a state space model derived from the short-period approximation of the longitudinal aircraft dynamics. The model corresponds to the pitch rate to stick force transfer function currently used in flying qualities analysis. Because of the model chosen, handling qualities information is also obtained. The parameters are estimated from flight data as well as from a six-degree-of-freedom, nonlinear simulation of the aircraft. These two estimates are then compared and the discrepancies noted. The low-order model is able to satisfactorily match both flight data and simulation data from a high-order computer simulation. The parameters obtained from the EKF analysis of flight data are compared to those obtained using frequency response analysis of the flight data. Time delays and damping ratios are compared and are in agreement. This technique demonstrates the potential to determine, in near real time, the extent of differences between computer models and the actual aircraft. Precise knowledge of these differences can help to determine the flying qualities of a test aircraft and lead to more efficient envelope expansion.

  3. Statistical analysis of flight times for space shuttle ferry flights

    NASA Technical Reports Server (NTRS)

    Graves, M. E.; Perlmutter, M.

    1974-01-01

    Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.

  4. Head movement compensation in real-time magnetoencephalographic recordings.

    PubMed

    Little, Graham; Boe, Shaun; Bardouille, Timothy

    2014-01-01

    Neurofeedback- and brain-computer interface (BCI)-based interventions can be implemented using real-time analysis of magnetoencephalographic (MEG) recordings. Head movement during MEG recordings, however, can lead to inaccurate estimates of brain activity, reducing the efficacy of the intervention. Most real-time applications in MEG have utilized analyses that do not correct for head movement. Effective means of correcting for head movement are needed to optimize the use of MEG in such applications. Here we provide preliminary validation of a novel analysis technique, real-time source estimation (rtSE), that measures head movement and generates corrected current source time course estimates in real-time. rtSE was applied while recording a calibrated phantom to determine phantom position localization accuracy and source amplitude estimation accuracy under stationary and moving conditions. Results were compared to off-line analysis methods to assess validity of the rtSE technique. The rtSE method allowed for accurate estimation of current source activity at the source-level in real-time, and accounted for movement of the source due to changes in phantom position. The rtSE technique requires modifications and specialized analysis of the following MEG work flow steps.•Data acquisition•Head position estimation•Source localization•Real-time source estimation This work explains the technical details and validates each of these steps.

  5. Some stylized facts of the Bitcoin market

    NASA Astrophysics Data System (ADS)

    Bariviera, Aurelio F.; Basgall, María José; Hasperué, Waldo; Naiouf, Marcelo

    2017-10-01

    In recent years a new type of tradable assets appeared, generically known as cryptocurrencies. Among them, the most widespread is Bitcoin. Given its novelty, this paper investigates some statistical properties of the Bitcoin market. This study compares Bitcoin and standard currencies dynamics and focuses on the analysis of returns at different time scales. We test the presence of long memory in return time series from 2011 to 2017, using transaction data from one Bitcoin platform. We compute the Hurst exponent by means of the Detrended Fluctuation Analysis method, using a sliding window in order to measure long range dependence. We detect that Hurst exponents changes significantly during the first years of existence of Bitcoin, tending to stabilize in recent times. Additionally, multiscale analysis shows a similar behavior of the Hurst exponent, implying a self-similar process.

  6. Physiological demands of women's rugby union: time-motion analysis and heart rate response.

    PubMed

    Virr, Jody Lynn; Game, Alex; Bell, Gordon John; Syrotuik, Daniel

    2014-01-01

    The aim of this study was to determine the physical demands of women's rugby union match play using time-motion analysis and heart rate (HR) response. Thirty-eight premier club level female rugby players, ages 18-34 years were videotaped and HRs monitored for a full match. Performances were coded into 12 different movement categories: 5 speeds of locomotion (standing, walking, jogging, striding, sprinting), 4 forms of intensive non-running exertion (ruck/maul/tackle, pack down, scrum, lift) and 3 discrete activities (kick, jump, open field tackle). The main results revealed that backs spend significantly more time sprinting and walking whereas forwards spend more time in intensive non-running exertion and jogging. Forwards also had a significantly higher total work frequency compared to the backs, but a higher total rest frequency compared to the backs. In terms of HR responses, forwards displayed higher mean HRs throughout the match and more time above 80% of their maximum HR than backs. In summary, women's rugby union is characterised by intermittent bursts of high-intensity activity, where forwards and backs have similar anaerobic energy demands, but different specific match demands.

  7. Time to angiographic reperfusion in acute ischemic stroke: decision analysis.

    PubMed

    Vagal, Achala S; Khatri, Pooja; Broderick, Joseph P; Tomsick, Thomas A; Yeatts, Sharon D; Eckman, Mark H

    2014-12-01

    Our objective was to use decision analytic modeling to compare 2 treatment strategies of intravenous recombinant tissue-type plasminogen activator (r-tPA) alone versus combined intravenous r-tPA/endovascular therapy in a subgroup of patients with large vessel (internal carotid artery terminus, M1, and M2) occlusion based on varying times to angiographic reperfusion and varying rates of reperfusion. We developed a decision model using Interventional Management of Stroke (IMS) III trial data and comprehensive literature review. We performed 1-way sensitivity analyses for time to reperfusion and 2-way sensitivity for time to reperfusion and rate of reperfusion success. We also performed probabilistic sensitivity analyses to address uncertainty in total time to reperfusion for the endovascular approach. In the base case, endovascular approach yielded a higher expected utility (6.38 quality-adjusted life years) than the intravenous-only arm (5.42 quality-adjusted life years). One-way sensitivity analyses demonstrated superiority of endovascular treatment to intravenous-only arm unless time to reperfusion exceeded 347 minutes. Two-way sensitivity analysis demonstrated that endovascular treatment was preferred when probability of reperfusion is high and time to reperfusion is small. Probabilistic sensitivity results demonstrated an average gain for endovascular therapy of 0.76 quality-adjusted life years (SD 0.82) compared with the intravenous-only approach. In our post hoc model with its underlying limitations, endovascular therapy after intravenous r-tPA is the preferred treatment as compared with intravenous r-tPA alone. However, if time to reperfusion exceeds 347 minutes, intravenous r-tPA alone is the recommended strategy. This warrants validation in a randomized, prospective trial among patients with large vessel occlusions. © 2014 American Heart Association, Inc.

  8. Greek Children Living in Rural Areas Are Heavier but Fitter Compared to Their Urban Counterparts: A Comparative, Time-Series (1997-2008) Analysis

    ERIC Educational Resources Information Center

    Tambalis, Konstantinos D.; Panagiotakos, Demosthenes B.; Sidossis, Labros S.

    2011-01-01

    Purpose: To compare 12-year (1997-2008) trends in the distribution of Body Mass Index (BMI) status and physical fitness test performances among 8- to 9-year-old Greek children living in rural and urban areas. Methods: Population data derived from 11 national school-based health surveys conducted from 1997 to 2008. Anthropometric measurements and…

  9. Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun; Niu, Hongli

    2015-10-01

    Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.

  10. Assessment of the antidandruff activity of a new shampoo: a randomized, double-blind, controlled study by clinical and instrumental evaluations.

    PubMed

    Sparavigna, Adele; Setaro, Michele; Caserini, Maurizio; Bulgheroni, Anna

    2013-01-01

    The aim of this randomized, double-blind, controlled study was to evaluate the antidandruff activity exerted by a new shampoo on patients affected by dandruff and/or mild seborrheic dermatitis by means of both D-squame technique coupled with image analysis and clinical assessments. Thirty-four patients were enrolled and 1:1 randomly assigned to either a test shampoo or a comparative shampoo group. Treatment schedule was twice a week for 4 weeks. The D-squame technique was shown to be able to objectively record variations in scalp desquamation both between test and comparative groups and within the same group over time. The results obtained with this instrumental approach showed a statistically significant reduction by 52% vs baseline after 2 weeks of treatment. There was an even greater reduction after 4 weeks (-66%). This reduction was statistically significant compared with the comparative group at the same time points. The analysis of all the other parameters (except Wood's lamp) confirmed the superiority of the test vs the comparative shampoo. The test shampoo proved to be safe, well tolerated, and accepted by the patients for cosmetic acceptability and efficacy. The study confirmed the antidandruff efficacy of the test shampoo and its superiority vs the comparative shampoo.

  11. Validation of Methods to Control for Immortal Time Bias in a Pharmacoepidemiologic Analysis of Renin–Angiotensin System Inhibitors in Type 2 Diabetes

    PubMed Central

    Yang, Xilin; Kong, Alice PS; Luk, Andrea OY; Ozaki, Risa; Ko, Gary TC; Ma, Ronald CW; Chan, Juliana CN; So, Wing Yee

    2014-01-01

    Background Pharmacoepidemiologic analysis can confirm whether drug efficacy in a randomized controlled trial (RCT) translates to effectiveness in real settings. We examined methods used to control for immortal time bias in an analysis of renin–angiotensin system (RAS) inhibitors as the reference cardioprotective drug. Methods We analyzed data from 3928 patients with type 2 diabetes who were recruited into the Hong Kong Diabetes Registry between 1996 and 2005 and followed up to July 30, 2005. Different Cox models were used to obtain hazard ratios (HRs) for cardiovascular disease (CVD) associated with RAS inhibitors. These HRs were then compared to the HR of 0.92 reported in a recent meta-analysis of RCTs. Results During a median follow-up period of 5.45 years, 7.23% (n = 284) patients developed CVD and 38.7% (n = 1519) were started on RAS inhibitors, with 39.1% of immortal time among the users. In multivariable analysis, time-dependent drug-exposure Cox models and Cox models that moved immortal time from users to nonusers both severely inflated the HR, and time-fixed models that included immortal time deflated the HR. Use of time-fixed Cox models that excluded immortal time resulted in a HR of only 0.89 (95% CI, 0.68–1.17) for CVD associated with RAS inhibitors, which is closer to the values reported in RCTs. Conclusions In pharmacoepidemiologic analysis, time-dependent drug exposure models and models that move immortal time from users to nonusers may introduce substantial bias in investigations of the effects of RAS inhibitors on CVD in type 2 diabetes. PMID:24747198

  12. Short-term vs. long-term heart rate variability in ischemic cardiomyopathy risk stratification.

    PubMed

    Voss, Andreas; Schroeder, Rico; Vallverdú, Montserrat; Schulz, Steffen; Cygankiewicz, Iwona; Vázquez, Rafael; Bayés de Luna, Antoni; Caminal, Pere

    2013-01-01

    In industrialized countries with aging populations, heart failure affects 0.3-2% of the general population. The investigation of 24 h-ECG recordings revealed the potential of nonlinear indices of heart rate variability (HRV) for enhanced risk stratification in patients with ischemic heart failure (IHF). However, long-term analyses are time-consuming, expensive, and delay the initial diagnosis. The objective of this study was to investigate whether 30 min short-term HRV analysis is sufficient for comparable risk stratification in IHF in comparison to 24 h-HRV analysis. From 256 IHF patients [221 at low risk (IHFLR) and 35 at high risk (IHFHR)] (a) 24 h beat-to-beat time series (b) the first 30 min segment (c) the 30 min most stationary day segment and (d) the 30 min most stationary night segment were investigated. We calculated linear (time and frequency domain) and nonlinear HRV analysis indices. Optimal parameter sets for risk stratification in IHF were determined for 24 h and for each 30 min segment by applying discriminant analysis on significant clinical and non-clinical indices. Long- and short-term HRV indices from frequency domain and particularly from nonlinear dynamics revealed high univariate significances (p < 0.01) discriminating between IHFLR and IHFHR. For multivariate risk stratification, optimal mixed parameter sets consisting of 5 indices (clinical and nonlinear) achieved 80.4% AUC (area under the curve of receiver operating characteristics) from 24 h HRV analysis, 84.3% AUC from first 30 min, 82.2 % AUC from daytime 30 min and 81.7% AUC from nighttime 30 min. The optimal parameter set obtained from the first 30 min showed nearly the same classification power when compared to the optimal 24 h-parameter set. As results from stationary daytime and nighttime, 30 min segments indicate that short-term analyses of 30 min may provide at least a comparable risk stratification power in IHF in comparison to a 24 h analysis period.

  13. Motor current signature analysis for gearbox condition monitoring under transient speeds using wavelet analysis and dual-level time synchronous averaging

    NASA Astrophysics Data System (ADS)

    Bravo-Imaz, Inaki; Davari Ardakani, Hossein; Liu, Zongchang; García-Arribas, Alfredo; Arnaiz, Aitor; Lee, Jay

    2017-09-01

    This paper focuses on analyzing motor current signature for fault diagnosis of gearboxes operating under transient speed regimes. Two different strategies are evaluated, extensively tested and compared to analyze the motor current signature in order to implement a condition monitoring system for gearboxes in industrial machinery. A specially designed test bench is used, thoroughly monitored to fully characterize the experiments, in which gears in different health status are tested. The measured signals are analyzed using discrete wavelet decomposition, in different decomposition levels using a range of mother wavelets. Moreover, a dual-level time synchronous averaging analysis is performed on the same signal to compare the performance of the two methods. From both analyses, the relevant features of the signals are extracted and cataloged using a self-organizing map, which allows for an easy detection and classification of the diverse health states of the gears. The results demonstrate the effectiveness of both methods for diagnosing gearbox faults. A slightly better performance was observed for dual-level time synchronous averaging method. Based on the obtained results, the proposed methods can used as effective and reliable condition monitoring procedures for gearbox condition monitoring using only motor current signature.

  14. Sinus tarsi approach (STA) versus extensile lateral approach (ELA) for treatment of closed displaced intra-articular calcaneal fractures (DIACF): A meta-analysis.

    PubMed

    Bai, L; Hou, Y-L; Lin, G-H; Zhang, X; Liu, G-Q; Yu, B

    2018-04-01

    Our aim was to compare the effect of sinus tarsi approach (STA) vs extensile lateral approach (ELA) for treatment of closed displaced intra-articular calcaneal fractures (DIACF) is still being debated. A thorough research was carried out in the MEDLINE, EMBASE and Cochrane library databases from inception to December 2016. Only prospective or retrospective comparative studies were selected in this meta-analysis. Two independent reviewers conducted literature search, data extraction and quality assessment. The primary outcomes were anatomical restoration and prevalence of complications. Secondary outcomes included operation time and functional recovery. Four randomized controlled trials involving 326 patients and three cohort studies involving 206 patients were included. STA technique for DIACFs led to a decline in both operation time and incidence of complications. There were no significant differences between the groups in American Orthopedic Foot and Ankle Society scores, nor changes in Böhler angle. This meta-analysis suggests that STA technique may reduce the operation time and incidence of complications. In conclusion, STA technique is reasonably an optimal choice for DIACF. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  15. Multidimensional Separation of Natural Products Using Liquid Chromatography Coupled to Hadamard Transform Ion Mobility Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Liu, Wenjie; Zhang, Xing; Knochenmuss, Richard; Siems, William F.; Hill, Herbert H.

    2016-05-01

    A high performance liquid chromatograph (HPLC)was interfaced to an atmospheric drift tube ion mobility time of flight mass spectrometry. The power of multidimensional separation was demonstrated using chili pepper extracts. The ambient pressure drift tube ion mobility provided high resolving powers up to 166 for the HPLC eluent. With implementation of Hadamard transform (HT), the duty cycle for the ion mobility drift tube was increased from less than 1% to 50%, and the ion transmission efficiency was improved by over 200 times compared with pulsed mode, improving signal to noise ratio 10 times. HT ion mobility and TOF mass spectrometry provide an additional dimension of separation for complex samples without increasing the analysis time compared with conventional HPLC.

  16. A comparison of anthropometric and training characteristics between recreational female marathoners and recreational female Ironman triathletes.

    PubMed

    Rüst, Christoph Alexander; Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas

    2013-02-28

    A personal best marathon time has been reported as a strong predictor variable for an Ironman race time in recreational female Ironman triathletes. This raises the question whether recreational female Ironman triathletes are similar to recreational female marathoners. We investigated similarities and differences in anthropometry and training between 53 recreational female Ironman triathletes and 46 recreational female marathoners. The association of anthropometric variables and training characteristics with race time was investigated using bi- and multi-variate analysis. The Ironman triathletes were younger (P < 0.01), had a lower skin-fold thickness at pectoral (P < 0.001), axillar (P < 0.01), and subscapular (P < 0.05) site, but a thicker skin-fold thickness at the calf site (P < 0.01) compared to the marathoners. Overall weekly training hours were higher in the Ironman triathletes (P < 0.001). The triathletes were running faster during training than the marathoners (P < 0.05). For the triathletes, neither an anthropometric nor a training variable showed an association with overall Ironman race time after bi-variate analysis. In the multi-variate analysis, running speed during training was related to marathon split time for the Ironman triathletes (P = 0.01) and to marathon race time for the marathoners (P = 0.01). To conclude, although personal best marathon time is a strong predictor variable for performance in recreational female Ironman triathletes, there are differences in both anthropometry and training between recreational female Ironman triathletes and recreational female marathoners and different predictor variables for race performance in these two groups of athletes. These findings suggest that recreational female Ironman triathletes are not comparable to recreational female marathoners regarding the association between anthropometric and training characteristics with race time.

  17. WOMEN IN MANAGEMENT: Analysis of Current Population Survey Data

    DTIC Science & Technology

    2002-04-22

    were represented in management positions compared to their representation in all positions within particular industries, and (3) identified salary differentials between men and women in full- time management positions.

  18. The Future of Historical Family Demography

    PubMed Central

    Ruggles, Steven

    2013-01-01

    An explosion of new data sources describing historical family composition is opening unprecedented opportunities for discovery and analysis. The new data will allow comparative multilevel analysis of spatial patterns and will support studies of the transformation of living arrangements over the past 200 years. Using measurement methods that assess family choices at the individual level and analytic strategies that assess variations across space and time, we can dissect the decline of patriarchal family forms in the developed world, and place Northwestern Europe and North America in global comparative context. PMID:23946554

  19. Comparative analysis of breakdown mechanism in thin SiO2 oxide films in metal-oxide-semiconductor structures under the action of heavy charged particles and a pulsed voltage

    NASA Astrophysics Data System (ADS)

    Zinchenko, V. F.; Lavrent'ev, K. V.; Emel'yanov, V. V.; Vatuev, A. S.

    2016-02-01

    Regularities in the breakdown of thin SiO2 oxide films in metal-oxide-semiconductors structures of power field-effect transistors under the action of single heavy charged particles and a pulsed voltage are studied experimentally. Using a phenomenological approach, we carry out comparative analysis of physical mechanisms and energy criteria of the SiO2 breakdown in extreme conditions of excitation of the electron subsystem in the subpicosecond time range.

  20. Comparative genome analysis identifies novel nucleic acid diagnostic targets for use in the specific detection of Haemophilus influenzae.

    PubMed

    Coughlan, Helena; Reddington, Kate; Tuite, Nina; Boo, Teck Wee; Cormican, Martin; Barrett, Louise; Smith, Terry J; Clancy, Eoin; Barry, Thomas

    2015-10-01

    Haemophilus influenzae is recognised as an important human pathogen associated with invasive infections, including bloodstream infection and meningitis. Currently used molecular-based diagnostic assays lack specificity in correctly detecting and identifying H. influenzae. As such, there is a need to develop novel diagnostic assays for the specific identification of H. influenzae. Whole genome comparative analysis was performed to identify putative diagnostic targets, which are unique in nucleotide sequence to H. influenzae. From this analysis, we identified 2H. influenzae putative diagnostic targets, phoB and pstA, for use in real-time PCR diagnostic assays. Real-time PCR diagnostic assays using these targets were designed and optimised to specifically detect and identify all 55H. influenzae strains tested. These novel rapid assays can be applied to the specific detection and identification of H. influenzae for use in epidemiological studies and could also enable improved monitoring of invasive disease caused by these bacteria. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Calibration-free quantitative elemental analysis of meteor plasma using reference laser-induced breakdown spectroscopy of meteorite samples

    NASA Astrophysics Data System (ADS)

    Ferus, Martin; Koukal, Jakub; Lenža, Libor; Srba, Jiří; Kubelík, Petr; Laitl, Vojtěch; Zanozina, Ekaterina M.; Váňa, Pavel; Kaiserová, Tereza; Knížek, Antonín; Rimmer, Paul; Chatzitheodoridis, Elias; Civiš, Svatopluk

    2018-03-01

    Aims: We aim to analyse real-time Perseid and Leonid meteor spectra using a novel calibration-free (CF) method, which is usually applied in the laboratory for laser-induced breakdown spectroscopic (LIBS) chemical analysis. Methods: Reference laser ablation spectra of specimens of chondritic meteorites were measured in situ simultaneously with a high-resolution laboratory echelle spectrograph and a spectral camera for meteor observation. Laboratory data were subsequently evaluated via the CF method and compared with real meteor emission spectra. Additionally, spectral features related to airglow plasma were compared with the spectra of laser-induced breakdown and electric discharge in the air. Results: We show that this method can be applied in the evaluation of meteor spectral data observed in real time. Specifically, CF analysis can be used to determine the chemical composition of meteor plasma, which, in the case of the Perseid and Leonid meteors analysed in this study, corresponds to that of the C-group of chondrites.

  2. Pediatric procedural sedation with ketamine: time to discharge after intramuscular versus intravenous administration.

    PubMed

    Ramaswamy, Preeti; Babl, Franz E; Deasy, Conor; Sharwood, Lisa N

    2009-02-01

    Ketamine is an attractive agent for pediatric procedural sedation. There are limited data on time to discharge comparing intramuscular (IM) vs. intravenous (IV) ketamine. The authors set out to determine whether IM or IV ketamine leads to quicker discharge from the emergency department (ED) and how side effect profiles compare. All patients who had received ketamine IM or IV at a tertiary children's hospital ED during the 3-year study period (2004-2007) were identified. Prospective sedation registry data, retrospective medical records, and administrative data were reviewed for drug dosages, use of additional agents, time of drug administration to discharge, total ED time (triage to discharge), and adverse events. A subgroup analysis for patients requiring five or fewer sutures (short suture group) was performed. A total of 229 patients were enrolled (60% male) with median age of 2.8 years (IQR =1.8-4.3 years) and median weight of 15.7 kg (range = 8.7-74 kg). Ketamine was most frequently employed for laceration repair (80%) and foreign body removal (9%). Overall, 48% received ketamine IM and 52% received it IV. In the short-suture subgroup, 52% received ketamine IM, while 48% received it IV. Multivariate linear regression analysis determined time from drug administration to patient discharge as 21 minutes shorter for IV compared with IM administration, adjusted for age and number of additional doses (R(2) = -0.35; 95% CI = -0.5 to -0.19; p < 0.001). Total time in the ED (triage to discharge) comparing IV versus IM administration, adjusting for age and gender and number of additional doses, was not significantly different (p = 0.16). In the short-suture subgroup, time to discharge from administration was also shorter in the IV ketamine group (R(2) = -0.454; 95%CI = -0.66 to -0.25; p < 0.001) but similar for total time in ED (p = 0.16). Overall, adverse events occurred in 35% (95% CI = 27% to 45%) of the IM group and 20% (95% CI = 13% to 28%) of the IV group (p = 0.01). Only one patient required brief bag-mask ventilation. In this institution, time from drug injection to discharge was shorter in the IV compared to IM ketamine group, both overall and for the short-suture group. However, time from triage to discharge was similar.

  3. Development and comparison of advanced reduced-basis methods for the transient structural analysis of unconstrained structures

    NASA Technical Reports Server (NTRS)

    Mcgowan, David M.; Bostic, Susan W.; Camarda, Charles J.

    1993-01-01

    The development of two advanced reduced-basis methods, the force derivative method and the Lanczos method, and two widely used modal methods, the mode displacement method and the mode acceleration method, for transient structural analysis of unconstrained structures is presented. Two example structural problems are studied: an undamped, unconstrained beam subject to a uniformly distributed load which varies as a sinusoidal function of time and an undamped high-speed civil transport aircraft subject to a normal wing tip load which varies as a sinusoidal function of time. These example problems are used to verify the methods and to compare the relative effectiveness of each of the four reduced-basis methods for performing transient structural analyses on unconstrained structures. The methods are verified with a solution obtained by integrating directly the full system of equations of motion, and they are compared using the number of basis vectors required to obtain a desired level of accuracy and the associated computational times as comparison criteria.

  4. Treatment refusal and premature termination in psychotherapy, pharmacotherapy, and their combination: A meta-analysis of head-to-head comparisons.

    PubMed

    Swift, Joshua K; Greenberg, Roger P; Tompkins, Kelley A; Parkin, Susannah R

    2017-03-01

    The purpose of this meta-analysis was to examine rates of treatment refusal and premature termination for pharmacotherapy alone, psychotherapy alone, pharmacotherapy plus psychotherapy, and psychotherapy plus pill placebo treatments. A systematic review of the literature resulted in 186 comparative trials that included a report of treatment refusal and/or premature termination for at least 2 of the 4 treatment conditions. The data from these studies were pooled using a random-effects analysis. Odds Ratio effect sizes were then calculated to compare the rates between treatment conditions, once across all studies and then again for specific client disorder categories. An average treatment refusal rate of 8.2% was found across studies. Clients who were assigned to pharmacotherapy were 1.76 times more likely to refuse treatment compared with clients who were assigned psychotherapy. Differences in refusal rates for pharmacotherapy and psychotherapy were particularly evident for depressive disorders, panic disorder, and social anxiety disorder. On average, 21.9% of clients prematurely terminated their treatment. Across studies, clients who were assigned to pharmacotherapy were 1.20 times more likely to drop out compared with clients who were assigned to psychotherapy. Pharmacotherapy clients with anorexia/bulimia and depressive disorders dropped out at higher rates compared with psychotherapy clients with these disorders. Treatment refusal and dropout are significant problems in both psychotherapy and pharmacotherapy and providers of these treatments should seek to employ strategies to reduce their occurrence. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. A comparative cost analysis of robot-assisted versus traditional laparoscopic partial nephrectomy.

    PubMed

    Hyams, Elias; Pierorazio, Philip; Mullins, Jeffrey K; Ward, Maryann; Allaf, Mohamad

    2012-07-01

    Robot-assisted laparoscopic partial nephrectomy (RALPN) is supplanting traditional laparoscopic partial nephrectomy (LPN) as the technique of choice for minimally invasive nephron-sparing surgery. This evolution has resulted from potential clinical benefits, as well as proliferation of robotic systems and patient demand for robot-assisted surgery. We sought to quantify the costs associated with the use of robotics for minimally invasive partial nephrectomy. A cost analysis was performed for 20 consecutive robot-assisted partial nephrectomy (RPN) and LPN patients at our institution from 2009 to 2010. Data included actual perioperative and hospitalization costs as well as professional fees. Capital costs were estimated using purchase costs and amortization of two robotic systems from 2001 to 2009, as well as maintenance contract costs. The estimated cost/case was obtained using total robotic surgical volume during this period. Total estimated costs were compared between groups. A separate analysis was performed assuming "ideal" robotic utilization during a comparable period. RALPN had a cost premium of +$1066/case compared with LPN, assuming actual robot utilization from 2001 to 2009. Assuming "ideal" utilization during a comparable period, this premium decreased to +$334; capital costs per case decreased from $1907 to $1175. Tumor size, operative time, and length of stay were comparable between groups. RALPN is associated with a small to moderate cost premium depending on assumptions regarding robotic surgical volume. Saturated utilization of robotic systems decreases attributable capital costs and makes comparison with laparoscopy more favorable. Purported clinical benefits of RPN (eg, decreased warm ischemia time, increased utilization of nephron-sparing surgery) need further study, because these may have cost implications.

  6. Basic gait analysis based on continuous wave radar.

    PubMed

    Zhang, Jun

    2012-09-01

    A gait analysis method based on continuous wave (CW) radar is proposed in this paper. Time-frequency analysis is used to analyze the radar micro-Doppler echo from walking humans, and the relationships between the time-frequency spectrogram and human biological gait are discussed. The methods for extracting the gait parameters from the spectrogram are studied in depth and experiments on more than twenty subjects have been performed to acquire the radar gait data. The gait parameters are calculated and compared. The gait difference between men and women are presented based on the experimental data and extracted features. Gait analysis based on CW radar will provide a new method for clinical diagnosis and therapy. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Particle Simulation of Coulomb Collisions: Comparing the Methods of Takizuka & Abe and Nanbu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C; Lin, T; Caflisch, R

    2007-05-22

    The interactions of charged particles in a plasma are in a plasma is governed by the long-range Coulomb collision. We compare two widely used Monte Carlo models for Coulomb collisions. One was developed by Takizuka and Abe in 1977, the other was developed by Nanbu in 1997. We perform deterministic and stochastic error analysis with respect to particle number and time step. The two models produce similar stochastic errors, but Nanbu's model gives smaller time step errors. Error comparisons between these two methods are presented.

  8. Comparing the Efficiencies of Third Molar Surgeries With and Without a Dentist Anesthesiologist

    PubMed Central

    Young, S.; Boukas, E.; Davidian, E.; Carnahan, J.

    2017-01-01

    Two different anesthesia models were compared in terms of surgical duration, safer outcomes, and economic implications. Third molar surgeries performed with and without a separate dentist anesthesiologist were evaluated by a retrospective data analysis of the surgical operative times. For more difficult surgeries, substantially shorter operative times were observed with the dentist anesthesiologist model, leading to a more favorable surgical outcome. An example calculation is presented to demonstrate economic advantages of scheduling the participation of a dentist anesthesiologist for more difficult surgeries. PMID:28128661

  9. Using Queue Time Predictions for Processor Allocation

    DTIC Science & Technology

    1997-01-01

    Diego Supercomputer Center, 1996. 19 [15] Vijay K. Naik, Sanjeev K. Setia , and Mark S. Squillante. Performance analysis of job schedul- ing policies in...Processing, pages 101{111, 1995. [19] Sanjeev K. Setia and Satish K. Tripathi. An analysis of several processor partitioning policies for parallel...computers. Technical Report CS-TR-2684, University of Maryland, May 1991. [20] Sanjeev K. Setia and Satish K. Tripathi. A comparative analysis of static

  10. Hydrocarbon Reservoir Prediction Using Bi-Gaussian S Transform Based Time-Frequency Analysis Approach

    NASA Astrophysics Data System (ADS)

    Cheng, Z.; Chen, Y.; Liu, Y.; Liu, W.; Zhang, G.

    2015-12-01

    Among those hydrocarbon reservoir detection techniques, the time-frequency analysis based approach is one of the most widely used approaches because of its straightforward indication of low-frequency anomalies from the time-frequency maps, that is to say, the low-frequency bright spots usually indicate the potential hydrocarbon reservoirs. The time-frequency analysis based approach is easy to implement, and more importantly, is usually of high fidelity in reservoir prediction, compared with the state-of-the-art approaches, and thus is of great interest to petroleum geologists, geophysicists, and reservoir engineers. The S transform has been frequently used in obtaining the time-frequency maps because of its better performance in controlling the compromise between the time and frequency resolutions than the alternatives, such as the short-time Fourier transform, Gabor transform, and continuous wavelet transform. The window function used in the majority of previous S transform applications is the symmetric Gaussian window. However, one problem with the symmetric Gaussian window is the degradation of time resolution in the time-frequency map due to the long front taper. In our study, a bi-Gaussian S transform that substitutes the symmetric Gaussian window with an asymmetry bi-Gaussian window is proposed to analyze the multi-channel seismic data in order to predict hydrocarbon reservoirs. The bi-Gaussian window introduces asymmetry in the resultant time-frequency spectrum, with time resolution better in the front direction, as compared with the back direction. It is the first time that the bi-Gaussian S transform is used for analyzing multi-channel post-stack seismic data in order to predict hydrocarbon reservoirs since its invention in 2003. The superiority of the bi-Gaussian S transform over traditional S transform is tested on a real land seismic data example. The performance shows that the enhanced temporal resolution can help us depict more clearly the edge of the hydrocarbon reservoir, especially when the thickness of the reservoir is small (such as the thin beds).

  11. Studies of ZVS soft switching of dual-active-bridge isolated bidirectional DC-DC converters

    NASA Astrophysics Data System (ADS)

    Xu, Fei; Zhao, Feng; Shi, Qibiao; Wen, Xuhui

    2018-05-01

    To operate dual-active-bridge isolated bidirectional dc- dc converter (DAB) at high efficiency, the two bridge switches must operate with Zero-Voltage-Switching (ZVS) over as wide an operating range as possible. This paper proposes a new perspective on realizing ZVS in dead-time. An exact theoretical analysis and mathematical mode is built to explain the process of ZVS switching in dead-time under Single Phase Shift (SPS) control strategy. In order to assure the two bridge switches operate on soft switching, every SPS switching point is analyzed. Generally, dead-time will be determined when the power electronic devices is selected. The key factor to realizing ZVS is the size of the end time of resonance comparing to dead-time. Through detailed analysis, it can obtain the conditions of all switches achieving ZVS turn-on and turn-off. Finally, simulation validates the theoretical analysis and some advice are given to realize the ZVS soft switching.

  12. Bayesian inference of interaction properties of noisy dynamical systems with time-varying coupling: capabilities and limitations

    NASA Astrophysics Data System (ADS)

    Wilting, Jens; Lehnertz, Klaus

    2015-08-01

    We investigate a recently published analysis framework based on Bayesian inference for the time-resolved characterization of interaction properties of noisy, coupled dynamical systems. It promises wide applicability and a better time resolution than well-established methods. At the example of representative model systems, we show that the analysis framework has the same weaknesses as previous methods, particularly when investigating interacting, structurally different non-linear oscillators. We also inspect the tracking of time-varying interaction properties and propose a further modification of the algorithm, which improves the reliability of obtained results. We exemplarily investigate the suitability of this algorithm to infer strength and direction of interactions between various regions of the human brain during an epileptic seizure. Within the limitations of the applicability of this analysis tool, we show that the modified algorithm indeed allows a better time resolution through Bayesian inference when compared to previous methods based on least square fits.

  13. Failure time analysis with unobserved heterogeneity: Earthquake duration time of Turkey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ata, Nihal, E-mail: nihalata@hacettepe.edu.tr; Kadilar, Gamze Özel, E-mail: gamzeozl@hacettepe.edu.tr

    Failure time models assume that all units are subject to same risks embodied in the hazard functions. In this paper, unobserved sources of heterogeneity that are not captured by covariates are included into the failure time models. Destructive earthquakes in Turkey since 1900 are used to illustrate the models and inter-event time between two consecutive earthquakes are defined as the failure time. The paper demonstrates how seismicity and tectonics/physics parameters that can potentially influence the spatio-temporal variability of earthquakes and presents several advantages compared to more traditional approaches.

  14. Cost-effectiveness analysis of left atrial appendage occlusion compared with pharmacological strategies for stroke prevention in atrial fibrillation.

    PubMed

    Lee, Vivian Wing-Yan; Tsai, Ronald Bing-Ching; Chow, Ines Hang-Iao; Yan, Bryan Ping-Yen; Kaya, Mehmet Gungor; Park, Jai-Wun; Lam, Yat-Yin

    2016-08-31

    Transcatheter left atrial appendage occlusion (LAAO) is a promising therapy for stroke prophylaxis in non-valvular atrial fibrillation (NVAF) but its cost-effectiveness remains understudied. This study evaluated the cost-effectiveness of LAAO for stroke prophylaxis in NVAF. A Markov decision analytic model was used to compare the cost-effectiveness of LAAO with 7 pharmacological strategies: aspirin alone, clopidogrel plus aspirin, warfarin, dabigatran 110 mg, dabigatran 150 mg, apixaban, and rivaroxaban. Outcome measures included quality-adjusted life years (QALYs), lifetime costs and incremental cost-effectiveness ratios (ICERs). Base-case data were derived from ACTIVE, RE-LY, ARISTOTLE, ROCKET-AF, PROTECT-AF and PREVAIL trials. One-way sensitivity analysis varied by CHADS2 score, HAS-BLED score, time horizons, and LAAO costs; and probabilistic sensitivity analysis using 10,000 Monte Carlo simulations was conducted to assess parameter uncertainty. LAAO was considered cost-effective compared with aspirin, clopidogrel plus aspirin, and warfarin, with ICER of US$5,115, $2,447, and $6,298 per QALY gained, respectively. LAAO was dominant (i.e. less costly but more effective) compared to other strategies. Sensitivity analysis demonstrated favorable ICERs of LAAO against other strategies in varied CHADS2 score, HAS-BLED score, time horizons (5 to 15 years) and LAAO costs. LAAO was cost-effective in 86.24 % of 10,000 simulations using a threshold of US$50,000/QALY. Transcatheter LAAO is cost-effective for prevention of stroke in NVAF compared with 7 pharmacological strategies. The transcatheter left atrial appendage occlusion (LAAO) is considered cost-effective against the standard 7 oral pharmacological strategies including acetylsalicylic acid (ASA) alone, clopidogrel plus ASA, warfarin, dabigatran 110 mg, dabigatran 150 mg, apixaban, and rivaroxaban for stroke prophylaxis in non-valvular atrial fibrillation management.

  15. Time variations of solar UV irradiance as measured by the SOLSTICE (UARS) instrument

    NASA Technical Reports Server (NTRS)

    London, Julius; Rottman, Gary J.; Woods, Thomas N.; Wu, Fie

    1993-01-01

    An analysis is presented of solar ultraviolet irradiance measurements made by the SOLSTICE spectrometers on the Upper Atmosphere Research Satellite (UARS). Reported observations cover the wavelength interval 119-420 nm, and the analysis discussed here is for the time period 26 Nov 1991 to 31 Dec 1992, during which time solar activity decreased in intensity. At the time of peak activity, the average 27-day variation had a relative amplitude of about 8 percent at Ly-alpha, tailing off to about 0.6 percent at 260 nm. It is shown that over the spectral interval 119-260 nm, the relative 27-day harmonic was about a factor of two larger during the strongly disturbed as compared with the moderately disturbed period.

  16. Real-Time Stability Margin Measurements for X-38 Robustness Analysis

    NASA Technical Reports Server (NTRS)

    Bosworth, John T.; Stachowiak, Susan J.

    2005-01-01

    A method has been developed for real-time stability margin measurement calculations. The method relies on a tailored-forced excitation targeted to a specific frequency range. Computation of the frequency response is matched to the specific frequencies contained in the excitation. A recursive Fourier transformation is used to make the method compatible with real-time calculation. The method was incorporated into the X-38 nonlinear simulation and applied to an X-38 robustness test. X-38 stability margins were calculated for different variations in aerodynamic and mass properties over the vehicle flight trajectory. The new method showed results comparable to more traditional stability analysis techniques, and at the same time, this new method provided coverage that is more complete and increased efficiency.

  17. Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal

    Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.

  18. Cost-effectiveness analysis of timely dialysis referral after renal transplant failure in Spain.

    PubMed

    Villa, Guillermo; Sánchez-Álvarez, Emilio; Cuervo, Jesús; Fernández-Ortiz, Lucía; Rebollo, Pablo; Ortega, Francisco

    2012-08-16

    A cost-effectiveness analysis of timely dialysis referral after renal transplant failure was undertaken from the perspective of the Public Administration. The current Spanish situation, where all the patients undergoing graft function loss are referred back to dialysis in a late manner, was compared to an ideal scenario where all the patients are timely referred. A Markov model was developed in which six health states were defined: hemodialysis, peritoneal dialysis, kidney transplantation, late referral hemodialysis, late referral peritoneal dialysis and death. The model carried out a simulation of the progression of renal disease for a hypothetical cohort of 1,000 patients aged 40, who were observed in a lifetime temporal horizon of 45 years. In depth sensitivity analyses were performed in order to ensure the robustness of the results obtained. Considering a discount rate of 3 %, timely referral showed an incremental cost of 211 €, compared to late referral. This cost increase was however a consequence of the incremental survival observed. The incremental effectiveness was 0.0087 quality-adjusted life years (QALY). When comparing both scenarios, an incremental cost-effectiveness ratio of 24,390 €/QALY was obtained, meaning that timely dialysis referral might be an efficient alternative if a willingness-to-pay threshold of 45,000 €/QALY is considered. This result proved to be independent of the proportion of late referral patients observed. The acceptance probability of timely referral was 61.90 %, while late referral was acceptable in 38.10 % of the simulations. If we however restrict the analysis to those situations not involving any loss of effectiveness, the acceptance probability of timely referral was 70.10 %, increasing twofold that of late referral (29.90 %). Timely dialysis referral after graft function loss might be an efficient alternative in Spain, improving both patients' survival rates and health-related quality of life at an affordable cost. Spanish Public Health authorities might therefore promote the inclusion of specific recommendations for this group of patients within the existing clinical guidelines.

  19. Asymmetry in gait pattern following bicondylar tibial plateau fractures-A prospective one-year cohort study.

    PubMed

    Elsoe, Rasmus; Larsen, Peter

    2017-07-01

    Despite the high number of studies evaluating outcomes following tibial plateau fractures, the literature lacks studies including the objective assessment of gait pattern. The purpose of the present study was to evaluate asymmetry in gait patterns at 12 months after frame removal following ring fixation of a tibial plateau fracture. The study design was a prospective cohort study. The primary outcome measurement was the gait patterns 12 months after frame removal measured with a pressure-sensitive mat. The mat registers footprints and present gait speed, cadence, as well as temporal and spatial parameters of the gait cycle. Gait patterns were compared to a healthy reference population. Twenty-three patients were included with a mean age of 54.4 years (32-78 years). Patients presented with a shorter step-length of the injured leg compared to the non-injured leg (asymmetry of 11.3%). Analysis of single-support showed shorter support time of the injured leg compared to the non-injured leg (asymmetry of 8.7%). Moreover, analysis of swing-time showed increased swing-time of the injured leg (asymmetry of 8.9%). Compared to a healthy reference population, increased asymmetry in all gait patterns was observed. The association between asymmetry and health-related quality of life (HRQOL) showed moderate associations (single-support: R=0.50, P=0.03; step-length: R=0.43, P=0.07; swing-time: R=0.46, P=0.05). Compared to a healthy reference population, gait asymmetry is common 12 months after frame removal in patients treated with external ring fixation following a tibial plateau fracture of the tibia. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Duration of motor block with intrathecal ropivacaine versus bupivacaine for caesarean section: a meta-analysis.

    PubMed

    Malhotra, R; Johnstone, C; Halpern, S; Hunter, J; Banerjee, A

    2016-08-01

    Bupivacaine is a commonly used local anaesthetic for spinal anaesthesia for caesarean section, but may produce prolonged motor block, delaying discharge from the post-anaesthesia care unit. Ropivacaine may have a shorter time to recovery of motor function compared with bupivacaine. We performed a meta-analysis to assess the time difference in duration of motor block with intrathecal ropivacaine compared with bupivacaine for caesarean section. We searched MEDLINE, EMBASE and Cochrane Central Register of Controlled Trials databases for randomised controlled trials comparing ropivacaine with bupivacaine in parturients undergoing elective caesarean section under spinal anaesthesia. The primary outcome was the duration of motor block. Secondary outcomes included the time to onset of sensory block, need for conversion to general anaesthesia and the incidence of hypotension. Thirteen trials comprising 743 spinal anaesthetics were included. Intrathecal ropivacaine resulted in a reduced duration of motor block, regressing 35.7min earlier compared with intrathecal bupivacaine (P<0.00001). There was no difference in the time to onset of sensory block (P=0.25) or the incidence of hypotension (P=0.10). Limited data suggested no difference in the rate of conversion to general anaesthesia, but an earlier request for postoperative analgesia with ropivacaine. Compared with bupivacaine, intrathecal ropivacaine is associated with more rapid recovery of motor block despite similar sensory properties and no increased rate of conversion to general anaesthesia. This may be useful in centres in which recovery of motor block is a criterion for discharge from the post-anaesthesia care unit. However, small numbers of trials and significant heterogeneity limit the interpretation of our results. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Robotic assisted versus pure laparoscopic surgery of the adrenal glands: a case-control study comparing surgical techniques.

    PubMed

    Morelli, Luca; Tartaglia, Dario; Bronzoni, Jessica; Palmeri, Matteo; Guadagni, Simone; Di Franco, Gregorio; Gennai, Andrea; Bianchini, Matteo; Bastiani, Luca; Moglia, Andrea; Ferrari, Vincenzo; Fommei, Enza; Pietrabissa, Andrea; Di Candio, Giulio; Mosca, Franco

    2016-11-01

    The role of the da Vinci Robotic System ® in adrenal gland surgery is not yet well defined. The goal of this study was to compare robotic-assisted surgery with pure laparoscopic surgery in a single center. One hundred and 16 patients underwent minimally invasive adrenalectomies in our department between June 1994 and December 2014, 41 of whom were treated with a robotic-assisted approach (robotic adrenalectomy, RA). Patients who underwent RA were matched according to BMI, age, gender, and nodule dimensions, and compared with 41 patients who had undergone laparoscopic adrenalectomies (LA). Statistical analysis was performed using the Student's t test for independent samples, and the relationship between the operative time and other covariates were evaluated with a multivariable linear regression model. P < 0.05 was considered significant. Mean operative time was significantly shorter in the RA group compared to the LA group. The subgroup analysis showed a shorter mean operative time in the RA group in patients with nodules ≥6 cm, BMI ≥ 30 kg/m 2 and in those who had previous abdominal surgery (p < 0.05). Results from the multiple regression model confirmed a shorter mean operative time with RA with nodules ≥6 cm (p = 0.010). Conversion rate and postoperative complications were 2.4 and 4.8 % in the LA group and 0 and 4.8 % in the RA group. In our experience, RA shows potential benefits compared to classic LA, in particular on patients with nodules ≥6 cm, BMI ≥ 30 kg/m2, and with previous abdominal surgery.

  2. A comparative analysis of the dependences of the hemodynamic parameters on changes in ROI's position in perfusion CT scans

    NASA Astrophysics Data System (ADS)

    Choi, Yong-Seok; Cho, Jae-Hwan; Namgung, Jang-Sun; Kim, Hyo-Jin; Yoon, Dae-Young; Lee, Han-Joo

    2013-05-01

    This study performed a comparative analysis of cerebral blood volume (CBV), cerebral blood flow (CBF), mean transit time (MTT), and mean time-to-peak (TTP) obtained by changing the region of interest's (ROI) anatomical positions, during CT brain perfusion. We acquired axial source images of perfusion CT from 20 patients undergoing CT perfusion exams due to brain trauma. Subsequently, the CBV, CBF, MTT, and TTP values were calculated through data-processing of the perfusion CT images. The color scales for the CBV, CBF, MTT, and TTP maps were obtained using the image data. Anterior cerebral artery (ACA) was taken as the standard ROI for the calculations of the perfusion values. Differences in the hemodynamic average values were compared in a quantitative analysis by placing ROI and the dividing axial images into proximal, middle, and distal segments anatomically. By performing the qualitative analysis using a blind test, we observed changes in the sensory characteristics by using the color scales of the CBV, CBF, and MTT maps in the proximal, middle, and distal segments. According to the qualitative analysis, no differences were found in CBV, CBF, MTT, and TTP values of the proximal, middle, and distal segments and no changes were detected in the color scales of the the CBV, CBF, MTT, and TTP maps in the proximal, middle, and distal segments. We anticipate that the results of the study will useful in assessing brain trauma patients using by perfusion imaging.

  3. Analysis of the High-Frequency Content in Human QRS Complexes by the Continuous Wavelet Transform: An Automatized Analysis for the Prediction of Sudden Cardiac Death.

    PubMed

    García Iglesias, Daniel; Roqueñi Gutiérrez, Nieves; De Cos, Francisco Javier; Calvo, David

    2018-02-12

    Fragmentation and delayed potentials in the QRS signal of patients have been postulated as risk markers for Sudden Cardiac Death (SCD). The analysis of the high-frequency spectral content may be useful for quantification. Forty-two consecutive patients with prior history of SCD or malignant arrhythmias (patients) where compared with 120 healthy individuals (controls). The QRS complexes were extracted with a modified Pan-Tompkins algorithm and processed with the Continuous Wavelet Transform to analyze the high-frequency content (85-130 Hz). Overall, the power of the high-frequency content was higher in patients compared with controls (170.9 vs. 47.3 10³nV²Hz -1 ; p = 0.007), with a prolonged time to reach the maximal power (68.9 vs. 64.8 ms; p = 0.002). An analysis of the signal intensity (instantaneous average of cumulative power), revealed a distinct function between patients and controls. The total intensity was higher in patients compared with controls (137.1 vs. 39 10³nV²Hz -1 s -1 ; p = 0.001) and the time to reach the maximal intensity was also prolonged (88.7 vs. 82.1 ms; p < 0.001). The high-frequency content of the QRS complexes was distinct between patients at risk of SCD and healthy controls. The wavelet transform is an efficient tool for spectral analysis of the QRS complexes that may contribute to stratification of risk.

  4. Real-time solar magnetograph operation system software design and user's guide

    NASA Technical Reports Server (NTRS)

    Wang, C.

    1984-01-01

    The Real Time Solar Magnetograph (RTSM) Operation system software design on PDP11/23+ is presented along with the User's Guide. The RTSM operation software is for real time instrumentation control, data collection and data management. The data is used for vector analysis, plotting or graphics display. The processed data is then easily compared with solar data from other sources, such as the Solar Maximum Mission (SMM).

  5. Far-field radiation patterns of aperture antennas by the Winograd Fourier transform algorithm

    NASA Technical Reports Server (NTRS)

    Heisler, R.

    1978-01-01

    A more time-efficient algorithm for computing the discrete Fourier transform, the Winograd Fourier transform (WFT), is described. The WFT algorithm is compared with other transform algorithms. Results indicate that the WFT algorithm in antenna analysis appears to be a very successful application. Significant savings in cpu time will improve the computer turn around time and circumvent the need to resort to weekend runs.

  6. Lung function in type 2 diabetes: the Normative Aging Study.

    PubMed

    Litonjua, Augusto A; Lazarus, Ross; Sparrow, David; Demolles, Debbie; Weiss, Scott T

    2005-12-01

    Cross-sectional studies have noted that subjects with diabetes have lower lung function than non-diabetic subjects. We conducted this analysis to determine whether diabetic subjects have different rates of lung function change compared with non-diabetic subjects. We conducted a nested case-control analysis in 352 men who developed diabetes and 352 non-diabetic subjects in a longitudinal observational study of aging in men. We assessed lung function among cases and controls at three time points: Time0, prior to meeting the definition of diabetes; Time1, the point when the definition of diabetes was met; and Time2, the most recent follow-up exam. Cases had lower forced expiratory volume in 1s (FEV1) and forced vital capacity (FVC) at all time points, even with adjustment for age, height, weight, and smoking. In multiple linear regression models adjusting for relevant covariates, there were no differences in rates of FEV1 or FVC change over time between cases and controls. Men who are predisposed to develop diabetes have decreased lung function many years prior to the diagnosis, compared with men who do not develop diabetes. This decrement in lung function remains after the development of diabetes. We postulate that mechanisms involved in the insulin resistant state contribute to the diminished lung function observed in our subjects.

  7. Reproducibility of DCE-MRI time-intensity curve-shape analysis in patients with knee arthritis: A comparison with qualitative and pharmacokinetic analyses.

    PubMed

    van der Leij, Christiaan; Lavini, Cristina; van de Sande, Marleen G H; de Hair, Marjolein J H; Wijffels, Christophe; Maas, Mario

    2015-12-01

    To compare the between-session reproducibility of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) combined with time-intensity curve (TIC)-shape analysis in arthritis patients, within one scanner and between two different scanners, and to compare this method with qualitative analysis and pharmacokinetic modeling (PKM). Fifteen knee joint arthritis patients were included and scanned twice on a closed-bore 1.5T scanner (n = 9, group 1), or on a closed-bore 1.5T and on an open-bore 1.0T scanner (n = 6, group 2). DCE-MRI data were postprocessed using in-house developed software ("Dynamo"). Disease activity was assessed. Disease activity was comparable between the two visits. In group 1 qualitative analysis showed the highest reproducibility with intraclass correlation coefficients (ICCs) between 0.78 and 0.98 and root mean square-coefficients of variation (RMS-CoV) of 8.0%-14.9%. TIC-shape analysis showed a slightly lower reproducibility with similar ICCs (0.78-0.97) but higher RMS-CoV (18.3%-42.9%). The PKM analysis showed the lowest reproducibility with ICCs between 0.39 and 0.64 (RMS-CoV 21.5%-51.9%). In group 2 TIC-shape analysis of the two most important TIC-shape types showed the highest reproducibility with ICCs of 0.78 and 0.71 (RMS-CoV 29.8% and 59.4%) and outperformed the reproducibility of the most important qualitative parameter (ICC 0.31, RMS-CoV 45.1%) and the within-scanner reproducibility of PKM analysis. TIC-shape analysis is a robust postprocessing method within one scanner, almost as reproducible as the qualitative analysis. Between scanners, the reproducibility of the most important TIC-shapes outperform that of the most important qualitative parameter and the within-scanner reproducibility of PKM analysis. © 2015 Wiley Periodicals, Inc.

  8. Ultrasonic dissection versus electrocautery in mastectomy for breast cancer - a meta-analysis.

    PubMed

    Currie, A; Chong, K; Davies, G L; Cummins, R S

    2012-10-01

    Electrocautery has advanced the practice of mastectomy but significant morbidity, such as seroma and blood loss, remains a concern. This has led to newer forms of dissection being introduced including the ultrasonic dissection devices, which are thought to reduce tissue damage. The aim of this systematic review was to compare the outcomes after mastectomy using novel ultrasonic dissection or standard electrocautery in published trials. Medline, Embase, trial registries, conference proceedings and reference lists were searched for comparative trials of ultrasonic dissection versus electrocautery for mastectomy. The primary outcomes were total postoperative drainage, seroma development and intra-operative blood loss. Secondary outcomes were operative time and wound complications. Odds ratios were calculated for categorical outcomes and standardised mean differences for continuous outcomes. Six trials were included in the analysis of 287 mastectomies. There was no effect in total postoperative drainage (pooled analysis weight mean difference: -0.21 (95% CI: -0.70-0.29); p = 0.41) or seroma development (pooled analysis odds ratio: 0.77 (95% CIs 0.43-1.37); p = 0.37). Intra-operative blood was slightly less for ultrasonic dissection compared to standard electrocautery (pooled analysis weight mean difference: -1.04 (95% CI: -2.00 to -0.08); p = 0.03). Ultrasonic dissection and standard electrocautery had similar outcomes with regard to operative time and wound complications. Ultrasonic dissection and standard electrocautery appear to deliver similar results in the mastectomy setting. Further cost-effectiveness analysis may guide surgeon selection in the use of new technologies for mastectomy. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. A Comparative Analysis of the Costs of Administration of an OSCE.

    ERIC Educational Resources Information Center

    Cusimano, Michael D.; And Others

    1994-01-01

    A study compared the costs of a six-station structured oral examination and an objective structured clinical examination (OSCE) for a surgery clerkship, assessing both faculty and materials costs. The OSCE was found to be more time consuming and expensive. Cost-cutting measures and guidelines to assist medical schools in selecting test type are…

  10. An Analysis of Mathematics Interventions: Increased Time-on-Task Compared with Computer-Assisted Mathematics Instruction

    ERIC Educational Resources Information Center

    Calhoun, James M., Jr.

    2011-01-01

    Student achievement is not progressing on mathematics as measured by state, national, and international assessments. Much of the research points to mathematics curriculum and instruction as the root cause of student failure to achieve at levels comparable to other nations. Since mathematics is regarded as a gate keeper to many educational…

  11. 75 FR 57272 - The Dun & Bradstreet Corporation; Analysis of Agreement Containing Consent Order to Aid Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    ..., demographic, and other information that allow their customers to market to teachers, administrators, schools... turning to the other company. By contrast, MCH lacked a K-12 database comparable to MDR or QED's..., including the time and cost to develop a database with market coverage and accuracy comparable to MDR or QED...

  12. Cover-Copy-Compare and Spelling: One versus Three Repetitions

    ERIC Educational Resources Information Center

    Erion, Joel; Davenport, Cindy; Rodax, Nicole; Scholl, Bethany; Hardy, Jennifer

    2009-01-01

    Cover, copy, compare (CCC) has been used with success to improve spelling skills. This study adds to existing research by completing an analysis of the rewriting component of the intervention. The impact of varying the number of times a subject copied a word following an error was examined with four elementary age students. An adaptive alternating…

  13. An Analysis of Peer-Reviewed Scores and Impact Factors with Different Citation Time Windows: A Case Study of 28 Ophthalmologic Journals

    PubMed Central

    Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu

    2015-01-01

    Background An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. Methods The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor’s definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Results Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals’ impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Research Limitations Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. Originality/ Value We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation. PMID:26295157

  14. An Analysis of Peer-Reviewed Scores and Impact Factors with Different Citation Time Windows: A Case Study of 28 Ophthalmologic Journals.

    PubMed

    Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu

    2015-01-01

    An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor's definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals' impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation.

  15. Comparing groundwater recharge and storage variability from GRACE satellite observations with observed water levels and recharge model simulations

    NASA Astrophysics Data System (ADS)

    Allen, D. M.; Henry, C.; Demon, H.; Kirste, D. M.; Huang, J.

    2011-12-01

    Sustainable management of groundwater resources, particularly in water stressed regions, requires estimates of groundwater recharge. This study in southern Mali, Africa compares approaches for estimating groundwater recharge and understanding recharge processes using a variety of methods encompassing groundwater level-climate data analysis, GRACE satellite data analysis, and recharge modelling for current and future climate conditions. Time series data for GRACE (2002-2006) and observed groundwater level data (1982-2001) do not overlap. To overcome this problem, GRACE time series data were appended to the observed historical time series data, and the records compared. Terrestrial water storage anomalies from GRACE were corrected for soil moisture (SM) using the Global Land Data Assimilation System (GLDAS) to obtain monthly groundwater storage anomalies (GRACE-SM), and monthly recharge estimates. Historical groundwater storage anomalies and recharge were determined using the water table fluctuation method using observation data from 15 wells. Historical annual recharge averaged 145.0 mm (or 15.9% of annual rainfall) and compared favourably with the GRACE-SM estimate of 149.7 mm (or 14.8% of annual rainfall). Both records show lows and peaks in May and September, respectively; however, the peak for the GRACE-SM data is shifted later in the year to November, suggesting that the GLDAS may poorly predict the timing of soil water storage in this region. Recharge simulation results show good agreement between the timing and magnitude of the mean monthly simulated recharge and the regional mean monthly storage anomaly hydrograph generated from all monitoring wells. Under future climate conditions, annual recharge is projected to decrease by 8% for areas with luvisols and by 11% for areas with nitosols. Given this potential reduction in groundwater recharge, there may be added stress placed on an already stressed resource.

  16. Energy dependence of SEP electron and proton onset times

    NASA Astrophysics Data System (ADS)

    Xie, H.; Mäkelä, P.; Gopalswamy, N.; St. Cyr, O. C.

    2016-07-01

    We study the large solar energetic particle (SEP) events that were detected by GOES in the >10 MeV energy channel during December 2006 to March 2014. We derive and compare solar particle release (SPR) times for the 0.25-10.4 MeV electrons and 10-100 MeV protons for the 28 SEP events. In the study, the electron SPR times are derived with the time-shifting analysis (TSA) and the proton SPR times are derived using both the TSA and the velocity dispersion analysis (VDA). Electron anisotropies are computed to evaluate the amount of scattering for the events under study. Our main results include (1) near-relativistic electrons and high-energy protons are released at the same time within 8 min for most (16 of 23) SEP events. (2)There exists a good correlation between electron and proton acceleration, peak intensity, and intensity time profiles. (3) The TSA SPR times for 90.5 MeV and 57.4 MeV protons have maximum errors of 6 min and 10 min compared to the proton VDA release times, respectively, while the maximum error for 15.4 MeV protons can reach to 32 min. (4) For 7 low-intensity events of the 23, large delays occurred for 6.5 MeV electrons and 90.5 MeV protons relative to 0.5 MeV electrons. Whether these delays are due to times needed for the evolving shock to be strengthened or due to particle transport effects remains unsolved.

  17. A COMPARATIVE STUDY OF REAL-TIME AND STATIC ULTRASONOGRAPHY DIAGNOSES FOR THE INCIDENTAL DETECTION OF DIFFUSE THYROID DISEASE.

    PubMed

    Kim, Dong Wook

    2015-08-01

    The aim of this study was to compare the diagnostic accuracy of real-time and static ultrasonography (US) for the incidental detection of diffuse thyroid disease (DTD). In 118 consecutive patients, a single radiologist performed real-time US before thyroidectomy. For static US, the same radiologist retrospectively investigated the sonographic findings on a picture-archiving and communication system after 3 months. The diagnostic categories of both real-time and static US diagnoses were determined based on the number of abnormal findings, and the diagnostic indices were calculated by a receiver operating characteristic (ROC) curve analysis using the histopathologic results as the reference standard. Histopathologic results included normal thyroid (n = 77), Hashimoto thyroiditis (n = 11), non-Hashimoto lymphocytic thyroiditis (n = 29), and diffuse hyperplasia (n = 1). Normal thyroid and DTD showed significant differences in echogenicity, echotexture, glandular margin, and vascularity on both real-time and static US. There was a positive correlation between US categories and histopathologic results in both real-time and static US. The highest diagnostic indices were obtained when the cutoff criteria of real-time and static US diagnoses were chosen as indeterminate and suspicious for DTD, respectively. The ROC curve analysis showed that real-time US was superior to static US in diagnostic accuracy. Both real-time and static US may be helpful for the detection of incidental DTD, but real-time US is superior to static US for detecting incidental DTD.

  18. Nonstationary Dynamics Data Analysis with Wavelet-SVD Filtering

    NASA Technical Reports Server (NTRS)

    Brenner, Marty; Groutage, Dale; Bessette, Denis (Technical Monitor)

    2001-01-01

    Nonstationary time-frequency analysis is used for identification and classification of aeroelastic and aeroservoelastic dynamics. Time-frequency multiscale wavelet processing generates discrete energy density distributions. The distributions are processed using the singular value decomposition (SVD). Discrete density functions derived from the SVD generate moments that detect the principal features in the data. The SVD standard basis vectors are applied and then compared with a transformed-SVD, or TSVD, which reduces the number of features into more compact energy density concentrations. Finally, from the feature extraction, wavelet-based modal parameter estimation is applied.

  19. Analysis of the unbalanced NBI rotation experiments in the ISX-B, PLT and PDX tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stacey, W.M. Jr.; Ryu, C.M.; Malik, M.A.

    1985-07-01

    The recently developed Stacey-Sigmar theory for toroidal momentum confinement, which is based upon neoclassical gyroviscosity, has been applied to the analysis of the unbalanced NBI rotation experiments in ISX-B, PLT and PDX. Measured steady-state rotation velocities, momentum confinement times inferred therefrom and momentum confinement times inferred from rotation decay after termination of NBI were compared with theoretical predictions. Good agreement between theory and experiment was obtained over a wide range of the parameters which enter the theory (R,Z,T,B).

  20. Tooth brushing frequency and risk of new carious lesions.

    PubMed

    Holmes, Richard D

    2016-12-01

    Data sourcesMedline, Embase, CINHAL and the Cochrane databases.Study selectionTwo reviewers selected studies, and case-control, prospective cohort, retrospective cohort and experimental trials evaluating the effect of toothbrushing frequency on the incidence or increment of new carious lesions were considered.Data extraction and synthesisTwo reviewers undertook data abstraction independently using pre-piloted forms. Study quality was assessed using a quality assessment tool for quantitative studies developed by the Effective Public Health Practice Project (EPHPP). Meta-analysis of caries outcomes was carried out using RefMan and meta-regressions undertaken to assess the influence of sample size, follow-up period, caries diagnosis level and study methodological quality.ResultsThirty-three studies were included of which 13 were considered to be methodologically strong, 14 moderate and six weak. Twenty-five studies contributed to the quantitative analysis. Compared with frequent brushers, self-reported infrequent brushers demonstrated a higher incidence of carious lesions, OR=1.50 (95%CI: 1.34 -1.69). The odds of having carious lesions differed little when subgroup analysis was conducted to compare the incidence between ≥2 times/d vs <2 times/d; OR=1.45; (95%CI; 1.21 - 1.74) and ≥1 time/d vs <1 time/d brushers OR=1.56; (95%CI; 1.37 - 1.78). Brushing <2 times/day significantly caused an increment of carious lesions compared with ≥2/day brushing, standardised mean difference [SMD] =0.34; (95%CI; 0.18 - 0.49). Overall, infrequent brushing was associated with an increment of carious lesions, SMD= 0.28; (95%CI; 0.13 - 0.44). Meta-analysis conducted with the type of dentition as subgroups found the effect of infrequent brushing on incidence and increment of carious lesions was higher in deciduous, OR=1.75; (95%CI; 1.49 - 2.06) than permanent dentition OR=1.39; (95% CI: 1.29 -1.49). Meta-regression indicated that none of the included variables influenced the effect estimate.ConclusionsIndividuals who state that they brush their teeth infrequently are at greater risk for the incidence or increment of new carious lesions than those brushing more frequently. The effect is more pronounced in the deciduous than in the permanent dentition. A few studies indicate that this effect is independent of the presence of fluoride in toothpaste.

  1. Comparison of ITRF2014 station coordinate input time series of DORIS, VLBI and GNSS

    NASA Astrophysics Data System (ADS)

    Tornatore, Vincenza; Tanır Kayıkçı, Emine; Roggero, Marco

    2016-12-01

    In this paper station coordinate time series from three space geodesy techniques that have contributed to the realization of the International Terrestrial Reference Frame 2014 (ITRF2014) are compared. In particular the height component time series extracted from official combined intra-technique solutions submitted for ITRF2014 by DORIS, VLBI and GNSS Combination Centers have been investigated. The main goal of this study is to assess the level of agreement among these three space geodetic techniques. A novel analytic method, modeling time series as discrete-time Markov processes, is presented and applied to the compared time series. The analysis method has proven to be particularly suited to obtain quasi-cyclostationary residuals which are an important property to carry out a reliable harmonic analysis. We looked for common signatures among the three techniques. Frequencies and amplitudes of the detected signals have been reported along with their percentage of incidence. Our comparison shows that two of the estimated signals, having one-year and 14 days periods, are common to all the techniques. Different hypotheses on the nature of the signal having a period of 14 days are presented. As a final check we have compared the estimated velocities and their standard deviations (STD) for the sites that co-located the VLBI, GNSS and DORIS stations, obtaining a good agreement among the three techniques both in the horizontal (1.0 mm/yr mean STD) and in the vertical (0.7 mm/yr mean STD) component, although some sites show larger STDs, mainly due to lack of data, different data spans or noisy observations.

  2. Early mortality among children and adults in antiretroviral therapy programs in Southwest Ethiopia, 2003-15.

    PubMed

    Gesesew, Hailay Abrha; Ward, Paul; Woldemichael, Kifle; Mwanri, Lillian

    2018-01-01

    Several studies reported that the majority of deaths in HIV-infected people are documented in their early antiretroviral therapy (ART) follow-ups. Early mortality refers to death of people on ART for follow up period of below 24 months due to any cause. The current study assessed predictors of early HIV mortality in Southwest Ethiopia. We have conducted a retrospective analysis of 5299 patient records dating from June 2003- March 2015. To estimate survival time and compare the time to event among the different groups of patients, we used a Kaplan Meir curve and log-rank test. To identify mortality predictors, we used a cox regression analysis. We used SPSS-20 for all analyses. A total of 326 patients died in the 12 years follow-up period contributing to 6.2% cumulative incidence and 21.7 deaths per 1000 person-year observations incidence rate. Eighty-nine percent of the total deaths were documented in the first two years follow up-an early-term ART follow up. Early HIV mortality rates among adults were 50% less in separated, divorced or widowed patients compared with never married patients, 1.6 times higher in patients with baseline CD4 count <200 cells/μL compared to baseline CD4 count ≥200 cells/μL, 1.5 times higher in patients with baseline WHO clinical stage 3 or 4 compared to baseline WHO clinical stage 1 or 2, 2.1 times higher in patients with immunologic failure compared with no immunologic failure, 60% less in patients with fair or poor compared with good adherence, 2.9 times higher in patients with bedridden functional status compared to working functional status, and 2.7 times higher with patients who had no history of HIV testing before diagnosis compared to those who had history of HIV testing. Most predictors of early mortality remained the same to the predictors of an overall HIV mortality. When discontinuation was assumed as an event, the predictors of an overall HIV mortality included age between 25-50 years, base line CD4 count, developing immunologic failure, bedridden functional status, and no history of HIV testing before diagnosis. The great majority of deaths were documented in the first two years of ART, and several predictors of early HIV mortality were also for the overall mortality when discontinuation was assumed as event or censored. Considering the above population, interventions to improve HIV program in the first two years of ART follow up should be improved.

  3. Rapid simultaneous high-resolution mapping of myelin water fraction and relaxation times in human brain using BMC-mcDESPOT.

    PubMed

    Bouhrara, Mustapha; Spencer, Richard G

    2017-02-15

    A number of central nervous system (CNS) diseases exhibit changes in myelin content and magnetic resonance longitudinal, T 1 , and transverse, T 2 , relaxation times, which therefore represent important biomarkers of CNS pathology. Among the methods applied for measurement of myelin water fraction (MWF) and relaxation times, the multicomponent driven equilibrium single pulse observation of T 1 and T 2 (mcDESPOT) approach is of particular interest. mcDESPOT permits whole brain mapping of multicomponent T 1 and T 2 , with data acquisition accomplished within a clinically realistic acquisition time. Unfortunately, previous studies have indicated the limited performance of mcDESPOT in the setting of the modest signal-to-noise range of high-resolution mapping, required for the depiction of small structures and to reduce partial volume effects. Recently, we showed that a new Bayesian Monte Carlo (BMC) analysis substantially improved determination of MWF from mcDESPOT imaging data. However, our previous study was limited in that it did not discuss determination of relaxation times. Here, we extend the BMC analysis to the simultaneous determination of whole-brain MWF and relaxation times using the two-component mcDESPOT signal model. Simulation analyses and in-vivo human brain studies indicate the overall greater performance of this approach compared to the stochastic region contraction (SRC) algorithm, conventionally used to derive parameter estimates from mcDESPOT data. SRC estimates of the transverse relaxation time of the long T 2 fraction, T 2,l , and the longitudinal relaxation time of the short T 1 fraction, T 1,s , clustered towards the lower and upper parameter search space limits, respectively, indicating failure of the fitting procedure. We demonstrate that this effect is absent in the BMC analysis. Our results also showed improved parameter estimation for BMC as compared to SRC for high-resolution mapping. Overall we find that the combination of BMC analysis and mcDESPOT, BMC-mcDESPOT, shows excellent performance for accurate high-resolution whole-brain mapping of MWF and bi-component transverse and longitudinal relaxation times within a clinically realistic acquisition time. Published by Elsevier Inc.

  4. Reliability of measuring sciatic and tibial nerve movement with diagnostic ultrasound during a neural mobilisation technique.

    PubMed

    Ellis, Richard; Hing, Wayne; Dilley, Andrew; McNair, Peter

    2008-08-01

    Diagnostic ultrasound provides a technique whereby real-time, in vivo analysis of peripheral nerve movement is possible. This study measured sciatic nerve movement during a "slider" neural mobilisation technique (ankle dorsiflexion/plantar flexion and cervical extension/flexion). Transverse and longitudinal movement was assessed from still ultrasound images and video sequences by using frame-by-frame cross-correlation software. Sciatic nerve movement was recorded in the transverse and longitudinal planes. For transverse movement, at the posterior midthigh (PMT) the mean value of lateral sciatic nerve movement was 3.54 mm (standard error of measurement [SEM] +/- 1.18 mm) compared with anterior-posterior/vertical (AP) movement of 1.61 mm (SEM +/- 0.78 mm). At the popliteal crease (PC) scanning location, lateral movement was 6.62 mm (SEM +/- 1.10 mm) compared with AP movement of 3.26 mm (SEM +/- 0.99 mm). Mean longitudinal sciatic nerve movement at the PMT was 3.47 mm (SEM +/- 0.79 mm; n = 27) compared with the PC of 5.22 mm (SEM +/- 0.05 mm; n = 3). The reliability of ultrasound measurement of transverse sciatic nerve movement was fair to excellent (Intraclass correlation coefficient [ICC] = 0.39-0.76) compared with excellent (ICC = 0.75) for analysis of longitudinal movement. Diagnostic ultrasound presents a reliable, noninvasive, real-time, in vivo method for analysis of sciatic nerve movement.

  5. Questionable sound exposure outside of the womb: frequency analysis of environmental noise in the neonatal intensive care unit.

    PubMed

    Lahav, Amir

    2015-01-01

    Recent research raises concerns about the adverse effects of noise exposure on the developing preterm infant. However, current guidelines for NICU noise remain focused on loudness levels, leaving the problem of exposure to potentially harmful sound frequencies largely overlooked. This study examined the frequency spectra present in a level-II NICU. Noise measurements were taken in two level-II open-bay nurseries. Measurements were taken over 5 days for a period of 24 h each. Spectral analysis was focused on comparing sound frequencies in the range of human speech during daytime (7 AM-7 PM) vs. night-time (7 PM-7 AM). On average, daytime noise levels (Leq = 60.05 dBA) were higher than night-time (Leq = 58.67 dBA). Spectral analysis of frequency bands (>50 dB) revealed that infants were exposed to frequencies <500 Hz 100% of the time and to frequencies >500 Hz 57% of the time. During daytime, infants were exposed to nearly 20% more sounds within the speech frequency range compared with night-time (p = 0.018). Measuring the frequency spectra of NICU sounds is necessary to attain a thorough understanding of both the noise levels and the type of sounds that preterm infants are exposed to throughout their hospital stay. The risk of high-frequency noise exposure in the preterm population is still unclear and warrants further investigation. © 2014 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  6. Four layer bandage compared with short stretch bandage for venous leg ulcers: systematic review and meta-analysis of randomised controlled trials with data from individual patients

    PubMed Central

    Tierney, Jayne; Cullum, Nicky; Bland, J Martin; Franks, Peter J; Mole, Trevor; Scriven, Mark

    2009-01-01

    Objective To compare the effectiveness of two types of compression treatment (four layer bandage and short stretch bandage) in people with venous leg ulceration. Design Systematic review and meta-analysis of patient level data. Data sources Electronic databases (the Cochrane Central Register of Controlled Trials, the Cochrane Wounds Group Specialised Register, Medline, Embase, CINAHL, and National Research Register) and reference lists of retrieved articles searched to identify relevant trials and primary investigators. Primary investigators of eligible trials were invited to contribute raw data for re-analysis. Review methods Randomised controlled trials of four layer bandage compared with short stretch bandage in people with venous leg ulceration were eligible for inclusion. The primary outcome for the meta-analysis was time to healing. Cox proportional hazards models were run to compare the methods in terms of time to healing with adjustment for independent predictors of healing. Secondary outcomes included incidence and number of adverse events per patient. Results Seven eligible trials were identified (887 patients), and patient level data were retrieved for five (797 patients, 90% of known randomised patients). The four layer bandage was associated with significantly shorter time to healing: hazard ratio (95% confidence interval) from multifactorial model based on five trials was 1.31 (1.09 to 1.58), P=0.005. Larger ulcer area at baseline, more chronic ulceration, and previous ulceration were all independent predictors of delayed healing. Data from two trials showed no evidence of a difference in adverse event profiles between the two bandage types. Conclusions Venous leg ulcers in patients treated with four layer bandages heal faster, on average, than those of people treated with the short stretch bandage. Benefits were consistent across patients with differing prognostic profiles. PMID:19376798

  7. A Study of the Factors Associated with Risk for Development of Pressure Ulcers: A Longitudinal Analysis.

    PubMed

    Thomas, Elizebeth; Vinodkumar, Sudhaya; Mathew, Silvia; Setia, Maninder Singh

    2015-01-01

    Pressure ulcers (PUs) are prevalent in hospitalized patients; they may cause clinical, psychological, and economic problems in these patients. Previous studies are cross-sectional, have used pooled data, or cox-regression models to assess the risk for developing PU. However, PU risk scores change over time and models that account for time varying variables are useful for cohort analysis of data. The present longitudinal study was conducted to compare the risk of PU between surgical and nonsurgical patients, and to evaluate the factors associated with the development of these ulcers over a period of time. We evaluated 290 hospitalized patients over a 4 months period. The main outcomes for our analysis were: (1) Score on the pressure risk assessment scale; and (2) the proportion of individuals who were at severe risk for developing PUs. We used random effects models for longitudinal analysis of the data. The mean PU score was significantly higher in the nonsurgical patients compared with surgical patients at baseline (15.23 [3.86] vs. 9.33 [4.57]; P < 0.01). About 7% of the total patients had a score of >20 at baseline and were considered as being at high-risk for PU; the proportion was significantly higher among the nonsurgical patients compared with the surgical patients (14% vs. 4%, P = 0.003). In the adjusted models, there was no difference for severe risk for PU between surgical and nonsurgical patients (odds ratios [ORs]: 0.37, 95% confidence interval [CI]: 0.01-12.80). An additional day in the ward was associated with a significantly higher likelihood of being at high-risk for PU (OR: 1.47, 95% CI: 1.16-1.86). There were no significant differences between patients who were admitted for surgery compared with those who were not. An additional day in the ward, however, is important for developing a high-risk score for PU on the monitoring scale, and these patients require active interventions.

  8. Renal and Metabolic Toxicities Following Initiation of HIV-1 Treatment Regimen in a Diverse, Multinational Setting: A Focused Safety Analysis of ACTG PEARLS (A5175)

    PubMed Central

    Romo, F. Touzard; Smeaton, L.M.; Campbell, T.B.; Riviere, C.; Mngqibisa, R.; Nyirenda, M.; Supparatpinyo, K.; Kumarasamy, N.; Hakim, J.G.; Flanigan, T.P.

    2015-01-01

    Background Convenient dosing, potency, and low toxicity support use of tenofovir disoproxil fumarate (TDF) as preferred nucleotide reverse transcriptase inhibitor (NRTI) for HIV-1 treatment. However, renal and metabolic safety of TDF compared to other NRTIs has not been well described in resource-limited settings. Methods This was a secondary analysis examining the occurrence of renal abnormalities (RAs) and renal and metabolic serious non-AIDS-defining events (SNADEs) through study follow-up between participants randomized to zidovudine (ZDV)/lamivudine/efavirenz and TDF/emtricitabine/efavirenz treatment arms within A5175/PEARLS trial. Exact logistic regression explored associations between baseline covariates and RAs. Response profile longitudinal analysis compared creatinine clearance (CrCl) over time between NRTI groups. Results Twenty-one of 1,045 participants developed RAs through 192 weeks follow-up; there were 15 out of 21 in the TDF arm (P = .08). Age 41 years or older (odds ratio [OR], 3.35; 95% CI, 1.1–13.1), history of diabetes (OR, 10.7; 95% CI, 2.1–55), and lower baseline CrCl (OR, 3.1 per 25 mL/min decline; 95% CI, 1.7–5.8) were associated with development of RAs. Renal SNADEs occurred in 42 participants; 33 were urinary tract infections and 4 were renal failure/insufficiency; one event was attributed to TDF. Significantly lower CrCl values were maintained among patients receiving TDF compared to ZDV (repeated measures analysis P = .05), however worsening CrCl from baseline was not observed with TDF exposure over time. Metabolic SNADEs were rare, but were higher in the ZDV arm (20 vs 3; P < .001). Conclusions TDF is associated with lower serious metabolic toxicities but not higher risk of RAs, serious renal events, or worsening CrCl over time compared to ZDV in this randomized multinational study. PMID:25433664

  9. Newspaper coverage of maternal health in Bangladesh, Rwanda and South Africa: a quantitative and qualitative content analysis

    PubMed Central

    Gugsa, Frey; Karmarkar, Ellora; Cheyne, Andrew; Yamey, Gavin

    2016-01-01

    Objective To examine newspaper coverage of maternal health in three countries that have made varying progress towards Millennium Development Goal 5 (MDG 5): Bangladesh (on track), Rwanda (making progress, but not on track) and South Africa (no progress). Design We analysed each country's leading national English-language newspaper: Bangladesh's The Daily Star, Rwanda's The New Times/The Sunday Times, and South Africa's Sunday Times/The Times. We quantified the number of maternal health articles published from 1 January 2008 to 31 March 2013. We conducted a content analysis of subset of 190 articles published from 1 October 2010 to 31 March 2013. Results Bangladesh's The Daily Star published 579 articles related to maternal health from 1 January 2008 to 31 March 2013, compared to 342 in Rwanda's The New Times/The Sunday Times and 253 in South Africa's Sunday Times/The Times over the same time period. The Daily Star had the highest proportion of stories advocating for or raising awareness of maternal health. Most maternal health articles in The Daily Star (83%) and The New Times/The Sunday Times (69%) used a ‘human-rights’ or ‘policy-based’ frame compared to 41% of articles from Sunday Times/The Times. Conclusions In the three countries included in this study, which are on different trajectories towards MDG 5, there were differences in the frequency, tone and content of their newspaper coverage of maternal health. However, no causal conclusions can be drawn about this association between progress on MDG 5 and the amount and type of media coverage of maternal health. PMID:26769780

  10. Curricula and Programmes in Petroleum Engineering for Higher Technical Education Institutions: Comparative Analysis

    ERIC Educational Resources Information Center

    Tymkiv, Nadiya

    2018-01-01

    The article states the analysis of the curriculum that regulates the main purposes, essence and directions for petroleum training. The importance and necessity of positive usage of Austrian, English and Norwegian experience at the time of petroleum engineers training in the petroleum industry has been stressed on. The structure and content of…

  11. The Use of Time Series Analysis and t Tests with Serially Correlated Data Tests.

    ERIC Educational Resources Information Center

    Nicolich, Mark J.; Weinstein, Carol S.

    1981-01-01

    Results of three methods of analysis applied to simulated autocorrelated data sets with an intervention point (varying in autocorrelation degree, variance of error term, and magnitude of intervention effect) are compared and presented. The three methods are: t tests; maximum likelihood Box-Jenkins (ARIMA); and Bayesian Box Jenkins. (Author/AEF)

  12. Validity of electronic diet recording nutrient estimates compared to dietitian analysis of diet records: A randomized controlled trial

    USDA-ARS?s Scientific Manuscript database

    Background: Dietary intake assessment with diet records (DR) is a standard research and practice tool in nutrition. Manual entry and analysis of DR is time-consuming and expensive. New electronic tools for diet entry by clients and research participants may reduce the cost and effort of nutrient int...

  13. Improved Persistent Scatterer analysis using Amplitude Dispersion Index optimization of dual polarimetry data

    NASA Astrophysics Data System (ADS)

    Esmaeili, Mostafa; Motagh, Mahdi

    2016-07-01

    Time-series analysis of Synthetic Aperture Radar (SAR) data using the two techniques of Small BAseline Subset (SBAS) and Persistent Scatterer Interferometric SAR (PSInSAR) extends the capability of conventional interferometry technique for deformation monitoring and mitigating many of its limitations. Using dual/quad polarized data provides us with an additional source of information to improve further the capability of InSAR time-series analysis. In this paper we use dual-polarized data and combine the Amplitude Dispersion Index (ADI) optimization of pixels with phase stability criterion for PSInSAR analysis. ADI optimization is performed by using Simulated Annealing algorithm to increase the number of Persistent Scatterer Candidate (PSC). The phase stability of PSCs is then measured using their temporal coherence to select the final sets of pixels for deformation analysis. We evaluate the method for a dataset comprising of 17 dual polarization SAR data (HH/VV) acquired by TerraSAR-X data from July 2013 to January 2014 over a subsidence area in Iran and compare the effectiveness of the method for both agricultural and urban regions. The results reveal that using optimum scattering mechanism decreases the ADI values in urban and non-urban regions. As compared to single-pol data the use of optimized polarization increases initially the number of PSCs by about three times and improves the final PS density by about 50%, in particular in regions with high rate of deformation which suffer from losing phase stability over the time. The classification of PS pixels based on their optimum scattering mechanism revealed that the dominant scattering mechanism of the PS pixels in the urban area is double-bounce while for the non-urban regions (ground surfaces and farmlands) it is mostly single-bounce mechanism.

  14. Semi-Automated Trajectory Analysis of Deep Ballistic Penetrating Brain Injury

    PubMed Central

    Folio, Les; Solomon, Jeffrey; Biassou, Nadia; Fischer, Tatjana; Dworzak, Jenny; Raymont, Vanessa; Sinaii, Ninet; Wassermann, Eric M.; Grafman, Jordan

    2016-01-01

    Background Penetrating head injuries (PHIs) are common in combat operations and most have visible wound paths on computed tomography (CT). Objective We assess agreement between an automated trajectory analysis-based assessment of brain injury and manual tracings of encephalomalacia on CT. Methods We analyzed 80 head CTs with ballistic PHI from the Institutional Review Board approved Vietnam head injury registry. Anatomic reports were generated from spatial coordinates of projectile entrance and terminal fragment location. These were compared to manual tracings of the regions of encephalomalacia. Dice’s similarity coefficients, kappa, sensitivities, and specificities were calculated to assess agreement. Times required for case analysis were also compared. Results Results show high specificity of anatomic regions identified on CT with semiautomated anatomical estimates and manual tracings of tissue damage. Radiologist’s and medical students’ anatomic region reports were similar (Kappa 0.8, t-test p < 0.001). Region of probable injury modeling of involved brain structures was sensitive (0.7) and specific (0.9) compared with manually traced structures. Semiautomated analysis was 9-fold faster than manual tracings. Conclusion Our region of probable injury spatial model approximates anatomical regions of encephalomalacia from ballistic PHI with time-saving over manual methods. Results show potential for automated anatomical reporting as an adjunct to current practice of radiologist/neurosurgical review of brain injury by penetrating projectiles. PMID:23707123

  15. C-reactive protein-to-albumin ratio is a predictor of hepatitis B virus related decompensated cirrhosis: time-dependent receiver operating characteristics and decision curve analysis.

    PubMed

    Huang, Si-Si; Xie, Dong-Mei; Cai, Yi-Jing; Wu, Jian-Min; Chen, Rui-Chong; Wang, Xiao-Dong; Song, Mei; Zheng, Ming-Hua; Wang, Yu-Qun; Lin, Zhuo; Shi, Ke-Qing

    2017-04-01

    Hepatitis B virus (HBV) infection remains a major health problem and HBV-related-decompensated cirrhosis (HBV-DC) usually leads to a poor prognosis. Our aim was to determine the utility of inflammatory biomarkers in predicting mortality of HBV-DC. A total of 329 HBV-DC patients were enrolled. Survival estimates for the entire study population were generated using the Kaplan-Meier method. The prognostic values for model for end-stage liver disease (MELD) score, Child-Pugh score, and inflammatory biomarkers neutrophil/lymphocyte ratio, C-reactive protein-to-albumin ratio (CAR), and lymphocyte-to-monocyte ratio (LMR) for HBV-DC were compared using time-dependent receiver operating characteristic curves and time-dependent decision curves. The survival time was 23.1±15.8 months. Multivariate analysis identified age, CAR, LMR, and platelet count as prognostic independent risk factors. Kaplan-Meier analysis indicated that CAR of at least 1.0 (hazard ratio, 7.19; 95% confidence interval, 4.69-11.03), and LMR less than 1.9 (hazard ratio, 2.40; 95% confidence interval, 1.69-3.41) were independently associated with mortality of HBV-DC. The time-dependent receiver operating characteristic indicated that CAR showed the best performance in predicting mortality of HBV-DC compared with LMR, MELD score, and Child-Pugh score. The results were also confirmed by time-dependent decision curves. CAR and LMR were associated with the prognosis of HBV-DC. CAR was superior to LMR, MELD score, and Child-Pugh score in HBV-DC mortality prediction.

  16. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  17. Validating the WRF-Chem model for wind energy applications using High Resolution Doppler Lidar data from a Utah 2012 field campaign

    NASA Astrophysics Data System (ADS)

    Mitchell, M. J.; Pichugina, Y. L.; Banta, R. M.

    2015-12-01

    Models are important tools for assessing potential of wind energy sites, but the accuracy of these projections has not been properly validated. In this study, High Resolution Doppler Lidar (HRDL) data obtained with high temporal and spatial resolution at heights of modern turbine rotors were compared to output from the WRF-chem model in order to help improve the performance of the model in producing accurate wind forecasts for the industry. HRDL data were collected from January 23-March 1, 2012 during the Uintah Basin Winter Ozone Study (UBWOS) field campaign. A model validation method was based on the qualitative comparison of the wind field images, time-series analysis and statistical analysis of the observed and modeled wind speed and direction, both for case studies and for the whole experiment. To compare the WRF-chem model output to the HRDL observations, the model heights and forecast times were interpolated to match the observed times and heights. Then, time-height cross-sections of the HRDL and WRF-Chem wind speed and directions were plotted to select case studies. Cross-sections of the differences between the observed and forecasted wind speed and directions were also plotted to visually analyze the model performance in different wind flow conditions. A statistical analysis includes the calculation of vertical profiles and time series of bias, correlation coefficient, root mean squared error, and coefficient of determination between two datasets. The results from this analysis reveals where and when the model typically struggles in forecasting winds at heights of modern turbine rotors so that in the future the model can be improved for the industry.

  18. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2017-12-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  19. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures.

    PubMed

    Boes, Kelsey S; Roberts, Michael S; Vinueza, Nelson R

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R 2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R 2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. Graphical Abstract ᅟ.

  20. Single-site Versus Multiport Robotic Hysterectomy in Benign Gynecologic Diseases: A Retrospective Evaluation of Surgical Outcomes and Cost Analysis.

    PubMed

    Bogliolo, Stefano; Ferrero, Simone; Cassani, Chiara; Musacchi, Valentina; Zanellini, Francesca; Dominoni, Mattia; Spinillo, Arsenio; Gardella, Barbara

    2016-01-01

    To compare the surgical outcomes and costs of robotic-assisted hysterectomy with the single-site (RSSH) or multiport approach (RH). A retrospective analysis of a prospectively collected database (Canadian Task Force classification II1). A university hospital. Consecutive women who underwent robotic-assisted total laparoscopic hysterectomy and bilateral salpingo-oophorectomy for the treatment of benign gynecologic diseases. Data on surgical approach, surgical outcomes, and costs were collected in a prospective database and retrospectively analyzed. The total operative time, console time, docking time, estimated blood loss, conversion rate, and surgical complications rate were compared between the 2 study groups. Cost analysis was performed. One hundred four patients underwent total robotic-assisted hysterectomy and bilateral salpingo-oophorectomy (45 RSSH and 59 RH). There was no significant difference in the indications for surgery and in the characteristics of the patients between the 2 study groups. There was no significant difference between the single-site and multiport approach in console time, surgical complication rate, conversion rate, and postoperative pain. The docking time was lower in the RH group (p = .0001). The estimated blood loss and length of hospitalization were lower in the RSSH group (p = .0008 and p = .009, respectively). The cost analysis showed significant differences in favor of RSSH. RSSH should be preferred to RH when hysterectomy is performed for benign disease because it could be at least as equally effective and safe with a potential cost reduction. However, because of the high cost and absence of clear advantages, the robotic approach should be considered only for selected patients. Copyright © 2016 AAGL. Published by Elsevier Inc. All rights reserved.

  1. Embedded Hyperchaotic Generators: A Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Sadoudi, Said; Tanougast, Camel; Azzaz, Mohamad Salah; Dandache, Abbas

    In this paper, we present a comparative analysis of FPGA implementation performances, in terms of throughput and resources cost, of five well known autonomous continuous hyperchaotic systems. The goal of this analysis is to identify the embedded hyperchaotic generator which leads to designs with small logic area cost, satisfactory throughput rates, low power consumption and low latency required for embedded applications such as secure digital communications between embedded systems. To implement the four-dimensional (4D) chaotic systems, we use a new structural hardware architecture based on direct VHDL description of the forth order Runge-Kutta method (RK-4). The comparative analysis shows that the hyperchaotic Lorenz generator provides attractive performances compared to that of others. In fact, its hardware implementation requires only 2067 CLB-slices, 36 multipliers and no block RAMs, and achieves a throughput rate of 101.6 Mbps, at the output of the FPGA circuit, at a clock frequency of 25.315 MHz with a low latency time of 316 ns. Consequently, these good implementation performances offer to the embedded hyperchaotic Lorenz generator the advantage of being the best candidate for embedded communications applications.

  2. Using the cost-effectiveness of allogeneic islet transplantation to inform induced pluripotent stem cell-derived β-cell therapy reimbursement.

    PubMed

    Archibald, Peter R T; Williams, David J

    2015-11-01

    In the present study a cost-effectiveness analysis of allogeneic islet transplantation was performed and the financial feasibility of a human induced pluripotent stem cell-derived β-cell therapy was explored. Previously published cost and health benefit data for islet transplantation were utilized to perform the cost-effectiveness and sensitivity analyses. It was determined that, over a 9-year time horizon, islet transplantation would become cost saving and 'dominate' the comparator. Over a 20-year time horizon, islet transplantation would incur significant cost savings over the comparator (GB£59,000). Finally, assuming a similar cost of goods to islet transplantation and a lack of requirement for immunosuppression, a human induced pluripotent stem cell-derived β-cell therapy would dominate the comparator over an 8-year time horizon.

  3. Electrodynamic actuators for rocket engine valves

    NASA Technical Reports Server (NTRS)

    Fiet, O.; Doshi, D.

    1972-01-01

    Actuators, employed in acoustic loudspeakers, operate liquid rocket engine valves by replacing light paper cones with flexible metal diaphragms. Comparative analysis indicates better response time than solenoid actuators, and improved service life and reliability.

  4. Research related to improved computer aided design software package. [comparative efficiency of finite, boundary, and hybrid element methods in elastostatics

    NASA Technical Reports Server (NTRS)

    Walston, W. H., Jr.

    1986-01-01

    The comparative computational efficiencies of the finite element (FEM), boundary element (BEM), and hybrid boundary element-finite element (HVFEM) analysis techniques are evaluated for representative bounded domain interior and unbounded domain exterior problems in elastostatics. Computational efficiency is carefully defined in this study as the computer time required to attain a specified level of solution accuracy. The study found the FEM superior to the BEM for the interior problem, while the reverse was true for the exterior problem. The hybrid analysis technique was found to be comparable or superior to both the FEM and BEM for both the interior and exterior problems.

  5. Comprehensive Bibliography of Pakistan Archaeology: Paleolithic to Historic Times. South Asia Series, Occasional Paper No. 24.

    ERIC Educational Resources Information Center

    King, Denise E.

    The comprehensive bibliography is a compilation of twentieth century documents about Pakistan prehistory from Paleolithic times to the arrival of the Greeks in approximately 330 B.C., also includes some of the major archaeological studies in adjacent countries which have a bearing on the interpretation and comparative analysis of Pakistan…

  6. Constant and Progressive Time Delay Procedures for Teaching Children with Autism: A Literature Review

    ERIC Educational Resources Information Center

    Walker, Gabriela

    2008-01-01

    A review of 22 empirical studies examining the use of constant (CTD) and progressive (PTD) time delay procedures employed with children with autism frames an indirect analysis of the demographic, procedural, methodological, and outcome parameters of existing research. None of the previous manuscripts compared the two response prompting procedures.…

  7. Integrated Formal Analysis of Timed-Triggered Ethernet

    NASA Technical Reports Server (NTRS)

    Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam

    2012-01-01

    We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.

  8. Semi-automatic motion compensation of contrast-enhanced ultrasound images from abdominal organs for perfusion analysis.

    PubMed

    Schäfer, Sebastian; Nylund, Kim; Sævik, Fredrik; Engjom, Trond; Mézl, Martin; Jiřík, Radovan; Dimcevski, Georg; Gilja, Odd Helge; Tönnies, Klaus

    2015-08-01

    This paper presents a system for correcting motion influences in time-dependent 2D contrast-enhanced ultrasound (CEUS) images to assess tissue perfusion characteristics. The system consists of a semi-automatic frame selection method to find images with out-of-plane motion as well as a method for automatic motion compensation. Translational and non-rigid motion compensation is applied by introducing a temporal continuity assumption. A study consisting of 40 clinical datasets was conducted to compare the perfusion with simulated perfusion using pharmacokinetic modeling. Overall, the proposed approach decreased the mean average difference between the measured perfusion and the pharmacokinetic model estimation. It was non-inferior for three out of four patient cohorts to a manual approach and reduced the analysis time by 41% compared to manual processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Measurement of neoclassically predicted edge current density at ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Dunne, M. G.; McCarthy, P. J.; Wolfrum, E.; Fischer, R.; Giannone, L.; Burckhart, A.; the ASDEX Upgrade Team

    2012-12-01

    Experimental confirmation of neoclassically predicted edge current density in an ELMy H-mode plasma is presented. Current density analysis using the CLISTE equilibrium code is outlined and the rationale for accuracy of the reconstructions is explained. Sample profiles and time traces from analysis of data at ASDEX Upgrade are presented. A high time resolution is possible due to the use of an ELM-synchronization technique. Additionally, the flux-surface-averaged current density is calculated using a neoclassical approach. Results from these two separate methods are then compared and are found to validate the theoretical formula. Finally, several discharges are compared as part of a fuelling study, showing that the size and width of the edge current density peak at the low-field side can be explained by the electron density and temperature drives and their respective collisionality modifications.

  10. Near Real-Time Surveillance for Influenza Vaccine Safety: Proof-of-Concept in the Vaccine Safety Datalink Project

    PubMed Central

    Greene, Sharon K.; Kulldorff, Martin; Lewis, Edwin M.; Li, Rong; Yin, Ruihua; Weintraub, Eric S.; Fireman, Bruce H.; Lieu, Tracy A.; Nordin, James D.; Glanz, Jason M.; Baxter, Roger; Jacobsen, Steven J.; Broder, Karen R.; Lee, Grace M.

    2010-01-01

    The emergence of pandemic H1N1 influenza in 2009 has prompted public health responses, including production and licensure of new influenza A (H1N1) 2009 monovalent vaccines. Safety monitoring is a critical component of vaccination programs. As proof-of-concept, the authors mimicked near real-time prospective surveillance for prespecified neurologic and allergic adverse events among enrollees in 8 medical care organizations (the Vaccine Safety Datalink Project) who received seasonal trivalent inactivated influenza vaccine during the 2005/06–2007/08 influenza seasons. In self-controlled case series analysis, the risk of adverse events in a prespecified exposure period following vaccination was compared with the risk in 1 control period for the same individual either before or after vaccination. In difference-in-difference analysis, the relative risk in exposed versus control periods each season was compared with the relative risk in previous seasons since 2000/01. The authors used Poisson-based analysis to compare the risk of Guillian-Barré syndrome following vaccination in each season with that in previous seasons. Maximized sequential probability ratio tests were used to adjust for repeated analyses on weekly data. With administration of 1,195,552 doses to children under age 18 years and 4,773,956 doses to adults, no elevated risk of adverse events was identified. Near real-time surveillance for selected adverse events can be implemented prospectively to rapidly assess seasonal and pandemic influenza vaccine safety. PMID:19965887

  11. A retrospective analysis of the effect of discussion in teleconference and face-to-face scientific peer-review panels

    PubMed Central

    Carpenter, Afton S; Sullivan, Joanne H; Deshmukh, Arati; Glisson, Scott R; Gallo, Stephen A

    2015-01-01

    Objective With the use of teleconferencing for grant peer-review panels increasing, further studies are necessary to determine the efficacy of the teleconference setting compared to the traditional onsite/face-to-face setting. The objective of this analysis was to examine the effects of discussion, namely changes in application scoring premeeting and postdiscussion, in these settings. We also investigated other parameters, including the magnitude of score shifts and application discussion time in face-to-face and teleconference review settings. Design The investigation involved a retrospective, quantitative analysis of premeeting and postdiscussion scores and discussion times for teleconference and face-to-face review panels. The analysis included 260 and 212 application score data points and 212 and 171 discussion time data points for the face-to-face and teleconference settings, respectively. Results The effect of discussion was found to be small, on average, in both settings. However, discussion was found to be important for at least 10% of applications, regardless of setting, with these applications moving over a potential funding line in either direction (fundable to unfundable or vice versa). Small differences were uncovered relating to the effect of discussion between settings, including a decrease in the magnitude of the effect in the teleconference panels as compared to face-to-face. Discussion time (despite teleconferences having shorter discussions) was observed to have little influence on the magnitude of the effect of discussion. Additionally, panel discussion was found to often result in a poorer score (as opposed to an improvement) when compared to reviewer premeeting scores. This was true regardless of setting or assigned reviewer type (primary or secondary reviewer). Conclusions Subtle differences were observed between settings, potentially due to reduced engagement in teleconferences. Overall, further research is required on the psychology of decision-making, team performance and persuasion to better elucidate the group dynamics of telephonic and virtual ad-hoc peer-review panels. PMID:26351194

  12. Laparoscopic versus open repair for perforated peptic ulcer: A meta analysis of randomized controlled trials.

    PubMed

    Tan, Shanjun; Wu, Guohao; Zhuang, Qiulin; Xi, Qiulei; Meng, Qingyang; Jiang, Yi; Han, Yusong; Yu, Chao; Yu, Zhen; Li, Ning

    2016-09-01

    The role of laparoscopic surgery in the repair for peptic ulcer disease is unclear. The present study aimed to compare the safety and efficacy of laparoscopic versus open repair for peptic ulcer disease. Randomized controlled trials (RCTs) comparing laparoscopic versus open repair for peptic ulcer disease were identified from MEDLINE, EMBASE, Cochrane Central Register of Controlled Trials, and references of identified articles and relevant reviews. Primary outcomes were postoperative complications, mortality, and reoperation. Secondary outcomes were operative time, postoperative pain, postoperative hospital stay, nasogastric tube duration, and time to resume diet. Statistical analysis was carried out by Review Manage software. Five RCTs investigating a total of 549 patients, of whom, 279 received laparoscopic repair and 270 received open repair, were included in the final analysis. There were no significant differences between these two procedures in some primary outcomes including overal postoperative complication rate, mortality, and reoperation rate. Subcategory analysis of postoperative complications showed that laparoscopic repair had also similar rates of repair site leakage, intra-abdominal abscess, postoperative ileus, pneumonia, and urinary tract infection as open surgery, except of the lower surgical site infection rate (P < 0.05). In addition, there were also no significant differences between these two procedures in some second outcomes including operative time, postoperative hospital stay, and time to resume diet, but laparoscopic repair had shorter nasogastric tube duration (P < 0.05) and less postoperative pain (P < 0.05) than open surgery. Laparoscopic surgery is comparable with open surgery in the setting of repair for perforated peptic ulcer. The obvious advantages of laparoscopic surgery are the lower surgical site infection rate, shorter nasogastric tube duration and less postoperative pain. However, more higher quality studies should be undertaken to further assess the safety and efficacy of laparoscopic repair for peptic ulcer disease. Copyright © 2016 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.

  13. Spatiotemporal analysis of particulate matter, sulfur dioxide and carbon monoxide concentrations over the city of Rio de Janeiro, Brazil

    NASA Astrophysics Data System (ADS)

    Zeri, Marcelo; Oliveira-Júnior, José Francisco; Lyra, Gustavo Bastos

    2011-09-01

    Time series of pollutants and weather variables measured at four sites in the city of Rio de Janeiro, Brazil, between 2002 and 2004, were used to characterize temporal and spatial relationships of air pollution. Concentrations of particulate matter (PM10), sulfur dioxide (SO2) and carbon monoxide (CO) were compared to national and international standards. The annual median concentration of PM10 was higher than the standard set by the World Health Organization (WHO) on all sites and the 24 h means exceeded the standards on several occasions on two sites. SO2 and CO did not exceed the limits, but the daily maximum of CO in one of the stations was 27% higher on weekends compared to weekdays, due to increased activity in a nearby Convention Center. Air temperature and vapor pressure deficit have both presented the highest correlations with pollutant's concentrations. The concentrations of SO2 and CO were not correlated between sites, suggesting that local sources are more important to those pollutants compared to PM10. The time series of pollutants and air temperature were decomposed in time and frequency by wavelet analysis. The results revealed that the common variability of air temperature and PM10 is dominated by temporal scales of 1-8 days, time scales that are associated with the passage of weather events, such as cold fronts.

  14. Quantifying memory in complex physiological time-series.

    PubMed

    Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.

  15. Quantifying Memory in Complex Physiological Time-Series

    PubMed Central

    Shirazi, Amir H.; Raoufy, Mohammad R.; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R.; Amodio, Piero; Jafari, G. Reza; Montagnese, Sara; Mani, Ali R.

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of “memory length” was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are ‘forgotten’ quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations. PMID:24039811

  16. Comparative bioavailability of two oral formulations of ketorolac tromethamine: Dolac and Exodol.

    PubMed

    Flores-Murrieta, F J; Granados-Soto, V; Castañeda-Hernández, G; Herrera, J E; Hong, E

    1994-03-01

    The bioavailability of ketorolac after administration of two oral formulations containing 10 mg of ketorolac tromethamine, Exodol and Dolac, to 12 healthy Mexican volunteers was compared. Subjects received both formulations according to a randomized crossover design and blood samples were drawn at selected times during 24 h. Ketorolac plasma concentrations were determined by HPLC and individual plasma-concentration-against-time curves were constructed. Maximal plasma concentration and AUC0-24 values were compared by analysis of variance followed by Westlake's confidence interval test. 90% confidence limits ranged from 80 to 125% for Cmax and from 85 to 118% for AUC0-24. It is concluded that the two assayed formulations are bioequivalent.

  17. Implementation of laparoscopic hysterectomy: maintenance of skills after a mentorship program.

    PubMed

    Twijnstra, A R H; Blikkendaal, M D; Kolkman, W; Smeets, M J G H; Rhemrev, J P T; Jansen, F W

    2010-01-01

    To evaluate the implementation and maintenance of advanced laparoscopic skills after a structured mentorship program in laparoscopic hysterectomy (LH). Cohort retrospective analysis of 104 successive LHs performed by two gynecologists during and after a mentorship program. LHs were compared for indication, patient characteristics and intraoperative characteristics. As a frame of reference, 94 LHs performed by the mentor were analyzed. With regard to indication, blood loss and adverse outcomes, both trainees performed LHs during their mentorship program comparable with the LHs performed by the mentor. The difference in mean operating time between trainees and mentor was not clinically significant. Both trainees progressed along a learning curve, while operating time remained statistically constant and comparable to that of the mentor. After completing the mentorship program, both gynecologists maintained their acquired skills as blood loss, adverse outcome rates and operating time were comparable with the results during their traineeship. A mentorship program is an effective and durable tool for implementing a new surgical procedure in a teaching hospital with respect to patient safety aspects, as indications, operating time and adverse outcome rates are comparable to those of the mentor in his own hospital during and after completing the mentorship program. Copyright © 2010 S. Karger AG, Basel.

  18. Machine translation project alternatives analysis

    NASA Technical Reports Server (NTRS)

    Bajis, Catherine J.; Bedford, Denise A. D.

    1993-01-01

    The Machine Translation Project consists of several components, two of which, the Project Plan and the Requirements Analysis, have already been delivered. The Project Plan details the overall rationale, objectives and time-table for the project as a whole. The Requirements Analysis compares a number of available machine translation systems, their capabilities, possible configurations, and costs. The Alternatives Analysis has resulted in a number of conclusions and recommendations to the NASA STI program concerning the acquisition of specific MT systems and related hardware and software.

  19. Longevity of Self-etch Dentin Bonding Adhesives Compared to Etch-and-rinse Dentin Bonding Adhesives: A Systematic Review.

    PubMed

    Masarwa, Nader; Mohamed, Ahmed; Abou-Rabii, Iyad; Abu Zaghlan, Rawan; Steier, Liviu

    2016-06-01

    A systematic review and meta-analysis were performed to compare longevity of Self-Etch Dentin Bonding Adhesives to Etch-and-Rinse Dentin Bonding Adhesives. The following databases were searched for PubMed, MEDLINE, Web of Science, CINAHL, the Cochrane Library complemented by a manual search of the Journal of Adhesive Dentistry. The MESH keywords used were: "etch and rinse," "total etch," "self-etch," "dentin bonding agent," "bond durability," and "bond degradation." Included were in-vitro experimental studies performed on human dental tissues of sound tooth structure origin. The examined Self-Etch Bonds were of two subtypes; Two Steps and One Step Self-Etch Bonds, while Etch-and-Rinse Bonds were of two subtypes; Two Steps and Three Steps. The included studies measured micro tensile bond strength (μTBs) to evaluate bond strength and possible longevity of both types of dental adhesives at different times. The selected studies depended on water storage as the aging technique. Statistical analysis was performed for outcome measurements compared at 24 h, 3 months, 6 months and 12 months of water storage. After 24 hours (p-value = 0.051), 3 months (p-value = 0.756), 6 months (p-value=0.267), 12 months (p-value=0.785) of water storage self-etch adhesives showed lower μTBs when compared to the etch-and-rinse adhesives, but the comparisons were statistically insignificant. In this study, longevity of Dentin Bonds was related to the measured μTBs. Although Etch-and-Rinse bonds showed higher values at all times, the meta-analysis found no difference in longevity of the two types of bonds at the examined aging times. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. QuickView video preview software of colon capsule endoscopy: reliability in presenting colorectal polyps as compared to normal mode reading.

    PubMed

    Farnbacher, Michael J; Krause, Horst H; Hagel, Alexander F; Raithel, Martin; Neurath, Markus F; Schneider, Thomas

    2014-03-01

    OBJECTIVE. Colon capsule endoscopy (CCE) proved to be highly sensitive in detection of colorectal polyps (CP). Major limitation is the time-consuming video reading. The aim of this prospective, double-center study was to assess the theoretical time-saving potential and its possible impact on the reliability of "QuickView" (QV), in the presentation of CP as compared to normal mode (NM). METHODS. During NM reading of 65 CCE videos (mean patient´s age 56 years), all frames showing CPs were collected and compared to the number of frames presented by QV at increasing QV settings (10, 20, ... 80%). Reliability of QV in presenting polyps <6 mm and ≥6 mm (significant polyp), and identifying patients for subsequent therapeutic colonoscopy, capsule egestion rate, cleansing level, and estimated time-saving potential were assessed. RESULTS. At a 30% QV setting, the QV video presented 89% of the significant polyps and 86% of any polyps with ≥1 frame (per-polyp analysis) identified in NM before. At a 10% QV setting, 98% of the 52 patients with significant polyps could be identified (per-patient analysis) by QV video analysis. Capsule excretion rate was 74% and colon cleanliness was adequate in 85%. QV´s presentation rate correlates to the QV setting, the polyp size, and the number of frames per finding. CONCLUSIONS. Depending on its setting, the reliability of QV in presenting CP as compared to NM reading is notable. However, if no significant polyp is presented by QV, NM reading must be performed afterwards. The reduction of frames to be analyzed in QV might speed up identification of candidates for therapeutic colonoscopy.

  1. Comparative Bioinformatic Analysis of Active Site Structures in Evolutionarily Remote Homologues of α,β-Hydrolase Superfamily Enzymes.

    PubMed

    Suplatov, D A; Arzhanik, V K; Svedas, V K

    2011-01-01

    Comparative bioinformatic analysis is the cornerstone of the study of enzymes' structure-function relationship. However, numerous enzymes that derive from a common ancestor and have undergone substantial functional alterations during natural selection appear not to have a sequence similarity acceptable for a statistically reliable comparative analysis. At the same time, their active site structures, in general, can be conserved, while other parts may largely differ. Therefore, it sounds both plausible and appealing to implement a comparative analysis of the most functionally important structural elements - the active site structures; that is, the amino acid residues involved in substrate binding and the catalytic mechanism. A computer algorithm has been developed to create a library of enzyme active site structures based on the use of the PDB database, together with programs of structural analysis and identification of functionally important amino acid residues and cavities in the enzyme structure. The proposed methodology has been used to compare some α,β-hydrolase superfamily enzymes. The insight has revealed a high structural similarity of catalytic site areas, including the conservative organization of a catalytic triad and oxyanion hole residues, despite the wide functional diversity among the remote homologues compared. The methodology can be used to compare the structural organization of the catalytic and substrate binding sites of various classes of enzymes, as well as study enzymes' evolution and to create of a databank of enzyme active site structures.

  2. Comparison of retinal thickness by Fourier-domain optical coherence tomography and OCT retinal image analysis software segmentation analysis derived from Stratus optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Tátrai, Erika; Ranganathan, Sudarshan; Ferencz, Mária; Debuc, Delia Cabrera; Somfai, Gábor Márk

    2011-05-01

    Purpose: To compare thickness measurements between Fourier-domain optical coherence tomography (FD-OCT) and time-domain OCT images analyzed with a custom-built OCT retinal image analysis software (OCTRIMA). Methods: Macular mapping (MM) by StratusOCT and MM5 and MM6 scanning protocols by an RTVue-100 FD-OCT device are performed on 11 subjects with no retinal pathology. Retinal thickness (RT) and the thickness of the ganglion cell complex (GCC) obtained with the MM6 protocol are compared for each early treatment diabetic retinopathy study (ETDRS)-like region with corresponding results obtained with OCTRIMA. RT results are compared by analysis of variance with Dunnett post hoc test, while GCC results are compared by paired t-test. Results: A high correlation is obtained for the RT between OCTRIMA and MM5 and MM6 protocols. In all regions, the StratusOCT provide the lowest RT values (mean difference 43 +/- 8 μm compared to OCTRIMA, and 42 +/- 14 μm compared to RTVue MM6). All RTVue GCC measurements were significantly thicker (mean difference between 6 and 12 μm) than the GCC measurements of OCTRIMA. Conclusion: High correspondence of RT measurements is obtained not only for RT but also for the segmentation of intraretinal layers between FD-OCT and StratusOCT-derived OCTRIMA analysis. However, a correction factor is required to compensate for OCT-specific differences to make measurements more comparable to any available OCT device.

  3. An analysis of metropolitan land-use by machine processing of earth resources technology satellite data

    NASA Technical Reports Server (NTRS)

    Mausel, P. W.; Todd, W. J.; Baumgardner, M. F.

    1976-01-01

    A successful application of state-of-the-art remote sensing technology in classifying an urban area into its broad land use classes is reported. This research proves that numerous urban features are amenable to classification using ERTS multispectral data automatically processed by computer. Furthermore, such automatic data processing (ADP) techniques permit areal analysis on an unprecedented scale with a minimum expenditure of time. Also, classification results obtained using ADP procedures are consistent, comparable, and replicable. The results of classification are compared with the proposed U. S. G. S. land use classification system in order to determine the level of classification that is feasible to obtain through ERTS analysis of metropolitan areas.

  4. A correlational analysis of the effects of changing environmental conditions on the NR atomic hydrogen maser

    NASA Technical Reports Server (NTRS)

    Dragonette, Richard A.; Suter, Joseph J.

    1992-01-01

    An extensive statistical analysis has been undertaken to determine if a correlation exists between changes in an NR atomic hydrogen maser's frequency offset and changes in environmental conditions. Correlation analyses have been performed comparing barometric pressure, humidity, and temperature with maser frequency offset as a function of time for periods ranging from 5.5 to 17 days. Semipartial correlation coefficients as large as -0.9 have been found between barometric pressure and maser frequency offset. Correlation between maser frequency offset and humidity was small compared to barometric pressure and unpredictable. Analysis of temperature data indicates that in the most current design, temperature does not significantly affect maser frequency offset.

  5. Application of Linear Discriminant Analysis in Dimensionality Reduction for Hand Motion Classification

    NASA Astrophysics Data System (ADS)

    Phinyomark, A.; Hu, H.; Phukpattaranont, P.; Limsakul, C.

    2012-01-01

    The classification of upper-limb movements based on surface electromyography (EMG) signals is an important issue in the control of assistive devices and rehabilitation systems. Increasing the number of EMG channels and features in order to increase the number of control commands can yield a high dimensional feature vector. To cope with the accuracy and computation problems associated with high dimensionality, it is commonplace to apply a processing step that transforms the data to a space of significantly lower dimensions with only a limited loss of useful information. Linear discriminant analysis (LDA) has been successfully applied as an EMG feature projection method. Recently, a number of extended LDA-based algorithms have been proposed, which are more competitive in terms of both classification accuracy and computational costs/times with classical LDA. This paper presents the findings of a comparative study of classical LDA and five extended LDA methods. From a quantitative comparison based on seven multi-feature sets, three extended LDA-based algorithms, consisting of uncorrelated LDA, orthogonal LDA and orthogonal fuzzy neighborhood discriminant analysis, produce better class separability when compared with a baseline system (without feature projection), principle component analysis (PCA), and classical LDA. Based on a 7-dimension time domain and time-scale feature vectors, these methods achieved respectively 95.2% and 93.2% classification accuracy by using a linear discriminant classifier.

  6. Efficient Analysis of Mass Spectrometry Data Using the Isotope Wavelet

    NASA Astrophysics Data System (ADS)

    Hussong, Rene; Tholey, Andreas; Hildebrandt, Andreas

    2007-09-01

    Mass spectrometry (MS) has become today's de-facto standard for high-throughput analysis in proteomics research. Its applications range from toxicity analysis to MS-based diagnostics. Often, the time spent on the MS experiment itself is significantly less than the time necessary to interpret the measured signals, since the amount of data can easily exceed several gigabytes. In addition, automated analysis is hampered by baseline artifacts, chemical as well as electrical noise, and an irregular spacing of data points. Thus, filtering techniques originating from signal and image analysis are commonly employed to address these problems. Unfortunately, smoothing, base-line reduction, and in particular a resampling of data points can affect important characteristics of the experimental signal. To overcome these problems, we propose a new family of wavelet functions based on the isotope wavelet, which is hand-tailored for the analysis of mass spectrometry data. The resulting technique is theoretically well-founded and compares very well with standard peak picking tools, since it is highly robust against noise spoiling the data, but at the same time sufficiently sensitive to detect even low-abundant peptides.

  7. Comparative study of two protocols for quantitative image-analysis of serotonin transporter clustering in lymphocytes, a putative biomarker of therapeutic efficacy in major depression.

    PubMed

    Romay-Tallon, Raquel; Rivera-Baltanas, Tania; Allen, Josh; Olivares, Jose M; Kalynchuk, Lisa E; Caruncho, Hector J

    2017-01-01

    The pattern of serotonin transporter clustering on the plasma membrane of lymphocytes extracted from human whole blood samples has been identified as a putative biomarker of therapeutic efficacy in major depression. Here we evaluated the possibility of performing a similar analysis using blood smears obtained from rats, and from control human subjects and depression patients. We hypothesized that we could optimize a protocol to make the analysis of serotonin protein clustering in blood smears comparable to the analysis of serotonin protein clustering using isolated lymphocytes. Our data indicate that blood smears require a longer fixation time and longer times of incubation with primary and secondary antibodies. In addition, one needs to optimize the image analysis settings for the analysis of smears. When these steps are followed, the quantitative analysis of both the number and size of serotonin transporter clusters on the plasma membrane of lymphocytes is similar using both blood smears and isolated lymphocytes. The development of this novel protocol will greatly facilitate the collection of appropriate samples by eliminating the necessity and cost of specialized personnel for drawing blood samples, and by being a less invasive procedure. Therefore, this protocol will help us advance the validation of membrane protein clustering in lymphocytes as a biomarker of therapeutic efficacy in major depression, and bring it closer to its clinical application.

  8. Analysis of environmental microplastics by vibrational microspectroscopy: FTIR, Raman or both?

    PubMed

    Käppler, Andrea; Fischer, Dieter; Oberbeckmann, Sonja; Schernewski, Gerald; Labrenz, Matthias; Eichhorn, Klaus-Jochen; Voit, Brigitte

    2016-11-01

    The contamination of aquatic ecosystems with microplastics has recently been reported through many studies, and negative impacts on the aquatic biota have been described. For the chemical identification of microplastics, mainly Fourier transform infrared (FTIR) and Raman spectroscopy are used. But up to now, a critical comparison and validation of both spectroscopic methods with respect to microplastics analysis is missing. To close this knowledge gap, we investigated environmental samples by both Raman and FTIR spectroscopy. Firstly, particles and fibres >500 μm extracted from beach sediment samples were analysed by Raman and FTIR microspectroscopic single measurements. Our results illustrate that both methods are in principle suitable to identify microplastics from the environment. However, in some cases, especially for coloured particles, a combination of both spectroscopic methods is necessary for a complete and reliable characterisation of the chemical composition. Secondly, a marine sample containing particles <400 μm was investigated by Raman imaging and FTIR transmission imaging. The results were compared regarding number, size and type of detectable microplastics as well as spectra quality, measurement time and handling. We show that FTIR imaging leads to significant underestimation (about 35 %) of microplastics compared to Raman imaging, especially in the size range <20 μm. However, the measurement time of Raman imaging is considerably higher compared to FTIR imaging. In summary, we propose a further size division within the smaller microplastics fraction into 500-50 μm (rapid and reliable analysis by FTIR imaging) and into 50-1 μm (detailed and more time-consuming analysis by Raman imaging). Graphical Abstract Marine microplastic sample (fraction <400 μm) on a silicon filter (middle) with the corresponding Raman and IR images.

  9. Cost-effectiveness of intensive multifactorial treatment compared with routine care for individuals with screen-detected Type 2 diabetes: analysis of the ADDITION-UK cluster-randomized controlled trial

    PubMed Central

    Tao, L; Wilson, E C F; Wareham, N J; Sandbæk, A; Rutten, G E H M; Lauritzen, T; Khunti, K; Davies, M J; Borch-Johnsen, K; Griffin, S J; Simmons, R K

    2015-01-01

    Aims To examine the short- and long-term cost-effectiveness of intensive multifactorial treatment compared with routine care among people with screen-detected Type 2 diabetes. Methods Cost–utility analysis in ADDITION-UK, a cluster-randomized controlled trial of early intensive treatment in people with screen-detected diabetes in 69 UK general practices. Unit treatment costs and utility decrement data were taken from published literature. Accumulated costs and quality-adjusted life years (QALYs) were calculated using ADDITION-UK data from 1 to 5 years (short-term analysis, n = 1024); trial data were extrapolated to 30 years using the UKPDS outcomes model (version 1.3) (long-term analysis; n = 999). All costs were transformed to the UK 2009/10 price level. Results Adjusted incremental costs to the NHS were £285, £935, £1190 and £1745 over a 1-, 5-, 10- and 30-year time horizon, respectively (discounted at 3.5%). Adjusted incremental QALYs were 0.0000, – 0.0040, 0.0140 and 0.0465 over the same time horizons. Point estimate incremental cost-effectiveness ratios (ICERs) suggested that the intervention was not cost-effective although the ratio improved over time: the ICER over 10 years was £82 250, falling to £37 500 over 30 years. The ICER fell below £30 000 only when the intervention cost was below £631 per patient: we estimated the cost at £981. Conclusion Given conventional thresholds of cost-effectiveness, the intensive treatment delivered in ADDITION was not cost-effective compared with routine care for individuals with screen-detected diabetes in the UK. The intervention may be cost-effective if it can be delivered at reduced cost. PMID:25661661

  10. Guaranteeing robustness of structural condition monitoring to environmental variability

    NASA Astrophysics Data System (ADS)

    Van Buren, Kendra; Reilly, Jack; Neal, Kyle; Edwards, Harry; Hemez, François

    2017-01-01

    Advances in sensor deployment and computational modeling have allowed significant strides to be recently made in the field of Structural Health Monitoring (SHM). One widely used SHM strategy is to perform a vibration analysis where a model of the structure's pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability, unknown model functional forms, and unknown values of model parameters. Not accounting for these sources of uncertainty can lead to false-positives or false-negatives in the structural condition assessment. To manage the uncertainty, we propose a robust SHM methodology that combines three technologies. A time series algorithm is trained using "baseline" data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate "size" of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes throughout time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM. (Publication approved for unlimited, public release on October-28-2015, LA-UR-15-28442, unclassified.)

  11. Application of comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry method to identify potential biomarkers of perinatal asphyxia in a non-human primate model.

    PubMed

    Beckstrom, Andrew C; Humston, Elizabeth M; Snyder, Laura R; Synovec, Robert E; Juul, Sandra E

    2011-04-08

    Perinatal asphyxia is a leading cause of brain injury in infants, occurring in 2-4 per 1000 live births. The clinical response to asphyxia is variable and difficult to predict with current diagnostic tests. Reliable biomarkers are needed to help predict the timing and severity of asphyxia, as well as response to treatment. Two-dimensional gas chromatography-time-of-flight-mass spectrometry (GC×GC-TOFMS) was used herein, in conjunction with chemometric data analysis approaches for metabolomic analysis in order to identify significant metabolites affected by birth asphyxia. Blood was drawn before and after 15 or 18 min of cord occlusion in a Macaca nemestrina model of perinatal asphyxia. Postnatal samples were drawn at 5 min of age (n=20 subjects). Metabolomic profiles of asphyxiated animals were compared to four controls delivered at comparable gestational age. Fifty metabolites with the greatest change pre- to post-asphyxia were identified and quantified. The metabolic profile of post-asphyxia samples showed marked variability compared to the pre-asphyxia samples. Fifteen of the 50 metabolites showed significant elevation in response to asphyxia, ten of which remained significant upon comparison to the control animals. This metabolomic analysis confirmed lactate and creatinine as markers of asphyxia and discovered new metabolites including succinic acid and malate (intermediates in the Krebs cycle) and arachidonic acid (a brain fatty acid and inflammatory marker) as potential biomarkers. GC×GC-TOFMS coupled with chemometric data analysis are useful tools to identify acute biomarkers of brain injury. Further study is needed to correlate these metabolites with severity of disease, and response to treatment. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Application of Comprehensive Two-Dimensional Gas Chromatography with Time-of-Flight Mass Spectrometry Method to Identify Potential Biomarkers of Perinatal Asphyxia in a Non-human Primate Model

    PubMed Central

    Beckstrom, Andrew C.; Humston, Elizabeth M.; Snyder, Laura R.; Synovec, Robert E.; Juul, Sandra E.

    2011-01-01

    Perinatal asphyxia is a leading cause of brain injury in infants, occurring in 2–4 per 1000 live births. The clinical response to asphyxia is variable and difficult to predict with current diagnostic tests. Reliable biomarkers are needed to help predict the timing and severity of asphyxia, as well as response to treatment. Two-dimensional gas chromatography-time-of-flight-mass spectrometry (GC x GC-TOFMS) was used herein, in conjunction with chemometric data analysis approaches for metabolomic analysis in order to identify significant metabolites affected by birth asphyxia. Blood was drawn before and after 15 or 18 minutes of cord occlusion in a Macaca nemestrina model of perinatal asphyxia. Postnatal samples were drawn at 5 minutes of age (n=20 subjects). Metabolomic profiles of asphyxiated animals were compared to four controls delivered at comparable gestational age. Fifty metabolites with the greatest change pre- to post-asphyxia were identified and quantified. The metabolic profile of post-asphyxia samples showed marked variability compared to the pre-asphyxia samples. Fifteen of the 50 metabolites showed significant elevation in response to asphyxia, ten of which remained significant upon comparison to the control animals. This metabolomic analysis confirmed lactate and creatinine as markers of asphyxia and discovered new metabolites including succinic acid and malate (intermediates in the Krebs cycle) and arachidonic acid (a brain fatty acid and inflammatory marker) as potential biomarkers. GC × GC-TOFMS coupled with chemometric data analysis are useful tools to identify acute biomarkers of brain injury. Further study is needed to correlate these metabolites with severity of disease, and response to treatment. PMID:21353677

  13. Multiscale entropy-based methods for heart rate variability complexity analysis

    NASA Astrophysics Data System (ADS)

    Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio

    2015-03-01

    Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.

  14. Automatic segmentation of invasive breast carcinomas from dynamic contrast-enhanced MRI using time series analysis.

    PubMed

    Jayender, Jagadaeesan; Chikarmane, Sona; Jolesz, Ferenc A; Gombos, Eva

    2014-08-01

    To accurately segment invasive ductal carcinomas (IDCs) from dynamic contrast-enhanced MRI (DCE-MRI) using time series analysis based on linear dynamic system (LDS) modeling. Quantitative segmentation methods based on black-box modeling and pharmacokinetic modeling are highly dependent on imaging pulse sequence, timing of bolus injection, arterial input function, imaging noise, and fitting algorithms. We modeled the underlying dynamics of the tumor by an LDS and used the system parameters to segment the carcinoma on the DCE-MRI. Twenty-four patients with biopsy-proven IDCs were analyzed. The lesions segmented by the algorithm were compared with an expert radiologist's segmentation and the output of a commercial software, CADstream. The results are quantified in terms of the accuracy and sensitivity of detecting the lesion and the amount of overlap, measured in terms of the Dice similarity coefficient (DSC). The segmentation algorithm detected the tumor with 90% accuracy and 100% sensitivity when compared with the radiologist's segmentation and 82.1% accuracy and 100% sensitivity when compared with the CADstream output. The overlap of the algorithm output with the radiologist's segmentation and CADstream output, computed in terms of the DSC was 0.77 and 0.72, respectively. The algorithm also shows robust stability to imaging noise. Simulated imaging noise with zero mean and standard deviation equal to 25% of the base signal intensity was added to the DCE-MRI series. The amount of overlap between the tumor maps generated by the LDS-based algorithm from the noisy and original DCE-MRI was DSC = 0.95. The time-series analysis based segmentation algorithm provides high accuracy and sensitivity in delineating the regions of enhanced perfusion corresponding to tumor from DCE-MRI. © 2013 Wiley Periodicals, Inc.

  15. Automatic Segmentation of Invasive Breast Carcinomas from DCE-MRI using Time Series Analysis

    PubMed Central

    Jayender, Jagadaeesan; Chikarmane, Sona; Jolesz, Ferenc A.; Gombos, Eva

    2013-01-01

    Purpose Quantitative segmentation methods based on black-box modeling and pharmacokinetic modeling are highly dependent on imaging pulse sequence, timing of bolus injection, arterial input function, imaging noise and fitting algorithms. To accurately segment invasive ductal carcinomas (IDCs) from dynamic contrast enhanced MRI (DCE-MRI) using time series analysis based on linear dynamic system (LDS) modeling. Methods We modeled the underlying dynamics of the tumor by a LDS and use the system parameters to segment the carcinoma on the DCE-MRI. Twenty-four patients with biopsy-proven IDCs were analyzed. The lesions segmented by the algorithm were compared with an expert radiologist’s segmentation and the output of a commercial software, CADstream. The results are quantified in terms of the accuracy and sensitivity of detecting the lesion and the amount of overlap, measured in terms of the Dice similarity coefficient (DSC). Results The segmentation algorithm detected the tumor with 90% accuracy and 100% sensitivity when compared to the radiologist’s segmentation and 82.1% accuracy and 100% sensitivity when compared to the CADstream output. The overlap of the algorithm output with the radiologist’s segmentation and CADstream output, computed in terms of the DSC was 0.77 and 0.72 respectively. The algorithm also shows robust stability to imaging noise. Simulated imaging noise with zero mean and standard deviation equal to 25% of the base signal intensity was added to the DCE-MRI series. The amount of overlap between the tumor maps generated by the LDS-based algorithm from the noisy and original DCE-MRI was DSC=0.95. Conclusion The time-series analysis based segmentation algorithm provides high accuracy and sensitivity in delineating the regions of enhanced perfusion corresponding to tumor from DCE-MRI. PMID:24115175

  16. Evaluation of a New Digital Automated Glycemic Pattern Detection Tool

    PubMed Central

    Albiñana, Emma; Artes, Maite; Corcoy, Rosa; Fernández-García, Diego; García-Alemán, Jorge; García-Cuartero, Beatriz; González, Cintia; Rivero, María Teresa; Casamira, Núria; Weissmann, Jörg

    2017-01-01

    Abstract Background: Blood glucose meters are reliable devices for data collection, providing electronic logs of historical data easier to interpret than handwritten logbooks. Automated tools to analyze these data are necessary to facilitate glucose pattern detection and support treatment adjustment. These tools emerge in a broad variety in a more or less nonevaluated manner. The aim of this study was to compare eDetecta, a new automated pattern detection tool, to nonautomated pattern analysis in terms of time investment, data interpretation, and clinical utility, with the overarching goal to identify early in development and implementation of tool areas of improvement and potential safety risks. Methods: Multicenter web-based evaluation in which 37 endocrinologists were asked to assess glycemic patterns of 4 real reports (2 continuous subcutaneous insulin infusion [CSII] and 2 multiple daily injection [MDI]). Endocrinologist and eDetecta analyses were compared on time spent to analyze each report and agreement on the presence or absence of defined patterns. Results: eDetecta module markedly reduced the time taken to analyze each case on the basis of the emminens eConecta reports (CSII: 18 min; MDI: 12.5), compared to the automatic eDetecta analysis. Agreement between endocrinologists and eDetecta varied depending on the patterns, with high level of agreement in patterns of glycemic variability. Further analysis of low level of agreement led to identifying areas where algorithms used could be improved to optimize trend pattern identification. Conclusion: eDetecta was a useful tool for glycemic pattern detection, helping clinicians to reduce time required to review emminens eConecta glycemic reports. No safety risks were identified during the study. PMID:29091477

  17. Central Fetal Monitoring With and Without Computer Analysis: A Randomized Controlled Trial.

    PubMed

    Nunes, Inês; Ayres-de-Campos, Diogo; Ugwumadu, Austin; Amin, Pina; Banfield, Philip; Nicoll, Antony; Cunningham, Simon; Sousa, Paulo; Costa-Santos, Cristina; Bernardes, João

    2017-01-01

    To evaluate whether intrapartum fetal monitoring with computer analysis and real-time alerts decreases the rate of newborn metabolic acidosis or obstetric intervention when compared with visual analysis. A randomized clinical trial carried out in five hospitals in the United Kingdom evaluated women with singleton, vertex fetuses of 36 weeks of gestation or greater during labor. Continuous central fetal monitoring by computer analysis and online alerts (experimental arm) was compared with visual analysis (control arm). Fetal blood sampling and electrocardiographic ST waveform analysis were available in both arms. The primary outcome was incidence of newborn metabolic acidosis (pH less than 7.05 and base deficit greater than 12 mmol/L). Prespecified secondary outcomes included operative delivery, use of fetal blood sampling, low 5-minute Apgar score, neonatal intensive care unit admission, hypoxic-ischemic encephalopathy, and perinatal death. A sample size of 3,660 per group (N=7,320) was planned to be able to detect a reduction in the rate of metabolic acidosis from 2.8% to 1.8% (two-tailed α of 0.05 with 80% power). From August 2011 through July 2014, 32,306 women were assessed for eligibility and 7,730 were randomized: 3,961 to computer analysis and online alerts, and 3,769 to visual analysis. Baseline characteristics were similar in both groups. Metabolic acidosis occurred in 16 participants (0.40%) in the experimental arm and 22 participants (0.58%) in the control arm (relative risk 0.69 [0.36-1.31]). No statistically significant differences were found in the incidence of secondary outcomes. Compared with visual analysis, computer analysis of fetal monitoring signals with real-time alerts did not significantly reduce the rate of metabolic acidosis or obstetric intervention. A lower-than-expected rate of newborn metabolic acidosis was observed in both arms of the trial. ISRCTN Registry, http://www.isrctn.com, ISRCTN42314164.

  18. Wide-band profile domain pulsar timing analysis

    NASA Astrophysics Data System (ADS)

    Lentati, L.; Kerr, M.; Dai, S.; Hobson, M. P.; Shannon, R. M.; Hobbs, G.; Bailes, M.; Bhat, N. D. Ramesh; Burke-Spolaor, S.; Coles, W.; Dempsey, J.; Lasky, P. D.; Levin, Y.; Manchester, R. N.; Osłowski, S.; Ravi, V.; Reardon, D. J.; Rosado, P. A.; Spiewak, R.; van Straten, W.; Toomey, L.; Wang, J.; Wen, L.; You, X.; Zhu, X.

    2017-04-01

    We extend profile domain pulsar timing to incorporate wide-band effects such as frequency-dependent profile evolution and broad-band shape variation in the pulse profile. We also incorporate models for temporal variations in both pulse width and in the separation in phase of the main pulse and interpulse. We perform the analysis with both nested sampling and Hamiltonian Monte Carlo methods. In the latter case, we introduce a new parametrization of the posterior that is extremely efficient in the low signal-to-noise regime and can be readily applied to a wide range of scientific problems. We apply this methodology to a series of simulations, and to between seven and nine years of observations for PSRs J1713+0747, J1744-1134 and J1909-3744 with frequency coverage that spans 700-3600 Mhz. We use a smooth model for profile evolution across the full frequency range, and compare smooth and piecewise models for the temporal variations in dispersion measure (DM). We find that the profile domain framework consistently results in improved timing precision compared to the standard analysis paradigm by as much as 40 per cent for timing parameters. Incorporating smoothness in the DM variations into the model further improves timing precision by as much as 30 per cent. For PSR J1713+0747, we also detect pulse shape variation uncorrelated between epochs, which we attribute to variation intrinsic to the pulsar at a level consistent with previously published analyses. Not accounting for this shape variation biases the measured arrival times at the level of ˜30 ns, the same order of magnitude as the expected shift due to gravitational waves in the pulsar timing band.

  19. Comparison of CTT and Rasch-based approaches for the analysis of longitudinal Patient Reported Outcomes.

    PubMed

    Blanchin, Myriam; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Blanchard, Claire; Mirallié, Eric; Sébille, Véronique

    2011-04-15

    Health sciences frequently deal with Patient Reported Outcomes (PRO) data for the evaluation of concepts, in particular health-related quality of life, which cannot be directly measured and are often called latent variables. Two approaches are commonly used for the analysis of such data: Classical Test Theory (CTT) and Item Response Theory (IRT). Longitudinal data are often collected to analyze the evolution of an outcome over time. The most adequate strategy to analyze longitudinal latent variables, which can be either based on CTT or IRT models, remains to be identified. This strategy must take into account the latent characteristic of what PROs are intended to measure as well as the specificity of longitudinal designs. A simple and widely used IRT model is the Rasch model. The purpose of our study was to compare CTT and Rasch-based approaches to analyze longitudinal PRO data regarding type I error, power, and time effect estimation bias. Four methods were compared: the Score and Mixed models (SM) method based on the CTT approach, the Rasch and Mixed models (RM), the Plausible Values (PV), and the Longitudinal Rasch model (LRM) methods all based on the Rasch model. All methods have shown comparable results in terms of type I error, all close to 5 per cent. LRM and SM methods presented comparable power and unbiased time effect estimations, whereas RM and PV methods showed low power and biased time effect estimations. This suggests that RM and PV methods should be avoided to analyze longitudinal latent variables. Copyright © 2010 John Wiley & Sons, Ltd.

  20. The effect of kidney morcellation on operative time, incision complications, and postoperative analgesia after laparoscopic nephrectomy.

    PubMed

    Camargo, Affonso H; Rubenstein, Jonathan N; Ershoff, Brent D; Meng, Maxwell V; Kane, Christopher J; Stoller, Marshall L

    2006-01-01

    Compare the outcomes between kidney morcellation and two types of open specimen extraction incisions, several covariates need to be taken into consideration that have not yet been studied. We retrospectively reviewed 153 consecutive patients who underwent laparoscopic nephrectomy at our institution, 107 who underwent specimen morcellation and 46 with intact specimen removal, either those with connected port sites with a muscle-cutting incision and those with a remote, muscle-splitting incision. Operative time, postoperative analgesia requirements, and incisional complications were evaluated using univariate and multivariate analysis, comparing variables such as patient age, gender, body mass index (BMI), laterality, benign versus cancerous renal conditions, estimated blood loss, specimen weight, overall complications, and length of stay. There was no significant difference for operative time between the 2 treatment groups (p = 0.65). Incision related complications occurred in 2 patients (4.4%) from the intact specimen group but none in the morcellation group (p = 0.03). Overall narcotic requirement was lower in patients with morcellated (41 mg) compared to intact specimen retrieval (66 mg) on univariate (p = 0.03) and multivariate analysis (p = 0.049). Upon further stratification, however, there was no significant difference in mean narcotic requirement between the morcellation and muscle-splitting incision subgroup (p = 0.14). Morcellation does not extend operative time, and is associated with significantly less postoperative pain compared to intact specimen retrieval overall, although this is not statistically significant if a remote, muscle-splitting incision is made. Morcellation markedly reduces the risk of incisional-related complications.

Top